I like to begin my day with a bowl of cereal. It’s quick, light and, if I make the appropriate alternative, good for me. I even have the liberty to make a less healthy alternative – but when I’m going to the supermarket and look over the array of options in front of me, I want only have a look at the back of the box to know what I’ll be eating and, most significantly, whether the ingredients are nutritious.
Today’s vast market in mental health apps feels similar to picking cereal. With the waiting lists for NHS mental health services being months long, apps appear to have the potential to ease pressure and are relatively low cost. Their accessibility and light-touch approach suggests that individuals in need of support will give you the option to administer their very own mental health.
But increasingly, we’re seeing reports thatfor all their promise, mental health apps won’t be all they’re made out to be. With questions being asked about the necessity for regulationis it possible that these apps are doing more harm than good?
The frustrating answer is: we don’t yet have enough information to say a method or one other.
Unlikely to cause damage, but may not support wellbeing
It’s unlikely that the majority apps are actively damaging people’s mental health, although some encourage behaviour that’s unlikely to support wellbeing.
To get around the issue of responsibility, many apps categorise themselves as wellness relatively than therapy. They cannot offer advice that should be regulated, but they’ll point to services that may offer more help. This approach also reduces their responsibility for monitoring problems equivalent to someone reporting they’re going to self-harm.
Apps are also different from face-to-face therapy as they’re generally designed for use in brief, ten-minute bursts which are accessible as and once they’re needed.
There’s definitely loads of alternative. App stores are bloated with options offering different levels and kinds of support. Unfortunately, hardly any offer extensive evidence of their effectiveness – when it comes to controlled trials and in-depth evaluation relatively than user reviews – and even in the event that they did, the app store wouldn’t let you know that before you downloaded.
This leaves potential users in a situation where they don’t know what they’re getting, and it may very well be stopping them from accessing actual evidence-based care.
Effective interventions should at all times be based on evidence, but in addition they require the user to engage with them over a time frame. While they’re easy to download, apps are also easy to disregard. There are quite a few examples of trials using app-based interventions by which participants download but never actually open the app, or which have a really steep drop-off in engagement after a couple of sessions.
It’s clear to me that these apps are greater than a passing fad. When the National Institute for Care and Excellence (Nice) made the choice to approve eight online interventions in March 2023, I (cautiously) welcomed the thought. Digital therapies have the potential to supply additional support for people in need and supply a welcome bridge between sessions of therapy.
Importantly, these eight apps will likely be scientifically assessed for the way and where apps may be effective in the actual world, laying the foundations for the remainder of the market to follow.
Four principles for mental health apps
More than anything, people have to know what they’re getting, and we’d like to see greater transparency from providers. This is prone to increase their market share, so it truly is of their interests.
In 2019, Stephen Schueller and I set out 4 principles that every one mental health apps should work towards, as a way to provide probably the most transparent possible offering to their users. What personal information is collected, and the way is it used? Were goal users involved within the design of the app? How much must you use it, and is it protected? Are there measurable advantages to using the app?
My view in 2019 was that formal regulation wasn’t necessarily needed. Anyone making false claims on an app may be reported to promoting standards authorities, and anyone providing actual therapy already operates inside a regulated market – even though it’s at all times price reminding those searching for therapy to do their due diligence and find the therapist that is correct for them.
My view about regulation hasn’t modified, but this isn’t to say that no changes are needed. While arrange as libertarian havens, app stores equivalent to Google Play and Apple’s App Store have to adopt some rules for the best way that health apps are marketed.
By tightening up what’s considered a “health” app and setting out clear rules like those I’ve listed, mental health app consumers must have greater confidence that the app they’re downloading has the facility to assist them. Essentially, consumers have to give you the option to decide on their mental health apps as easily as I could make a alternative about my cereal.