For Health Apps, Questions Over Privacy and Efficacy

Yves here. Yours truly is a privacy fetishist (a very much uphill battle), particularly about sensitive data like health information. For instance, I inconvenience myself regarding how I deal with my insurer so that they do not have access or even the right to obtain my medical test results. Most people don’t have the desire to be that stringent. Even so, I can see ways for “apps” as in programs, to compile health data, like blood sugar or blood pressure readings over time and apply some metrics that would be helpful. But uploading data to the cloud is integral to just about all of these tools. That to me seems unduly risky, particularly since this article questions whether these apps are as beneficial as they are touted to be.

By Claudia López Lloreda, a senior contributor at Undark and a freelance science journalist covering life sciences, health care, and medicine. Originally published at Undar

In recent years, health apps have seen an explosion in development and usage: Today, there are about 350,000 apps available to help people with everything from tracking weight, to finding a therapist, to helping identify skin lesions.

But while studies have found that some of these apps — known as “mHealth” apps, short for “mobile health” — can help individuals manage their health, questions persist about their effectiveness and risks when deployed in the real world, as well as whether they are appropriately regulated.

Such harms can include mistakes in diagnostic assistance and inaccurate care. Some users also decry negative interactions with the apps, such as the use of stigmatizing and distressing language, or concern about privacy breaches. Meanwhile, the health apps that do provide effective guidance can falter or shut down, leaving people without care that they depend on.

According to one analysis, in 2023, the market size for mHealth apps was calculated to be around $32 billion. Analysts estimate that market will only continue to grow. But even with the benefits of some mental health apps, Kang said, the lack of regulation is a problem.

“I’m an enthusiast for these technologies,” said Stephen Gilbert, who researches medical device regulatory science at Dresden University of Technology in Germany. But, he added, the public must understand how they work, and regulators should “take out the really worst performing companies who are being clearly irresponsible.”


Today, appropriate and affordable health care can be hard to come by for some. In 2023, about 8 percent of the U.S. population lacked health insurance, limiting access to care. Mobile health apps could broaden access to health care services to try to fill that void.

A 2022 review found that the apps can provide timely support for patients, ease health care costs, and improve clinical outcomes. And an analysis published in 2024 found that mental health apps could help reduce depression and anxiety symptoms.

There are several strong meta-analyses that back up “that mental health apps can actually improve mental health,” said Kostadin Kushlev, a psychology researcher at Georgetown University. “If you look at the literature, there is evidence that they can work.”

But other research has flagged small effect sizes. And Kushlev also pointed out that studies are typically done in controlled settings. Research into the effectiveness of mental health apps after they have been deployed onto the market is relatively lacking: “It’s not clear whether these apps actually work at this very fundamental level.”

There is also little evidence showing how these apps, particularly mental health apps, compare to typical standard treatment. For example, some app-based therapy has been found to mitigate depression symptoms, but it’s still not clear how that compares to going to a therapist.

Choosing an app over typical therapy could also come with risks: “If somebody is actually really depressed and maybe suicidal, any day they’re losing and not getting proper care could be essentially a life and death situation,” said Kushlev. In 2024, a platform called BetterHelp, which is advertised as an online counseling and therapy provider, was sued by a former customer who claimed the app matched her with a therapist who was not licensed in her state. In response to a request for comment that BetterHelp senior public relations manager Megan Garner asked be attributed to a spokesperson, the company noted: “As a matter of policy, we do not comment on active legal proceedings.”

Some users already report negative experiences with health care apps and perceive them as providing subpar care. A recent study led by Kang, published as a peer-reviewed conference paper, scoured mental health app reviews in the Google Play and Apple App stores. In an analysis of around 6,000 reviews of 36 apps, Kang focused on those that reported a negative experience. For example, some features remained behind paywalls, which users said made them feel unsupported and underserving of mental healthcare. They also reported feeling distressed thinking their private data could be shared with or sold to third parties, such as Facebook. While doctors are bound by strict rules of confidentiality, those same regulations don’t apply to apps. One study revealed that fewer than half of mobile apps for depression had a privacy policy.

Some apps make it particularly tricky for users to cancel in an effort to keep them subscribed. Other apps may opt for aggressive marketing or scare tactics to retain users or push paid subscriptions, which can be especially harmful for vulnerable populations.

One health app user told Undark that she experienced this firsthand. Earlier this year, she signed up for an app called Reframe, which claims to help users curb drinking through a “neuroscience-based program.” However, when the user — who asked to remain anonymous because of concerns about stigma — canceled her subscription, Reframe sent her an email warning her that “Your Next Drink Could Cause Sudden Liver Rupture.” While prolonged excess alcohol intake can cause liver damage, sudden rupture of the organ is uncommon and is not generally associated with drinking.

“It felt like such an aggressive email to send to people who are probably at a vulnerable place,” she said. It also felt “incredibly manipulative and irresponsible, especially from an app that claims to be science-based,” she added. An email to Undark that Reframe’s support team attributed to CEO and co-founder Vedant Pradeep noted that “the vast majority of our 32K+ reviews on the app store indicate that Reframe is actually among the most empathetic and judgement-free platforms available. But with the volume of people we’re helping, there will always be those who feel differently — and they’d be right. There is a lot more we can do to improve Reframe to be able to cater to everyone.”

Apps, too, can also be outright wrong in their assessment and care recommendations. For example, early research into apps to help detect skin lesions and melanoma found that they were associated with a high likelihood of missing melanomas. A review by the Australian’s health department cited research that other apps such as those used to calculate drug dosages can also be wrong, with some errors resulting in a significant change in prognosis.

Kang noted that “whenever anything new and exciting comes out” in the health care sphere, it’s common for the public to focus on the benefits and advances. But, she added, “it’s not until someone actually gets hurt that you start to dig more deeply into those risks.”


Experts say it’s critical that consumers get help to navigate the health app landscape, and have called for more regulation. But it’s not entirely clear exactly who should step in. “There seems to be kind of a void in the responsibility,” said Tera Reynolds, an assistant professor in the Department of Information Systems at UMBC.

Many mobile health apps available to users are not regulated by one consistent mechanism. Some are classified as medical devices and thus fall under the purview of the Food and Drug Administration, while others are subject to a patchwork of regulations depending on the app’s target users and its functions. The 21st Century Cures Act, a law designed to accelerate medical advancements, excludes some health apps from FDA oversight by classifying them as “general wellness” tools, rather than medical devices such as those that claim to help manage weight or increase mental acuity. And much of the data collected by apps don’t fall within HIPAA privacy laws to protect sensitive health information.

Apps are subject to regulation by the Federal Trade Commission, which may intervene if an app is reported for making deceptive claims or failing to abide by rules like the Health Breach Notification Rule. The rule requires companies to notify customers following unauthorized acquisition of their personal health information. In the case of BetterHelp, the FTC ordered the company to pay out $7.8 million after it shared consumers’ health data to advertisers. The settlement, BetterHelp noted in a statement on its website, was not an admission of wrongdoing.

The diversity of functions and designs across apps may create unique challenges to developers. Ultimately, it falls to the apps to regulate themselves, experts write. To make mental health mHealth apps safer and more effective, Kang suggested that app developers should directly work with clinicians and health providers. And Gilbert added that developers should monitor complaints from users to improve and refine their apps. Clinicians also should be aware of which apps have such evidence and exercise caution when considering specific ones with patients, according to the American Psychiatric Association.

Still, it is difficult to regulate the ever-growing market of health apps, so ultimately, identifying which ones are appropriate for a specific need may lie with the consumer. In collaboration with a group that studies software ethics at UMBC, Kang is currently working to develop a sort of digital nutritional label that can provide users with information they can use to determine which apps might be helpful and which ones might be harmful to them. Other researchers have proposed incorporating app labels, which could highlight, for example, how many peer-reviewed studies support the evidence the app is based on, or that signal how secure they are for personal user data.

“There is potential for apps in almost any of these spaces to do either good or harm,” said Gilbert, “depending on how responsibly they are developed.”

Print Friendly, PDF & Email

3 comments

  1. Solideco

    Yves – what do you do to keep your insurer from seeing your test results? Admittedly, I never consider that this was even possible and would love to know how (presuming you are willing to share).

    Reply
    1. Yves Smith Post author

      It’s a function of my policy. I have an old fashioned indemnity plan, I could give practitioners my card and let them get paid and chase me for my share. If the insurer is the direct payer, they have the right to the MD records. I instead pay and submit for reimbursement. They only have their right to the records in the event of a dispute over whether the service was necessary. They don’t bother with routine bills, and big stuff like surgeries have to be pre-approved. They still do get billing codes and procedure codes.

      Reply
  2. The Rev Kev

    I think that a good rule of thumb is that if you download an app to track your health, that that app will always sell out your private health information to anybody that has a buck. And it won’t even matter if you actually paid for that app. Silicon Valley has an unending thirst for people’s data and some of the most profitable data that they can collect is that to do with people’s health. And now with AI being jammed into some of these apps the situation will grow worse. It’s like the Facebook way of operating. You give them your personal private information for free – and then they sell you out to all and sundry.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *