Strange IndiaStrange India


Image for article titled Do Therapy Apps Really Protect Your Privacy?

Photo: fizkes (Shutterstock)

It’s no surprise that after a year of unprecedented (remember that word?) isolation, there was a significant surge in users seeking out remote therapy and mental health apps. In terms of increasing mental health awareness and decreasing mental health stigma, more people seeking out help is a positive thing. However, as with anything you download onto your phone, therapy apps come with data and privacy concerns—perhaps even more concerns than other apps, considering the sensitive nature of what you disclose to virtual therapists.

Text-based therapy and mental health apps like Talkspace, MindDoc, and (perhaps the most well-known) BetterHelp, offer major benefits. These apps are typically more affordable and certainly more convenient compared to in-person appointments. But in 2020, Jezebel reporters investigated the “loosely regulated world of online therapy,” with a focus on BetterHelp in particular. Similarly, earlier this year, researchers in Consumer Reports’ Digital Lab evaluated seven popular therapy apps, including BetterHelp, in order to uncover what happens to your personal information after you share it with the app.

Read on to see what these reports found, so you can make an educated decision about whether text-based therapy apps are right for you.

What information does BetterHelp say it protects?

BetterHelp claims to take confidentiality seriously. Here’s what they list out in their FAQ about privacy protection:

  • Everything you tell your therapist is confidential.
  • We don’t cooperate or work with any insurance companies so nothing needs to be shared, reported or filed with them.
  • You can always click the “Shred” button next to each message that you’ve sent so it will no longer show in your account.
  • All the messages between you and your therapist are secured and encrypted by banking-grade 256-bit encryption.
  • Our servers are distributed across multiple Tier 4- AWS Data Centers for optimal security and protection.
  • Our browsing encryption system (SSL) follows modern best practices.
  • Our databases are encrypted and scrambled so they essentially become useless in the unlikely event that they are stolen or inappropriately used.

Additionally, according to Eric Silver in eCounseling.com, the BetterHelp browsing encryption system (SSL) is provided by Comodo, “a world leader in data security.”

So what does all this mean for you? Silver argues that the measures described above ensure that your information is supremely safe. He even goes on to say that files saved to an in-office therapist’s computer are “just as [if not more] susceptible to attack as BetterHelp.” However, I’d argue that many users’ concerns are less about hackers accessing your encrypted information, and more about how BetterHelp itself deliberately collects and shares your information with third parties.

What does the privacy policy mean?

Privacy policies aren’t exactly well-known for their clarity, and BetterHelp’s most recently updated policy is no exception. Jeff Guenther, a professional therapist who goes by @therapyden on TikTok, recently went viral for exploring what he finds problematic about BetterHelp’s practices. In several videos discussing the app, Guenther digs into the company’s claim that “the data [BetterHelp] collects is not used for marketing or any other purposes except as specified in this Privacy Policy.” Guenther believes this could be rephrased as BetterHelp saying “the way we collect data is not being used for marketing except for when we use it for marketing.”

In addition to standard demographic info (like your age, gender, and location), some of the most notable data points that BetterHelp collects fall under “visitor activity.” This includes the frequency that users log into the app, how long they spend online, and how many messages they exchange with their therapist. The Jezebel report points out that although we’re conditioned to share every thought we’ve ever had on social media, it still feels “pretty spooky” that BetterHelp would share how often you talk to your therapist with third-parties like Snapchat and Pinterest—“even if it’s covered in the fine print.”

And even if this sounds like “normal tech company shenanigans,” Guenther adds, “it is not normal in the mental health industry.” (It should be noted that Guenther closes his videos by asserting that no one should feel “guilty” for using BetterHelp, and that there are “amazing” therapists on that app; he takes issue with the company’s privacy policies, not people who use these kinds of apps.)

Where does your information go?

When Jezebel reporters signed up for BetterHelp in order to monitor what kinds of information the company collected and shared, they discovered that—under the “ostensible goal of better tracking user behavior”—users’ sensitive information does indeed get shared with advertisers. BetterHelp responded to Jezebel saying their methods were standard and that they “typically far exceed all applicable regulatory, ethical and legal requirements.” And they’re not wrong: Jezebel’s reporting finds that BetterHelp is well within their legal rights to share your therapy habits with Facebook. This means that until the laws change, the issue is the same one we face every time we “read” the “terms of services” for any app: We hand over intimate information to tech companies that protect themselves in the fine print.

What about HIPAA?

HIPAA is the key federal health data law that protects your information—sometimes. Unfortunately, “technical practices have moved past what laws like HIPAA were designed to address, and until regulations evolve, these companies owe it to consumers to do better,” says Bill Fitzgerald, a privacy researcher in CR’s Digital Lab who led the mental health app research.

BetterHelp specifically says that all their therapists are HIPAA compliant. This is notable when you consider that HIPAA doesn’t apply at all to more loosely defined “mental health” apps, as Consumer Reports explains:

The law doesn’t protect data just because it’s related to your health. It applies only to information collected and held by “covered entities,” such as insurance companies, healthcare providers, and the “business associates” that provide them with services such as billing. Those companies sign business associate agreements that restrict them from doing anything with the data other than help the providers run their businesses, unless they get explicit permission from a patient.

However, Consumer Reports’ investigation found that BetterHelp’s HIPAA protections get blurry, particularly when the app sends data to social media platforms like Facebook. BetterHelp president Alon Matas told Consumer Reports that Facebook “can use data from other apps on an aggregated level, not on an individual basis.” The major takeaway here is that companies like Facebook can easily take the fact that you use a mental health app and then combine that knowledge with other data points collected by the app. Taken together, it’s safe to assume the ads that get shown to you are informed by your online therapy habits. So, while Facebook’s policies say that sensitive data like your medical symptoms isn’t used for targeted ads, mental health apps are still providing a range of data points that could be considered fair game.

Making an educated decision

To be fair, the privacy issues explored above are by no means exclusive to BetterHelp or any other mental health app. Consumer Reports points out the similarities between mental health services and basically every app we download (in particular, their report highlights the fact that all these apps assign unique IDs to individual smartphones, which can be tracked and combined with other data for targeted advertising). The question that remains is whether it’s okay for mental health services to work the same as all the other apps that we’ve grown accustomed to harvesting our data.

Mental health is incredibly personal, and in an ideal world, we wouldn’t have to worry about how your information gets shared and misused by services like BetterHelp. Alas, until laws and regulations catch up, your decision to use text-based therapy apps comes down to your idea of informed consent. If you’re already comfortable posting about your mental health on Facebook directly, then using mental health apps might be worth the privacy risks for you. Remote therapy options are critical tools for many people, and for now, it doesn’t look like their privacy concerns mean you have to dismiss them entirely.

Finding what works for you

We’ve previously covered what to know about if teletherapy could work for you, as well as the pros and cons of text-based therapy (especially for anyone who wants to give it a shot but isn’t ready to commit to in-person sessions). And finally, here’s our guide to finding the right therapist for you.

  



Source link

By AUTHOR

Leave a Reply

Your email address will not be published. Required fields are marked *