Expanded from an earlier post titled “Can you trust your smartphone to be your therapist?”
I’m a long-time advocate for therapy, and as you can tell, I’m very open about it. I can’t imagine life without having a person who I can share anything with and have no possible adverse impact. I realize that there are multitudes of people who either find therapy scary, frivolous, or simply unaffordable and inaccessible. It can be unrealistic to find time to even do therapy, or it might seem selfish to some when there are so many other things going on in the world. The barriers to getting help are myriad.
The advent of e-therapy addressed many of those barriers. Although forms of mental health support via the internet have existed since the 1980s, and the use of emails and chatrooms became more available in the 1990s – 2000s, the industry came into its own in 2012 when Talkspace was launched. 
According to Talkspace’s site, it is “an online therapy platform that quickly evolved to offer unlimited messaging therapy. It wasn’t the first company to provide chat rooms for therapists and clients to work together. It did, however, provide online therapy on a historic scale.” 
BetterHelp came along a year later with a mission of “Making professional counseling accessible, affordable, convenient – so anyone who struggles with life’s challenges can get help, anytime, anywhere.” 
There is even a therapy service that you can summon through Amazon Alexa called The Difference.  And there are hundreds of mental health apps available – many of them are free. (Here’s a list: https://www.psycom.net/25-best-mental-health-apps )
These options provide anyone with a smartphone the ability to get the help they need. They allow the user anonymity, have varying price options, and give them access to someone 24/7. Seems like a winning solution.
As in any situation, though, it is best to understand the full picture. Despite the fact that the user is promised anonymity, there’s an astounding amount of data that you as a user broadcast, and that data can be used to that company’s advantage, not just your own.
According to an investigation by the New York Times, several Talkspace employees say the company “regards treatment transcripts as another data resource to be mined.”  Additionally, two former employees said the company’s data scientists shared common phrases from clients’ transcripts with the marketing team so that it could better target potential customers.
Both Talkspace and BetterHelp have interesting origin stories in that they were both developed by and helmed by entrepreneurs, not by mental health professionals. And they are both for-profit companies, which is fairly uncommon in the world of mental health (with the exception of places like recovery centers, which is a very different issue.)
In a 2019 article, Alon Matas, the founder of BetterHelp is quoted saying “We’re growing exponentially and we have barely scratched the surface…I’m committed to keep it growing. I really feel lucky that we’re able to do something that touches people’s lives so directly and does social good, and makes a good profit for shareholders.” 
When I first read this, it struck me as horrifying. The idea of profiting from peoples’ illness seems incredibly wrong. I started to ask myself whether my repulsion at the idea was based in older thinking, though, and rooted in the long-standing stigma that has been associated with mental illness that needs to be challenged. Both Talkspace and BetterHelp claim that destigmatizing mental health is core to their mission. And as I mentioned, I am very open about going to therapy. Perhaps this is a positive change?
The site Jezebel signed up for BetterHealth to better understand what the company does with its users’ data. According to Jezebel, they “used software to collect, analyze, and eavesdrop on the data sent from the application to its various servers.”  They found that BetterHelp provided data to Facebook, Google, Snapchat, and Pinterest that they were considering treatment from BetterHelp. This does not sound out of the ordinary; after all, after I started researching this post, my various social media feeds began sprouting ads for both services. That’s just how the world works now.
Jezebel goes on to state that Facebook was alerted each time they went in to the app; that metadata sent to Facebook included when and how long they were on the app, as well as their location. They were able to trace how their data was shared even further, to a research and analytics firm called MixPanel. They state that their data was anonymized according to HIPAA requirements. Again, is this out of the ordinary? Is it cause for concern?
Is there truly a problem with a company that delivers therapy to be profit-driven? Why wouldn’t a company use the rich data that is theirs (based on users’ consent) to improve their product and reach a broader audience?
I’m not here to villainize e-therapy at all. As I stated, I advocate therapy and think that every human being could benefit from it at various points in their life, and I think that the increased access is extremely positive. That said, users need to be sure that they think through how they are getting that help. To me, though, it’s most important that they DO get help.
And please, if you or someone you know is in crisis, call the National Suicide Prevention Lifeline (Lifeline) at 1-800-273-TALK (8255), or text the Crisis Text Line (text HELLO to 741741). Both services are free and available 24 hours a day, seven days a week. The deaf and hard of hearing can contact the Lifeline via TTY at 1-800-799-4889.