The Risk Isn't Hypothetical

Data breaches in mental health care aren't some far-off risk -- they've already happened, exposing deeply personal information to the world. And with new AI platforms (and existing EHR platforms) recording and/or storing entire sessions and transcripts, the damage from the next breach will be far worse.


If you’re a therapist or a client, it's easy to assume that your data is safe. Of course companies don't want a data breach. But privacy rules and tech safeguards and good intentions can only go so far. Mistakes and oversights happen. Data breaches aren’t some far-off possibility. They’ve already happened -- in mental health care -- and in ways that turned people’s lives upside down.

Wired wrote a piece on a breach that took place a few years ago, which you can read in full here. (But we'll also give you a quick recap!)

Wired wrote an article about a data breach that has already happened in the mental health space.

(A screenshot of the article from Wired.com...)

So here's what happened:

In 2020, Finland's largest network of private mental health providers, Vastaamo, was hacked. Not just names and addresses -- but therapists' actual session notes, line by line. It's the kind of breach that forces you to think differently about what "private" means.

Tens of thousands of people got ransom emails: pay €200 (about $230) in Bitcoin, or your most personal conversations will be posted online. Miss that deadline and it jumped to €500 (about $570). Some victims saw their files leaked anyway -- details about suicide attempts, childhood abuse, addictions -- permanently searchable on the internet.

One victim summed it up:

"Being honest about my mental health turned out to be a bad idea."

He worried about identity theft, about employers or strangers finding his history, and about his own mother discovering things he had only ever told his therapist.

And this happened in Finland, a country often held up as a digital health leader, with national health systems that are generally considered secure. Vastaamo wasn't a shady fly-by-night operation. It was popular. Well-funded. Growing fast. And still, a mix of weak technical choices and gaps in oversight meant the door was wide open.

Here's the part that should make people stop and think: Vastaamo's breach involved therapist-written summaries of sessions. So basically, the data stored on a traditional EHR platform if you go back in time two years. But many of today's new AI mental health startups (and EHR platforms) want to go much further -- recording entire therapy sessions, transcribing them word for word, and storing those transcripts, even if just temporarily. If a breach like Vastaamo’s is devastating, a breach from one of these companies could be catastrophic. We’re talking about not just notes, but every single thing said in the room, preserved forever for a hacker to publish elsewhere (and then downloaded by anybody).

If you work in mental health, the lesson isn't "be scared". It's "be honest". Digital systems always carry risk. Even if you do everything right, you can't control what happens once data leaves your hands. And in therapy, "data" isn't just a name or number in a spreadsheet -- it's a record of what someone trusted you enough to say out loud.

Data breaches of mental health and therapy data have happened before, so while we might sound like a Debbie Downer, this is just the reality of the situation.

We're not trying to be Debbie Downer here. And people who speak out against unnecessarily recording therapy sessions aren't being alarmists. Nor are they fear-mongering. They're being realistic. Pretending the risk doesn't exist doesn't make it go away. (Heck, this is why we want to highlight that, hey, this has already happened! A few years ago!)

At Quill, we've made a simple decision: we don't record your sessions at all. We're not sitting on a massive database of everything your clients ever said, because that database is exactly what hackers dream about. We never want to handle that data at all. Instead, you give us a session summary after the fact, we help you turn it into clean, professional notes, and that's the end of it. (And we don't even store any of that data either!)

Regardless of whether you choose to work with us at Quill, please don't be persuaded by the other companies that are pouring millions of dollars into advertising and marketing to convince you that, sure, record your session! It'll be fine! Because... it won't always be fine.

Breaches aren't hypothetical. They've happened. They will happen again. The best protection is to avoid storing anything you don't absolutely need -- and to think twice before creating it at all.

Published on Aug. 11, 2025.

AI Data Breaches General Privacy

Quill Therapy Solutions

What is Quill?

Quill streamlines progress notes for therapists, saving time by generating notes from a verbal or typed session summary. With privacy at its core, Quill never records client sessions, protecting the therapist-client relationship and avoiding ethical and confidentiality risks. Just record a summary, click a button, and Quill generates your notes for you.

Try Quill for free today, no credit card required. And for unlimited notes (and other types of therapy documentation), it's only $20/month. (Even less for teams.)

Try Quill and save time on notes.