Last week, I received an email from TherapyStreamline informing me that my abandonment issues had been downgraded to the Standard tier. Apparently, my childhood neglect didn’t meet the severity threshold for Premium processing. The algorithm suggested I might want to upgrade my subscription or, failing that, consider developing a more marketable form of psychological damage.
I was reminded of the medieval practice of selling indulgences. For a fee, the Church would reduce your time in purgatory. TherapyStreamline has simply automated the process and added push notifications.
The company’s pitch is seductive in its simplicity: artificial intelligence can categorise your psychological baggage with the same efficiency it uses to recommend television programmes. Your traumas get sorted into neat subscription tiers. Standard users receive validation for common anxieties—workplace stress, mild social phobia, that persistent feeling that everyone’s lying to you. Premium subscribers unlock the good stuff: complex PTSD, intergenerational trauma, the kind of dissociative episodes that require actual clinical vocabulary.
It’s psychological triage by price point. Suffering as a service.
The Enclosure of Inner Space
The commodification of mental health didn’t begin with TherapyStreamline, of course. We’ve been turning emotional labour into profit centres for decades now. Instagram influencers monetise their breakdowns. TikTok therapists dispense CBT techniques between advertisements for meal replacement shakes. LinkedIn has become a confessional booth where executives perform vulnerability whilst angling for speaking fees.
But TherapyStreamline represents something more ambitious: the complete privatisation of psychological legitimacy. Your pain isn’t real until it’s been categorised, quantified, and assigned a monthly billing cycle.
The algorithm works like this: you complete an initial assessment—two hundred questions ranging from “How often do you feel sad?” to “Did your parents’ divorce involve litigation or just screaming?” The AI analyses your responses, cross-references them with your income data (voluntarily provided, naturally), and determines which tier of service your trauma deserves.
Standard tier gets you chatbot sessions with an AI therapist named “Dr. Sarah,” who offers the kind of generic validation you could get from a particularly empathetic autocomplete function. “That sounds really difficult” appears to be her primary training data.
Premium tier unlocks video sessions with actual humans, though these occur at the pleasure of the algorithm’s scheduling optimisation. If your panic attack isn’t scheduled for a Tuesday between 2 and 4 PM, you’ll need to manage it on your own.
The Platinum tier—because of course there’s a Platinum tier—offers something called “Trauma Concierge Service.” I haven’t worked out what this means, but the promotional materials show a woman in expensive activewear crying into a headset whilst her smartwatch monitors her cortisol levels.
The Violence of Optimisation
Here’s where it gets interesting. The algorithm doesn’t just categorise your trauma; it judges whether you deserve to have it.
Users have reported receiving notifications suggesting their depression might be “circumstantial rather than clinical” based on their income bracket. One woman was told her anxiety was likely “lifestyle-related” because she earned enough to afford yoga classes. The implication being: you’re financially comfortable, so your suffering must be optional. Have you tried simply being grateful?
This is means-testing for emotional validity. The algorithm has learned what we’ve always suspected: society believes your pain is inversely proportional to your bank balance. Suffering is for the poor. The comfortable are merely having “a moment.”
The company insists this is a feature, not a bug. Their FAQ explains that the AI is “helping users maintain perspective on their mental health challenges within their broader life context.” Which is Silicon Valley for: we’ve automated the dismissive relative who tells you to cheer up because children are starving in Africa.
I’ve seen the internal metrics. The algorithm occasionally flags users for having “disproportionate emotional responses to objectively minor stressors.” In other words, it’s developed the capacity for gaslighting. Your trauma isn’t trauma; it’s a character flaw disguised as chemical imbalance.
The Subscription Model of Suffering
TherapyStreamline’s founders include two former Spotify product managers and a Stanford psychiatrist who appears to have had a nervous breakdown and emerged believing that mental health is just another content library requiring better recommendation engines.
In a recent interview, the CEO explained that they’re “democratising access to mental health support.” This is a fascinating use of the word democratising, which apparently now means “restricting service quality based on ability to pay whilst pretending the algorithm is objective.”
The pricing structure tells you everything you need to know. Standard tier is nine pounds ninety-nine per month—roughly the cost of two fancy coffees, according to their marketing materials. Because your childhood trauma should cost the same as a week’s caffeine habit.
Premium is forty-nine pounds ninety-nine. For that, you get human therapists, crisis support outside business hours, and the algorithm stops suggesting your problems aren’t real.
Platinum is one hundred and ninety-nine pounds monthly. At this tier, you’re not paying for better therapy; you’re paying for the algorithm to take you seriously. You’re purchasing legitimacy. Your suffering has been verified and approved for processing.
The investors love it. Mental health is a growth market, after all. Everyone’s anxious, depressed, or dissociating, and now there’s an app that can rank your psychological damage like a credit score.
The Tyranny of Categorisation
What disturbs me most isn’t the pricing model or the algorithmic gaslighting. It’s the users who’ve internalised the system’s logic.
I’ve read forum posts from people apologising for their “basic tier trauma.” One man wrote that he felt guilty taking up Premium resources when his depression was “probably just genetic and not that interesting.” A woman described feeling relieved when the algorithm upgraded her anxiety to Premium tier because it meant her suffering was “officially valid.”
We’ve outsourced the legitimacy of our pain to a recommendation engine built by people who think human psychology is like the Netflix algorithm but for feelings.
The algorithm occasionally gets it catastrophically wrong, of course. Last month, it downgraded a user’s complex PTSD to Standard tier mid-treatment because she received a salary increase. Her trauma was real when she was poor but became optional once she could afford Premium subscription fees. The logic is flawless: if you can pay for therapy, you don’t need it.
TherapyStreamline’s response was to add a feature where users can “appeal” the algorithm’s decisions. You submit documentation—therapy records, diagnostic assessments, perhaps a personal essay explaining why your suffering deserves upgrade consideration. It’s like applying for a mortgage, except instead of proving you can repay money, you’re proving your trauma is severe enough to warrant human attention.
The Eternal Return of Indulgences
We’ve been here before, obviously. Every technological revolution promises to democratise something whilst actually just creating new hierarchies with better branding.
The printing press was supposed to democratise knowledge; it gave us propaganda and clickbait four centuries before the internet. Social media promised to connect humanity; it gave us algorithmic rage optimisation and the attention economy. Streaming services promised unlimited entertainment; they gave us subscription fatigue and the constant anxiety that you’re watching the wrong programme.
Now we’ve got mental health apps that promise to democratise therapy whilst actually creating a system where the algorithm decides if your suffering is premium enough to deserve human contact.
The medieval Church at least had the decency to make salvation expensive for everyone. TherapyStreamline has innovated beyond that: they’ve made validation cheap and abundant, whilst reserving actual care for those who can afford the upgrade.
The Return
I received another email from TherapyStreamline yesterday. The algorithm has analysed my recent activity—apparently I’ve been opening the app without completing sessions, which suggests “engagement issues.” They’re offering me a discount on Premium tier as an “investment in my mental health journey.”
The phrasing is exquisite. It’s not a purchase; it’s an investment. Not treatment, but a journey. The algorithm has learned the language of wellness culture, that peculiar dialect where everything’s an opportunity for growth and nothing’s ever just broken.
I won’t be upgrading. Not because I don’t need help—I almost certainly do—but because I refuse to let an algorithm built by venture capital decide if my damage is premium enough to process.
My abandonment issues will remain Standard tier. The algorithm can go to hell, or whatever the digital equivalent is. Perhaps an infinite loop of its own optimisation metrics.
At least my trauma’s authentic. Can’t say the same for their therapy.