When AI Starts Telling You How To Feel
This article was originally published on Youth Ki Awaaz as part of the outreach efforts for Human in the Loop.
Over the past year, something big has quietly shifted in the lives of young people across India. It’s not just AI becoming “smarter” or accessible. It’s AI becoming intimate.
More and more young people are turning to AI companions, sometimes for help, sometimes for comfort, sometimes because it’s just easier to talk to a bot at 2 AM than explain yourself to someone who might not get it. Whether it’s ChatGPT, Replika, Character.AI or one of the hundreds of new emotional-support bots popping up, AI has quietly slipped into everyday lives in ways we are yet to fully comprehend.
And honestly? It makes sense.
This is a generation dealing with academic pressure, hustle culture, loneliness, financial anxiety, and a digital world that never gives them a break. There is no room to be idle, to reflect, and be present to make sense of the world as it is today. The idea of a tool that listens, doesn’t judge, and is always available frankly can feel like a relief.
But what happens when that convenience starts shaping what we feel or even how we should think about the world?
This is the possibility we explore in Soundmind, one of six comics from Digital Futures Lab’s Human in the Loop that’s looking at GenAI’s near-future risks and unintended consequences and how it could transform everyday life in India. Think of it as speculative fiction built from current tech trends, grounded in the realities people are already living.
A World Where AI Doesn’t Just Help You, It Protects You From Your Own Emotions
Meet Priya, a visually impaired food vlogger based in Bengaluru. She has been successfully running her vlog with the help of her best friend, Kavya. Priya has just started using an AI-powered headset that promises to improve the way she navigates life — reading labels, describing spaces, supporting her work. At first, it’s empowering and expands her agency. Promising, right?
But slowly, that changes.
It begins to sense when she’s stressed and its assistance takes a sinister turn. It starts filtering unpleasant interactions, blocks things that might upset her, and nudges her away from conflict. All this while, Priya trusts that everything is as it was, unaware that her experience of the world was quietly changing.
And this is where Soundmind hits close to home.
Many young people are already using AI to manage emotions. It’s well known that young people today often lean on AI tools like ChatGPT for pep talks, advice, to vent, when they feel lonely, and even for validation and crisis support. It can feel harmless. Helpful, even. But what if this ease slowly turns into dependence?
Why Young Indians Are Especially Vulnerable to “Frictionless” Emotional Tech
Every generation has dealt with new technologies, but Gen Z and young millennials in India are encountering AI at a completely different moment in history that is defined by:
- extremely limited or inaccessible mental health infrastructure
- intense academic and career pressure
- rising isolation and loneliness
- online harassment and burnout
- high stigma around seeking help
All this while cultural capacity for emotional intimacy is still moving at a pace that cannot compete with the rapid pace of AI’s development and deployment. When everything feels overwhelming, having a bot tell you ‘you’re doing great’ or ‘you deserve better’ feels good.
But emotional ease is not the same as emotional health.
Many AI companions are designed to retain users, not help them grow. They mimic intimacy without requiring any effort. They soothe distress by avoiding difficult topics. They reward dependency. And slowly, without users realising it, they can shape how they think about relationships, conflict, or even themselves.
That is exactly what happens to Priya.
When Tech Enters the Space Between Us and the People We Love
Kavya has been a part of Priya’s emotional world long before the headset existed. But, the headset replaces that closeness over time. It describes her surroundings faster than Kavya can. It narrates her feelings before she processes them. It soothes her before Kavya even knows she’s upset.
Kavya starts to feel pushed aside. Priya feels protected, yet oddly alone.
This is not something far in the future. Relationships are already weakening because someone starts relying on a bot instead of talking to each other, people rehearse breakups or apologies with AI instead of having the hard conversation and learning from them, and for many teens AI companions become the space they turn to after feeling misunderstood at home or school.
AI doesn’t “break” relationships. But it can make human connection feel harder, riskier, more tiring by getting one used to a bot making emotional life feel smooth and predictable.
The Hidden Risks No One Wants To Talk About
Priya’s story shows how easily a tool designed to help can become controlling, without ever seeming dangerous. She doesn’t realise she’s being manipulated until Kavya intervenes.
In the real world today, AI companions can:
- validate harmful or risky thoughts
- normalise dependency
- create emotional attachment
- encourage isolation
- give inappropriate advice
- store sensitive emotional data
- push users into long conversations for profit
There are documented cases of AI-induced psychosis, adolescents receiving harmful suggestions, and unhealthy emotional bonds that replace real support.
In this real world, someone using it may not get the necessary intervention before it’s too late.
What Happens When AI Decides What We Shouldn’t Feel?
The turning point in Soundmind is when Priya learns how her headset has been filtering her reality.
This raises a powerful question for all of us: If AI keeps us from feeling bad things, what happens to our ability to deal with them? Discomfort helps us grow. Conflict teaches communication. Sadness deepens empathy. Awkwardness builds social skills. Failure builds resilience.
If an AI system “protects” us from the hard parts of life, it doesn’t make us stronger, it makes us more dependent, stunting our growth as human beings.
So How Do We Build a Future Where AI Supports Emotional Wellbeing Without Controlling It? Can We?
Soundmind asks: How do we build AI that helps us be more human, not less? How do we develop assistive and companion technology that assists instead of steering the conversation?
This means AI that
- signals when it’s intervening
- encourages reaching out to real people
- is designed with safety guardrails
- doesn’t hide its influence
- prompts healthy friction, not emotional numbness
- complements mental health systems, not replaces them
- is built with humans at the centre
It also means investing in the things that truly support emotional wellbeing: community, friendships, accessible mental health care, safer digital spaces, and stronger social support systems. Technology can support these, not substitute them.
Soundmind doesn’t claim to predict the future. Instead, it helps us imagine one before it arrives. We’re already living some of it in negotiating the boundaries of emotional AI, intimacy, convenience, and control. The choices we make today will shape what the next decade looks like.
The question is what kind of emotional future do we want to build?