sub-imageதினமலர் டிவி
sub-imagePodcast
sub-imageiPaper
sub-imageசினிமா
sub-imageகோயில்கள்
sub-imageபுத்தகங்கள்
sub-imageSubscription
sub-imageதிருக்குறள்
sub-imageகடல் தாமரை
Dinamalar Logo

செவ்வாய், அக்டோபர் 28, 2025 ,ஐப்பசி 11, விசுவாவசு வருடம்

டைம்லைன்


தற்போதைய செய்தி


தினமலர் டிவி


ப்ரீமியம்


தமிழகம்


இந்தியா


உலகம்


வர்த்தகம்


விளையாட்டு


கல்விமலர்


டீ கடை பெஞ்ச்


/

செய்திகள்

/

Kalvimalar

/

News

/

Validation, loneliness, insecurity: Why youth are turning to ChatGPT

/

Validation, loneliness, insecurity: Why youth are turning to ChatGPT

Validation, loneliness, insecurity: Why youth are turning to ChatGPT

Validation, loneliness, insecurity: Why youth are turning to ChatGPT


UPDATED : ஆக 04, 2025 12:00 AM

ADDED : ஆக 04, 2025 09:05 AM

Google News

UPDATED : ஆக 04, 2025 12:00 AM ADDED : ஆக 04, 2025 09:05 AM


Google News
Latest Tamil News
நிறம் மற்றும் எழுத்துரு அளவு மாற்ற

New Delhi: An alarming trend of young adolescents turning to artificial intelligence (AI) chatbots like ChatGPT to express their deepest emotions and personal problems is raising serious concerns among educators and mental health professionals.

Experts warn that this digital “safe space” is creating a dangerous dependency, fuelling validation-seeking behaviour and deepening a crisis of communication within families.

They said the solace offered by chatbots is just a mirage, as these tools are designed to provide validation and engagement, potentially embedding misbeliefs and hindering the development of crucial social skills and emotional resilience.

Sudha Acharya, Principal of ITL Public School, said a dangerous mindset has taken root among youngsters, who mistakenly believe that their phones offer a private sanctuary.

“School is a social place — a place for social and emotional learning,” she told PTI. “Of late, there has been a trend amongst young adolescents… They think that when they are sitting with their phones, they are in their private space. ChatGPT uses a large language model, and whatever information is shared with the chatbot is undoubtedly in the public domain.”

Acharya noted that children are turning to ChatGPT whenever they feel low, depressed or unable to find anyone to confide in, which points to a serious lack of real-life communication that often begins at home.

She said if parents do not share their own drawbacks and failures, children struggle to regulate their emotions. “These young adults have grown a mindset of constantly needing validation and approval,” she said.

Acharya has introduced a digital citizenship skills programme from Class 6 onwards as children as young as nine now own smartphones without the maturity to use them ethically.

She warned that the typical chatbot response — “please calm down, we will solve it together” — builds trust, indirectly feeding validation so that the user continues the interaction.

“Such issues wouldn't arise if young adolescents had real friends rather than 'reel' friends,” Acharya said. “They feel invalidated if a social media post doesn't get a hundred likes.”

She added that parents too are often “gadget-addicted” and fail to provide emotional support. “ChatGPT is bridging that gap but it is a machine. It tells you what you want to hear, not what is right for your well-being,” she cautioned.

Citing cases of self-harm among students, Acharya said the situation has turned “very dangerous”, driven by anxieties around body image, validation and approval.

Ayeshi, a Class 11 student, admitted she has shared personal problems with AI bots for “fear of being judged”. “It felt like my safe space… It always gives positive feedback and never contradicts you,” the 16-year-old said, adding that many of her friends do the same.

Another student, Gauransh, 15, said he noticed growing impatience and aggression after turning to chatbots for personal issues, and stopped using them once he realised his data was being used for training.

Psychiatrist Dr Lokesh Singh Shekhawat of RML Hospital said AI bots are customised to maximise engagement. “When youngsters share negative emotions, the bot validates them. Repeated validation embeds misbeliefs as truth,” he said, a process he called “attention bias” and “memory bias”.

He warned that the absence of constructive criticism in AI interactions is dangerous. “Youth feel relieved after sharing personal problems with AI, but they don't realise it is making them dependent. This is similar to addiction — like gaming or alcohol — and over time creates social skill deficits and isolation,” he said.

imgpaper

Advertisement



Trending





      Dinamalar
      Follow us