icon caret-left icon caret-right instagram pinterest linkedin facebook x goodreads bluesky threads tiktok question-circle facebook circle twitter circle linkedin circle instagram circle goodreads circle pinterest circle

A Psychologist's Thoughts on Clinical Practice, Behavior, and Life

The New Mental Health Diagnosis of AI Chatbot Psychosis

Startling mental health concepts can mislead though appearing believable since the general public lacks accurate knowledge of child psychological development. To begin, unless one is using drugs it is exceptionally difficult for the ordinary person to become psychotic unless they experienced long-term stress. The popular metaphor, "I'm going crazy," generally refers to those experiencing a scary but normal emotional experience as, for example, crying by one rarely does.


While "AI Chatbot Induced Psychosis" with adolescents has become a newsworthy term, it is conceptually no different from the financial scams perpetrated on romance-seeking adults, termed "pig-butchering" by its perpetrators. The reason that a teenager with emotional problems is more susceptible to chatbot blandishment lies in the adolescent experience, During that period they face critical stress-filled developmental tasks: to separate appropriately from their parents; to make sound educational/vocational decisions; to develop a sturdy sense of who they are, or their "sense of self"; and to explore intimacy through dating which arouses powerful feelings. Youth experiencing difficulty with one or more of these tasks are more likely to develop emotional problems, which can usually be resolved through individual psychodynamic psychotherapy combined with supportive parent education. This reduces isolation, making the chatbot experience less enticing. Better, more widespread parenting education would benefit since babies do not arrive with instructions and the critical ego capacities governing thinking and behavior develop during the earliest years.


As for adults developing "AI Chatbot Induced Psychosis": a December 28, 2025 article in The Wall Street Journal ("AI Chatbots Linked to Psychosis, Say Doctors") describes a young woman who sought to speak with her dead brother. She "said she was prone to 'magical thinking' and was on an antidepressant and a stimulant and had gone long stretches without sleep before her hospitalizations." Nuff said.

Be the first to comment