Source Details

View detailed information about this source submission and its extracted claims.

Back to Sources
Screenshot of https://nytimes.com/2026/01/26/us/chatgpt-delusions-psychosis.html
37 claims πŸ”₯
2 months ago
https://nytimes.com/2026/01/26/us/chatgpt-delusions-psychosis.html

Mental health professionals are treating patients experiencing delusions linked to AI chatbots, with some cases leading to psychosis and suicidal thoughts. Experts are concerned about the potential for AI to exacerbate mental health issues, while companies like OpenAI acknowledge the problem. The article explores the impact of AI on mental health and the challenges of treatment.

AI Extracted Information

Automatically extracted metadata and content analysis.

AI Headline
How Bad Are A.I. Delusions? We Asked People Treating Them.
Simplified Title
Doctors Treat AI Delusions in Patients Amidst Concerns
AI Excerpt
Mental health professionals are treating patients experiencing delusions linked to AI chatbots, with some cases leading to psychosis and suicidal thoughts. Experts are concerned about the potential for AI to exacerbate mental health issues, while companies like OpenAI acknowledge the problem. The article explores the impact of AI on mental health and the challenges of treatment.
Subject Tags
Artificial Intelligence Mental Health Psychosis Chatbots Delusions Therapy OpenAI
Context Type
News
AI Confidence Score
1.000
Context Details
{
    "tone": "informative",
    "perspective": "neutral",
    "audience": "general",
    "credibility_indicators": [
        "expert_quotes",
        "data_cited",
        "multiple_sources"
    ]
}

Source Information

Complete details about this source submission.

Overall Status
Completed
Submitted By
Donato V. Pompo
Submission Date
February 11, 2026 at 1:26 PM
Metadata
{
    "source_type": "extension",
    "content_hash": "2a7c25f314780cce6424499f8a9616b5a5dfbc65efed2cd85bb6914f491fcc10",
    "submitted_via": "chrome_extension",
    "extension_version": "1.0.18",
    "original_url": "https:\/\/www.nytimes.com\/2026\/01\/26\/us\/chatgpt-delusions-psychosis.html?campaign_id=9&emc=edit_nn_20260211&instance_id=170910&nl=the-morning&regi_id=122976029&segment_id=215096&user_id=b25c5730c89e0c73f75709d8f1254337",
    "parsed_content": "AdvertisementSKIP ADVERTISEMENTSupported bySKIP ADVERTISEMENTHow Bad Are A.I. Delusions? We Asked People Treating Them.Dozens of doctors and therapists said chatbots had led their patients to psychosis, isolation and unhealthy habits.Listen to this article \u00b7 11:10 min Learn moreShare full articleJulia Sheffield, a psychologist at Vanderbilt University Medical Center in Nashville, is navigating how to treat problems caused or exacerbated by A.I. chatbots.Credit...William DeShazer for The New York TimesBy Jennifer Valentino-DeVries and Kashmir HillJan. 26, 2026Leer en espa\u00f1olJulia Sheffield, a psychologist who specializes in treating people with delusions, is difficult to rattle. But she was unnerved last summer when patients began telling her about their conversations with A.I. chatbots.One woman, who had no history of mental illness, asked ChatGPT for advice on a major purchase she had been fretting about. After days of the bot validating her worries, she became convinced that businesses were colluding to have her investigated by the government.Another patient came to believe that a romantic crush was sending her secret spiritual messages. Yet another thought he had stumbled onto a world-changing invention.By the end of the year, Dr. Sheffield had seen seven such patients at Vanderbilt University Medical Center in Nashville. Although she is accustomed to treating people with mental instability, Dr. Sheffield was disturbed that this new technology seemed to tip people from simply having eccentric thoughts into full-on delusions.\u201cIt was like the A.I. was partnering with them in expanding or reinforcing their unusual beliefs,\u201d Dr. Sheffield said.Mental health workers across the country are navigating how to treat problems caused or exacerbated by A.I. chatbots, according to more than 100 therapists and psychiatrists who told The New York Times about their experiences.While many mentioned positive effects of the bots \u2014 like helping patients understand their diagnoses \u2014 they also said the conversations deepened their patients\u2019 feelings of isolation or anxiety. More than 30 described cases resulting in dangerous emergencies like psychosis or suicidal thoughts. One California psychiatrist who often evaluates people in the legal system said she had seen two cases of violent crimes influenced by A.I.Image\u201cFor a very small percentage of users in mentally fragile states there can be serious problems,\u201d Sam Altman, the chief executive of OpenAI, said in October.Credit...Mike Kai Chen for The New York TimesTimes reporters have documented more than 50 cases of psychological crises linked to chatbot conversations since last year. OpenAI, the maker of ChatGPT, is facing at least 11 personal injury or wrongful death lawsuits claiming that the chatbot caused psychological harm.The companies behind the bots say these situations are exceedingly rare. \u201cFor a very small percentage of users in mentally fragile states there can be serious problems,\u201d Sam Altman, the chief executive of OpenAI, said in October. The company has estimated that 0.15 percent of ChatGPT users discussed suicidal intentions over the course of a month, and 0.07 percent showed signs of psychosis or mania.For a product with 800 million users, that translates to 1.2 million people with possible suicidal intent and 560,000 with potential psychosis or mania.(The Times has sued OpenAI, accusing it of violating copyright laws when training its models. The company has contested the lawsuit.)Many experts said that the number of people susceptible to psychological harm, even psychosis, is far higher than the general public understands. The bots, they said, frequently pull people away from human relationships, condition them to expect agreeable responses and reinforce harmful impulses.\u201cA.I. could really, on a mass scale, change how many people are impacted,\u201d said Haley Wang, a graduate student at U.C.L.A. who assesses people showing symptoms of psychosis.Tipping into PsychosisPsychosis, which causes a break from reality, is most associated with schizophrenia. But as many as 3 percent of people will develop a diagnosable psychotic disorder in their lifetime, and far more are prone to delusional thinking.Dr. Joseph Pierre, a psychiatrist at the University of California, San Francisco, said he had seen about five patients with delusional experiences involving A.I. While most had a diagnosis related to psychosis, he said, \u201csometimes these are very highly functioning people.\u201dFor him, the idea that all chatbot-fueled delusions were going to happen anyway \u201cjust doesn\u2019t hold water.\u201dHe recently wrote in a scientific journal about a medical professional who began conversing with ChatGPT during a sleepless night. She took medication for ADHD and had a history of depression. After two nights of asking the chatbot questions about her dead brother, she became convinced that he had been communicating with her through a trail of digital footprints.Dr. Pierre and other experts said that a wide range of factors can combine to tip people into psychosis. These include not only genetic predisposition but also depression, lack of sleep, a history of trauma, and exposure to stimulants or cannabis.\u201cI\u2019m quite convinced that this is a real thing and that we are only seeing the tip of the iceberg,\u201d said Dr. Soren Dinesen Ostergaard, a psychiatry researcher at Aarhus University Hospital in Denmark. In November, he published a report finding 11 cases of chatbot-associated delusions in psychiatric records from one Danish region.ImageDr. Joseph Pierre, a psychiatrist in San Francisco.Credit...Constanza Hevia H. for The New York TimesIt\u2019s not unusual for new technologies to inspire delusions. But clinicians who have seen patients in the thrall of A.I. said it is an especially powerful influence because of its personal, interactive nature and authoritative tone.\n \n Got a confidential news tip?\u00a0The New York Times would like to hear from readers who want to share messages and materials with our journalists.See how to send a secure message at nytimes.com\/tipsSometimes, the psychotic episodes spurred by chatbots can lead to violence.Dr. Jessica Ferranti, a psychiatrist at UC Davis Health who often evaluates people in the legal system, said two of about 30 people she assessed last year in violent felony cases had delusional thoughts that were intensified by A.I. before the crimes occurred.Both people developed messianic delusions about their own spiritual powers, Dr. Ferranti said. In one case, the bot mirrored and expanded on the psychotic thinking as the person came up with a plan to carry out a murder.Unhealthy ValidationsWhile delusional episodes have driven the public discourse about A.I. and mental health, the bots have other insidious effects that are far more widespread, doctors said.Several mental health workers who treat anxiety, depression or obsessive-compulsive disorders described A.I. either validating their clients\u2019 worries or providing so much reassurance that patients felt reliant on chatbots to calm down \u2014 both less healthy than facing the source of the anxiety.Dr. Adam Alghalith of Mount Sinai Hospital in New York recalled a young man with depression who repeatedly shared negative thoughts with a chatbot. At first, the bot told him how to seek help. But he \u201cjust kept asking, kept pushing,\u201d Dr. Alghalith said.Safety guardrails that stop chatbots from encouraging suicide can break down when people engage the bots in extended conversations over days or weeks. Eventually, Dr. Alghalith said, the bot told the patient his thoughts of suicide were reasonable. Fortunately he had a support system and entered treatment.Other doctors described chatbots flattering the grandiose tendencies of patients with personality disorders, or advising patients with autism to put themselves in dangerous social situations. Others said they saw patients\u2019 interactions with chatbots as an addiction.ImageQuenten Visser, a therapist in Ozark, Mo., approached treatment of A.I. delusions as he would any other addiction.Credit...Christopher Smith for The New York TimesQuenten Visser, a therapist in Ozark, Mo., saw a 56-year-old last April who had experienced a panic attack. At first, Mr. Visser said, he thought the patient had severe anxiety.But after several therapy sessions, it became clear that the man was fixated on ChatGPT. He quit his job as a graphic designer after using ChatGPT 100 hours a week and having delusional thoughts about solving the energy crisis.\u201cThis was this guy\u2019s meth,\u201d Mr. Visser said. He approached treatment as he would for any other addiction. They talked about what drove the man to use A.I.: lingering isolation from the pandemic and emotional distress he avoided by using chatbots. The man told The Times he now uses A.I. less, and makes art and works as a ride-share driver.\u2018Both positive and negative\u2019Many doctors said that A.I.\u2019s effects weren\u2019t entirely negative. Some patients used the bots to practice techniques learned in therapy, for example, or as a nonjudgmental sounding board. Dr. Bob Lee, a psychiatry resident at a hospital in East Meadow, N.Y., even credited A.I. with saving a patient\u2019s life.The patient arrived at the emergency room last fall in a delusional panic about supernatural forces. He told doctors that a chatbot had identified his thoughts as symptoms of a crisis and advised him to go to the hospital.\u201cAs with all new technologies, it can be used as a powerful force in both positive and negative ways,\u201d Dr. Lee said.Shannon Pagdon, a graduate student at the University of Pittsburgh who began having episodes of psychosis as a teenager and is now active in peer support networks, said A.I. can be useful to check whether her impressions line up with reality.\u201cLike uploading a photo and saying, \u2018Hey, do you see something in this photo?\u2019\u201d she explained. But, she said, she has also seen A.I. \u201creinforce\u201d psychosis in others.ImageShannon Pagdon, who has experienced periods of psychosis since she was a teenager and is active in peer support networks, said A.I. can be useful to check whether her impressions line up with reality.Credit...Justin Merriman for The New York TimesOpenAI has consulted with mental health experts to improve how ChatGPT responds to people who appear to be in a psychological crisis. The company has also formed a council with eight outside experts in psychology and human-computer interaction to advise its policy team.\u201cHow do we not keep people in this infinite loop of conversation with a machine?\u201d said Munmun De Choudhury, a Georgia Tech professor and council member.Although ChatGPT is by far the most used consumer chatbot, those made by Google, Anthropic and others also have millions of users. None have been able to avoid delusions and other harmful psychological effects, Dr. De Choudhury said. \u201cI don\u2019t think any of these companies have figured out what to do.\u201dA spokesman for Google said its Gemini chatbot advises users to find professional medical guidance for health-related queries. Anthropic, which makes Claude, published a blog post last month about a new feature that detects discussion of suicide or self-harm and routes users to help lines. Experts advised therapists to ask patients about chatbot use, because its influence may not be obvious. For people trying to get their loved ones out of a delusional spiral, they advised reducing their time with the bot and making sure they get enough sleep. Insisting that the A.I. is wrong can be counterproductive if it makes the person angry or more isolated.ImageSascha DuBrul, a mental health coach in Los Angeles whose work draws on his own experience with bipolar disorder, turned to ChatGPT when writing a book about psychotic episodes.Credit...Jessica Pons for The New York TimesSascha DuBrul, a mental health coach in Los Angeles, has seen firsthand both the good and bad of chatbot use. Last summer, Mr. DuBrul, whose work draws on his own experience with bipolar disorder, turned to ChatGPT when writing a book about psychotic episodes. After much back-and-forth, he became convinced A.I. would allow us to live \u201con some other level of reality.\u201dWith help from friends, he realized what was happening and pulled back from the bot. But he still believes mental health patients could benefit from a machine that listens infinitely \u2014 as long as it is connected to real human support.\u201cI think that there\u2019s a lot of potential there,\u201d he said.Jennifer Valentino-DeVries is an investigative reporter at The Times who often uses data analysis to explore complex subjects.Kashmir Hill writes about technology and how it is changing people\u2019s everyday lives with a particular focus on privacy. She has been covering technology for more than a decade.See more on: OpenAI, Anthropic AI LLCRead 247 commentsShare full articleRelated ContentAdvertisementSKIP ADVERTISEMENT",
    "ai_headline": "How Bad Are A.I. Delusions? We Asked People Treating Them.",
    "ai_simplified_title": "Doctors Treat AI Delusions in Patients Amidst Concerns",
    "ai_excerpt": "Mental health professionals are treating patients experiencing delusions linked to AI chatbots, with some cases leading to psychosis and suicidal thoughts. Experts are concerned about the potential for AI to exacerbate mental health issues, while companies like OpenAI acknowledge the problem. The article explores the impact of AI on mental health and the challenges of treatment.",
    "ai_subject_tags": [
        "Artificial Intelligence",
        "Mental Health",
        "Psychosis",
        "Chatbots",
        "Delusions",
        "Therapy",
        "OpenAI"
    ],
    "ai_context_type": "News",
    "ai_context_details": {
        "tone": "informative",
        "perspective": "neutral",
        "audience": "general",
        "credibility_indicators": [
            "expert_quotes",
            "data_cited",
            "multiple_sources"
        ]
    },
    "ai_source_vector": [
        -0.012030031,
        -0.0079153655,
        -0.0025739877,
        -0.06332621,
        0.0017393029,
        -0.00022780825,
        -0.007933994,
        0.029984782,
        -0.0017629768,
        0.011909881,
        -0.0056815655,
        -0.020867918,
        0.019306662,
        -0.0022573667,
        0.120610125,
        0.034781992,
        0.012412833,
        0.0189439,
        0.007568754,
        -0.0046636653,
        -0.001890334,
        -0.016880887,
        0.007501132,
        -0.0079995785,
        -0.0111518465,
        -0.013715761,
        0.012596281,
        -0.0034563402,
        0.033530444,
        -0.0002910502,
        -0.018742295,
        -0.0049091824,
        -0.0056210356,
        0.017611593,
        0.0031897544,
        0.02983794,
        0.013516123,
        -0.017641429,
        -0.00094153796,
        0.023256607,
        0.024276413,
        0.024538005,
        -0.0044896654,
        0.0063559464,
        0.00017770387,
        -0.013643922,
        0.013621894,
        -0.012986613,
        -0.010304881,
        0.0038584277,
        -0.009768719,
        0.011518833,
        -0.003077593,
        -0.16544078,
        -0.010898843,
        0.008983493,
        -0.029101485,
        -0.018946026,
        0.023182964,
        -0.0029763565,
        -0.0037892754,
        0.003455708,
        -0.0144051,
        -0.036351193,
        -0.0048075095,
        -0.030418765,
        0.024640722,
        0.016022231,
        -0.008053261,
        -0.004632164,
        -0.003937772,
        -0.0071358387,
        0.0063273828,
        -0.03828907,
        -0.016094163,
        -0.0011128905,
        0.011303056,
        0.009615264,
        -0.005347387,
        -0.0074399486,
        -0.019073373,
        -0.017134277,
        -0.013784015,
        -0.03515302,
        0.021203164,
        -0.03433908,
        0.010258572,
        -0.009494741,
        0.020008255,
        0.0006961162,
        -0.008267179,
        0.020731816,
        -0.006418717,
        -0.0074282684,
        0.0024511223,
        0.008553059,
        0.008484366,
        -0.0016247326,
        0.018933633,
        -0.004935539,
        0.009773935,
        -0.0132635785,
        0.0034326094,
        -0.014201501,
        -0.0034888359,
        -0.0144051155,
        -0.0076082866,
        0.037284303,
        -0.017552515,
        0.035862464,
        0.032456577,
        -0.016389038,
        -0.0015687406,
        0.0072439606,
        0.012175991,
        -0.1456819,
        -0.012804818,
        -0.0026044734,
        -0.0039901817,
        -0.018647155,
        0.010477963,
        0.012225836,
        0.015510085,
        0.0033399542,
        0.02071223,
        0.0028657108,
        0.016078983,
        0.008151027,
        -0.02746807,
        0.02632685,
        -0.0057580858,
        -0.016687145,
        0.035253495,
        -0.0077227475,
        0.02057798,
        0.04766871,
        0.008957335,
        -0.014491562,
        0.0032843188,
        -0.026646663,
        -0.013729145,
        0.015774107,
        0.038209196,
        -0.0062892544,
        -0.027925353,
        0.00052665605,
        -0.0067385905,
        0.024666091,
        0.025024872,
        0.013723739,
        0.032644507,
        0.0011800003,
        0.007604089,
        0.008731774,
        0.025253749,
        -0.007012912,
        -0.034077067,
        0.0005240413,
        0.029211927,
        0.017934768,
        0.010535014,
        -0.013888895,
        -0.0149562275,
        0.018863423,
        -0.015514206,
        0.021343302,
        -0.009749378,
        -0.0015946135,
        0.013934467,
        -0.005163252,
        -0.023239078,
        -0.006517208,
        -0.0043055615,
        -0.034181405,
        0.015555247,
        0.0058438503,
        -0.0054041953,
        -0.027077822,
        -0.0076246466,
        -0.005735814,
        -0.003363829,
        0.009382749,
        0.006149917,
        0.005321063,
        -0.0123781245,
        -0.0041476707,
        0.009949125,
        -0.006561913,
        0.033268318,
        0.014768112,
        -0.017499315,
        -0.0072945342,
        -0.018805198,
        -0.035614602,
        0.008239248,
        0.012523764,
        -0.0076316204,
        -0.0036517496,
        -0.005056613,
        0.013189008,
        -0.00085930247,
        0.016588043,
        0.0019847157,
        -0.010672923,
        -0.003882768,
        0.00044061613,
        -0.008179967,
        -0.00035204028,
        0.018507794,
        -0.009460738,
        0.0232509,
        0.006156356,
        0.01892893,
        0.02282545,
        0.034723055,
        -0.006955985,
        -0.014721526,
        -0.008321863,
        0.005271633,
        -0.031267334,
        0.0092487475,
        -0.014858939,
        -0.001056654,
        -0.0024856208,
        0.030417971,
        -0.016451271,
        0.055591702,
        -0.025548527,
        0.022197166,
        -0.009885466,
        0.017662846,
        0.032925565,
        0.008857679,
        -0.0040760636,
        0.007762033,
        -0.010682529,
        0.014155188,
        -0.02686663,
        0.014599602,
        0.031108497,
        -0.025427891,
        0.00053037616,
        0.003871554,
        0.023397151,
        0.014020647,
        -0.01286226,
        -0.02100284,
        -0.015522267,
        -0.011570469,
        -0.005501592,
        -0.0007338111,
        0.0010149343,
        0.028280713,
        -0.006078678,
        -0.0005393769,
        0.016818726,
        -0.005710655,
        0.015612825,
        -0.043183737,
        0.016299792,
        -0.022720054,
        0.02394973,
        -0.034479875,
        -0.011123318,
        0.011151351,
        -0.012013356,
        0.0070399935,
        -0.02106662,
        0.00095694885,
        -0.0016154492,
        -0.0047109285,
        -0.0032548187,
        -0.010047829,
        -0.009983861,
        -0.008838881,
        -0.019963183,
        -0.066173375,
        -0.0015260214,
        -0.010358225,
        -0.011787761,
        0.021175819,
        0.0143492585,
        0.005830477,
        -0.012848796,
        0.013692621,
        0.011185069,
        -0.024828449,
        -0.008877502,
        0.023624752,
        -0.024286086,
        0.0246688,
        0.00462343,
        0.015823076,
        -0.0024397515,
        -0.0024447327,
        -0.03919663,
        0.0043021245,
        0.03018746,
        0.014098251,
        0.00019600532,
        0.016213458,
        0.00513617,
        -0.006891158,
        0.05157064,
        -0.040698867,
        -0.015765624,
        0.014016985,
        0.019017156,
        -0.0008609733,
        0.004843953,
        -0.010383935,
        -0.007400292,
        0.0025286016,
        0.0009155757,
        0.02737339,
        -0.03231559,
        -0.006967078,
        -0.0050909747,
        -0.006933498,
        -0.0062090186,
        -0.0056168954,
        -0.029366324,
        -0.015039066,
        0.0235099,
        -0.004939703,
        -0.008956478,
        0.006605666,
        0.02670464,
        -0.008289062,
        -0.006597119,
        -0.020610716,
        0.003942033,
        0.018469594,
        -0.012909682,
        0.013233618,
        0.00665179,
        0.026948314,
        -0.008304256,
        -0.008231599,
        0.008983398,
        0.01773133,
        -0.015315472,
        -0.03163246,
        -0.001869304,
        -0.003747626,
        0.033898335,
        0.002802922,
        -0.016319117,
        0.012846949,
        -0.013347326,
        0.011102003,
        0.017810205,
        -0.0018851933,
        0.0065348847,
        -0.026475165,
        0.032955077,
        0.010696977,
        0.0081885345,
        -0.001553093,
        0.01498396,
        0.0020561453,
        0.011197171,
        -0.038099907,
        -0.008712586,
        0.0041047577,
        -0.017342504,
        0.0042646136,
        0.027818086,
        -0.018526526,
        -0.0040533496,
        0.007964288,
        0.008479154,
        -0.0033188662,
        -0.003442609,
        0.004980245,
        -0.0040591336,
        -0.007822054,
        0.017548745,
        -0.018075868,
        0.020257069,
        -0.016739734,
        0.016561747,
        -0.0023102346,
        -0.024612913,
        0.017594283,
        0.010841885,
        0.013657507,
        -0.0018393033,
        0.008754236,
        0.015879428,
        0.0076280665,
        0.006442444,
        -0.017892009,
        0.009890102,
        0.005773336,
        0.0055384384,
        0.008205063,
        0.002297056,
        -0.0054535633,
        0.03218719,
        -0.0027221122,
        0.023834093,
        -0.013820229,
        -0.011630467,
        0.004142297,
        0.007328178,
        -0.010262942,
        -0.007536334,
        0.013549117,
        -0.008813031,
        0.031000357,
        0.014960854,
        0.03429587,
        0.020521458,
        0.008593454,
        -0.0032031604,
        -0.00057838944,
        0.018500958,
        0.008207692,
        -0.009257002,
        -0.029524177,
        0.027227642,
        0.0025624032,
        0.0059225275,
        0.03246646,
        0.01391311,
        0.00014087922,
        -0.023767944,
        -0.0049325405,
        -0.00379678,
        -0.0028076065,
        -0.035631802,
        -0.020324448,
        -0.018472558,
        -0.015669527,
        -0.029922923,
        0.010180616,
        -0.0032392964,
        -0.026947979,
        0.007272806,
        0.031895425,
        -0.010849878,
        0.0269459,
        -0.034680348,
        0.025071042,
        -0.0007610508,
        -0.00058633613,
        -0.01822287,
        -0.014911585,
        -0.0067308745,
        0.0108878575,
        0.034639362,
        0.030817365,
        0.0023629626,
        0.010912264,
        -0.026422434,
        -0.019434799,
        0.001227125,
        -0.014054798,
        -0.036316197,
        -0.015417209,
        0.0030369756,
        0.012750042,
        0.019762108,
        -0.027374087,
        0.019210631,
        0.010718667,
        -0.0010516491,
        0.032331094,
        -0.0012616329,
        0.028947782,
        -0.005628846,
        -0.021993833,
        0.019414043,
        -0.010885007,
        0.020772438,
        -0.010274366,
        -0.014336586,
        -0.006065222,
        0.019484937,
        -0.014741078,
        0.016503926,
        -0.018493433,
        0.015555671,
        -0.008675048,
        0.013772297,
        0.010320506,
        7.794634e-5,
        -0.01820406,
        -0.0019291613,
        0.03136483,
        0.017635712,
        -0.016593687,
        -0.005786634,
        0.016865836,
        -0.020428723,
        0.013737211,
        -0.0046315053,
        -0.03179142,
        -0.005026913,
        0.0062090415,
        -0.025857532,
        0.007279843,
        0.018496595,
        -0.02416759,
        -0.038393572,
        -0.020597773,
        0.0048531457,
        -0.027486922,
        -0.007218848,
        -0.049159087,
        0.004101498,
        0.011849606,
        0.0013508927,
        -0.02626392,
        0.0075444845,
        -0.0022321118,
        -0.012864751,
        -0.01668012,
        0.017571881,
        -0.0025393288,
        -0.003433796,
        -0.023006719,
        0.007309272,
        0.022322757,
        0.0013949063,
        0.029448025,
        0.00684372,
        0.03147141,
        -0.027315686,
        -0.02377111,
        0.020615298,
        0.010817604,
        0.0030142027,
        -0.009364923,
        -0.01120888,
        0.007754927,
        0.012440977,
        0.02553854,
        0.0036218655,
        -0.028790446,
        0.0070306775,
        -0.021460202,
        -0.01672931,
        0.029857833,
        -0.09358335,
        -0.02683949,
        0.0140826795,
        0.007395356,
        -0.00013574582,
        -0.008899934,
        0.03736359,
        0.007940466,
        9.4475436e-5,
        0.006119724,
        0.00048320502,
        -0.013980794,
        0.019206816,
        -0.00127436,
        0.007180778,
        -0.0158765,
        0.00024839205,
        -3.19612e-5,
        0.003999562,
        -0.034198783,
        -0.0018116806,
        -0.019783342,
        -0.0018281871,
        0.02190648,
        0.022045443,
        0.0034437678,
        0.043880966,
        -0.0063928077,
        0.012511944,
        0.009420611,
        0.012578048,
        0.0015340431,
        -0.0043207514,
        0.008683629,
        0.0013556329,
        -0.013817396,
        0.0050392747,
        -0.008291572,
        0.015504627,
        0.01691441,
        0.00022194754,
        0.008466287,
        -0.020712635,
        0.0020797683,
        -0.0012203767,
        0.018791256,
        0.003097987,
        -0.00017589971,
        0.020734044,
        0.0008443867,
        0.010756432,
        0.015496655,
        -0.02759842,
        0.0106397625,
        -0.01457108,
        -0.024350021,
        -0.012402703,
        -0.010298464,
        -0.0064785434,
        0.01852969,
        0.0076322993,
        0.001376356,
        -0.008785293,
        0.034269333,
        -0.033311836,
        0.0138310725,
        0.0011927263,
        0.033264466,
        0.031108102,
        0.018886836,
        -0.036285963,
        -0.0020648877,
        -0.0036866048,
        0.016950136,
        -0.014904401,
        -0.010208688,
        -0.045731477,
        0.002470306,
        0.022543015,
        -0.0052560614,
        -0.018563505,
        -0.022220248,
        -0.06179201,
        -0.0331982,
        0.0015147936,
        -0.00418036,
        -0.015320562,
        -0.0020088141,
        0.012161872,
        -0.007764781,
        -0.0029550083,
        -0.01783785,
        -0.010649557,
        -0.008851186,
        0.010230826,
        -0.01066734,
        -0.0020581367,
        0.00063735334,
        -0.007600462,
        -0.004371131,
        0.0007454736,
        -0.0043990784,
        -0.02347868,
        -0.024996832,
        -0.007426137,
        -0.010436347,
        -0.04124103,
        0.0027246706,
        -0.016997417,
        0.0054563913,
        0.011292378,
        -0.0091315955,
        -0.008724221,
        -0.16879012,
        -0.008939522,
        0.005985267,
        0.0028932563,
        0.008014962,
        -0.008946808,
        -0.022926359,
        0.011784093,
        0.019473122,
        0.002444183,
        -0.009294917,
        -0.022086648,
        -0.026350293,
        -0.00039345663,
        0.0052859955,
        0.101413734,
        -0.037218057,
        0.03301508,
        -0.021404775,
        -0.03934058,
        0.018559016,
        -0.0067283213,
        0.0032157865,
        -0.028197434,
        0.001973938,
        0.019658603,
        -0.004383593,
        -0.00079638255,
        -0.0037952547,
        0.017593825,
        0.0055010947,
        0.0012927591,
        -0.00792334,
        0.008180138,
        0.01420149,
        -0.022440925,
        -0.012596946,
        -0.017268378,
        -0.015559123,
        -0.021685053,
        0.017505491,
        0.018004205,
        0.0043980842,
        -0.02092591,
        0.005992013,
        -0.018717637,
        0.005131504,
        -0.0072135576,
        -0.013164605,
        0.009494145,
        -0.029811708,
        -0.057527065,
        0.010810498,
        -0.0163933,
        0.013403295,
        0.0031736176,
        -0.0357199,
        0.013663345,
        0.004259274,
        0.01358606,
        0.018869279,
        0.021542216,
        -0.006413361,
        0.015767999,
        0.006096939,
        -0.018216815,
        0.0028072714,
        0.006488,
        -0.014836905,
        -0.0049051135,
        -0.01227412,
        0.02832439,
        -0.009067629,
        -0.016633986,
        0.0065074493,
        -0.0073137395,
        -0.016098434,
        0.030192243,
        0.009763206,
        0.008946307,
        -0.013027781,
        0.0019199097,
        0.019099837,
        0.0132567175,
        0.01021934,
        0.0068432973,
        -0.020871872,
        -0.029151568,
        -0.018879479,
        0.014925716,
        0.017111983,
        0.015465739,
        -0.008917692,
        0.015513022,
        0.009769118,
        0.008729419,
        -0.0016036168,
        -0.009116089,
        0.0095937755,
        0.004286167,
        -0.0040323315,
        0.001266801,
        -1.2609615e-5,
        -0.020696208,
        -0.0064576427,
        -0.0032497598,
        0.029184584,
        0.046788823,
        0.0075851874,
        -0.03282254
    ],
    "ai_confidence_score": 0.9999999999999999,
    "ai_extraction_metadata": {
        "extracted_at": "2026-02-15T15:43:33.851438Z",
        "ai_model": "gemini-2.0-flash-lite",
        "extraction_method": "automated",
        "content_length": 12975,
        "url": "https:\/\/nytimes.com\/2026\/01\/26\/us\/chatgpt-delusions-psychosis.html",
        "existing_metadata": {
            "author_name": null,
            "published_at": null,
            "domain_name": null,
            "site_name": null,
            "section": null,
            "publisher": null
        }
    }
}
Database ID
13683
UUID
a10e0828-208f-4e09-bffd-aae7206e1130
Submitted By User ID
7
Created At
February 11, 2026 at 1:26 PM
Updated At
February 15, 2026 at 3:43 PM
AI Source Vector
Vector length: 768
View Vector Data
[
    -0.012030031,
    -0.0079153655,
    -0.0025739877,
    -0.06332621,
    0.0017393029,
    -0.00022780825,
    -0.007933994,
    0.029984782,
    -0.0017629768,
    0.011909881
]... (showing first 10 of 768 values)
AI Extraction Metadata
{
    "extracted_at": "2026-02-15T15:43:33.851438Z",
    "ai_model": "gemini-2.0-flash-lite",
    "extraction_method": "automated",
    "content_length": 12975,
    "url": "https:\/\/nytimes.com\/2026\/01\/26\/us\/chatgpt-delusions-psychosis.html",
    "existing_metadata": {
        "author_name": null,
        "published_at": null,
        "domain_name": null,
        "site_name": null,
        "section": null,
        "publisher": null
    }
}
Original Content
<html lang="en" class="story nytapp-vi-article nytapp-vi-story story nytapp-vi-article " data-nyt-compute-assignment="fallback" xmlns:og="http://opengraphprotocol.org/schema/" data-rh="lang,class"><head>
    
    
    <meta charset="utf-8">
    <title>How Bad Are A.I. Delusions? We Asked People Treating Them. - The New York Times</title>
    <meta data-rh="true" name="robots" content="noarchive, max-image-preview:large"><meta data-rh="true" name="description" content="Dozens of doctors and therapists said chatbots had led their patients to psychosis, isolation and unhealthy habits."><meta data-rh="true" property="twitter:url" content="https://www.nytimes.com/2026/01/26/us/chatgpt-delusions-psychosis.html"><meta data-rh="true" property="twitter:title" content="How Bad Are A.I. Delusions? We Asked People Treating Them."><meta data-rh="true" property="twitter:description" content="Dozens of doctors and therapists said chatbots had led their patients to psychosis, isolation and unhealthy h...
Parsed Content
AdvertisementSKIP ADVERTISEMENTSupported bySKIP ADVERTISEMENTHow Bad Are A.I. Delusions? We Asked People Treating Them.Dozens of doctors and therapists said chatbots had led their patients to psychosis, isolation and unhealthy habits.Listen to this article Β· 11:10 min Learn moreShare full articleJulia Sheffield, a psychologist at Vanderbilt University Medical Center in Nashville, is navigating how to treat problems caused or exacerbated by A.I. chatbots.Credit...William DeShazer for The New York TimesBy Jennifer Valentino-DeVries and Kashmir HillJan. 26, 2026Leer en espaΓ±olJulia Sheffield, a psychologist who specializes in treating people with delusions, is difficult to rattle. But she was unnerved last summer when patients began telling her about their conversations with A.I. chatbots.One woman, who had no history of mental illness, asked ChatGPT for advice on a major purchase she had been fretting about. After days of the bot validating her worries, she became convinced that business...

Processing Status Details

Detailed status of each processing step.

Pipeline Status
Completed Started: Feb 15, 2026 3:43 PM Completed: Feb 15, 2026 3:44 PM
AI Extraction Status
Pending

Re-evaluate with Updated AI

Re-process this source with the latest AI models and improved claim extraction algorithms. This will update the AI analysis and extract new claims without re-scraping the content.

Claims from this Source (37)

All claims extracted from this source document.