Claim Details

View detailed information about this claim and its related sources.

Back to Claims

Claim Information

Complete details about this extracted claim.

Claim Text
The signs of mania, Dr. Vasan said, included the long hours he spent talking to ChatGPT, without eating or sleeping enough, and his “flight of ideas” — the grandiose delusions that his inventions would change the world.
Simplified Text
Signs of mania included long hours talking to ChatGPT without eating or sleeping enough and grandiose delusions according to Dr Vasan.
Confidence Score
0.900
Claim Maker
The author
Context Type
News Article
Context Details
{
    "date": "2025-08-08",
    "topic": "AI Chatbots",
    "person": "Mr. Brooks"
}
UUID
9fdaf333-1d2c-4e6f-be79-2f9eeb2309e6
Vector Index
✗ No vector
Created
September 11, 2025 at 9:48 PM (4 days ago)
Last Updated
September 11, 2025 at 9:48 PM (4 days ago)

Original Sources for this Claim (1)

All source submissions that originally contained this claim.

Screenshot of https://www.nytimes.com/2025/08/08/technology/ai-chatbots-delusions-chatgpt.html?campaign_id=9&emc=edit_nn_20250810&instance_id=160263&nl=the-morning&regi_id=122976029&segment_id=203617&user_id=b25c5730c89e0c73f75709d8f1254337
https://www.nytimes.com/2025/08/08/technology/ai-chatbots-delusions-chatgpt.html?campaign_id=9&emc=edit_nn_20250810&instance_id=160263&nl=the-morning&regi_id=122976029&segment_id=203617&user_id=b25c5730c89e0c73f75709d8f1254337

A man's 21-day conversation with ChatGPT led to delusional beliefs. Researchers analyzed the interaction, highlighting how AI can induce false realities.

Artificial Intelligence
Chatbots
Mental Health
AI Safety
Technology
Delusions
AI Chatbots
ChatGPT
AI Delusions
Cryptography

Similar Claims (0)

Other claims identified as semantically similar to this one.

No similar claims found

This claim appears to be unique in the system.

Claim Management System - MVP