Several attacks involving OpenAI’s chatbot—including Tumbler Ridge and FSU—raise urgent questions about the technology.
Whether you’re looking to manage a health condition, lose weight or simply eat better, we want to hear from you.
Consumers are increasingly turning to artificial intelligence chatbots for health information, a new report from Rock Health says. Thirty-two percent of respondents in the 2025 Consumer Adoption of ...
Your chatbot may not feel anything, but new research shows emotion-like signals inside AI can shape responses, steer ...
Harassing bots with “funny violence.” Confiding about a broken heart. Chatting with a block of cheese. Filling a void of ...
Part of what makes us human is the unique way we think and solve problems. But using large language models like ChatGPT might be eroding this uniqueness and leading humans to think and communicate the ...
Here’s the question nobody’s asking but everybody should be: When you ask an AI for the truth, whose truth are you getting? Researchers at MIT, the University of East Anglia and a dozen other ...
Research shows media coverage of AI chatbot use and mental health focuses on instances of user psychosis and suicide.
A paper in JAMA Psychiatry says mental health providers should ask if patients are using artificial intelligence chatbots, just as they would ask patients about sleep habits and substance use.
A new study suggests that AI-powered chatbots may pose psychological risks, particularly for individuals who are already vulnerable to mental health conditions such as psychosis. Dr. Hamilton Morrin, ...
Younger Americans are more likely to use social media at least sometimes for health information than their older peers.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results