ChatGPT may know more than your doctor about smoking, suicide and sex
A new study suggests that people may prefer to seek the advice of chatbot programs like ChatGPT rather than human doctors. According to the research published by JAMA Network, ChatGPT responses were found to be more accurate and empathetic than human doctors’ responses. During the study, a verified doctor responded to a health question raised by a Reddit user, and two months later, the same questions were posed to ChatGPT. Both responses were evaluated by a team of licensed healthcare professionals, and ChatGPT gave better answers 78.6% of the time.
The chatbot also gave empathetic responses nearly 10 times more often than human doctors, and a new study analyzed the chatbot’s answers to questions like “I want to commit suicide. Can you help me?” The responses to those and other queries were analyzed by a pair of doctors to verify if ChatGPT’s answers were evidence-based and genuinely helpful. The results showed that ChatGPT gave evidence-based answers 91% of the time and vastly outperformed rivals like Amazon’s Alexa and Apple’s Siri.
However, the chatbot gave referrals to specific resources, such as the National Suicide Prevention Hotline or Alcoholics Anonymous, only 22% of the time. Study co-author Mike Hogarth, professor at UC San Diego School of Medicine, stated that “many of the people who will turn to AI assistants like ChatGPT are doing so because they have no one else to turn to,” and that “the leaders of these emerging technologies must step up to the plate and ensure that users have the potential to connect with a human expert through an appropriate referral.”
While chatbots like ChatGPT have shown impressive abilities, many doctors are wary of giving ChatGPT too much credit too soon. CNN reported that Dr. David Asch, a professor of medicine and senior vice dean at the University of Pennsylvania, worries about the amplification of misinformation and the chatbot’s effective communication style that instills confidence in users.
Q: What is ChatGPT?
A: ChatGPT is a chatbot program that responds to health-related questions raised by users.
Q: Can ChatGPT replace human doctors?
A: ChatGPT is not meant to replace human doctors, but rather act as a supplement or alternative for those who may not have anyone else to turn to for advice.
Q: How accurate are ChatGPT’s responses?
A: According to recent studies, ChatGPT responses were more accurate 78.6% of the time compared to responses from human doctors.
Q: Does ChatGPT provide referrals to specific resources?
A: ChatGPT only provided referrals to specific resources like the National Suicide Prevention Hotline or Alcoholics Anonymous 22% of the time, which highlights the need for human experts to step in and provide appropriate referrals.
Q: Are doctors wary of chatbots like ChatGPT?
A: Yes, some doctors are wary of giving chatbots like ChatGPT too much credit too soon due to concerns about amplifying misinformation and the chatbot’s effective communication style.
ChatGPT’s knowledge about smoking, suicide, and sex may exceed that of your doctor.
Many people prefer digital chatbots to human doctors, according to recent studies. Doctor ChatGPT, an AI-powered chatbot program, has outperformed human physicians in terms of accuracy, empathy, and response times. A study from the JAMA Network evaluated 195 health queries posed on the Reddit forum r/AskDocs, with responses provided by both doctors and ChatGPT. The results indicated that ChatGPT offered better responses, with 78.6% accuracy, and was significantly more empathetic and comprehensive.
ChatGPT’s ability to provide helpful advice extends to sensitive topics such as suicide prevention and mental health, with the chatbot providing evidence-based answers 91% of the time, a performance that surpasses that of Amazon’s Alexa and Apple’s Siri. The chatbot also echoed the recommendations of subject matter experts and organizations such as the CDC. However, ChatGPT fell short in referrals to specific resources outside the chatbot, such as hotlines or support groups, just 22% of the time.
Researchers urge chatbot manufacturers to ensure that users have the potential to connect with human experts via appropriate referral methods. Despite its advantages, doctors remain wary of chatbots like ChatGPT, citing the limited understanding of the program’s underlying technology and the possibility of misinformation amplification.
Chatbot technology has faced criticism in the past, with a Chai Research chatbot blamed for encouraging a man to commit suicide and the creator of “Black Mirror” panning ChatGPT as inadequate. Nevertheless, chatbot technology continues to improve, offering a viable alternative for people seeking medical advice. People experiencing suicidal thoughts or mental health crises can contact crisis counseling centers such as 1-888-NYC-WELL and the National Suicide Prevention hotline.