A study shows that chatbot GBT has a better "patient care style" than some doctors - at least when their written advice is rated for quality and empathy, according to The Guardian.
The findings highlight the potential for AI to play a role in medicine, according to the study authors, who suggest that such apps could help doctors communicate with patients. “The opportunities to improve healthcare with AI are huge,” said Dr. John Ayers of the University of California, San Diego.
Others, however, point out that the findings do not mean that ChatGBT is actually a better doctor, and caution against delegating clinical responsibility given that the chatbot tends to produce inaccurate "facts."
The study used data from the Ask the Doctors forum, where members can post medical questions answered by certified healthcare professionals. The team randomly sampled 195 Ask the Doctor exchanges in which verified doctors answered a general question. The original questions were then posed to the artificial intelligence app, GBT Chat, to which it was asked to respond
A panel of three licensed healthcare professionals, who did not know whether the response came from a human doctor or AI, rated the responses for quality and empathy.
Overall, the panel preferred GPT chat responses to those provided by a human 79 percent of the time. GBT chat responses were also rated as good or very good 79 percent of the time, compared to 22 percent of clinicians' responses, and 45 percent of GPT chat responses were rated sympathetic or very sympathetic compared to 5 Only percent of doctors' responses
"These findings suggest that tools like ChatGPT can efficiently craft high-quality, personalized medical advice for review by clinicians, and we are beginning that process," said Dr Christopher Longhurst, of the University of California, San Diego.
Professor James Davenport, from the University of Bath, who was not involved in the study, said: “The paper does not say that ChatGPT can replace doctors, but it does call, quite legitimately, for more research into whether intelligence can. Industrial help doctors...and how. »
Others have warned against relying on AI for factual information because of its tendency to generate made-up and inaccurate “facts.”
1Published on Saturday, April 29, 2023 Source News: http://nabdapp.com/t/118792435
Comentarios