4.1 Editorial Material

Will ChatGPT3 Substitute for us as Clozapine Experts?

期刊

JOURNAL OF CLINICAL PSYCHOPHARMACOLOGY
卷 43, 期 5, 页码 400-402

出版社

LIPPINCOTT WILLIAMS & WILKINS
DOI: 10.1097/JCP.0000000000001734

关键词

Key Words; artificial intelligence; Black or African American; clozapine; blood; pharmacokinetics; CYP1A2

向作者/读者索取更多资源

ChatGPT3, a new artificial intelligence program, provided a mixture of true and false information regarding the metabolism of clozapine in African-Americans. It offered real and nonexistent references, defending opposite positions within a week on the clinically relevant issue.
Background: ChatGPT3 is a new artificial intelligence program released on February 13, 2023.Method: The authors tested ChatGPT3 on February 18, 2023, and repeated the test a week later. They used their expertise on the effects of ethnic ancestry in the stratification of clozapine dosing and the new idea that they published in March 2022 that African-Americans need higher clozapine doses because they have higher clozapine clearance.Results: In the first interaction on February 18, ChatGPT3 provided reasonable and very up-to-date information, which included a comment that patients of African ancestry have higher clozapine metabolism. The other 4 interactions became progressively more concerning as we asked ChatGPT3 to provide references to justify the latter statement. ChatGPT3 provided non-existent references using articles from real journals, with real authors, false PubMed identifiers, and false titles. Moreover, ChatGPT3 said that the first author wrote in 2003 that African-Americans had higher CYP1A2 activity when that did not happen until 2022. One week later, the second author repeated the same set of questions. This time ChatGPT3 described the opposite, that African-Americans have lower CYP1A2 activity and slower metabolism. ChatGPT3 provided another set of articles to justify the information; some were real but did not comment on clozapine metabolism in African-Americans while others did not exist.Conclusions: ChatGPT3 provided a mixture of truth, twisted reality, and non-existent facts. Within one week it defended opposite positions regarding a clinically relevant issue such as using higher or lower clozapine doses in African-Americans.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据