Science: Science and Technology:

Models: language models like ChatGPT make it easy to create fake science research

University scientists New York State, Downstate Health Sciences and Weill Cornell Medical Center in New York found that advanced language models such as ChatGPT can make it easier for scammers to write fabricated research papers. This is reported in the article published in Patterns magazine.

Researchers demonstrated their ability to create a research paper using ChatGPT that generated well-written but completely fictional summaries. A hypothetical scammer can submit fake texts to several scientific publications for publication. In the event that editors accept them, ChatGPT can be used to write a completely fabricated article with fake data, non-existent clinical trial participants, and nonsensical results.

The scientists conducted an analogue of the Turing test, in which human experts were asked to examine both human-made and artificial intelligence (AI) texts. Experts misidentified 32% of research abstracts generated by the language model and 14% of human-authored abstracts.

Furthermore, the fabricated texts were tested on three online AI-written text detectors and, in the vast majority of cases, the summaries were identified as generated by the language model. This indicates that the use of fraud detection tools by publishers can be an effective method of combating fraudsters. However, when the experts first ran the text through an AI-powered online paraphrase tool, the summary was recognized as man-made. Thus, the authors of the work conclude, more advanced AI detection tools are needed.

Currently, creating a fake study with a sufficient level of credibility takes a lot of effort and a lot of time, which can be too tedious a task for fraudsters. However, AI can potentially complete this task in minutes, making it much easier to create counterfeits. This can be used, for example, by unscrupulous drug manufacturers, as well as medical professionals to obtain funding and career advancement.

Leave a Comment