Schumacher interview the misuse of ChatGPT - GulfToday

Schumacher interview the misuse of ChatGPT

Michael-Schumacher1

Michael Schumacher

There are many like the nonagenarian former US Secretary of State Henri Kissinger who are hugely enthused by the AI device ChatGPT, which can churn out reasonable articles based on what is demanded. The ethical questions implied in a software-generated piece of writing in schools and colleges have already been raised. The misuse in journalism, which was lurking beneath the surface, and which many feared even to mention, exploded when German magazine Die Aktuelle published a Chat GPT-based interview with German motorcar racing legend Michael Schumacher, who had been incapacitated in a 2013 accident and who has not been active since then, protected by his family. The headline said, “Michael Schumacher, the first interview” and the strapline above said, “It sounded deceptively real”.

The argument that the strapline was a clarification of sorts that this was no real interview might not serve as a defence because the readers are drawn in basically by the headline. More importantly, the quotes that ChatGPT generated in the interview are indeed deceptive. One of them says, “My life has completely changed since [the accident]. That was a horrible time for my wife, my children and the whole family.” It only shows that ChatGPT can churn out plausible answers, and that makes it all the more difficult to differentiate the real from the artificially generated answer.

The publishers of the magazine have apologised for the publication of the article and sacked the managing editor even as the family planned to take legal action against the magazine, and Schumacher fans fumed. Die Aktuelle had run into trouble earlier too regarding stories on Schumacher. In 2014, the magazine carried a front-page story saying “Awake” along with a picture of Schumacher and talked about other celebrities who had gone into a deep coma and woke up. In a story published in 2015 it said Schumacher’s wife Caroline has found new love when it was actually daughter Gina.  While the two earlier stories in connection with Schumacher and his family are inaccurate, the latest one is fake, and therefore far more serious, and with moral implications. A newspaper does not print fiction unless it says so explicitly. That why, if a newspaper reports what has not been said by a person, then it is indeed a case of libel.

There is need to distinguish between AI and its uses, and the problems created by AI devices such as ChatGPT. The problems that could arise in the misuse of ChatGPT can clearly be seen in the fake interview with a differently-abled Schumacher, and how it could spread false news like a prairie fire. There is a crying need for protocols about how ChatGPT and others like it are to be used, and these protocols are needed in journalism as well as in other activities like scientific research, and also in scholarship.

It is possible to argue than even as one uses computers and its many devices like search engines to gather information, it is possible to assemble facts through ChatGPT-like devices and it would save time and labour which then could be used for thinking about the conclusions to be drawn from the evidence gathered in the first place. The argument cannot be dismissed out of hand, but there is need to be aware of its misuse in small and big ways. The ChatGPT-generated interview with Schumacher is both small and big. The tagline, “It sounded deceptively real”, can be read as a statement that the interview is not real but it sounds real. But the immediate impact is that most people will not stop to read the fine print. And the false information that there was an interview with Schumacher will become a fact before the clarification in the tagline is highlighted.

Related articles