Belgian man talks to AI chatbot for 6 weeks, then kills self: Report

[ad_1]

A Belgian national died by suicide after talking to an artificial intelligence chatbot. The man reportedly spoke to the AI chatbot ELIZA for six weeks on climate change, Belgian news website La Libre (article behind paywall) reported. 

But the conversation turned out to be increasingly confusing and harmful, triggering the man to take the drastic step. His wife told the website that her husband would be alive if he hadn’t spoken to the chatbot. 

ALSO READ: Italy blocks OpenAI’s ChatGPT over privacy concerns

According to the news report, the Belgian national had become concerned about environment. It was when he started to talk with the chatbot which uses an open-source AI language model named GPT-J. His wife said that ELIZA had become his ‘confidante’ and had become like a drug he could not live without in the morning and night. 

The six weeks of conversation turned fatal as he took his own life. As per the report, he had proposed the idea of a suicide if the chatbot took care of the planet. 

ALSO READ: ‘AI risk to society’: Why Elon Musk is asking for a development pause?

It is not known whether the Belgian man had mental health issues before killing himself(Shutterstock)
It is not known whether the Belgian man had mental health issues before killing himself(Shutterstock)

It is not known whether the man had mental health issues before killing himself. But he had indeed isolated himself from his family and friends before death. Mathieu Michel, the secretary of state for digitalisation in control of administrative simplication, privacy and building regulation said he was moved by the tragedy of the family and called for the matter to be taken very seriously.

Michel called for the need to define responsibilities vis a vis ChatGPT and said that the general public is now more aware of the potential of AI in the lives. The official added that it is necessary to determine the nature of responsibilities contributing to such a tragedy.

[ad_2]
Source link