Tuesday, April 16, 2024

‘Hallucinating’ AI could cause social order collapse, warns tech giant


Matthew Field
8 April 2024

A ChatGPT prompt is shown on a device near a public school in Brooklyn, New York

Artificial intelligence (AI) could lead to the collapse of the social order and trigger wars, two of Japan’s biggest companies have claimed.

In a warning, Japanese technology giant NTT and Yomiuri Shimbun, the publisher of one of the country’s leading newspapers, said “trust in society as a whole may be damaged” by AI tools which are inaccurate or biased.

The companies said: “There is a concern that, in the worst-case scenario, democracy and social order could collapse, resulting in wars.”

They said current AI chatbots suffered from an issue known as “hallucination”, where the bots make up facts, adding that they often “lie with confidence”. AI bots could be used to spread “malicious information” that could “cause social unrest”.



They called for “rigid restrictions by law” on AI programmes and companies, including legislation for areas of the highest risk, such as around elections and protecting Japan’s national security.

NTT is Japan’s biggest telecoms company, formerly the state monopoly, while Yomiuri Shimbun is the country’s most widely circulated newspaper with six million daily readers.

The warning follows a crackdown by the European Union on AI companies, while the US, UK and Japan have all set up research institutes aimed at monitoring AI for potential risks.

A new wave of AI tools have been developed by US labs, including OpenAI, Google Deepmind and Anthropic. Some experts have raised warnings over the risks posed by AI, including potential existential risk to humanity.

Last year, thousands of entrepreneurs and scientists called for a six-month moratorium on the development of more powerful AI machines over fears they posed a “profound risk to society and humanity”.

However, these claims have divided researchers, with others insisting that the threat is being overblown. So far, the AI bots have largely been used for writing emails, summarising reports or creating pictures, but they are often prone to errors.

No comments: