Advertisement

Elon Musk, Steve Wozniak and other tech leaders warn 'out-of-control' AI poses 'profound risks'

Artificial intelligence-driven language models have garnered millions of users in recent months, instantly whipping up viral sensations like a biblical verse about how to remove a peanut butter sandwich from a VCR.

However, AI-enhanced systems pose significant dangers that far outweigh the benefits, according to a group of tech leaders, including entrepreneur Elon Musk, who signed an open letter on Wednesday calling for a six-month pause in the development of AI systems and a major expansion of government oversight.

"AI systems with human-competitive intelligence can pose profound risks to society and humanity," the letter said.

MORE: OpenAI releases GPT-4, claims its chatbot significantly smarter than previous versions

"Recent months have seen AI labs locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control," the letter added.

Earlier this month, artificial intelligence company OpenAI released the latest version of ChatGPT, the AI-powered language model that became an internet sensation late last year.

GPT-4, the latest model, can understand images as input, meaning it can look at a photo and give the user general information about the image; and it can write code in all major programming languages, among other advances.

PHOTO: This file photo taken on January 23, 2023 in Toulouse, France, shows screens displaying the logos of OpenAI and ChatGPT. (Lionel Bonaventure/AFP via Getty Images, FILE)
PHOTO: This file photo taken on January 23, 2023 in Toulouse, France, shows screens displaying the logos of OpenAI and ChatGPT. (Lionel Bonaventure/AFP via Getty Images, FILE)

The open letter released on Wednesday calls on AI labs to immediately pause the training of AI systems more powerful than GPT-4.

In addition to Musk, prominent figures signed onto the letter include Apple Co-founder Steve Wozniak, former Democratic presidential candidate Andrew Yang and Marc Rotenberg, the president of the nonprofit Center for AI and Digital Policy.

In all, the letter features more than 1,000 signees, including professors, tech executives and scientists.

The letter arrives roughly a month after Microsoft released a newly AI-enhanced version of its search engine Bing for some users.

MORE: What to know about Microsoft's controversial Bing AI chatbot

Microsoft declined to comment on the letter. Open AI did not immediately respond to a request for comment from ABC News.

Describing conversations with the chatbot that lasted as long as two hours, some journalists and researchers warned that the AI could potentially persuade a user to commit harmful deeds or steer him or her toward misinformation.

In a series of blog posts, Microsoft acknowledged unexpected results and placed limits on the tool.

"We've updated the service several times in response to user feedback, and per our blog are addressing many of the concerns being raised, to include the questions about long-running conversations," a Microsoft spokesperson previously told ABC News.

In January, Microsoft announced it was investing $10 billion in OpenAI, the artificial intelligence firm that developed Chat GPT.

MORE: Is AI coming for your job? ChatGPT renews fears

The move deepened a longstanding relationship between Microsoft and OpenAI, which began with a $1 billion investment four years ago.

The open letter called on AI developers to work with policymakers to improve oversight of artificial intelligence technology, and called on the industry to shift its priorities as it works to enhance AI.

"AI research and development should be refocused on making today's powerful, state-of-the-art systems more accurate, safe, interpretable, transparent, robust, aligned, trustworthy, and loyal," the letter said.

ABC News' Victor Ordonez contributed reporting.

Elon Musk, Steve Wozniak and other tech leaders warn 'out-of-control' AI poses 'profound risks' originally appeared on abcnews.go.com