Breaking News

Top scientists alert to outside control of AI


Yoshua Bengio (L) and Max Tegmark (R) discuss the development of artificial general intelligence during the recording of Live Podcast from CNBC “Beyond the Valley” in Davos in Switzerland in January 2025.

CNBC

Artificial general intelligence built like “agents” could prove dangerous because its creators could lose control of the system, for CNBC for CNBC, in two of the world’s most prominent AI scientists.

In the last episode of CNBC “Behind the valley” Podcast posted on Tuesday, Max Tegmark, Professor at the Massachusetts Institute of Technology and President of the Institute for the Future of Life, and Yoshua Bengio, was named one of the “godfather AI” and professor at Université de Montréal, spoke about their concern about artificial general intelligence or Agi. The term is widely related to AI systems that are smarter than people.

Their fears come from the world’s largest companies now talking about “AI agents” or “Agentic AI” – which companies claim that AI chatbots will allow them to behave like assistants or agents and help in their work and everyday life. Industry estimates differ when agi will occur.

With this concept, the idea that AI systems can have their own “agencies” and thoughts, Bengio said.

“Researchers in AI inspired human intelligence to build machine intelligence, and in people there is a mix and ability to understand the world such as pure intelligence and agent, which means … use their knowledge to achieve goals,” Bengio told CNBC “Beyond the Valley. “

“We are building agi right now: we try to make them agents who understand them a lot about the world and then can act accordingly. But this is actually a very dangerous proposal.”

Bengio added that following this approach would be like “creating a new species or new intelligent entities on this planet” and “not knowing whether to behave in ways that agree with our needs.”

“Instead, we can consider, what are the scenarios where things go bad and everyone rely on the agency? In other words, this is because AI has its goals that we can be in trouble.”

The idea of ​​self -preservation could be started, because Ai becomes even smarter, Bengio said.

“Do we want to be in competition with entities that are smarter than us? This is not very convincing gambling, right? So, we have to understand how self -preservation can appear as a goal in AI.”

ALi tool key

For MIT’s Tegmark, the key is located in the so-called “AI tool”-Sustavas created in a specific, closely defined purpose but who do not have to be agents.

Tegmark said that the AI ​​tool could be a system that tells you how to cure cancer or something that owns “some agencies” like a car that can prove or get a really high, really reliable guarantee that you are still still going to be able to control. “

“I think, on an optimistic note here, we can have almost everything we are excited to AI … If we simply insist that we have some basic security standards before people can sell powerful AI systems,” Tegmark said.

“They have to show that we can keep them under control. Then the industry will quickly innovate to understand how to do it better.”

Tegmark’s Institute for the Future of Life 2023 has invited to a break of the development of AI systems that can compete with intelligence at the level of people. Although this did not happen, Tegmark said that people were talking about the topic, and now it’s time to take measures to understand how to put on AGI control fences.

“So, at least now a lot of people talk about talking. We have to see if we can make them walk,” Tegmark told the CNBC “Beyond the Valley”.

“It is obviously crazy that people build something smarter than us before we realized how to control it.”

There are several views when agi arrive, partly guided by different definitions.

Openai Executive Director Sam Altman said his company knows how to build agi and said he would arrive before people think, even though he reduced the influence of technology.

“I suppose we will hit agi before most people in the world think and it will be much less,” Altman said in December.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Social Media Auto Publish Powered By : XYZScripts.com