Artificial intelligence may be the technology of the future, but could it bring us more harm than good?
That is what some AI researchers and tech experts fear, and that is why more than 1,000 of them signed a letter earlier this year urging a ‘pause’ on some AI development.
Eliezer Yudkowsky, a decision theorist at the Machine Intelligence Research Institute who has studied AI for more than 20 years, thinks that letter does not go far enough.
He says AI could actually lead to the end of humanity, and we need international agreements now to prevent the technology from advancing too far. He even believes we may need a ‘shooting war’ to stop a country that refuses to comply.
Yudkowsky recently spoke with FOX News Rundown’s Jessica Rosenthal about why he is trying to raise the alarm over AI and why action is needed now. He offered some blunt warnings and compared AI’s threat to that of nuclear war.
Due to time limitations, we could not include the conversation in our weekday editions of the Fox News Rundown. In a FOX News Rundown Extra exclusive, you will hear our entire unedited interview with AI expert Eliezer Yudkowsky.