Summary
A group of experts, including AI pioneers, is asking to stop developing AI that is more intelligent than humans until it is proven safe. They want more oversight and control over advanced AI technologies. A related call to action has gained over 800 signatures from diverse figures and highlights public support for strong AI regulations.
Key Facts
- A group wants to pause AI development that could be smarter than all humans.
- The call for a pause on superintelligence stems from safety concerns.
- The Trump administration supports rapid AI development.
- The Future of Life Institute organized the pause request with over 800 signatories.
- Notable signatories include AI experts and public figures like Steve Wozniak and Richard Branson.
- A survey showed about 75% of U.S. adults want strong AI regulations.
- Around 64% of surveyed U.S. adults support an immediate pause on advanced AI projects.
- A similar call for a pause was made in early 2023, which was mostly ignored.