November 03, 2023
Biden Took the First Step With AI Commitments — Now It’s Congress’ Turn
With an executive order (EO) released on Monday, the Biden administration has taken a major first step in supporting safe and reliable artificial intelligence (AI) innovation, while tackling some of the greatest risks that the technology poses. From health care to civil rights to national security, this mandate presents a sweeping and comprehensive framework for tackling AI risks, and one which Congress must take up in the coming months as we seek to protect Americans while pursuing global technical leadership.
One of the keys to tackling these risks is developing advanced methods to train effective AI systems while maintaining Americans’ privacy.
While many in Washington (including myself), have sought to frame the AI risk debate as a give-and-take between managing long- and near-term risks, the EO aims to address both simultaneously, with restrictions and reporting requirements for large language models (LLMs) like Chat GPT-4 and new, sector-specific restrictions that will tackle smaller AI systems.
Read the full article from The Messenger.
More from CNAS
-
Technology & National Security
Dutch Export Controls Don’t Go Far Enough on ChinaControlling the machines that make chips matters more than controlling any specific chip....
By Michelle Nie
-
Technology & National Security
China’s AI Is Spreading Fast. Here’s How to Stop the Security RisksThe first problem is not about China, but about AI as a technology: It is incredibly difficult to audit the global supply chain for AI software....
By Ryan Fedasiuk
-
Iran Shows the Emerging Crisis of the U.S. Airborne Battle Management Fleet
The ABM force structure crisis comes at an acute moment for US air operations....
By Philip Sheers
-
Technology & National Security
Anthropic, the Pentagon, and the Future of Autonomous WeaponsThe last big story right before the war in Iran started was the collapse in the relationship between the Pentagon and Anthropic, with the latter objecting to any potential use...
By Paul Scharre
