September 16, 2024

Regulating AI Is Easier Than You Think

Artificial intelligence is poised to deliver tremendous benefits to society. But, as many have pointed out, it could also bring unprecedented new horrors. As a general-purpose technology, the same tools that will advance scientific discovery could also be used to develop cyber, chemical, or biological weapons. Governing AI will require widely sharing its benefits while keeping the most powerful AI out of the hands of bad actors. The good news is that there is already a template on how to do just that.

In the 20th century, nations built international institutions to allow the spread of peaceful nuclear energy but slow nuclear weapons proliferation by controlling access to the raw materials—namely weapons-grade uranium and plutonium—that underpins them. The risk has been managed through international institutions, such as the Nuclear Non-Proliferation Treaty and International Atomic Energy Agency. Today, 32 nations operate nuclear power plants, which collectively provide 10% of the world’s electricity, and only nine countries possess nuclear weapons.

The U.S. can work with other nations to build on this foundation to put in place a structure to govern computing hardware across the entire lifecycle of an AI model: chip-making equipment, chips, data centers, training AI models, and the trained models that are the result of this production cycle.

Countries can do something similar for AI today. They can regulate AI from the ground up by controlling access to the highly specialized chips that are needed to train the world’s most advanced AI models. Business leaders and even the U.N. Secretary-General António Guterres have called for an international governance framework for AI similar to that for nuclear technology.

The most advanced AI systems are trained on tens of thousands of highly specialized computer chips. These chips are housed in massive data centers where they churn on data for months to train the most capable AI models. These advanced chips are difficult to produce, the supply chain is tightly controlled, and large numbers of them are needed to train AI models.

Read the full article from TIME.

  • Commentary
    • Sharper
    • November 20, 2024
    Sharper: Trump 2.0

    Donald Trump's return to the White House is widely expected to reshape America's global priorities. With personnel choices and policy agendas that mark a significant break fro...

    By Charles Horn & Gwendolyn Nowaczyk

  • Podcast
    • November 18, 2024
    Team America

    Kate Kuzminski, Deputy Director of Studies, and the Director of the Military, Veterans, and Society (MVS) Program at CNAS, joins to discuss President-elect Donald Trump nomina...

    By Katherine L. Kuzminski

  • Commentary
    • November 14, 2024
    Response to Request For Comment: “Bolstering Data Center Growth, Resilience, and Security”

    CNAS experts emphasize the importance of data centers for artificial intelligence...

    By Janet Egan, Geoffrey Gertz, Caleb Withers & Grace Park

    • Podcast
    • November 12, 2024
    Will Technology Define the Future of Geopolitics?

    Rachel Ziemba, Adjunct Senior Fellow at the Center for a New American Security, joins Steve Paikin to discuss the era of growing geopolitical tensions paralleled by deepening ...

    By Rachel Ziemba

View All Reports View All Articles & Multimedia