Op-Ed: Unregulated nuclear AI is here

  • The U.S. has still not created AI-specific regulations for nuclear power, unlike other countries
  • AI is already in use at four live U.S. nuclear reactors despite the regulatory void
  • Industry analysts downplay the risks, but nuclear experts strongly disagree
  • The situation reflects a broader trend of deregulation driven by Big Tech and the current administration 

Unlike other nuclear powers worldwide, the U.S. has still not updated its regulations to govern the use of AI in nuclear power stations. 

Am I the only person worried about this? 

AI plays a dual role in America’s energy crisis. It’s the cause of the problem — demanding more power than the current electrical infrastructure can deliver—but also a key part of the solution, enabling grid and power plant operators to increase efficiency and lower the cost of delivering power to AI data centers and factories. 

But at what cost? 

The U.S. Nuclear Regulatory Commission (NRC) has yet to define regulations specifically designed for artificial intelligence, a story I broke back in October last year. When asked about this by Fierce Network, the NRC adopted a Schrödinger’s Cat position: claiming that existing rules are sufficient to manage AI in nuclear power plants, while simultaneously acknowledging it has not developed any rules for managing AI in nuclear power plants. 

The slow pace isn’t surprising. Writing AI regulations for nuclear systems—incorporating thermodynamics, fluid dynamics, nuclear physics, and human-machine interfaces—is insanely complicated. But the absence of regulation hasn’t stopped the U.S. nuclear industry from charging ahead with AI. 

Four U.S. nuclear reactors have already deployed AI in live operational environments. Two are owned by Constellation Energy, which uses AI for critical operations including predictive maintenance, fuel cycle optimization, and monitoring temperatures inside the reactor core.  

The others belong to PG&E and Purdue University. Hermes Power and Oklo Inc. have announced plans to activate AI in production settings within two years. Dozens of other pre-commissioning trials are underway.

Fierce Network asked research firm Gartner whether using unregulated AI in nuclear power was a concern. Analyst Bob Johnson said, “No, that’s the simple answer.” He also noted, “Nuclear power is needed to run the data centers that run AI but it doesn’t need AI to run the power plant.”

This surprised one of FNTV’s sources, a designer of the fission reactors aboard the U.S. Navy’s fleet of intercontinental ballistic missile submarines. He noted that nuclear power plants are widely considered to be the most complex industrial systems ever built — even more so than the Large Hadron Collider. 

Another FNTV source — a former head of the UK’s National Nuclear Laboratory — called Gartner’s position “perplexing.” 

“[It’s] dangerous,” he said. “Obviously.” 

You could dismiss this as Gartner getting something important wrong — again. But it highlights a much deeper issue: the collapse of regulatory oversight in the U.S. If we can’t establish rules for AI in nuclear energy, what can we regulate? 

Complete deregulation may in fact be the endgame — for both the current U.S. administration and the Big Tech companies behind it, who are spending billions on lawyers and lobbyists to take the brakes off America’s AI, energy, and data governance. 

It’s a toxic trait — though admittedly not as toxic as a nuclear fuel pool leaking borated coolant contaminated with radioactive fission products.

Steve Saunders is a British-born communications analyst, investor and digital media entrepreneur with a career spanning decades.


Op-eds from industry experts, analysts or our editorial staff are opinion pieces that do not represent the opinions of Fierce Network.