Joe Biden’s Big AI Plan Sounds Scary—but Lacks Bite
“We intend that the actions we are taking domestically will serve as a model for international action, understanding that AI developed in one nation can impact the lives and livelihoods of billions of people around the world,” Harris said Monday. “Fundamentally, it is our belief that technology with global impact requires global action.”
Ahead of Biden’s address, WIRED sat down with Arati Prabhakar, director of the White House Office of Science and Technology Policy. She was quick to bat away a question about who has the president’s ear on generative AI. “Let’s start at the beginning,” she offered. “This is the biggest action anyone’s ever taken anywhere in the world on AI, and the president’s driven it from the beginning.”
Prabhakar also didn’t much want to talk about how a law designed for wartime could be casually used to enact new tech regulation. “The Defense Production Act has been used in times of crisis when we face significant national security issues,” Prabhakar said, noting its use to speed up delivery of vaccines and PPE during the pandemic.
But is the US—WIRED asked in countless ways—in a national crisis caused by AI on the level of a war or global pandemic?
“The national security issues are not the whole story but one part that does require paying attention to,” Prabhakar said. “That’s the reason for using the Defense Production Act for the very specific purpose of getting notification and disclosure related to powerful model development beyond where we are today.” It’s a nice theory that could end up being put to the test in court, if companies challenge this new use case for a law generally tapped only in emergencies.
Prabhakar steered the conversation back toward more everyday AI but ended up highlighting other limitations of Biden’s latest executive action. “The vast majority of actions in this executive order are about how we use the AI technologies that are already out in the world responsibly,” she said. “How do we make sure they don’t violate Americans’ privacy? How do we make sure that we don’t embed bias that changes, you know, where people can live and whether they get a loan or whether they go to jail or not?”
Great questions—but changing the rules for privacy or mandating, say, the universal watermarking of AI-altered images for private companies requires legislation from Congress. Indeed, even as his White House is going all out to sell the expansive new executive order as a historic first, Biden admits that the order falls short of what the changes wrought by this ever-evolving technology demand. “This executive order represents bold action, but we still need Congress to act,” Biden said.