White House tries to balance AI’s 'enormous danger' and promise

The astonishing speed of AI innovation has presented the Biden administration with a complex challenge.

President Biden at the microphone with several  American flags and the presidential seal behind him.
President Biden holds a meeting with his science and technology advisers at the White House on April 4 in Washington, D.C. (Kevin Dietsch/Getty Images)

Artificial intelligence could help detect disease, identify national security threats and resolve logistical challenges in global supply chains.

It can also spread misinformation, perpetuate institutional racism and, according to the technology’s most strident critics — including some responsible for its creation — actually destroy civilization.

The astonishing speed of AI innovation has presented the Biden administration with a complex challenge.

A bipartisan desire to regulate Big Tech had been building even before ChatGPT started writing articles and inventing languages. Yet in the past, Silicon Valley has been given a pass by Washington, regardless of which party was in office.

Given the immense promise and peril of artificial intelligence, the White House knows that it can no longer maintain a laissez-faire attitude.

Yahoo News spoke to Arati Prabhakar, who directs the Office of Science and Technology Policy at the White House, to discuss how the Biden administration plans to face the multifaceted challenge posed by artificial intelligence.

Arati Prabhakar, in royal blue-framed glasses, at the microphone.
Arati Prabhakar, director of the White House Office of Science and Technology, speaks at the Democracy Summit on March 30, in Washington, D.C. (Jacquelyn Martin/AP Photo)

'It’s important, when a technology is this powerful, this broad, this fast, this fast-moving, to know what you're navigating to. And that's why an articulation of values is essential.'

Last fall, the White House released a prospective “Bill of Rights” for artificial intelligence. The document outlined areas of concern, including the promulgation of harmful content, algorithmic discrimination, intrusions into privacy and the need for sustained human governance. But these are all merely proposals, a set of recommendations that private industry would be free to ignore.

'I think everyone, including the developers, were surprised and sometimes even astonished by the pace at which advancements have come.'

The introduction of ChatGPT earlier this year has been described as a world-changing moment that seemed to usher in a new stage of technological progress that had been anticipated for some time — and was finally here.

Since then, corporate competition in the sector has been intense. Some worry that the rush to innovate is preventing a badly needed national conversation about how AI should — and should not — be used.

'I think of it as a new dimension of reality distortion. These are the kinds of new risks we're going to be grappling with.'

Surgeon General Vivek Murthy at a podium with a a backdrop saying: The United States Conference of Mayors, #Mayors C23.
U.S. Surgeon General Vivek Murthy speaks at a meeting of the U.S. Conference of Mayors on Jan. 18 in Washington, D.C. (Nathan Posner/Anadolu Agency via Getty Images)

U.S. Surgeon General Vivek Murthy recently issued an advisory on what he has described as an “epidemic of loneliness.” Digital technology is in good part responsible for Americans’ deepening isolation, and while some believe that artificial intelligence could provide an escape out of solitude, it could also bind people even more tightly to their devices, taking the place of real-world interactions not choreographed by an algorithm.

'We’ve pressed them to step up.'

Last week, Vice President Kamala Harris met with executives from technology companies working on artificial intelligence. Biden himself dropped in on the meeting, in an apparent sign that he was paying attention to the debate over the future role of AI.

“I just came by to say thanks,” the president said. “What you're doing has enormous potential — and enormous danger. I know you understand that.”