US intel says AI is boosting, but not revolutionizing, foreign efforts to influence the 2024 elections

Artificial intelligence is helping “improve” rather than “revolutionize” influence operations from Russia and Iran aimed at November’s US elections, the Office of the Director of National Intelligence said in an assessment released Monday.

“The (US intelligence community) considers AI a malign influence accelerant, not yet a revolutionary influence tool,” an ODNI official told reporters.

The new US assessment is a counterpoint to some of the media and industry hype about AI-related threats. But the technology is still a top concern for US intelligence officials monitoring threats to the presidential election.

The risk to US elections from foreign, AI-generated content depends on the ability of foreign operatives to overcome restrictions built into many AI tools, to develop their own sophisticated AI models, or to “strategically target and disseminate” AI-generated content, the official said. “Foreign actors are behind in each of these three areas.”

Foreign operatives are using AI to try to overcome language barriers in targeting US voters with disinformation, according to US officials.

Iran, for example, has used AI to generate content in Spanish about immigration, which Tehran perceives as a divisive US political issue, the ODNI official said. Tehran-linked operatives have also used AI to target voters across the political spectrum on polarizing issues like the Israel-Gaza conflict, the official said. US officials believe Tehran is trying to undercut former President Donald Trump’s candidacy.

Russia has generated the most AI content related to the US election of any foreign power, according to the ODNI official. The AI-laced content — videos, photos, text and audio — have been consistent with Moscow’s efforts to boost Trump’s candidacy and denigrate Vice President Kamala Harris’ campaign, the official said.

China, meanwhile, is using AI “to amplify divisive U.S. political issues,” but not to try to shape specific US election outcomes, the new US intelligence assessment said.

Foreign operatives have also embraced plenty of old-school influence techniques this election cycle, such as staging videos rather than generating them with AI.

US intelligence agencies believe that Russian operatives staged a video that circulated on X earlier this month that falsely claimed that Harris paralyzed a young girl in a 2011 hit-and-run accident, the ODNI official said. The Russians promoted the story through a website pretending to be a local San Francisco media outlet, according to Microsoft researchers.

Another Russian-made video, which drew at least 1.5 million views on X, claimed to show Harris supporters attacking an attendee of a Donald Trump rally, according to Microsoft.

US intelligence agencies warned in July that Russia planned to “covertly use social media” to try to sway public opinion and undermine support for Ukraine in swing states.

“Russia is a much more sophisticated actor in the influence space in general, and they have a better understanding of how US elections work and where to target and what states to target,” the ODNI official said.

This isn’t the first general US election where foreign powers have considered deploying AI capabilities.

Operatives working for the Chinese and Iranian governments prepared fake, AI-generated content as part of a campaign to influence US voters in the closing weeks of the 2020 election campaign but chose not to disseminate the content, CNN previously reported. Some US officials who reviewed the intelligence at the time were unimpressed, believing it showed China and Iran lacked the capability to deploy deepfakes in a way that would seriously impact the 2020 presidential election.

For more CNN news and newsletters create an account at CNN.com