News

Latest News

Rubio on Big Tech and Foreign Influence Operations: “This is a Minefield”

Sep 18, 2024 | Press Releases

Vice Chairman Marco Rubio (R-FL) of the Senate Select Committee on Intelligence delivered opening remarks during a hearing on the responsibilities of U.S. technology companies to help secure our nation from foreign influence operations in the 2024 elections.

  • “I think it’s important to understand what our policies are today in terms of identifying what is disinformation—what is actually generated by foreign adversary—versus the amplification of a pre-existing belief in America, which has left a lot of people in a position of being labeled collaborators when, in fact, they just hold views that on that one issue happen to align with what some other country hopes we believe as well. I’m very interested to learn what our internal policies are—and our companies’—because I think it’s a minefield…” – Senator Rubio

Click here for video and read the transcript below:

“This is a tricky and difficult topic, because I think there are two kinds of things we’re trying to address. 

“The first is generated disinformation. I think you’re going to describe some of those efforts today. That is some foreign adversary—Iran, China, Russia—creating or making something up and then amplifying it. They push it out there, and they hope people believe it. It’s actually something I remember giving a speech about back in 2018 or 2019, warning about AI-generated videos that were going to be the wave of the future in terms of trying to influence what people think and see. We’ve seen some of that already play out. That’s pretty straightforward. 

“Let me tell you where it gets complicated. I use this as an example, not because I generally agree with it, but because I think it’s an important example—there are people in the United States who believe that perhaps we shouldn’t have gotten involved with Ukraine, or shouldn’t have gotten involved in the conflict in Europe. Vladimir Putin also happens to believe that and hopes that we will conclude that. There is someone out there saying something that, whether you agree with them or not, is a legitimate political view that’s pre-existing. Now some Russian bot decides to amplify the views of an American citizen who happens to hold those views. The question becomes, is that disinformation? Or misinformation? Is that an influence operation because an existing view is being amplified? 

“It’s easy to say, just take down the amplifiers. But the problem is, it stigmatizes the person whose view it is. The accusation is that that person isn’t simply holding a view. They’re holding the same view that Vladimir Putin happens to have on that one topic, or something similar to what he has. As a result, they themselves must be an asset. That’s problematic. 

“It’s complicated. Recall that in 2020—this is now well-known—there was Hunter Biden’s laptop. There was a story [about it] in the New York Post, and 51 former—I say former, because I have people calling me all the time saying ’intelligence officers,’ these are former[s]—intelligence officials went out and said, ‘This has all of the attributes of a Russian disinformation campaign.’ As a result, the New York Post, who posted the original story, had their story censored and taken down, their account locked. There was a concerted effort on the basis of that letter to silence a media outlet in the United States on something that actually turned out not to be Russian disinformation, even though I imagine the Russians wanted to spread that story. They might have amplified it, but it also happened to be factual. 

“We know, based on the letter from the CEO of Meta, that the government pressured him during the COVID pandemic to censor certain views. He expressed regret about agreeing to some of that. There are people in this country that have had their accounts locked or, in some cases, canceled out because they questioned the efficacy of masks. We now know Dr. [Anthony] Fauci agreed that masks were not a solution to all the problems.

“The lab leak theory, at one time, was considered a conspiracy and a flat-out lie. Now our own intelligence agencies are saying it’s 50 percent likely, just as likely as the natural occurring. 

“This is a tricky minefield. It’s even trickier now, because Russia is still doing it more than anybody else. But you don’t need to have a big, expensive operation to pursue some of this. I think we should anticipate that in the years to come the Iranians are going to get into this business. They already are. The Chinese are going to get into this business. They already are. You see them using that in other countries to sow discord and division. It’s also North Korea and maybe even friendly states who have a preference on how American public opinion turns [whom we have to be concerned about]. 

“I think it’s important to understand what our policies are today in terms of identifying what is disinformation—what is actually generated by foreign adversary—versus the amplification of a pre-existing belief in America, which has left a lot of people in a position of being labeled collaborators when, in fact, they just hold views that on that one issue happen to align with what some other country hopes we believe as well. I’m very interested to learn what our internal policies are—and our companies’—because I think it’s a minefield…. I don’t want to sow discord. That’s one of the dangers that we’re now flirting with.”