Eight Revealing Moments From the Second Day of Russia Hearings – WIRED

On their second day in Capitol Hill, lawyers from Facebook, Twitter, and Google took a bipartisan beating as they faced tough questions about the role their platforms played in Russian attempts to divide the American electorate. Members of the Senate Intelligence Committee grilled the tech executives about their responses to Russian interference in the 2016 election, arguing that the companies are not taking seriously what Congress considers a kind of cyberwarfare. Moreover, some members said the companies’ business models are built to enable the kind of disinformation campaigns Russians used to sow discord.

“Russians have been conducting information warfare for decades,” said Democratic Sen. Mark Warner in his opening remarks. “But what is new is the advent of social-media tools with the power to magnify propaganda and fake news on a scale that was unimaginable back in the days of the Berlin Wall. Today’s tools seem almost purpose-built for Russian disinformation techniques.”

The hearing revealed new and startling insight into the ways in which Russians pitted Americans against each other, and reinforced the notion that social-media ads are only a portion of the threat from foreign actors. Senators also forced the tech execs to explain how they police content on their platforms in different parts of the world.

Here were the most revealing exchanges.

“It’s hard to attend an event in Houston, Texas when you’re trolling from St. Petersburg, Russia.” – Republican Sen. Richard Burr

Burr, the committee chair, highlighted two Facebook posts from a Russian propaganda group called Internet Research Agency that created a conflict on the streets of Houston by drawing two groups of protestors to fake “rallies” at the same place and time. One post, shared by the fake Facebook page Heart of Texas, promoted a purported protest against the “Islamization of Texas.” The second post, uploaded by the fake page United Muslims of America, promoted an event aimed at saving “Islamic knowledge.” Both groups bought ads to publicize their events, spending about $200 in total.

Burr then showed images of the resulting clash outside the Islamic Center in Houston, dramatizing how fake accounts can produce real conflict. Skeptics of the impact of Russian meddling in the US election have argued that just because Russia endeavored to influence American voters doesn’t mean they did. But the fact that people showed up for these protests, designed to foment anger on both sides, demonstrates that influence.

“Do you believe that any of your companies have identified the full scope of Russian active measures?” – Warner

“I have to say no.” – Facebook General Counsel Colin Stretch

In September, Facebook acknowledged that it had discovered 3,000 ads from 470 accounts connected to Internet Research Agency. It’s since revealed that those accounts collectively created 80,000 pieces of content that may have been shared, both organically and through ads, with 126 million people. It shared that information with Twitter and Google. Now Twitter says it has identified 2,752 accounts linked to Internet Research Agency, while Google says it has identified 18 YouTube channels connected to the group.

But Warner said he’s concerned that much of what we know about Russian actions on these platforms is “derivative” of Facebook’s initial findings. In response to his question, executives of Twitter and Google also said they did not believe their companies had yet uncovered the full extent of Russian activities.

The dialogue illustrates an important point: The companies have been slow to investigate and respond to Russian meddling, which started in 2015, more than two years ago. “Many of us on this committee have been raising these issues since the beginning of this year,” Warner said. “Our claims were frankly blown off by the leadership of your companies.”

“Is a foreign influence campaign a violation of the terms of service of any of the three companies represented here today?” – Republican Sen. Marco Rubio.

Facebook says it deleted the accounts connected to the Internet Research Agency because the accounts were fake, a violation of its terms of service. Twitter says it deleted another 36,746 Russian bot accounts because its terms of service prohibit the use of automated accounts to spread spam on the service. And YouTube argues it allows the Russian propaganda media company RT to continue publishing videos because RT hasn’t explicitly violated the company’s rules around inciting hate speech or violence.

In response to Rubio’s question, Twitter’s general counsel Sean Edgett said foreign influence did not directly violate Twitter’s terms of service. “We don’t have state-sponsored manipulation of elections as one of our rules,” he said. “The other rules like inflammatory ads content would take down most of these posts, but we don’t outright ban it.”

Federal law bars foreign nationals from interfering in US elections. But the emphasis on fake accounts raises the question of whether the companies would act against foreign agencies that deployed trolls using real names and faces to spread the same messages. Guided by Rubio’s questions, Facebook’s Stretch said the company complies with laws in other countries that restrict speech, such as a German law that makes it a crime to deny the Holocaust. The implication of Rubio’s remarks: Why aren’t the companies enforcing the US law banning foreign interference in elections?

“Do any of you have any information that registered voter data was uploaded and used to customize advertising or messaging to individual voters?” – Rubio

“We haven’t seen evidence of that.” – Twitter’s Edgett

“The same is true for Facebook.” – Facebook’s Stretch

Facebook’s revelation that Russians had purchased ads prompted speculation about whether the Russians had help targeting the ads, potentially from the Trump campaign or its allies. The companies made clear Wednesday that they have no evidence that voter lists were used. Facebook and Twitter offered Internet Research Agency all of the targeting capabilities they needed.

Burr also noted that the Russians targeted ads at both “safe” states politically and “swing” states. He said nearly five times as many ads were targeted at Maryland, which Hillary Clinton won comfortably, as at Wisconsin, a key swing state that Trump won unexpectedly. He urged listeners not to consider Russian interference as an effort to prop up one candidate over another. “It is short-sighted and dangerous to selectively focus on one piece of information and think that somehow tells the whole story,” he said.

“Their strategy is to take a crack in our society and turn it into a chasm.” – Independent Sen. Angus King

As details of the Russian ad campaign have leaked to the public, questions have grown about what, exactly, the Internet Research Agency sought to accomplish. The content simultaneously supported conservative and liberal viewpoints. It attacked immigrants and welcomed them. It denounced white supremacism and denied its existence. Conservatives have used this as a defense of President Trump, arguing that Russians had no influence on the election outcome.

But Wednesday’s hearing made clear that the Russians achieved another outcome—stoking divisions and anger among Americans. Setting aside who won the race, anger and distrust in the American electoral system was the central outcome of the 2016 election. In clearly demonstrable ways, it was Russians who generated that anger.

Some, including the platforms themselves, have tried to frame the question of shutting down this content as a free-speech issue. Republican Sen. James Lankford challenged that view. “This is not an opposition of free speech battle. This is actually a battle to try to protect free speech,” he said. “If two Americans have a disagreement. Let’s have at it. If an outsider wants to come to it, we do have a problem with that.”

“You’ve created these platforms, and now, they’re being misused, and you have to be the ones to do something about it. Or we will.” – Democratic Sen. Dianne Feinstein

Feinstein’s remark, from a California senator generally viewed as friendly to tech, underscored how members of both parties are exploring regulatory fixes to the problems revealed during the campaign.

Burr noted that the companies are not exempt from federal laws requiring political advertisers to publicly disclose their funding. “I hope if there’s a takeaway from this, it’s that everybody’s going to adhere to FEC law,” he said.

Democratic Sen. Joe Manchin, meanwhile, pushed all three companies to support recently introduced legislation called the Honest Ads Act, which would require tech platforms to publish disclosures on political and issue-based ads and retain databases with additional information about who’s behind the ad, as TV and radio stations do now. Twitter and Facebook recently announced similar features, though it’s not clear if the disclosures will match those of broadcasters.

Both Facebook and Twitter said they would work with regulators on a legislative solution. Sen. Feinstein suggested they’d better act fast, saying, “We are not going to go away gentlemen.”

”Are you also intending to turn over to the committee any kind of direct messages that went on among the different accounts?” – Democratic Rep. Joaquin Castro

This question, referring to direct messages on Twitter and private chats on Facebook that phony Russian social-media accounts may have sent individual users, came during a House Intelligence Committee hearing Wednesday afternoon. So far, neither company has shared that content with Congress, and judging by their responses, they don’t seem poised to do so any time soon.

“Direct messages are the private communications of our users. We take that privacy right and responsibility very seriously,” said Twitter’s Edgett. Facebook’s Stretch said the question raised “thorny issues,” adding, “We’re happy to take a look at it and do what we can.”

The exchange demonstrates that despite gestures of transparency, these companies are generally responding to requests rather than volunteering information. It also underscores the many additional corners of Russia’s online influence campaign that have yet to be explored.

”Have your investigations looked at whether the Trump campaign was sharing Russian content? Have they looked at whether the Russians were sharing Trump campaign content?” – Democratic Rep. Jackie Speier

Perhaps the most cryptic back and forth of the day came toward the end of the House hearing. Speier noted two tweets that appeared around the same time during the campaign—one from the Trump campaign and one from RT—that both dealt with the subject of Hillary Clinton’s health. Speier asked the companies if they noted similarities in the content generated by Russian entities and by the Trump campaign.

Twitter and Facebook had both earlier said they had no evidence that the Russians and the Trump campaign targeted the same users. But this was a different question, relating to the similarity of content. And executives of both companies ducked. Facebook’s Stretch said, “We provided all relevant information to the committee, and we do think it’s an important function of this committee, because you have access to broader set of information than any single company will.”

The answer was puzzling not only because it was inconclusive, but also because the companies themselves are in the best position to know whether a post or tweet was repurposed and used by another account. Even if content from one account turned up elsewhere, that doesn’t necessarily imply collusion. Such retweets and repostings happen all the time. Viral content by its very nature has a way of being coopted.

UPDATE, 8PM: This article and headline have been updated to include comments from the afternoon hearing of the House Intelligence Committee.

Comments

Write a Reply or Comment:

Your email address will not be published.*