Posted by Guest on November 03, 2017 in Blog

by Sarah Seniuk

Probes into the influence of Russian entities during the 2016 presidential election produced distributing evidence of the use of social media platforms, including Facebook, to manipulate our election. We learned that as many as 126 million Americans may have seen some of the 80,000 pieces of Russia-created content on Facebook and 2,752 Twitter accounts have been linked to Russian propaganda. Much of this content was aimed at exploiting existing social and political divisions in the country and while the attention on foreign interference in the election is merited, we were focused on the fact that real people and organizations were impersonated and hateful content went undetected because our social networks already allow hate to get organized there. AAI Executive Director Maya Berry shared that concern directly with Facebook COO Sheryl Sandberg at a meeting that focused on the harmful impact of such hate on Arab American and American Muslim communities. The conversation was a frank discussion of these serious problems and an excellent beginning to a necessary, continuous dialogue. After the meeting, we joined other civil rights organizations in calling on Facebook not only to fully disclose all of the activities on their platform that trace back to Russian operatives, but also to work with us to meaningfully counter the impact of hate campaigns on their platform (you can read our full letter here). 

Early lines of questioning from committee members were aimed at understanding the scope of Russian actions - directly and indirectly, covert and overt, automated and coordinated - and to what extent were these array of tactics allowed within current user policies, and which were violations. While each of the three companies has particular concerns due to the structure of their platforms, a unifying concern is the purchasing of political ads and making the origins and intent of those ads more transparent to those they reach. An initial revelation from Stretch of Facebook confirmed that in their retrospective investigations, they had discovered accounts and content originating back to 2015.

Facebook representative Stretch highlighted the importance of establishing the authenticity of posts, - meaning that the content creator was in fact who they presented themselves to be. Edgett of Twitter discussed concerns over bots and the spread of both automated and coordinated content from Russian linked accounts. These two delivery mechanisms allowed Russian actors to both actively and passively promote 1.4 million tweets aimed at dividing people during the 2016 election. Salgado and Walker both addressed the need to prevent the disproportionate promotion of misinformation and malicious ads through Google itself, as well as YouTube. Despite an insistence from each of the tech representatives that the Russian-originated content represented a very small portion of overall content generated during that time, the numbers of those potentially affected was staggering: as many as 126 million Americans may have seen some of the 80,000 pieces of content created on Facebook; there have been 2,752 Twitter accounts identified as being linked to the Russian propaganda team the Internet Research Agency, and at least 18 YouTube channels similarly connected.

“Authenticity” for Stretch and Edgett was a continuous concern in balancing the openness of the platforms with the threats of outside groups who seek to maliciously influence and divide. And this is an issue which moved well beyond the purchasing of political ads. Facebook experienced the creation of fake groups which promoted followship according to divisive issues, and even planned protests in Houston to which real American citizens attended. Russian actors were able to cultivate spaces which fostered division, hate, and distrust. Representatives from each company communicated their dedication to expanding their ad teams and redefining what counts as a political ad, expanding their security teams to investigate malicious and suspicious activities, as well as to provide more and easy to access information regarding the origins of future political ads which are purchased on each of these sites.

Though, as Senator Durbin (D-IL) brought up on day one of these hearings, the issue goes beyond the influence of foreign actors, and to the intent of the posts, pages, and ads themselves. Durbin argued, Americans are of course, rightfully upset when we perceive foreign interference, but we should also be concerned about content, no matter its origins, which promotes hate, violence, and division. Durbin mentioned a letter which AAI signed, with 18 other rights groups, in which this very concern was raised. The letter called on Facebook to make concrete changes to policies and practices which limit the perpetuation of dangerous hate speech through social networking. Legislatively, steps have already been taken ahead of these requests with the introduction of the Honest Ads Act in both the House and Senate to address at least part of this problem. The act aims to hold internet ads to the same disclosure standards as  their television and radio counterparts.

The final hearing with the House Intelligence Committee saw the release of many of the malicious and divisive ads which have been the the key subject of this investigation. And as many of the representatives noted during their questioning, the ads were not aligned to one party or ideology - but played multiple sides of contentious issues. “Heritage, not hate. The South will rise again!” was emblazoned above a confederate flag in an ad promoting followship of the group “South United”, and successfully earned around 40,000 clicks. Two groups, “Blacktivists” and “Back the Badge” were created simultaneously, each actively promoting divisive content on either side of the issues linked to the Black Lives Matter movement. One pro-Hillary ad depicted a photo of Hillary beside a woman in a hijab stating “Support Hillary. Save American Muslims,” while a competing ad purportedly supported a burqa-ban. If one thing was apparent to the industry representatives and our members of Congress, it was that Russian interference was not about the support of one particular party or candidate, but about dangerously dividing the American citizenry.

In perhaps the most telling display of the lived effects of Russian disruption was the organizing of a protest and counter-protest in front of an Islamic center in Houston, Texas. The Russian “Heart of Texas” group, which spouted a fear of Islam, planned a protest in May 2016 to “Stop Islamification of Texas,” and the Russian counter group “United Muslims of America” responded in kind organizing a counter-protest to “Save Islamic Knowledge.” And dozens of real Americans showed up in support of both groups and agendas.

What these hearings have ultimately revealed is the complexity of the relationships between the balance of free speech and suppression, private industry and legislative protections. Senators and representatives from both parties struggled with questions over the responsibility of all involved parties, and the long term solution may lie both in greater education and transparency.

The full senate hearings of days one and two can be found here, and the first house hearing can be found here.

Sarah Seniuk is a 2017 fall intern at the Arab American Institute.