Posts Tagged ‘Craig Timberg’

Scale of Russian Operation Detailed

December 23, 2018

The title of this post is identical to the title of an article by Craig Timberg and Tony Romm in the 17 Dec ’18 issue of the Washington Post. Subtitles are: EVERY MAJOR SOCIAL MEDIA PLATFORM USED and Report finds Trump support before and after election. This post is the first to analyze the millions of posts provided by major technology firms to the Senate Intelligence Committee.

The research was done by Oxford University’s Computational Propaganda Project and Graphic, a network analysis firm. It provides new details on how Russians worked at the Internet Research Agency (IRA), which U.S. officials have charged with criminal offenses for interring in the 2016 campaign. The IRA divided Americans into key interest groups for targeted messaging. The report found that these efforts shifted over time, peaking at key political moments, such as presidential debates or party conventions. This report substantiates facts presented in prior healthy memory blog posts.

The data sets used by the researchers were provided by Facebook, Twitter, and Google and covered several years up to mid-2017, when the social media companies cracked down on the known Russian accounts. The report also analyzed data separately provided to House Intelligence Committee members.

The report says, “What is clear is that all of the messaging clearly sought to benefit the Republican Party and specifically Donald Trump. Trump is mentioned most in campaigns targeting conservatives and right-wing voters, where the messaging encouraged these groups to support his campaign. The main groups that could challenge Trump were then provided messaging that sought to confuse, distract and ultimately discourage members from voting.”

The report provides the latest evidence that Russian agents sought to help Trump win the White House. Democrats and Republicans on the panel previously studied the U.S. intelligence community’s 2017 finding that Moscow aimed to assist Trump, and in July, said the investigators had come to the correct conclusion. Nevertheless, some Republicans on Capitol Hill continue to doubt the nature of Russia’s interference in the election.

The Russians aimed energy at activating conservatives on issues such as gun rights and immigration, while sapping the political clout of left-leaning African American voters by undermining their faith in elections and spreading misleading information about how to vote. Many other groups such as Latinos, Muslims, Christians, gay men and women received at least some attention from Russians operating thousands of social media accounts.

The report offered some of the first detailed analyses of the role played by Youtube and Instagram in the Russian campaign as well as anecdotes about how Russians used other social media platforms—Google+, Tumblr and Pinterest—that had received relatively little scrutiny. That also used email accounts from Yahoo, Microsoft’s Hotmail service, and Google’s Gmail.

While reliant on data provided by technology companies the authors also highlighted the companies’ “belated and uncoordinated response” to the disinformation campaign and, once it was discovered, their failure to share more with investigators. The authors urged that in the future they provide data in “meaningful and constructive “ ways.

Facebook provided the Senate with copies of posts from 81 Facebook pages and information on 76 accounts used to purchase ads, but it did not share posts from other accounts run by the IRA. Twitter has made it challenging for outside researchers to collect and analyze data on its platform through its public feed.

Google submitted information in an especially difficult way for researchers to handle, providing content such as YouTube videos but not the related data that would have allowed a full analysis. They wrote that the YouTube information was so hard to study, that they instead tracked the links to its videos from other sites in hopes of better understand YouTube’s role in the Russian effort.

The report expressed concern about the overall threat social media poses to political discourse within and among nations, warning them that companies once viewed as tools for liberation in the Arab world and elsewhere are now a threat to democracy.

The report also said, “Social media have gone from being the natural infrastructure for sharing collective grievances and coordinating civic engagement to being a computational tool for social control, manipulated by canny political consultants and available to politicians in democracies and dictatorships alike.”

The report traces the origins of Russian online influence operations to Russian domestic politics in 2009 and says that ambitions shifted to include U.S. politics as early as 2013. The efforts to manipulate Americans grew sharply in 2014 and every year after, as teams of operatives spread their work across more platforms and accounts to target larger swaths of U.S. voters by geography, political interests, race, religion and other factors.

The report found that Facebook was particularly effective at targeting conservatives and African Americans. More than 99% of all engagements—meaning likes, shares and other reactions—came from 20 Facebook pages controlled by the IRA including “Being Patriotic,” “Heart of Texas,” “Blacktivist” and “Army of Jesus.”

Having lost the popular vote, it is difficult to believe that Trump could have carried the Electoral College given this impressive support by the Russians. One can also envisage Ronald Reagan thrashing about in his grave knowing that the Republican Presidential candidate was heavily indebted to Russia and that so many Republicans still support Trump.
© Douglas Griffith and healthymemory.wordpress.com, 2018. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Social Media Putting Democracy at Risk

February 24, 2018

This blog post is based on an article titled, “”YouTube excels at recommending videos—but not at deeming hoaxes” by Craig Timberg, Drew Harrell, and Tony Romm in 23 Feb 2018
issue of the Washington Post. The article begins, “YouTube’s failure to stop the spread of conspiracy theories related to last week’s school shooting in Florida highlights a problem that has long plagued the platform: It is far better at recommending videos that appeal to users than at stanching the flow of lies.”

To be fair, YouTube’s fortunes are based on how well its recommendation algorithm is tuned to the tastes of individual viewers. Consequently, the recommendation algorithm is its major strength. YouTube’s weakness in detecting misinformation was on stark display this week as demonstrably false videos rose to the top of YouTube’s rankings. The article notes that one clip that mixed authentic news images with misleading context earned more than 200,000 views before YouTube yanked it Wednesday for breaching its rules on harassment.

The article writes, “These failures this past week, which also happened on Facebook, Twitter, and other social media sites—make it clear that some of the richest, most technically sophisticated companies in the world are losing against people pushing content rife with untruth.”

YouTube apologized for the prominence of these misleading videos, which claimed that survivors featured in news reports were “crisis actors” appearing to grieve for political gain. YouTube removed these videos and said the people who posted them outsmarted the platform’s safeguards by using portions of real news reports about the Parkland, Fla, shooting as the basis for their conspiracy videos and memes that repurpose authentic content.

YouTube made a statement that its algorithm looks at a wide variety of factors when deciding a video’s placement and promotion. The statement said, “While we sometimes make mistakes with what appears in the Trending Tab, we actively work to filter out videos that are misleading, clickbait or sensational.”

It is believed that YouTube is expanding the fields its algorithm scans, including a video’s description, to ensure that clips alleging hoaxes do not appear in the trending tab. HM recommends that humans be involved with the algorithm scans to achieve man-machine symbiosis. [to learn more about symbiosis, enter “symbiosis” into the search block of the Healthymemory blog.] The company has pledged on several occasions to hire thousands more humans to monitor trending videos for deception. It is not known whether this has been done or if humans are being used in a symbiotic manner.

Google also seems to have fallen victim to falsehoods, as it did after previous mass shootings, via its auto-complete feature. When users type the name of a prominent Parkland student, David Hogg, the word “actor” often appears in the field, a feature that drives traffic to a subject.

 

© Douglas Griffith and healthymemory.wordpress.com, 2018. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.