Posts Tagged ‘Internet Research Agency’

What to Do About Disinformation

December 3, 2019

The title of this post is identical to the title of a section in Richard Stengel’s informative work, Information Wars. This book is highly informative and provides information not only about the State Department, but also about the actions Rick Stengel took performing his job. But the most useful part of the book is this section, What to Do About Disinformation. Several posts are needed here, and even then, they cannot do justice to the information provided in the book.

When the Library of Congress was created in 1800 it had 39 million books. Today the internet generates 100 times that much data every second. Information definitely is the most important asset of the 21st Century. Polls show that people feel bewildered by the proliferation of online news and data. Mixed in with this daily tsunami there is a lot of information that is false as well as true.

Disinformation undermines democracy because democracy depends on the free flow of information. That’s how we make decisions. Disinformation undermines the integrity of our choices. According to the Declaration of Independence “Governments are instituted among Men, deriving their just powers from the consent of the governed.” If that consent is acquired through deception, the powers from it are not just. Stengel states that it is an attack on the very heart of our democracy.

Disinformation is not news that is simply wrong or incorrect. It is information that is deliberately false in order to manipulate and mislead people.

Definitions of important terms follow:
Disinformation: The deliberate creation and distribution of information that is false and deceptive in order to mislead an audience.
Misinformation: Information that is false, though not deliberately; that is created inadvertently or by mistake.
Propaganda: Information that may or may not be true that is designed to engender support for a political view or ideology.

“Fake news” is a term Donald Trump uses to describe any content he does not like. But Trump did not originate the term. The term was familiar to Lenin and Stalin and almost every other dictator of the last century. Russians were calling Western media fake news before Trump, and Trump in his admiration of Russia followed suit. Stengel prefers the term “junk news” to describe information that is false, cheap, and misleading that has been created without regard for its truthfulness.

Most people regard “propaganda” as pejorative, but Stengel believes that it is—or should be—morally neutral. Propaganda can be used for good or ill. Advertising is a form of propaganda. What the United States Information Agency did during the Cold War was a form of propaganda. Advocating for something you believe in can be defined as propaganda. Stengel writes that while propaganda is a misdemeanor, disinformation is a felony.
Disinformation is often a mixture of truth and falsity. Disinformation doesn’t necessarily have to be 100% false to be disinformation. Stengel writes that the most effective forms of disinformation are a mixture of information that is both true and false.

Stengel writes that when he was a journalist he was close to being a First Amendment absolutist. But he has changed his mind. He writes that in America the standard for protected speech has evolved since Holme’s line about “falsely shouting fire in a theater.” In Brandenburg v. Ohio, the court ruled that speech that led to or directly caused violence was not protected by the First Amendment.

Stengel writes that even outlawing hate speech will not solve the problem of disinformation. He writes that government may not be the answer, but it has a role. He thinks that stricter government regulation of social media can incentivize the creation of fact-based content and discentivize the creation of disinformation. Currently big social media platforms optimize content that has greater engagement and vitality, and such content can sometimes be disinformation or misinformation. Stengel thinks the these incentives can be changed in part through regulation and in part through more informed user choices.

What Stengel finds most disturbing is that disinformation is being spread in a way and through means that erode trust in public discourses and democratic processes. This is precisely what these bad actors want to accomplish. They don’t necessarily want you to believe them—they don’t want you to believe anybody.

As has been described in previous healthy memory blog posts, the creators of disinformation use all the legal tools on social media platforms that are designed to deliver targeted messages to specific audiences. These are the same tools—behavioral data analysis, audience segmentation, programmatic ad buying—that make advertising campaigns effective. The Internet Research Agency in St. Petersburg, Russia uses the same behavioral data and machine-learning algorithms that Coca-Cola and Nike use.

All the big platforms depend on the harvesting and use of personal information. Our data is the currency of the digital economy. The business model of Google, Facebook, Amazon, Microsoft, and Apple, among others, depends on the collection and use of personal information. They use this information to show targeted advertising. They collect information on where you go, what you do, whom you know, and what your want to know about, so they can sell that information to advertisers.

The important question is, who owns this information? These businesses argue that because they collect, aggregate, and analyze our data, they own it. The law agrees in the U.S. But in Europe, according to the EU’s General Data Protection Regulation, people own their own information. Stengel and HM agree that this is the correct model. America needs a digital bill of rights that protects everyone’s information as a new social contract.

Stengel’s concluding paragraph is “I’m advocating a mixture of remedies that optimize transparency, accountability, privacy, self-regulation, data protection, and information literacy. That can collectively reduce the creation, dissemination, and consumption of false information. I believe that artificial intelligence and machine learning can be enormously effective in combating falsehood and disinformation. They are necessary but insufficient. All three efforts should be—to use one of the military’s favorite terms—mutually reinforcing.”

Censorship, Disinformation, and the Burial of Truth

January 20, 2019

This is the eighth post in a series of posts on a book by P.W. Singer and Emerson T. Brooking titled “Likewar: The Weaponization of Social Media. Initially, the notion that the internet would provide the basis for truth and independence was supported. The Arab Spring was promoted on the internet. The authors write, “Social media had illuminated the shadows crimes through which dictators had long clung to power, and offered up a powerful new means of grassroots mobilization.

Unfortunately, this did not last. Not only did the activists fail to sustain their movement, but they noticed that the government began to catch up. Tech-illiterate bureaucrats were replaced by a new generation of enforcers who understood the internet almost as well as the protestors. They invaded online sanctuaries and used the very same channels to spread propaganda. And these tactics worked. The much-celebrated revolutions fizzled. In Libya and Syria, digital activists turned their talents to waging internecine civil wars. In Egypt, the baby named Facebook would grow up in a country that quickly turned back to authoritarian government.

The internet remains under the control of only a few thousand internet service providers (ISPs). These firms run the backbone, or “pipes,” of the internet. Only a few ISPs supply almost all of he world’s mobile data. Because two-thirds of all ISPs reside in the United States, the average number across the rest of the world is relatively small. The authors note that, “Many of these ISPs hardly qualify as “businesses” at all. Rather, they are state-sanctioned monopolies or crony sanctuaries directed by the whim of local officials. Although the internet cannot be destroyed, regimes can control when the internet goes on or off and what goes on it.

Governments can control internet access and target particular areas of the country. India, the world’s largest democracy had the mobile connections in an area where violent protests had started out for a week. Bahrain instituted an internet curfew that affected only a handful of villages where antigovernment protests were brewing. When Bahrainis began to speak out against the shutdown, authorities narrowed their focus further, cutting access all the way down to specific internet users and IP addresses.

The Islamic Republic of Iran has poured billions of dollars into its National Internet Project. It is intended as a web replacement, leaving only a few closely monitored connections between Iran and the outside world. Italian officials describe it as creating a “clean” internet for its citizens, insulated from the “unclean” web that the rest of us use.

Outside the absolute-authoritarian state of North Korea (whose entire internet is a closed network of about 30 websites), the goal isn’t so much to stop the signal as it is to weaken it. Although extensive research and special equipment can circumvent government controls, the empower parts of the internet are no longer for the masses.

Although the book discusses China, that discussion will not be included here as there are separate posts on the book “Censored: Distraction and Diversion Inside China’s Great Firewall” by Margaret E. Roberts.

The Russian government hires people to create chaos on the internet. They are tempted by easy work and good money for work such as writing more than 200 blog posts and comments a day, assuming fake identities, hijacking conversations, and spreading lies. This is an ongoing war of global censorship by means of disinformation.

Russia’s large media networks are in the hands of oligarchs, whose finances are deeply intertwined with those of the state. The Kremlin makes its positions known through press releases and private conversations, the contents of which are then dutifully reported to the Russian people, no matter how much spin it takes to make them credible.

Valery Gerasimov has been mentioned in previous healthy memory blog posts. He channeled Clausewitz in speech reprinted in the Russian military newspaper that “the role of nonmilitary means of achieving political and strategic goals has grown. In many cases, they have exceeded the power of the force of weapons in their effectiveness.” This is known as the Gerasimov Doctrine that has been enshrined in the nation’s military strategy.

Individuals working at the Internet Research Agency assume a series of fake identities known as “sockpuppets.” The authors write, The job was writing hundreds of social media posts per day, with the goal of hijacking conversations and spreading lies, all to the benefit of the Russian government. For this work people are paid the equivalent of $1500 per month. (Those who worked on the “Facebook desk” targeting foreign audience received double the pay of those targeting domestic audiences).

The following is taken directly from the text:

“The hard work of a sockpuppet takes three forms, best illustrated by how they operated during the 2016 U.S. election. One is to pose as the organizer of a trusted group. @Ten_GOP called itself the “unofficial Twitter account of Tennessee Republicans” and was followed by over 136,000 people (ten times as many as the official Tennessee Republican Party Account). It’s 3,107 messages were retweeted 1,213,506 times. Each retweet then spread to millions more users especially when it was retweeted by prominent Trump campaign figures like Donald Trump Jr., Kellyanne Conway, and Michael Flynn. On Election Day 2016, it was the seventh most retweeted account across all of Twitter. Indeed, Flynn followed at least five such documented accounts, sharing Russian propaganda with his 1000,000 followers at least twenty-five times.

The second sockpuppet tactic is to pose as a trusted news source. With a cover photo image of the U.S. Constitution, @partynews presented itself as hub for conservative fans of the Tea Party to track the latest headlines. For months , the Russian front pushed out anti-immigrant and pro-Trump messages and was followed and echoed out by some 22,000 people, including Trump’s controversial advisor Sebastian Gorka.

Finally, sockpuppets pass as seemingly trustworthy individuals: a grandmother, a blue-collar worker from the midwest,a decorated veteran, providing their own heartfelt take on current events (and who to vote for). Another former employee of the Internet
Research Agency, Alan Baskayev, admitted that it could be exhausting to manage so many identities. “First you had to be a redneck from Kentucky, then you had to be some white guy from Minnesota who worked all his life, paid taxes and now lives in poverty; and in 15 minutes you have to write something in the slang of [African] Americans from New York.”

There have been many other posts about Russian interference in Trump’s election. Trump lost the popular vote, and it is clear that he would not have won the Electoral College had it not been for Russia. Clearly, Putin owns Trump.

© Douglas Griffith and healthymemory.wordpress.com, 2019. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Scale of Russian Operation Detailed

December 23, 2018

The title of this post is identical to the title of an article by Craig Timberg and Tony Romm in the 17 Dec ’18 issue of the Washington Post. Subtitles are: EVERY MAJOR SOCIAL MEDIA PLATFORM USED and Report finds Trump support before and after election. This post is the first to analyze the millions of posts provided by major technology firms to the Senate Intelligence Committee.

The research was done by Oxford University’s Computational Propaganda Project and Graphic, a network analysis firm. It provides new details on how Russians worked at the Internet Research Agency (IRA), which U.S. officials have charged with criminal offenses for interring in the 2016 campaign. The IRA divided Americans into key interest groups for targeted messaging. The report found that these efforts shifted over time, peaking at key political moments, such as presidential debates or party conventions. This report substantiates facts presented in prior healthy memory blog posts.

The data sets used by the researchers were provided by Facebook, Twitter, and Google and covered several years up to mid-2017, when the social media companies cracked down on the known Russian accounts. The report also analyzed data separately provided to House Intelligence Committee members.

The report says, “What is clear is that all of the messaging clearly sought to benefit the Republican Party and specifically Donald Trump. Trump is mentioned most in campaigns targeting conservatives and right-wing voters, where the messaging encouraged these groups to support his campaign. The main groups that could challenge Trump were then provided messaging that sought to confuse, distract and ultimately discourage members from voting.”

The report provides the latest evidence that Russian agents sought to help Trump win the White House. Democrats and Republicans on the panel previously studied the U.S. intelligence community’s 2017 finding that Moscow aimed to assist Trump, and in July, said the investigators had come to the correct conclusion. Nevertheless, some Republicans on Capitol Hill continue to doubt the nature of Russia’s interference in the election.

The Russians aimed energy at activating conservatives on issues such as gun rights and immigration, while sapping the political clout of left-leaning African American voters by undermining their faith in elections and spreading misleading information about how to vote. Many other groups such as Latinos, Muslims, Christians, gay men and women received at least some attention from Russians operating thousands of social media accounts.

The report offered some of the first detailed analyses of the role played by Youtube and Instagram in the Russian campaign as well as anecdotes about how Russians used other social media platforms—Google+, Tumblr and Pinterest—that had received relatively little scrutiny. That also used email accounts from Yahoo, Microsoft’s Hotmail service, and Google’s Gmail.

While reliant on data provided by technology companies the authors also highlighted the companies’ “belated and uncoordinated response” to the disinformation campaign and, once it was discovered, their failure to share more with investigators. The authors urged that in the future they provide data in “meaningful and constructive “ ways.

Facebook provided the Senate with copies of posts from 81 Facebook pages and information on 76 accounts used to purchase ads, but it did not share posts from other accounts run by the IRA. Twitter has made it challenging for outside researchers to collect and analyze data on its platform through its public feed.

Google submitted information in an especially difficult way for researchers to handle, providing content such as YouTube videos but not the related data that would have allowed a full analysis. They wrote that the YouTube information was so hard to study, that they instead tracked the links to its videos from other sites in hopes of better understand YouTube’s role in the Russian effort.

The report expressed concern about the overall threat social media poses to political discourse within and among nations, warning them that companies once viewed as tools for liberation in the Arab world and elsewhere are now a threat to democracy.

The report also said, “Social media have gone from being the natural infrastructure for sharing collective grievances and coordinating civic engagement to being a computational tool for social control, manipulated by canny political consultants and available to politicians in democracies and dictatorships alike.”

The report traces the origins of Russian online influence operations to Russian domestic politics in 2009 and says that ambitions shifted to include U.S. politics as early as 2013. The efforts to manipulate Americans grew sharply in 2014 and every year after, as teams of operatives spread their work across more platforms and accounts to target larger swaths of U.S. voters by geography, political interests, race, religion and other factors.

The report found that Facebook was particularly effective at targeting conservatives and African Americans. More than 99% of all engagements—meaning likes, shares and other reactions—came from 20 Facebook pages controlled by the IRA including “Being Patriotic,” “Heart of Texas,” “Blacktivist” and “Army of Jesus.”

Having lost the popular vote, it is difficult to believe that Trump could have carried the Electoral College given this impressive support by the Russians. One can also envisage Ronald Reagan thrashing about in his grave knowing that the Republican Presidential candidate was heavily indebted to Russia and that so many Republicans still support Trump.
© Douglas Griffith and healthymemory.wordpress.com, 2018. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The 2016 Election—Part One

July 20, 2018

This post is based on David E Sanger’s, “THE PERFECT WEAPON: War, Sabotage, & Fear in the Cyber Age.” In the middle of 2015 the Democratic National Committee asked Richard Clarke to assess the political organization’s digital vulnerabilities. He was amazed at what his team discovered. The DNC—despite its Watergate History, despite the well-publicized Chinese and Russian intrusion into the Obama campaign computers in 2008 and 2012—was securing its data with the kind of minimal techniques one would expect to find at a chain of dry cleaners. The way spam was filtered wasn’t even as sophisticated as what Google’s Gmail provides; it certainly wasn’t prepared for a sophisticated attack. And the DNC barely trained its employees to spot a “spear phishing” of the kind that fooled the Ukrainian power operators into clicking on a link, only to steal whatever passwords are entered. It lacked any capability for detecting suspicious activity in the network such as the dumping of data to a distant server. Sanger writes, “It was 2015, and the committee was still thinking like it was 1792.”

So Clarke’s team came up with a list of urgent steps the DNC needed to take to protect itself. The DNC said they were too expensive. Clarke recalled “They said all their money had to go into the presidential race.” Sanger writes, “Of the many disastrous misjudgments the Democrats made in the 2016 elections, this one may rank as the worst.” A senior FBI official told Sanger, “These DNC guys were like Bambi walking in the woods, surrounded by hunters. They had zero chance of surviving an attack. Zero.”

When an intelligence report from the National Security Agency about a suspicious Russian intrusion into the computer networks at the DNC was tossed onto Special Agent Adrian Hawkin’s desk at the end of the summer of 2015, it did not strike him or his superiors at the FBI as a four-alarm fire. When Hawkins eventually called the DNC switchboard, hoping to alert its computer-security team to the FBI’s evidence of Russian hacking he discovered that they didn’t have a computer-security team. In November 2015 Hawkins contacted the DNC again and explained that the situation was worsening. This second warning still did not set off alarms.

Anyone looking for a motive for Putin to poke into the election machinery of the United States does not have to look far: revenge. Putin had won his election, but had essentially assured the outcome. This evidence was on video that went viral.
Clinton, who was Secretary of State, called out Russia for its antidemocratic behavior. Putin took the declaration personally. The sign of actual protesters, shouting his name, seemed to shake the man known for his unchanging countenance. He saw this as an opportunity. He declared that the protests were foreign-inspired. At a large meeting he was hosting, he accused Clinton of being behind “foreign money” aimed at undercutting the Russian state. Putin quickly put down the 2011 protests and made sure that there was no repetition in the aftermath of later elections. His mix of personal grievance at Clinton and general grievance at what he viewed as American hypocrisy never went away. It festered.

Yevgeny Prigozhin developed a large project for Putin: A propaganda center called the Internet Research Agency (IRA). It was housed in a squat four-story building in Saint Petersburg. From that building, tens of thousands of tweets, Facebook posts, and advertisements were generated in hopes of triggering chaos in the United States, and, at the end of the processing, helping Donald Trump, a man who liked oligarchs, enter the Oval Office.

This creation of the IRA marked a profound transition in how the Internet could be put to use. Sanger writes, “For a decade it was regarded as a great force for democracy: as people of different cultures communicated, the best ideas would rise to the top and autocrats would be undercut. The IRA was based on the opposite thought: social media could just as easily incite disagreements, fray social bonds, and drive people apart. While the first great blush of attention garnered by the IRA would come because of its work surrounding the 2016 election, its real impact went deeper—in pulling at the threads that bound together a society that lived more and more of its daily life the the digital space. Its ultimate effect was mostly psychological.”

Sanger continues, “There was an added benefit: The IRA could actually degrade social media’s organizational power through weaponizing it. The ease with which its “news writers” impersonated real Americans—or real Europeans, or anyone else—meant that over time, people would lose trust in the entire platform. For Putin, who looked at social media’s role in fomenting rebellion in the Middle East and organizing opposition to Russia in Ukraine, the notion of calling into question just who was on the other end of a Tweet or Facebook post—of making revolutionaries think twice before reaching for their smartphones to organize—would be a delightful by-product. It gave him two ways to undermine his adversaries for the price of one.”

The IRA moved on to advertising. Between June 2015 and August 2017 the agency and groups linked to it spent thousands of dollars on Facebook as each month, at a fraction of the cost for an evening of television advertising on a local American television stations. In this period Putin’s trolls reached up to 126 million Facebook users, while on Twitter they made 288 million impressions. Bear in mind that there are about 200 million registered voters in the US and only 139 million voted in 2016.

Here are some examples of the Facebook posts. A doctored picture of Clinton shaking hands with Osama bin Laden or a comic depicting Satan arm-wrestling Jesus. The Satan figures says “If I win, Clinton wins.” The Jesus figure responds, “Not if I can help it.”

The IRA dispatched two of their experts, a data analyst and a high-ranking member of the troll farm. They spent three weeks touring purple states. They did rudimentary research and developed an understanding of swing states (something that doesn’t exist in Russia). This allows the Russians to develop an election-meddling strategy, which allows the IRA to target specific populations within these states that might be vulnerable to influence by social media campaigns operated by trolls across the Atlantic.

Russian hackers also broke into the State Department’s unclassified email system, and they might also have gotten into some “classified” systems. They also managed to break into the White House system. In the end, the Americans won the cyber battle in the State and White House systems, though they did not fully understand how it was part of an escalation of a very long war.

The Russians also broke into Clinton’s election office in Brooklyn. Podesta fell prey to a phishing attempt. When he changed his password the Russians obtained access to sixty thousand emails going back a decade.