Posts Tagged ‘fake news’

Postmortem

December 18, 2019

The title of this post is identical to the title of a post in Messing with the Enemy an excellent book by Clint Watts. The postmortem on Russia’s influence and meddling in the presidential election of 2016 may never end. Trump was completely unconventional, uninformed, unlikable in so many ways, and yet had become the leader of the free world. Fake news entered the American lexicon, and Watts pre-election detailing of Russian active measures on the internet became the subject of hot debate. Had fake news swayed the U.S. presidential election?

Social media companies began digging into the data. What they found spelled dangerous trends for democracy. Americans were increasingly getting their news and information from social media instead of mainstream media. Users were not consuming factual content. Fake news, false or misleading series from outlets of uncertain credibility was being read far more than that from traditional newsrooms. EndTheFed.com and Political Insider produced four of the five most read false news stories in the three months leading up to the election. One story falsely claimed that Pope Francis had endorsed Donald Trump and another story falsely claimed that Hillary Clinton’s emails hosted on WikiLeaks certified her as an ISIS supporter. Throughout December, fears of Russian election manipulations grew, and each day brought more inquiries into how Russia had trolled for Trump.

The American electorate remains divided, government operations are severely disrupted, and faith in elected leaders continues to fall. Apparently, the objectives of Russia’s active measures have been achieved. Watts concludes that Americans still don’t grasp the information war Russia perpetrated against the West, why it works, and why it continues.

Watts writes, “The Russians didn’t have to hack election machines; they hacked American minds. The Kremlin didn’t change votes; it won them, helping tear down its less-preferred candidate, Hillary Clinton, to promote one who shares their worldviews, Donald Trump.

Watts continues, “Americans’ rapid social media consumption of news creates a national vulnerability for foreign influence. Even further, the percentage of American adults fifty and older utilizing social media sites is one of the highest in the world, at 50%. Younger Americans, aged eighteen to thirty-four, sustain a utilization rate about 80%. Deeper analysis by the Pew Research Center shows that U.S. online news consumers still get their information from news organizations more than from their friends, but they believe the friends they stay in touch with on social media applications provide information that is just as relevant.

A look at the Columbia Journalism Review’s media map demonstrates how social media encouraged information bubbles for each political leaning. Conservatives strongly entered their consumption around Breitbart and Fox News, while liberals relied on a more diverse spread of left-leaning outlets. For a foreign influence operation like the one the Russians ran against the United States, the highly concentrated right-wing social media landscape is an immediate, ripe target for injecting themes and messages. The American-left is diversely spread making targeting messages more difficult.

The Internet Research Agency in St. Petersburg, Russia bought $4,700 in advertising and through eighteen channels, hosted more than 1,000 videos that received more than 300,000 views.

The Russians created a YouTube page called Williams and Kalvin. The page’s videos showcase two black video bloggers, with African accents, appearing to read script that Barack Obama created police brutality and calling Hillary Clinton an “old racist bitch.” The Williams and Calvin page garnered 48,000 fans. Watts writes,”Russian influence operators employed most every platform—Instagram, Tumblr, even PokemonGo—but it was the Kremlin’s manipulation via Twitter that proved the most troubling.”

Watts concludes that U.S. government resources are needed to find a truly effective effort. Intelligence agencies, Homeland Security, and the State Department need to rally and coordinate. Rex Tillerson was late in using the $80 million Congress had set aside for counterpropaganda resources, and then used only half of the appropriated amount. This is just a start, and a small one at that, of what America needs to do against Russian influence. The last sentence in this chapter reads, “Kislyak was right, and Putin must still wonder, “Why hasn’t America punched back.”

What to Do About Disinformation

December 3, 2019

The title of this post is identical to the title of a section in Richard Stengel’s informative work, Information Wars. This book is highly informative and provides information not only about the State Department, but also about the actions Rick Stengel took performing his job. But the most useful part of the book is this section, What to Do About Disinformation. Several posts are needed here, and even then, they cannot do justice to the information provided in the book.

When the Library of Congress was created in 1800 it had 39 million books. Today the internet generates 100 times that much data every second. Information definitely is the most important asset of the 21st Century. Polls show that people feel bewildered by the proliferation of online news and data. Mixed in with this daily tsunami there is a lot of information that is false as well as true.

Disinformation undermines democracy because democracy depends on the free flow of information. That’s how we make decisions. Disinformation undermines the integrity of our choices. According to the Declaration of Independence “Governments are instituted among Men, deriving their just powers from the consent of the governed.” If that consent is acquired through deception, the powers from it are not just. Stengel states that it is an attack on the very heart of our democracy.

Disinformation is not news that is simply wrong or incorrect. It is information that is deliberately false in order to manipulate and mislead people.

Definitions of important terms follow:
Disinformation: The deliberate creation and distribution of information that is false and deceptive in order to mislead an audience.
Misinformation: Information that is false, though not deliberately; that is created inadvertently or by mistake.
Propaganda: Information that may or may not be true that is designed to engender support for a political view or ideology.

“Fake news” is a term Donald Trump uses to describe any content he does not like. But Trump did not originate the term. The term was familiar to Lenin and Stalin and almost every other dictator of the last century. Russians were calling Western media fake news before Trump, and Trump in his admiration of Russia followed suit. Stengel prefers the term “junk news” to describe information that is false, cheap, and misleading that has been created without regard for its truthfulness.

Most people regard “propaganda” as pejorative, but Stengel believes that it is—or should be—morally neutral. Propaganda can be used for good or ill. Advertising is a form of propaganda. What the United States Information Agency did during the Cold War was a form of propaganda. Advocating for something you believe in can be defined as propaganda. Stengel writes that while propaganda is a misdemeanor, disinformation is a felony.
Disinformation is often a mixture of truth and falsity. Disinformation doesn’t necessarily have to be 100% false to be disinformation. Stengel writes that the most effective forms of disinformation are a mixture of information that is both true and false.

Stengel writes that when he was a journalist he was close to being a First Amendment absolutist. But he has changed his mind. He writes that in America the standard for protected speech has evolved since Holme’s line about “falsely shouting fire in a theater.” In Brandenburg v. Ohio, the court ruled that speech that led to or directly caused violence was not protected by the First Amendment.

Stengel writes that even outlawing hate speech will not solve the problem of disinformation. He writes that government may not be the answer, but it has a role. He thinks that stricter government regulation of social media can incentivize the creation of fact-based content and discentivize the creation of disinformation. Currently big social media platforms optimize content that has greater engagement and vitality, and such content can sometimes be disinformation or misinformation. Stengel thinks the these incentives can be changed in part through regulation and in part through more informed user choices.

What Stengel finds most disturbing is that disinformation is being spread in a way and through means that erode trust in public discourses and democratic processes. This is precisely what these bad actors want to accomplish. They don’t necessarily want you to believe them—they don’t want you to believe anybody.

As has been described in previous healthy memory blog posts, the creators of disinformation use all the legal tools on social media platforms that are designed to deliver targeted messages to specific audiences. These are the same tools—behavioral data analysis, audience segmentation, programmatic ad buying—that make advertising campaigns effective. The Internet Research Agency in St. Petersburg, Russia uses the same behavioral data and machine-learning algorithms that Coca-Cola and Nike use.

All the big platforms depend on the harvesting and use of personal information. Our data is the currency of the digital economy. The business model of Google, Facebook, Amazon, Microsoft, and Apple, among others, depends on the collection and use of personal information. They use this information to show targeted advertising. They collect information on where you go, what you do, whom you know, and what your want to know about, so they can sell that information to advertisers.

The important question is, who owns this information? These businesses argue that because they collect, aggregate, and analyze our data, they own it. The law agrees in the U.S. But in Europe, according to the EU’s General Data Protection Regulation, people own their own information. Stengel and HM agree that this is the correct model. America needs a digital bill of rights that protects everyone’s information as a new social contract.

Stengel’s concluding paragraph is “I’m advocating a mixture of remedies that optimize transparency, accountability, privacy, self-regulation, data protection, and information literacy. That can collectively reduce the creation, dissemination, and consumption of false information. I believe that artificial intelligence and machine learning can be enormously effective in combating falsehood and disinformation. They are necessary but insufficient. All three efforts should be—to use one of the military’s favorite terms—mutually reinforcing.”

Research Ties Fake News to Russia

November 28, 2016

The title of this post is identical to a front page story by Craig Timberg in the 25 November 2016 issue of the Washington Post.  The article begins, “The flood of ‘fake news’ this election season got support from a sophisticated Russian propaganda campaign that created misleading articles online with the goal of punishing Democrat Hillary Clinton, helping Republican Donald Trump, and undermining faith in American democracy, say independent researchers who tracked the operation.”

The article continues, “Russia’s increasingly sophisticated machinery—including thousands of bonnets, teams of paid human “trolls,” and networks of websites and social-media accounts—echoed and amplified right-wing sites across the Internet as they portrayed Clinton as a criminal hiding potentially fatal health problems and preparing to hand control of the nation to a shadowy cabal of global financiers.  The effort also sought to heighten the appearance of international tensions and promote fear of looming hostilities with the nuclear-armed Russia.”

Two teams of independent researchers found that the Russians exploited American-made technology platforms to attack U.S. democracy at a particularly vulnerable moment.  The sophistication of these Russian tactics may complicate efforts by Facebook and Google to crack down on “fake news.”

Research was done by Clint Watts, a fellow at the Foreign Policy Research Institute has been tracking Russian propaganda since 2014 along with two other researchers,s  Andrew Weisburg and J.M. Berger.  This research can be found at warontherocks.com, “Trolling for Trump:  How Russia is Trying to Destroy our Democracy.”

Another group, PropOrNot, http://www.propornot.com/
plans to release its own findings today showing the startling reach and effectiveness of Russian propaganda campaigns.

Here are some tips for identifying fake news:

Examine the url, which sometimes are subtly changed.
Does the photo looked photoshopped or unrealistic (drop into Google images)
Cross check with other news sources.
Think about installing Chrome plug-ins to identify bad stuff.