Posts Tagged ‘Stengel’

Advertising

December 8, 2019

The title of this post is identical to the title of one of the fixes needed for the internet proposed in Richard Stengel’s informative work, Information Wars. This is last fix he provides.

He writes, “Advertisers use ad mediation software provided by the platforms to find the most relevant audiences for their ads. These ad platforms take into account a user’s region, device, likes, searches, and purchasing history. Something called dynamic creative optimization, a tool that uses artificial intelligence, allows advertisers to optimize their content for the user and find the most receptive audience. Targeted ads are dispatched automatically across thousands of websites and social media feeds. Engagement statistics are logged instantaneously to tell the advertiser and the platform what is working and what is not. The system tailors the ads for the audiences likely to be most receptive.”

Of course, the bad guys use all these tools to find the audiences they want as well. The Russians became experts at using two parts of Facebook’s advertising infrastructure: the ads auction and something called Custom Audiences. In the ads auction, potential advertisers submit a bid for a piece of advertising real estate. Facebook not only awards the space to the highest bidder, but also evaluates how clickbaitish the copy is. The more eyeballs the ad will get the more likely it is to get the ad space, even if the bidder is offering a lower price. Since the Russians did not care about the accuracy of the content they were creating, they’re willing to create sensational false stories that become viral. Hence, more ad space.

The Russians efforts in the 2016 election have been reviewed in previous healthy memory blog posts. The Trump organization itself used the same techniques and spent exponentially more on these Facebook ads than the Russians did.

Stengel concludes this section on techniques for reducing the amount of disinformation in our culture would reduce, but not eliminate, disinformation. He writes that disinformation will always be with us, because the problem is not facts, or the lack of them, or misleading stories filled with conjecture; the problem is us (homo sapiens) .  There are all kind of cognitive biases and psychological states, but the truth is that people are gong to believe what they want to believe. It would be wonderful if the human brain came with a lie detector, but it doesn’t.

HM urges the reader not to take this conclusion offered by Stengel too seriously. It is true that human information processing is biased, because it needs to be. Our attention is quite limited. But rather than throwing in the towel, we need to deal with our biases as best we can. The suggestions offered by Stengel are useful to this end.

© Douglas Griffith and healthymemory.wordpress.com, 2019. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The Media

December 7, 2019

The title of this post is identical to he title of one of the fixes needed for the internet proposed in Richard Stengel’s informative work, Information Wars. Stengel writes, “American doesn’t have a “fake news” problem—it has a media literacy problem.”

He continues, “Millions of American aren’t able to tell a made-up story from a factual one. Few Americans examine the provenance of the information they get, and many will trust a story from an unknown source as much as one from the New York Times. Moreover, disinformtionists have gotten better and better at creating stories and websites that appear legitimate. During the presidential campaign the Russians created sites with names like Denver Guardian, which appeared to be genuine news sites.”

Schools don’t teach media literacy, and they need to. Students need to learn how news organizations work, and how they identify the provenance of information. Stengel notes, “Making journalism a staple of secondary education would go a long way toward solving the “fake news” problem.

Stengel makes some recommendations that would radically improve online news by using the very technology on which this news is presented. He suggests that online the story should essentially deconstruct itself. Next to the text there should be links to the full transcripts of interviews the reporter did. Those links would also include the URLs of biographies of those in the story. Writers and editors should include links to the primary and secondary sources for the story—all the research—including other news stories, books, video, and scholarly articles. There should be links to other photos or videos that were considered for the story. He would even have a link to the original outline of the story so that the reader could see how it was conceived. The top of each story should feature a digital table of contents that shows each of these aspects of the story. This is a technologically modern and even more open version of what scholars do with footnotes and bibliographies. The basic idea is to show the reader every step of the story and to show how it turned out the way it did.

Stengel concludes this section by arguing that news organizations must get rid of online clickbait and so-called content recommendation networks and “Sponsored Stories” that Taboola and Outbrain perch at the bottom of the screen and pretend to be news. He states that their presence at the bottom of the page weakens and undermines the credible journalism above it.

Algorithms, Rating Systems, and Artificial Intelligence

December 6, 2019

The title of this post is identical to he title of one of the fixes needed for the internet proposed in Richard Stengel’s informative work, Information Wars. Currently, the algorithms that decide what story goes to the top of Google’s search results or Facebook’s newsfeed rely in large part on how viral a story is, meaning how often it was linked to or shared by other users. It correlates popularity with value. Here the working assumption is that the more popular a story is the more valuable it is. So stories about a Kardashian quarrel might likely outrank one about nuclear weapons in Pakistan being insecure. Research shows that stories that are emotional or sensational, which also are stories that are more likely to be filled with misinformation, are shared much more widely than stories less emotional, less sensational. Consequently, these algorithms are boosting deceptive stories over factual ones. This also incentivizes people to create stories that are emotional and misleading because such stories produce more advertising revenue.

Currently, the algorithms that do this are black boxes that no one can see into. Platform companies should be compelled to be more transparent about their algorithms. If people had to publicly explain their formulas for relevance and importance, people would be able to make intelligent choices about the search engines they use. Wouldn’t you like to know the priorities of the search engine(s) you use?

Stengel notes that there has been a valuable movement toward offering ratings systems for news. These systems allow users to evaluate the trustworthiness of individual stories and the news organizations themselves. A study by the Knight Foundation found that when a news rating tool marked a site as reliable, readers’ belief in its accuracy went up. A negative rating for a story or brand made users less likely to use the information.

The Trust Project posts “Trust Indicators” for news sites, providing details of an organization’s ethics and standards. Slate has a Chrome extension called “This is Fake,” which puts a red banner over content that has been debunked, as well as on sites that are recognized as “serial fabricators.” Factmata is a start-up that is attempting to build a community-driven fact-checking system in which people can correct news articles. Stengel is on the board of advisors of NewsGuard, which labels news sites as trustworthy or not as determined by extensive research and a rigorous set of criteria.

Stengel writes that the greatest potential for detecting and deleting disinformation and “junk news” online is through artificial intelligence and machine learning. This involves using computer systems to perform human tasks such as visual perception, speech recognition, decision-making, and reasoning to detect and then delete false and misleading content.  Pattern recognition finds collections of groupings of dubious content. Data-based network analysis can distinguish between online networks that are formed by actual human beings and those that are artificially constructed by bots. Companies can adjust their algorithms to favor human-created networks over artificial ones. The platforms can even offer a predictor, based on sourcing, data. and precedent, as to whether a certain piece of content is likely to be false.

Of course, the bad guys can use it too. Stengel writes, “they are bleary developing their own systems to understand how their target audiences behave online and how to tailor disinformation for them so that they will share it. Platforms can help advertisers and companies find and reach their best audiences, and this works for bad guys as well as good. Platforms have to work to stay one step ahead of the disinformationist by developing more nuanced AI systems to protect their users from disinformation and information they do not want.

Privacy and Elections

December 5, 2019

The title of this post is identical to he title of one of the fixes needed for the internet proposed in Richard Stengel’s informative work, Information Wars. If your privacy information is protected, you are less likely to be targeted by deceptive information and disinformation.

Stengel thinks that an online privacy bill of rights is a good idea. One thing that needs to be mandatory in any digital bill of rights: the requirement that platforms obtain consent from users to share or sell their information and notify users about the collection of their data. This is the absolute minimum.

Regarding elections, platforms need to alert people when a third party uses their private information for online advertising. Political campaigns are highly sophisticated in their ability to use your consumer information to target you with advertising. If they know what movies you like, what shoes you by, and what books you read, they know what kind of campaign advertising you will be receptive to. At the same time, advertisers must give users the ability to output any content they receive.

The following fix could happen quickly: treat digital and online campaign advertising with the same strictness and rigor as television, radio, and print advertising. Currently, the Federal Election Commission does not do this. Television and radio stations, as well as newspapers, must disclose the source of all political advertising and who is paying for it. This is not true for digital advertising. The Honest Ads Act, which was introduced by the late Senator John McCain, Senator Amy Klobuchar, and Senator Mark Warner, is an attempt to solve the problem of hidden disinformation campaigns by creating a disclosure system for online political advertising. It would require online platforms to obtain and disclose information on who is buying political advertising, as well as who the targeted audience is for the ads. It requires that platform companies disclose if foreign agents are paying for the ads. Platform companies would also be responsible for identifying bots so that voters know whether they are being targeted by machines or actual human beings. Stengel writes that all of this is both necessary and the absolute minimum.

For this regulation to be effective, it must also be done in real time during campaigns. Currently, according to the Federal Election Commission, political campaigns do not have to disclose their ad buys until a year after the fact. This is absurd. People need to know if they are being fed disinformation and falsehoods, and to know this in a timely way so they can factor it in their decision-making. Immediacy is more important during political campaigns than at any other time. Finding out a year later that you were targeted with a false ad by a bot that influenced your vote is worse than useless.

Congress also needs to designate state and local election systems to be national critical infrastructure. This would the federal government broader posers to intervene in a crisis. The Obama administration tried to do this, but the Republican majority in Congress voted it down. Stengel writes, “This is an essential change, and should be a bi-partisan issue.

Section 230

December 4, 2019

The title of this post is identical to he title of one of the fixes needed for the internet proposed in Richard Stengel’s informative work, Information Wars. New legislation is needed to create an information environment that is more transparent, more consumer-focused, and makes the creators and purveyors of disinformation more accountable. Stengel calls this section legislations’s original sin, the Communicates Decency Act (CDA) of 1996. The CDA was one of the first attempts by Congress to regulate the internet. Section 230 of this act says that online platforms and their users are not considered publishers and have immunity from being sued for the content they post. It reads No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. Congress’s motivation back in 1996 was not so much to shield these new platforms as to protect free speech. Congress didn’t want the government to police these platforms and thereby potentially restrict freedom of speech—it wanted the platforms to police themselves. Congress worried that if the platforms were considered publishers, they would be too draconian in policing their content and put a chill on the creation of content by third parties. The courts had suggested that if a platform exercised editorial control by removing offensive language, that made it a publisher and therefore liable for the content on its site. The idea of Section 230 was to give these companies a “safe harbor” to screen harmful content. The reasoning was that if they received a general immunity, they would be freer to remove antisocial content that violated their terms of service without violating constitutional free speech provisions.

Focus on the year this act was passed. This was the era of America Online, CompuServe, Netscape, Yahoo and Prodigy. That was a different world and there was no way to anticipate the problems brought by Facebook. Stengel notes that Facebook is not like the old AT&T. Facebook makes money off the content it hosts and distributes,. They just call it “sharing.” Facebook makes the same amount off ad revenue from shared content that is false as from shared content that is true. Note that this problem is not unique to Facebook, but perhaps Facebook is the most prominent example.

Stengel continues, “If Section 230 was meant to encourage platforms to limit content that is false or misleading, it’s failed. No traditional publisher could survive if it put out the false and untrue content that these platforms do. It would be constantly sued. The law must incentivize the platform companies to be proactive and accountable to fighting disinformation. Demonstrable false information needs to be removed from the platforms. And that’s just the beginning.”

Stengel concludes this section as follows: “But let’s be realistic. The companies will fight tooth and nail to keep their immunity. So, revising Section 230 must encourage them to make good-faith efforts to police their content, without making them responsible for every phrase or sentence on their services. It’s unrealistic to expect these platforms to vet every tweet or post. One way to do this is to revise the language of the CDA to say that no platforms that make a good faith effort to fulfill its responsibility to delete harmful content and provide information to users about that content can be liable for the damage that it does start. It’s a start.”

What to Do About Disinformation

December 3, 2019

The title of this post is identical to the title of a section in Richard Stengel’s informative work, Information Wars. This book is highly informative and provides information not only about the State Department, but also about the actions Rick Stengel took performing his job. But the most useful part of the book is this section, What to Do About Disinformation. Several posts are needed here, and even then, they cannot do justice to the information provided in the book.

When the Library of Congress was created in 1800 it had 39 million books. Today the internet generates 100 times that much data every second. Information definitely is the most important asset of the 21st Century. Polls show that people feel bewildered by the proliferation of online news and data. Mixed in with this daily tsunami there is a lot of information that is false as well as true.

Disinformation undermines democracy because democracy depends on the free flow of information. That’s how we make decisions. Disinformation undermines the integrity of our choices. According to the Declaration of Independence “Governments are instituted among Men, deriving their just powers from the consent of the governed.” If that consent is acquired through deception, the powers from it are not just. Stengel states that it is an attack on the very heart of our democracy.

Disinformation is not news that is simply wrong or incorrect. It is information that is deliberately false in order to manipulate and mislead people.

Definitions of important terms follow:
Disinformation: The deliberate creation and distribution of information that is false and deceptive in order to mislead an audience.
Misinformation: Information that is false, though not deliberately; that is created inadvertently or by mistake.
Propaganda: Information that may or may not be true that is designed to engender support for a political view or ideology.

“Fake news” is a term Donald Trump uses to describe any content he does not like. But Trump did not originate the term. The term was familiar to Lenin and Stalin and almost every other dictator of the last century. Russians were calling Western media fake news before Trump, and Trump in his admiration of Russia followed suit. Stengel prefers the term “junk news” to describe information that is false, cheap, and misleading that has been created without regard for its truthfulness.

Most people regard “propaganda” as pejorative, but Stengel believes that it is—or should be—morally neutral. Propaganda can be used for good or ill. Advertising is a form of propaganda. What the United States Information Agency did during the Cold War was a form of propaganda. Advocating for something you believe in can be defined as propaganda. Stengel writes that while propaganda is a misdemeanor, disinformation is a felony.
Disinformation is often a mixture of truth and falsity. Disinformation doesn’t necessarily have to be 100% false to be disinformation. Stengel writes that the most effective forms of disinformation are a mixture of information that is both true and false.

Stengel writes that when he was a journalist he was close to being a First Amendment absolutist. But he has changed his mind. He writes that in America the standard for protected speech has evolved since Holme’s line about “falsely shouting fire in a theater.” In Brandenburg v. Ohio, the court ruled that speech that led to or directly caused violence was not protected by the First Amendment.

Stengel writes that even outlawing hate speech will not solve the problem of disinformation. He writes that government may not be the answer, but it has a role. He thinks that stricter government regulation of social media can incentivize the creation of fact-based content and discentivize the creation of disinformation. Currently big social media platforms optimize content that has greater engagement and vitality, and such content can sometimes be disinformation or misinformation. Stengel thinks the these incentives can be changed in part through regulation and in part through more informed user choices.

What Stengel finds most disturbing is that disinformation is being spread in a way and through means that erode trust in public discourses and democratic processes. This is precisely what these bad actors want to accomplish. They don’t necessarily want you to believe them—they don’t want you to believe anybody.

As has been described in previous healthy memory blog posts, the creators of disinformation use all the legal tools on social media platforms that are designed to deliver targeted messages to specific audiences. These are the same tools—behavioral data analysis, audience segmentation, programmatic ad buying—that make advertising campaigns effective. The Internet Research Agency in St. Petersburg, Russia uses the same behavioral data and machine-learning algorithms that Coca-Cola and Nike use.

All the big platforms depend on the harvesting and use of personal information. Our data is the currency of the digital economy. The business model of Google, Facebook, Amazon, Microsoft, and Apple, among others, depends on the collection and use of personal information. They use this information to show targeted advertising. They collect information on where you go, what you do, whom you know, and what your want to know about, so they can sell that information to advertisers.

The important question is, who owns this information? These businesses argue that because they collect, aggregate, and analyze our data, they own it. The law agrees in the U.S. But in Europe, according to the EU’s General Data Protection Regulation, people own their own information. Stengel and HM agree that this is the correct model. America needs a digital bill of rights that protects everyone’s information as a new social contract.

Stengel’s concluding paragraph is “I’m advocating a mixture of remedies that optimize transparency, accountability, privacy, self-regulation, data protection, and information literacy. That can collectively reduce the creation, dissemination, and consumption of false information. I believe that artificial intelligence and machine learning can be enormously effective in combating falsehood and disinformation. They are necessary but insufficient. All three efforts should be—to use one of the military’s favorite terms—mutually reinforcing.”

What Has Happened to the Global Engagement Center?

December 2, 2019

This post is a follow up to the post titled “Information Wars.” A few weeks before the end of the Obama administration Congress codified the Global Engagement Center (GEC) into law in the 2017 National Defense Authorization Act. It’s mission was to “lead synchronize, and coordinate efforts of the Federal Government to recognize, understand, expose, and counter foreign state and non-state propaganda and disinformation efforts aimed at undermining the United States national security interests.”

When Secretary of State Rex Tillerson finally requested the money, it was not for more than a year. When Tillerson finally did make the request, a year and a half into the administration, he asked for half of the $80 million that Congress had authorized. The sponsors of the legislation, Senators Portman and Murphy said this delay was “indefensible” at a time when “ISIS is spreading terrorist propaganda and Russia is implementing a sophisticated disinformation campaign to undermine the United States and our allies.”

The GEC is no longer in the business of creating content to counter disinformation. It has become an entity that uses data science and analytics to measure and better understand disinformation. Over the past two years, a steady stream of people has quit or retired and the GEC has had a hard time hiring replacements.

What is especially worrisome is when one hears Republicans arguing the points of Russian disinformation, for example, that the Ukraine was involved in disrupting the 2016 election in debating the Democrats. One has to think that true Republicans like Ronald Reagan and John McCain are thrashing about in their graves.

© Douglas Griffith and healthymemory.wordpress.com, 2019. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Information Wars

December 1, 2019

The title of this post is identical to the title of an informative book by Richard Stengel, a former editor of Time magazine. During the second term of the Obama administration he was appointed and confirmed as the Under Secretary for Public Diplomacy. The book provides a detailed and interesting description of the State Department and the organization and workings of the State Department.

Stengel was appointed to lead the Center for Strategic Counterterrorism Communication. It was integral to the Global Engagement Center. This is important because information warfare is the primary means terrorist organizations fought. It was punctuated by despicable terrorist acts, but the primary messaging was done using the internet. Effective counter-messaging needed to be developed to counter the messaging of the terrorists.

Although ISIS and al-Qaeda are currently recognized as the primary terrorist organizations, it is important not to overlook the largest and most threatening terrorist organization, Russia. Our term “disinformation” is in fact an adaptation of the Russian word dezinformatsiya, which was the KGB term for black propaganda. The modern Russian notion of hybrid warfare comes from what is called the Gerasimov model. Gerasimov has appeared in previous healthy memory blog posts. He is the father of the idea that in the 21st century only a small part of war is kinetic. He has written that modern warfare is nonlinear with no clear boundary between military and nonmilitary campaigns. The Russians, like ISIS, merged their military lines of effort with their information and messaging line of effort.

In the old days, disinformation involved placing a false story (often involving forged documents) in a fairly obscure left-leaning newspaper in a country like India or Brazil; then the story was picked up and echoed in Russian state media. A more modern version of dezinformatsiya is the campaign in the 1990s that suggested that the U.S. had invented the AIDS virus as a kind of “ethnic bomb” to wipe out people of color.

Two other theorists of Russian information warfare are Igor Panarin, an academic, and a former KGB officer; and Alexander Dugin, a philosopher whom he called “Putin’s Rasputin.” Panarin sees Russia as the victim of information aggression by the United States. He believes there is a global information war between what he calls the Atlantic world, led by the U.S. and Europe; and the Eurasian world, led by Russia.

Alexander Dugan has a particularly Russian version of history. He says that the 20th century was a titanic struggle among fascism, communism, and liberalism, in which liberalism won out. He thinks that in the 21st century there will be a fourth way. Western liberalism will be replaced by a conservative superstate like Russia leading a multipolar world and defending tradition and conservative values. He predicts that the rise of conservative strongmen in the West will embrace these values. Dugan supports the rise of conservative right-wing groups all across Europe. He has formed relationships with white nationalists’ groups in America. Dugan believes immigration and racial mixing are polluting the Caucasian world. He regards rolling back immigration as one of the key tasks for conservative states. Dugan says that all truth is relative and a question of belief; that freedom and democracy are not universal values but peculiarly Western ones; and that the U.S. must be dislodged as a hyper power through the destabilization of American democracy and the encouragement of American isolationism.

Dugan says that the Russians are better at messaging than anyone, and that they’ve been working on it as a part of conventional warfare since Lenin. So the Russians have been thinking and writing about information war for decades. It is embedded in their military doctrines.

Perhaps one of the best examples of Russia’s prowess at information warfare is Russia Today, (RT). During HM’s working days his job provided the opportunity to view RT over an extensive period of time. What is most remarkable about RT is that it appears to bear no resemblance of information warfare or propaganda. It appears to be as innocuous as CNN. However, after long viewing one realizes that one is being drawn to accept the objectives of Russian information warfare.

Stengel notes that Russian propaganda taps into all the modern cognitive biases that social scientists write about: projection, mirroring, anchoring, confirmation bias. Stengel and his staff put together their own guide to Russian propaganda and disinformation, with examples.

*Accuse your adversary of exactly what you’re doing.
*Plant false flags.
*Use your adversary’s accusations against him.
*Blame America for everything!
*America blames Russia for everything!
*Repeat, repeat, repeat.

Stengel writes that what is interesting about this list is that it also seems to describe Donald Trump’s messaging tactics. He asks whether this is a coincidence, or some kind of mirroring?

Recent events have answered this question. The acceptance of the alternative reality that the Ukraine has a secret server and was the source of the 2016 election interference is Putin’s narrative developed by Russian propaganda. Remember that Putin was once a KGB agent. His ultimate success here is the acceptance of this propaganda by the Republican Party. There is an information war within the US that the US is losing.

© Douglas Griffith and healthymemory.wordpress.com, 2019. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.