Posts Tagged ‘ananymity’

Amplifying the Worst Social Behavior

April 4, 2019

This is the eighth post based on an important book by Roger McNamee titled “Zucked: Waking Up to the Facebook Catastrophe.” Roger writes, “The competition for attention across the media and technology spectrum rewards the worst social behavior. Extreme views attract more attention, so platforms recommend them. News Feeds with filter bubbles do better at holding attention than News Feeds that don’t have them. If the worst thing that happened with filter bubbles was that they reinforced preexisting beliefs, they would be no worse than many other things in society. Unfortunately, people in a filter bubble become increasingly tribal, isolated, and extreme. They seek out people and ideas that make them comfortable.”

Roger continues, “Social media has enabled personal views that had previously been kept in check by social pressure—white nationalism is an example- to find an outlet.” This leads one to ask the question whether Trump would have been elected via the Electoral College if it weren’t for social media. Trump’s base consists of Nazis and white supremacists and constitutes more than a third of the citizens. Prior to the election, HM would never have believed that this was the case. Now he believes and is close to being clinically depressed.

Continuing on, “Before the platforms arrived, extreme views were often moderated because it was hard for adherents to find one another. Expressing extreme views in the real world can lead to social stigma, which also keeps them in check. By enabling anonymity and/or private Groups, the platforms removed the stigma, enabling like-minded people, including extremists, to find one another, communicate, and, eventually, to lose the fear of social stigma.”

Once a person identifies with an extreme position on an internet platform, that person will be subject to both filter bubbles and human nature. There are two types of bubbles. Filter bubbles are imposed by others, whereas a preference bubble is a choice, although the user might be unaware of this choice. By definition, a preference bubble takes users to a bad place, and they may not even be conscious of the change. Both filter bubbles and preference bubbles increase time on site, which is a driver of revenue. Roger notes that in a preference bubble, users create an alternative reality, built around values shared with a tribe, which can focus on politics, religion, or something else. “They stop interacting with people with whom they disagree, reinforcing the power of the bubble. They go to war against any threat to their bubble, which for some users means going to war against democracy and legal norms, They disregard expertise in favor of voices from their tribe. They refuse to accept uncomfortable facts, even ones that are incontrovertible. This is how a large minority of Americans abandoned newspapers in favor of talk radio and websites that peddle conspiracy theories. Filter bubbles and preference bubbles undermine democracy by eliminating the last vestiges of common ground among a huge percentage of Americans. The tribe is all that matters, and anything that advances the tribe is legitimate. You see this effect today among people whose embrace of Donald Trump has required them to abandon beliefs they held deeply only a few years earlier. Once again, this is a problem that internet platforms did not invent. Existing issues in society created a business opportunity that platforms exploited. They created a feedback loop that reinforces and amplifies ideas with a speed and at a scale that are unprecedented.”

Clint Watts in his book, “Messing with the Enemy” makes the case that in a preference bubble, facts and expertise can be the core of a hostile system, an enemy that must be defeated. “Whoever gets the most likes is in charge; whoever gets the most shares is an expert. Preference bubbles, once they’ve destroyed the core, seek to use their preference to create a core more to their liking, specially selecting information, sources, and experts that support their alternative reality rather than the real physical world.” Roger writes, “The shared values that form the foundation of our democracy proved to be powerless against the preference bubbles that have evolved over the past decade. Facebook does not create preference bubbles, but it is the ideal incubator for them. The algorithms that users who like one piece of disinformation will be fed more disinformation. Fed enough disinformation, users will eventually wind up first in a filter bubble and then in a preference bubble. if you are a bad actor and you want to manipulate people in a preference bubble, all you have to do is infiltrate the tribe, deploy the appropriate dog whistles, and you are good to go. That is what the Russians did in 2016 and what many are doing now.