It Gets Even Worse

This is the ninth post based on an important book by Roger McNamee titled “Zucked: Waking Up to the Facebook Catastrophe.” This post picks up where the immediately preceding post, “Amplifying the Worse Social Behavior” stopped. Users sometimes adopt an idea suggested by Facebook or by others on Facebook as their own. For example, if someone is active in a Facebook Group associated with a conspiracy theory and then stop using the platform for a time, Facebook will do something surprising when they return. It might suggest other conspiracy theory Groups to join because they share members with the first conspiracy Group. Because conspiracy theory Groups are highly engaging, they are likely to encourage reengagement with the platform. If you join the Group, the choice appears to be yours, but the reality is that Facebook planted the seed. This is because conspiracy theories are good for them, not for you.

Research indicates that people who accept one conspiracy theory have a high likelihood of accepting a second one. The same is true of inflammatory disinformation. Roger accepts the fact that Facebook, YouTube, and Twitter have created systems that modify user behavior. Roger writes, “They should have realized that global scale would have an impact on the way people use their products and would raise the stakes for society. They should have anticipated violations of their terms of service and taken steps to prevent them. Once made aware of the interference, they should have cooperated with investigators. I could no longer pretend that Facebook was a victim. I cannot overstate my disappointment. The situation was much worse than I realized.”

Apparently, the people at Facebook live in their own preference bubble. Roger writes, “Convinced of the nobility of their mission, Zuck and his employees reject criticism. They respond to every problem with the same approach that created the problem in the first place: more AI, more code, more short-term fixes. They do not do this because they are bad people. They do this because success has warped their perception of reality. To them, connecting 2.2 billion people is so obviously a good thing, and continued growth so important, that they cannot imagine that the problems that have resulted could be in any way linked to their designs or business decisions. As a result, when confronted with evidence that disinformation and fake news spread over Facebook influenced the Brexit referendum and the election of Putin’s choice in the United States, Facebook took steps that spoke volumes about the company’s world view. They demoted publishers in favor of family, friends, and Groups on the theory that information from those sources would be more trustworthy. The problem is that family, friends, and Groups are the foundational elements of filter and preference bubbles. Whether by design or by accident, they share the very disinformation and fake news that Facebook should suppress.

Tags: , , , , , , , , , , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


%d bloggers like this: