Posts Tagged ‘digital bots’

The Effects Facebook Has on Users

April 3, 2019

This is the seventh post based on an important book by Roger McNamee titled “Zucked: Waking Up to the Facebook Catastrophe.” Roger writes, “It turns out that connecting 2.2 billion people on a single network does not naturally produce happiness at all. It puts pressure on users, first to present a desirable image, then to command attention in the form of Likes or shares from others. In such an environment, the loudest voices dominate.” This can be intimidating. Consequently, we follow the human tendency to organize into clusters and tribes. This begins with people who share our beliefs. Most often this consists of family, friends, and Facebook Groups to which we belong. Facebook’s news feed encourages every user to surround him- or herself with like-minded people. Notionally, Facebook allows us to extend our friends network to include a highly diverse community, but many users stop following people with whom they disagree. Usually it feels good when we cut off someone who provokes us and lots of people do so. Consequently friends lists become more homogeneous over time. Facebook amplifies this effect with its approach to curating the News Feed. Roger writes, “When content is coming from like-minded family, friends, or Groups, we tend to relax our vigilance, which is one of the reasons why disinformation spreads so effectively on Facebook.

An unfortunate by-product of giving users what they want are filter bubbles. And unfortunately, there is a high correlation between the presence of filter bubbles and polarization. Roger writes, “I am not suggesting that filter bubbles create polarization, but I believe they have a negative impact on public discourse and political because filter bubbles isolate the people stuck in them. Filter bubbles exist outside Facebook and Google, but gains in attention for Facebook and Google are increasing the influence of their filter bubbles relative to others.”

Although practically everyone on Facebook has friends and family, many also are members of Groups. Facebook allows Groups on just about anything, including hobbies, entertainment, teams, communities, churches, and celebrities. Many groups are devoted to politics and they cross the full spectrum. Groups enables easy targeting by advertisers so Facebook loves them. And bad actors like them for the same reason. Case Sunstein, who was the administrator of the White House Office of Information and Regulatory Affairs for the first Obama administration conducted research indicating that when like-minded people discuss issues, their views tend to get more extreme over time. Jonathan Morgan of Data for Democracy has found that as few as 1 to 2 percent of a group can steer the conversation if they are well-coordinated. Roger writes, “That means a human troll with a small army of digital bots—software robots—can control a large, emotional Group, which is what the Russians did when they persuaded Groups on opposite sides of the same issue—like pro-Muslim groups and anti-Muslim groups—to simultaneously host Facebook events in the same place at the same time hoping for a confrontation.

Roger notes that Facebook asserts that users control their experience by picking the friends and sources that populate their News Feed when in reality an artificial intelligence, algorithms, and menus created by Facebook engineers control every aspect of that experience. Roger continues, “With nearly as many monthly users are there are notional Christians in the world, and nearly as many daily users as there are notional Muslims, Facebook cannot pretend its business model does not have a profound effect. Facebook’s notion that a platform with more than two billion users can and should police itself also seems both naive and self-serving, especially given the now plentiful evidence to the contrary. Even if it were “just a platform,” Facebook has a responsibility for protecting users from harm. Deflection of responsibility has serious consequences.”