Posts Tagged ‘cyberbulling’

The Role of Artificial Intelligence

March 31, 2019

This is the fourth post based on an important book by Roger McNamee titled “Zucked: Waking Up to the Facebook Catastrophe.” Companies like Facebook and Google use artificial intelligence (AI) to build behavioral prediction engines that anticipate our thoughts and emotions based on patterns found in the vast amount of data they have accumulated about users. Users of likes, posts, shares, comments, and Groups have taught Facebook’s AI how to monopolize our attention. As a result, Facebook can offer advertisers exceptionally high-quality targeting.

This battle for attention requires constant innovation. In the early days of the internet the industry learned that a user adapts to predictable ad layouts, skipping over them without registering any of the content. There’s a tradeoff when it comes to online ads. Although it is easy to see that the right person is seeing the ad, it is much harder to make sure that the person is paying attention to the ad. The solution to the latter problem is to maximize the time users spend on the platform. If users devote only a small percentage of attention to the ads they see, then they try to monopolize as much of the users’ attention as possible. So Facebook as well as other platforms add new content formats and products to stimulate more engagement. Text was enough at the outset. Next came photos, then mobile. Video is the current frontier. Facebook also introduces new products such as Messenger and, soon, dating. To maximize profits, Facebook and other platforms hide the data on the effectiveness of ads.

Platforms prevent traditional auditing practices by providing less-than-industry-standard visibility. Consequently advertisers say, “I know half my ad spending is wasted; I just don’t know which half. Nevertheless, platform ads work well enough that advertisers generally spend more every year. Search ads on Google offer the clearest payback, but brand ads on other platforms are much harder to measure. But advertisers need to put their message in front of prospective customers, regardless of where they are. When user gravitate from traditional media to the internet, the ad dollars follow them. Platforms do whatever they can to maximize daily users’ time on site.

As is known from psychology and persuasive technology, unpredictable, variable rewards stimulate behavioral addiction. Like buttons, tagging, and notifications trigger social validation loops. So users do not stand a chance. We humans have evolved a common set of responses to certain stimuli that can be exploited by technology. “Flight or fight” is one example. When presented with visual stimuli, such as vivid colors, red is a trigger color—or a vibration agains the skin near our pocket that signals a possible enticing reward, the body responds in predictable ways, such as a faster heartbeat and the release of dopamine are meant to be momentary responses that increase the odds of survival in a life-or-death situation. Too much of this kind of stimulation is bad for all humans, but these effects are especially dangerous in children and adolescents. The first consequences include lower sleep quality, an increase in stress, anxiety, depression, and inability to concentrate, irritability, and insomnia. Some develop a fear of being separated from their phone.
Many users develop problems relating to and interacting with people. Children get hooked on games, texting, Instagram, and Snapchat that change the nature of human experience. Cyberbullying becomes easy over social media because when technology mediates human relationships, the social cues and feedback loops that might normally cause a bully to experience shunning or disgust by their peers are not present.

Adults get locked into filter bubbles. Wikipedia defines filter bubbles as “a state of intellectual isolation that can result from personalized searches when a website algorithms selectively guesses what information a user would like to see.