Posts Tagged ‘ISIS’

From Preference Bubbles to Social Inception:

December 18, 2019

The title of this post is identical to half of a title in Messing with the Enemy an excellent book by Clint Watts. The second half of the title is “The Future of Influence.” In previous posts HM has mentioned the tremendous optimism regarding the internet that was written in this blog when it began in 2009. Physical boundaries no longer mattered. People passionate about chess, cancer research, or their favorite television shows could find like-known enthusiasts around the world wanting to share their thoughts and experiences. Those under oppressive regimes, denied access to information and the outside world, could leverage the web’s anonymity to build connections, share their experiences, and hope for a better world, either at home or elsewhere. All these sources of knowledge became widely available for those with growth mindsets.

Unfortunately, hackers and cybercriminals were some of the first actors to exploit the internet in pursuit of money and fame. Hate groups and terrorists found the internet an anonymous playground for connecting with like-minded people. Even though there were only a handful, or possibly only one, extremists in any given town, but with the internet, there were now hundreds and even thousands of extremists who used only internet connections to facilitate physical massing of terrorists in global safe havens or remote compounds.

The internet provided a virtual safe haven for bin Laden’s al-Qaeda, allowing a small minority of Muslims inclined to jihadi extremism to connect with like-minded supporters. As counter terrorists searched the earth for al-Qaeda’s head shed, the internet provided enough cover, capacity and space for the terror group to survive physically by thriving virtually. Watts writes, “This made al-Qaeda bigger, but not necessarily better—more diffuse and elusive, but vulnerable to fissures and difficult to manage.

Watts writes, “My experiences with the crowd—watching the mobs that toppled dictators during the Arab Spring, the hordes that joined ISIS, the counterterrorism punditry that missed the rise of ISIS, and the political swarms duped by Russia in the 2016 presidential election—led me to believe that crowds are increasingly dumb, driven by ideology, desire, ambition, fear, and hatred, or what might collectively be referred to as “preferences.”

Social media amplifies confirmation bias through the sheer volume of content provided, assessed, and shared. And this is further amplified by interactions with their friends, family, and neighbors—people who more often than not, think like they do, speak like they do, and look like they do.

Watts writes, “Confirmation bias and implicit bias working together pull social media users into digital tribes. Individuals sacrifice their individual responsibility and initiative to the strongest voices in their preferred crowd. The digital tribe makes collective decisions based on groupthink, blocking out alternative viewpoints, new information, and ideas. Digital tribes stratify over time into political, social, religious, ethnic,and economic enclaves. Status quo bias, a preference for the current state of affairs over a change, sets into these digital tribes, such that members must mute dissent or face expulsion from the group. Confirmation, implicit, and status quo bias, on a grand social media scale, harden preference bubbles. These three world-changing phenomena build upon one another to power the disruptive content bringing about the Islamic State and now shaking Western Democracies.

Watts continues, “Clickbait populism—the promotion of popular content, opinions, and the personas that voice them—now sets the agenda and establishes the parameters for terrorism, governance, policy direction, and our future. Audiences collectively like and retweet that which conforms to their preferences. To win the crowd, leaders, candidates, and companies must play to test collective preferences.”

This clickbait populism drives another critical emerging current: social media nationalism. Each year, social media access increases and virtual bonds accelerate, digital nations increasingly form around online communities where individual users have shared preferences.

Watts writes, “Social media nationalism and clickbait populism have led to a third phenomenon that undermines the intelligence of crowds, threatening the advancement of humanity and the unity of democracies, the death of expertise. Expertise is undermined by those on the internet who ignore facts and construct alternative realities.

Consider two preference bubbles, the ISIS boys, and Trump supporters. For the ISIS boys it was more important to have a caliphate than to do it right. It was more essential to pursue extreme violence than to effectively govern.

For Trump supporters, it is more important to win than be correct, more important to be tough than compromise and move forward. They appear to be living in an alternative reality that disdains factual information. The Republican Party can be regarded as one big preference bubble. To be fair, one might argue that the Democratic Party should also be regarded as a preference bubble, but one does not find the unanimity created in a true preference bubble.

Harmony, Disharmony, and the Power of Secrets

December 15, 2019

The title of this post is identical to the title of a chapter in Messing with the Enemy, an excellent book by Clint Watts. In 2002, CIA director George Tenet created the Harmony data base as the intelligence community’s central repository for all information gathered during counterterrorism operations. This data base served as a single place for bringing together information of the conduct of the emerging field of DOMINEX (document and media exploitation). At first , the Harmony database assisted soldiers picking up clues about enemy whereabouts and communications from many different babble fields and helped support the prosecution of alleged terrorists.

A Major Steve saw al-Qaeda’s secrets from a different perspective. He focused on the strategic deliberations of terrorists, their biases and preference, expense reports, likes and dislikes, and successes and failures, as well as what they thought of one another. In sum these documents yielded insights into the group’s strategic weakness and internal fractures.

Major Steve moved to the Combat Terrorism Center at West Point, which offered an interface for the military and government to connect with top experts in new cultures, regions, languages, and politics challenging effective counters operations. Major Steve could unlock the Harmony database’s secrets, create am open-source repository for the public, and enlist highly educated military officers stationed at West Point to study and collaborate with top professors around the world. In 2005, the CTC launched the Harmony Program “to contextualize the inner-functioning of al-Qaeda, its associated movement, and other security threats through primary source documents. In addition to conducting initial research on the materials, the program aimed “to make these sources, which are captured in the course of operations in Iraq, Afghanistan and other theaters, available to other scholars for further study.

The first study was tiled Harmony and Disharmony: Exploiting al-Qaeda’s Organizational Vulnerabilities. The study reviewed the employee contracts which showed that Arab recruits were paid more than African recruits, and married volunteers with children received more benefits and vacation than single members. The report noted that ineffective terrorists, instead of being plucked off the battlefield, should not be removed from the network if they can be reliably be observed, even if they present easy targets. The report’s justifications for this recommendation were pulled from a 1999 email sent by Ayman al-Zawahiri to a Yemeni cell leader in which he scolded a subordinate, saying, “With all due respect, this is not an accounting. It’s a summary accounting. For example, you didn’t write any date, and many of the items are vague. Watts writes, “Nearly twenty years later, Zawahiri’s letter offers some insights into why terrorists in the ranks sought to defect to ISIS after bin Laden’s death: he was a stickler of a boss.”

The key recommendation from the report follows: “increase internal dissension within al-Qaeda’s leadership.” Communique’s between al-Qaeda subordinates challenged the direction put out by the group’s leaders and questioned whether orders should be obeyed. One member said that faulty leadership held the group back, asserting that bin Laden had rushed “to move without visions,” and asked Khalid Sheikh Mohammed, mastermind of the 9/11 attacks, to reject bin Laden’s orders.

Another study using the Harmony Database found that al-Qaeda, as a military organization, had never been particularly strong, and its success as a media organization masked deep internal divides between its leaders over strategic direction.

The Russians recognized that transparency movements relied on content, and compromising information seeded to WikiLeaks provided a new method for character assassination.The Russian intelligence services forged ahead compromising adversaries in cyber through the late 1990s and early 2000s. They secretly placed child pornography on the computers of defectors and intelligence officers and leaked sex videos of political opponents on the internet, creating a media feeding frenzy. Outlets like Wikileaks were a perfect vehicle for widespread character assassination of enemies worldwide, an open-source vehicle for planting information that allowed for plausible deniability.

Watts concludes this chapter as follows: “Many of the great chess masters have been Russian, and their leader, Vladimir Putin, is a lover of judo. Both require strategy, and victory routinely goes to those who employ their adversary’s strengths against them. As Putin famously demonstrated his judo skills on Youtube, Edward Snowden settled into a Kremlin-provided safe house. Julian Assange stowed away in the Ecuadorian embassy. The Kremlin trolls practiced on audiences in Ukraine and Syria, and occasionally heckled me. As for the hackers swirling around the Syrian Electronic Army, some of them went offline, busy working on a new project. And Russia’s cyber team came together for a new mission, with some new methods the world had yet to see and still doesn’t quite comprehend.”

What Has Happened to the Global Engagement Center?

December 2, 2019

This post is a follow up to the post titled “Information Wars.” A few weeks before the end of the Obama administration Congress codified the Global Engagement Center (GEC) into law in the 2017 National Defense Authorization Act. It’s mission was to “lead synchronize, and coordinate efforts of the Federal Government to recognize, understand, expose, and counter foreign state and non-state propaganda and disinformation efforts aimed at undermining the United States national security interests.”

When Secretary of State Rex Tillerson finally requested the money, it was not for more than a year. When Tillerson finally did make the request, a year and a half into the administration, he asked for half of the $80 million that Congress had authorized. The sponsors of the legislation, Senators Portman and Murphy said this delay was “indefensible” at a time when “ISIS is spreading terrorist propaganda and Russia is implementing a sophisticated disinformation campaign to undermine the United States and our allies.”

The GEC is no longer in the business of creating content to counter disinformation. It has become an entity that uses data science and analytics to measure and better understand disinformation. Over the past two years, a steady stream of people has quit or retired and the GEC has had a hard time hiring replacements.

What is especially worrisome is when one hears Republicans arguing the points of Russian disinformation, for example, that the Ukraine was involved in disrupting the 2016 election in debating the Democrats. One has to think that true Republicans like Ronald Reagan and John McCain are thrashing about in their graves.

© Douglas Griffith and healthymemory.wordpress.com, 2019. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Information Wars

December 1, 2019

The title of this post is identical to the title of an informative book by Richard Stengel, a former editor of Time magazine. During the second term of the Obama administration he was appointed and confirmed as the Under Secretary for Public Diplomacy. The book provides a detailed and interesting description of the State Department and the organization and workings of the State Department.

Stengel was appointed to lead the Center for Strategic Counterterrorism Communication. It was integral to the Global Engagement Center. This is important because information warfare is the primary means terrorist organizations fought. It was punctuated by despicable terrorist acts, but the primary messaging was done using the internet. Effective counter-messaging needed to be developed to counter the messaging of the terrorists.

Although ISIS and al-Qaeda are currently recognized as the primary terrorist organizations, it is important not to overlook the largest and most threatening terrorist organization, Russia. Our term “disinformation” is in fact an adaptation of the Russian word dezinformatsiya, which was the KGB term for black propaganda. The modern Russian notion of hybrid warfare comes from what is called the Gerasimov model. Gerasimov has appeared in previous healthy memory blog posts. He is the father of the idea that in the 21st century only a small part of war is kinetic. He has written that modern warfare is nonlinear with no clear boundary between military and nonmilitary campaigns. The Russians, like ISIS, merged their military lines of effort with their information and messaging line of effort.

In the old days, disinformation involved placing a false story (often involving forged documents) in a fairly obscure left-leaning newspaper in a country like India or Brazil; then the story was picked up and echoed in Russian state media. A more modern version of dezinformatsiya is the campaign in the 1990s that suggested that the U.S. had invented the AIDS virus as a kind of “ethnic bomb” to wipe out people of color.

Two other theorists of Russian information warfare are Igor Panarin, an academic, and a former KGB officer; and Alexander Dugin, a philosopher whom he called “Putin’s Rasputin.” Panarin sees Russia as the victim of information aggression by the United States. He believes there is a global information war between what he calls the Atlantic world, led by the U.S. and Europe; and the Eurasian world, led by Russia.

Alexander Dugan has a particularly Russian version of history. He says that the 20th century was a titanic struggle among fascism, communism, and liberalism, in which liberalism won out. He thinks that in the 21st century there will be a fourth way. Western liberalism will be replaced by a conservative superstate like Russia leading a multipolar world and defending tradition and conservative values. He predicts that the rise of conservative strongmen in the West will embrace these values. Dugan supports the rise of conservative right-wing groups all across Europe. He has formed relationships with white nationalists’ groups in America. Dugan believes immigration and racial mixing are polluting the Caucasian world. He regards rolling back immigration as one of the key tasks for conservative states. Dugan says that all truth is relative and a question of belief; that freedom and democracy are not universal values but peculiarly Western ones; and that the U.S. must be dislodged as a hyper power through the destabilization of American democracy and the encouragement of American isolationism.

Dugan says that the Russians are better at messaging than anyone, and that they’ve been working on it as a part of conventional warfare since Lenin. So the Russians have been thinking and writing about information war for decades. It is embedded in their military doctrines.

Perhaps one of the best examples of Russia’s prowess at information warfare is Russia Today, (RT). During HM’s working days his job provided the opportunity to view RT over an extensive period of time. What is most remarkable about RT is that it appears to bear no resemblance of information warfare or propaganda. It appears to be as innocuous as CNN. However, after long viewing one realizes that one is being drawn to accept the objectives of Russian information warfare.

Stengel notes that Russian propaganda taps into all the modern cognitive biases that social scientists write about: projection, mirroring, anchoring, confirmation bias. Stengel and his staff put together their own guide to Russian propaganda and disinformation, with examples.

*Accuse your adversary of exactly what you’re doing.
*Plant false flags.
*Use your adversary’s accusations against him.
*Blame America for everything!
*America blames Russia for everything!
*Repeat, repeat, repeat.

Stengel writes that what is interesting about this list is that it also seems to describe Donald Trump’s messaging tactics. He asks whether this is a coincidence, or some kind of mirroring?

Recent events have answered this question. The acceptance of the alternative reality that the Ukraine has a secret server and was the source of the 2016 election interference is Putin’s narrative developed by Russian propaganda. Remember that Putin was once a KGB agent. His ultimate success here is the acceptance of this propaganda by the Republican Party. There is an information war within the US that the US is losing.

© Douglas Griffith and healthymemory.wordpress.com, 2019. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

The Conflicts That Drive the Web and the World

January 23, 2019

This is the eleventh post in a series of posts on a book by P.W. Singer and Emerson T. Brooking titled “Likewar: The Weaponization of Social Media.” The title to this post is identical to the subtitle of the chapter titled “Likewar.” In 1990 two political scientists with the Pentagon’s think tank at the RAND Corporation started to explore the security implications of the internet. John Arquilla and David Ronfeldt made their findings public in a revolutionary article titled “Cyberwar Is Coming!” in a 1993 article. They wrote that “information is becoming a strategic resource that may prove as valuable in the post-industrial era as capital and labor have been in the industrial age.” They argued that future conflicts would not be won by physical forces, but by the availability and manipulation of information. They warned of “cyberwar,” battles in which computer hackers might remotely target economies and disable military capabilities.

They went further and predicted that cyberwar would be accompanied by netwar. They explained: It means trying to disrupt, damage, or modify what a target population “knows” or thinks it knows about itself and the world around it. A network may focus on public or elite opinion, or both. It may involve public diplomacy, measures, propaganda and psychological campaigns, political and cultural subversion, deception of or interference with the local media…In other words, netwar represents a new entry on the spectrum of conflict that spans economic, political, and social as well as military forms of ‘war.’

Early netwar became the province of far-left activists undemocratic protesters, beginning with the 1994 Zapatista uprising in Mexico and culminating in the 2011 Arab Spring. In time, terrorists and far-right extremists also began to gravitate toward net war tactics. The balance shifted for disenchanted activists when dictators learned to use the internet to strengthen their regimes. For us, the moment came when we saw how ISIS militants used the internet not just to sow terror across the globe, but to win its battles in the field. For Putin’s government it came when the Russian military reorganized itself to strike back what it perceived as a Western information offensive. For many in American politics and Silicon Valley, it came when the Russian effort poisoned the networks with a flood of disinformation, bots, and hate.

In 2011, DARPA’s research division launched the new Social Media in Strategic Communications program to study online sentiment analysis and manipulation. About the same time, the U.S. military’s Central Command began overseeing Operation Earnest Voice to fight jihadists across the Middle East by distorting Arabic social media conversations. One part of this initiative was the development of an “online persona management service,” which is essentially sockpuppet software, “to allow one U.S. serviceman or woman to control up to 10 separate identities based all over the world.” Beginning in 2014, the U.S. State Department poured vast amounts of resources into countering violent extremism (CVE) efforts, building an array of online organizations that sought to counter ISIS by launching information offensives of their own.

The authors say national militaries have reoriented themselves to fight global information conflicts, the domestic politics of these countries have also morphed to resemble netwars. The authors write, “Online, there’s little difference in the information tactics required to “win” either a violent conflict or a peaceful campaign. Often, their battles are not just indistinguishable but also directly linked in their activities (such as the alignment of Russian sockpuppets and alt-right activists). The realms of war and politics have begun to merge.”

Memes and memetic warfare also emerged. Pepe the Frog was green and a dumb internet meme. In 2015, Pepe was adopted as the banner of Trump’s vociferous online army. By 2016, he’d also become a symbol of a resurgent timed of white nationalism, declared a hate symbol by the Anti-Defamation League. Trump tweeted a picture of himself as an anthropomorphized Pepe. Pepe was ascendant by 2017. Trump supporters launched a crowdfunding campaign to elect a Pepe billboard “somewhere in the American Midwest.” On Twitter, Russia’s UK embassy used a smug Pepe to taunt the British government in the midst of a diplomatic argument.

Pepe formed an ideological bridge between trolling and the next-generation white nationalist, alt-right movement that had lined up behind Trump. The authors note that Third Reich phrases like “blood and soil” filtered through Pepe memes, fit surprisingly well with Trump’s America First, anti-immigration, anti-Islamic campaign platform. The wink and note of a cartoon frog allowed a rich, but easily deniable, symbolism.

Pepe transformed again when Trump won. Pepe became representative of a successful, hard-fought campaign—one that now controlled all the levers of government. On Inauguration Day in Washington, DC, buttons and printouts of Pepe were visible in the crowd. Online vendors began selling a hat printed in the same style as those worn by military veterans of Vietnam, Korea, and WW II. It proudly pronounced its wearer as a “Meme War Veteran.”

The problem with memes is that by highjacking or chance, a meme can come to contain vastly different ideas than those that inspired it, even as it retains all its old reach and influence. And once a meme has been so redefined, it becomes nearly impossible to reclaim. Making something go viral is hard; co-opting or poisoning something that’s already viral can be remarkable. U.S Marine Corps Major Michael Prosser published a thesis titled: “Memetics—a Growth industry in US Military Operations.. Prosser’s work kicked off a tiny DARPA-Funded industry devoted to “military memetics.”

The New Wars for Attention and Power

January 22, 2019

This is the tenth post in a series of posts on a book by P.W. Singer and Emerson T. Brooking titled “Likewar: The Weaponization of Social Media.” The title of this post is identical to the subtitle of the title “Win the Net, Win the Day” of a chapter in the book.

Brian Jenkins declared in a 1974 RAND Corporation report, “Terrorism is theater,” that became one of terrorism’s foundational studies. The difference between the effectiveness of the Islamic State and that of terror groups in the past was not the brains of the ISIS; it was the medium they were using. Mobile internet access could be found everywhere; smartphones were available in any bazaar. Advanced video and image editing tools were just one illegal download away, and an entire generation was well acquainted with their use. For those who weren’t, there were free online classes offered by a group called Jihadi Design. It promised to take ISIS supporters ‘from zero to professionalism’ in just a few sessions. The most dramatic change from terrorism was that distributing a global message was as easy as pressing ”send,” with the dispersal facilitated by a network of super-spreaders beyond any one state’s control.

ISIS networked its propaganda pushing out a staggering volume of online messages. In 2016 Charlie Winter counted nearly fifty different ISIS media hubs, each based in different regions with different target audiences, but all threaded through the internet. These hubs were able to generate over a thousand “official” ISIS releases, ranging from statements to online videos, in just a one-month period.

They spun a tale in narratives. Human minds are wired to seek and create narratives. Every moment of the day, our brains are analyzing new events and finding them in thousand of different narratives already stowed in our memories. In 1944 psychologists Fritz Heider and Marianne Simmel produced a short film that showed three geometric figures (two triangles and a circle) bouncing off each other at random. They screened the film to a group of research subjects and asked them to interpret the shapes’ actions. All but one of the subjects described these abstract objects as living beings; most saw them as representations of humans. In the shapes’ random movements they expressed motives, emotions, and complex personal histories such as: the circle was “worried,” one triangle was “innocent” and the other was “blinded by rage.” Even in crude animation all but one observer saw a story of high drama.

The first rule in building effective narratives is simplicity. In 2000, the average attention span of an internet user was measured at twelve seconds. By 2015 it had shrunk to eight seconds. During the 2016 election Carnegie Mellon University researchers studied and ranked the complexity of the candidates language (using the Flesch-Kincaid score). They found that Trump’s vocabulary measured at the lowest level of all the candidates, comprehensible to someone with a fifth-grade education. This phenomenon is consistent with a larger historic pattern. Starting with George Washington’s first inaugural address, which was one of the most complex overall, American presidents communicated at a college level only when newspapers dominated mass communication. But each time a new technology took hold, complexity dropped. The authors write, “To put it another way: the more accessible the technology, the simpler a winning voice becomes. It may be Sad! But it is True!

The second rule of narrative is resonance. Nearly all effective narratives conform to what social scientists call “frames.” Frames are proud of specific languages and cultures that feel instantly and deeply familiar. To learn more about frames enter “frames” into the search block of the healthy memory blog.

The third and final rule of narrative is novelty. Just as narrative frames help build resonance, they also serve to make things predictable. However, too much predictability can be boring, especially in an age of microscopic attention spans and unlimited entertainment. Moreover, there seems to be no limit on the quality of narrative. Some messages far exceed the limits of credibility, yet they are believed and spread.

Additional guidelines are pull the heartstrings and feed the fury. Final guidance would be inundation: drown the web, run the world.

Flynn

January 19, 2019

This is the seventh post in a series of posts on a book by P.W. Singer and Emerson T. Brooking titled “Likewar: The Weaponization of Social Media. A former director of the U.S. Defense Intelligence Agency (DIA) said,”The exponential explosion of publicly available information is changing the global intelligence system…It’s changing how we tool, how we organize, how we institutionalize—everything we do.” This is how he explained to the authors how the people who once owned and collected secrets—professional spies—were adjusting to this world without secrets.

U.S. intelligence agencies collected open source intelligence (OSINT) on a massive scale through much of the Cold War. The U.S. embassy in Moscow collected OSINT on a massive scale. The U.S. embassy in Moscow maintained subscriptions to over a thousand Soviet journals and magazines, while the Foreign Broadcast Monitoring Service (FBIS) stretched across 19 regional bureaus, monitoring more than 3,500 publications in 55 languages, as well as nearly a thousand hours of television each week. Eventually FBIS was undone by the sheer volume of OSINT the internet produced. In 1993, FBIS was creating 17,000 reports a month; by 2004 that number had risen to 50,000. In 2005 FBIS was shuttered. The former director of DIA said, Publicly available information is now probably the greatest means of intelligence that we could bring to bear. Whether you’re a CEO, a commander in chief, or a military commander, if you don’t have a social media component…you’re going to fail.”

Michael Thomas Flynn was made the director of intelligence for the task force that deployed to Afghanistan. Then he assumed the same role for the Joint Special Operations Command (JSOC), the secretive organization of elite units like the bin Laden-killing navy SEAL team. He made the commandos into “net fishermen” who eschewed individual nodes and focused instead on taking down he entire network, hitting it before it could react and reconstitute itself. JSOC got better as Flynn’s methods evolved capturing or killing dozens of terrorists in a single operation, gathering up intelligence, and then blasting off to hit another target before the night was done. The authors write, “Eventually, the shattered remnants of AQI would flee Iraq for Syria, where they would ironically later reorganize themselves as the core of ISIS.

Eventually the Peter Principle prevailed. The Peter Principle is that people rise in an organization until they reach their level of incompetence. The directorship of DIA was that level for Flynn. Flynn was forced to retire after 33 years of service. Flynn didn’t take his dismissal well . He became a professional critic of the Obama administration, which brought him to the attention of Donald Trump. He used his personal Twitter account to push out messages of hate (Fear of Muslims is RATIONAL). He put out one wild conspiracy theory after another. His postings alleged that Obam wasn’t just a secret Muslim, but a “jihadi” who “laundered” money for terrorists, and that if Hillary Clinton won the election she would help erect a one-world government to outlaw Christianity (notwithstanding that Hillary Clinton was and is a Christian). He also claimed that Hillary was involved in “Sex Crimes w Children. This resulted in someone going into a Pizzeria, the supposed locus of these sex crimes with children, and shooting it up. He was charged by the FBI for lying about his contact with a Russian official. This was based on a recorded phone conversation. This was a singularly dumb mistake for a former intelligence officer

Crowdsourcing

January 18, 2019

This is the sixth post in a series of posts on a book by P.W. Singer and Emerson T. Brooking titled “Likewar: The Weaponization of Social Media. The terrorist attack on Mumbai opened up all the resources of the internet using Twitter to defend against the attack. When the smoke cleared, the Mumbai attack left several legacies. It was a searing tragedy visited upon hundreds of families. It brought two nuclear powers to the brink of war. It foreshadowed a major technological shift. Hundreds of witnesses—some on-site, some from afar—had generated a volume of information that previously would have taken months of diligent reporting to assemble. By stitching these individual accounts together, the online community had woven seemingly disparate bits of data into a cohesive whole. The authors write, “It was like watching the growing synaptic connections of a giant electric brain.”

This Mumbai operation was a realization of “crowdsourcing,” an idea that had been on the lips of Silicon Valley evangelists for years. It had originally been conceived as a new way to outsource programming jobs, the internet bringing people together to work collectively, more quickly and cheaply than ever before. As social media use had sky rocketed, the promise of had extended a space beyond business.

Crowdsourcing is about redistributing power-vesting the many with a degree of influence once reserved for the few. Crowdsourcing might be about raising awareness, or about money (also known as “crowdfunding.”) It can kick-start a new business or throw support to people who might have remained little known. It was through crowdsourcing that Bernie Sanders became a fundraising juggernaut in the 2016 presidential election, raking in $218 million online.

For the Syrian civil war and the rise of ISIS, the internet was the “preferred arena for fundraising.” Besides allowing wide geographic reach, it expands the circle of fundraisers, seemingly linking even the smallest donor with their gift on a personal level. The “Economist” explained, this was, in fact, one of the key factors that fueled the years-long Syrian civil war. Fighters sourced needed funds by learning “to crowd fund their war by using Instagram, Facebook and YouTube. In exchange for a sense of what the war was really like, the fighters asked for donations via PayPal. In effect, they sold their war online.”

In 2016 a hard-line Iraqi militia took to Instagram to brag about capturing a suspected ISIS fighter. The militia then invited its 75,000 online fans to vote on whether to kill or release him. Eager, violent comments rolled in from around the world, including many from the United States. Two hours later, a member of the militia posted a follow-up selfie; the body of the prisoner lay in a pool of blood behind him. The caption read, “Thanks for the vote.” In the words of Adam Lineman, a blogger and U.S. Army veteran, this represented a bizarre evolution in warfare: “A guy on the toilet in Omaha, Nebraska could emerge from the bathroom with the blood of some 18-year-old Syrian on his hands.”

Of course, crowdsourcing can be used for good as well as for evil.

Sharing

January 17, 2019

This is the fifth post in a series of posts on a book by P.W. Singer and Emerson T. Brooking titled “Likewar: The Weaponization of Social Media.” The authors blame sharing on Facebook rolling out a design update that included a small text box that asked the simple question: “What’s on your mind?” Since then, the “status update” has allowed people to use social media to share anything and everything about their lives they want to, from musings and geotagged photos to live video and augmented-reality stickers.

The authors continue, “The result is that we are now our own worst mythological monster—not just watchers but chronic over-sharers. We post on everything from events small (your grocery list) to momentous (the birth of a child, which one of us actually live-tweeted). The exemplar of this is the “selfie,” a picture taken of yourself and shared as widely as possible online. At the current pace, the average American millennial will take around 26,000 selfies in their lifetime. Fighter pilots take selfies during combat missions. Refugees take selfies to celebrate making it to safety. In 2016, one victim of an airplane hijacking scored the ultimate millennial coup: taking a selfie with his hijacker.”

Not only are these postings revelatory of our personal experiences, but they also convey the weightiest issues of public policy. The first sitting world leader to use social media was Canadian prime minister Stephen Harper in 2008, followed by U.S. President Barack Obama. A decade later, the leaders of 178 countries had joined in, including former Iranian president Mahmoud Ahmadinejad, who banned Twitter during a brutal crackdown, has changed his mind on the morality—and utility—of social media. He debuted online with a friendly English-language video as he stood next to the Iranian flag. He tweeted, “Let’s all love each other.”

Not just world leaders, but agencies at every level and in every type of government now share their own news, from some 4,000 national embassies to the fifth-grade student council of the Upper Greenwood Lake Elementary school. When the U.S. military’s Central Command expanded Operation Inherent Resolve against ISIS in 2016, Twitter users could follow along directly via the hashtag #TALKOIR.

Nothing actually disappears online. The data builds and builds and could reemerge at any moment. Law professor Jeffrey Rosen said that the social media revolution has essentially marked “the end of forgetting.”

The massive accumulation of all this information leads to revelations of its own. Perhaps the clearest example of this phenomenon is the first president to have used social media before running for office. Being both a television celebrity and a social media addict, Donald Trump entered politics with a vast digital trail behind him. The Internet Archive has a fully perusable, downloadable collection of more than a thousand hours of Trump-related video, and his Twitter account has generated around 40,000 messages. Never has a president shared so much of himself—not just words but even neuroses and particular psychological tics—for all the world to see. Trump is a man—the most powerful in the world—whose very essence has been imprinted on the internet. Know this one wonders how such a man could be elected President by the Electoral College.

Tom Nichols is a professor at the U.S. Naval War College who worked with the intelligence community during the Cold War explained the unprecedented value of this vault of information: “It’s something you never want the enemy to know. And yet it’s all out there…It’s also a window into how the President processes information—or how he doesn’t process information he doesn’t like. Solid gold info.” Reportedly Russian intelligence services came to the same conclusion, using Trump’s Twitter account as the basis on which to build a psychological profile of Trump.

How to Convert Terrorists

November 16, 2017

This post is based in part on a Feature Article in the19 August 2017 issue of the New Scientist titled, “Anatomy of terror: What makes normal people extremists?” by Peter Byrne. Anthropologist Scott Atran of the University of Oxford’s Centre for Resolution of Intractable Conflicts asks the question, “What makes someone prepared to die for an idea? He suggests that the answer comes in two parts. Jihadists fuse their individual identity with that of the group, and they adhere to “sacred values.” He writes that sacred values are values that cannot be abandoned or exchanged for material gain. They tend to be associated with strong emotions and are often religious in nature, but beliefs held by nationalists and secularists may earn the label too.

Atran argues that individuals in this state are best understood, not as rational actors but as “devoted actors.” “Once they’re locked in as a devoted actor, none of the classic interventions seem to work. However, there can be openings. Although a sacred value cannot be abandoned it can be reinterpreted. Atran relates the case of an imam he interviewed who had worked for ISIS as a recruiter, but had left because he disagreed with their definition of jihad. For him, but not for them, jihadism could accommodate persuasion by non-violent means. As long as alternative interpretations are seen as coming from inside the group, they can be persuasive within it. Atran is now advising the US, UK, and French governments on the dynamics of jihadist networks to help them deal with terrorism.

Atran says that the key to combating extremism lies in addressing its social roots, and intervening early before anyone becomes a “devoted actor.” Until then there are all sorts of things that can be done. He says that one of the most effective countermeasures is community engagement. High-school football and the scouts movement have been effective responses to antisocial behavior among the disenfranchised children of US immigrants, for example.

Perspectives need to be changed. Tania Singer of the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany thinks brain training could achieve similar effects. Neuroscientists have identified two pathways in the brain by which we relate to others. One mobilizes empathy and compassion, allowing us to share another person’s emotions. The second activates theory of mind, enabling us to see a situation from the other’s perspective. Her group recently completed a project called ReSource in which 300 volunteers spent nine months doing training first on mindfulness, and then on compassion and perspective training, and corresponding structural brain change were detectable in MRI scans.

Tania Singer notes that compassion evolved as part of an ancient nurturing instinct that is usually reserved for kin. To extend it to strangers, who may see the world differently from us, we need to add theory of mind. The full results from ReSource aren’t yet published, but Singer expects to see brain changes associated with perspective-taking training. She says that “only if you have both pathways working together in a coordinated fashion can you really move towards global cooperation.” By incorporating that training into school curricula, she suggests, we could build a more cohesive, cooperative society that is more resilient to extremism. To all of this, healthy memory say “Amen.’

Previous healthy memory posts have argued that had the prisoners held at Guantanomo been treated differently, an understanding could have been developed that would provide the basis for a new and more compelling narrative for these supposed terrorists. Once they had been converted, mindfulness training such as that in the ReSource program might have been highly effective.

© Douglas Griffith and healthymemory.wordpress.com, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Web of Lies

May 1, 2016

“Web of lies:  Is the Internet making a world without truth” is an article by Chris Baranluk in the Feb 20-26, 2016 edition of the New Scientist.  The World Economic Forum ranks massive digital misinformation as a geopolitical risk alongside terrorism.  This problem is especially pernicious as misinformation is very difficult to correct (enter “misinformation” into the healthy memory search block to see relevant posts).  Bruce Schneider, a director of the Electronic Frontier Foundation, says that we’re entering an era of unprecedented psychological manipulation.

Walter Quattrociocchi at the IMT Institute for Advanced Studies in Lucca, Italy, along with his colleagues looked at how different types of information are spread on Facebook by different communities.  They analyzed two groups:  those who shared conspiracy theories and those who shared science news articles.  They found that science stories received an initial spike of interest and were shared or “liked” frequently.  Conspiracy theories started with a low level of interest, but sometimes grew to be even more important than the science stories overall.  Both groups tended to ignore information that challenged they views.  Confirmation bias leads to an echo chamber.  Information that does not fit with an individual’s world view does not get passed on.  On social networks, people true their peers and use them as their primary information sources.   Quattrociocchi  says “The role of the expert is going to disappear.”

DARPA, a research agency for the U.S. Military,  is funding a Social Media in Strategic Communication Program, which funds dozens of studies looking at everything from subtle linguistic cues in specific posts to how information flows across large networks.
DARPA has also sponsored a challenge to design bots that can sniff out misinformation deliberately planted on Twitter.

Ultimately the aim of this research is to find ways to identify misinformation and effectively counter it, reducing the ability of groups like ISIS to manipulate events.  Jonathan Russell, head of policy at counter-terrorism think tank Quilliam in London says, “They have managed to digitize propaganda in a way that is completely understanding of social media and how it’s used.  Russell says that a lack of other voices also gives the impression that they are winning.  There’s no other  effective media coming out of Iraq and Syria.  Think tank Quilliam has attempted to counter such narratives with videos like “Not Another Brother,” which depicts a jihadist recruit in desperate circumstances.  It aims to show how easily people can be seduced by exposure to a narrow view of the world.

This research is key.  Information warfare will play an increasingly larger percentage of warfare than kinetic effects.

Pangiotis Metasxes of Wellsley College believes that we have entered a new ea in which the definition of literacy needs to be updated.  “In the past to be literate you needed to know reading and writing.  Today, these two are not enough.  Information reaches us from a vast number of sources.  We need to learn what to read, as well as how.”

Cognitive Misers and Democracy

February 17, 2016

Cognitive misers are people who do not like to exert the effort involved in thinking.   In addition to entering “cognitive misers” into the healtymemory search block, you can also enter “System 1” or “Kahneman.”  Cognitive misers like to believe in things because questioning beliefs or principles or learning new things involves cognitive effort and thinking.

A short while back I read a poll that I found extremely discouraging.  The question asked what was more important to voters, a politician’s willingness to compromise or to  principles.
Here is a breakdown of the responses by political party.  Note that they do not add up to 100% as some respondents refused to answer.

Group                   Principles        Willing to Compromise
All Voters             40%                  50%
Republicans        54%                   36%
Independents     40%                  47%
Democrats           23%                  68%

I guess that the good news is that with the exception of one group, the remaining groups a larger percentage indicated a Willingness to Compromise.  In only one group did this percentage reach 50% and only one other group indicated a slightly greater than a two to one preference.  If the results are representative, then I argue that these beliefs present a far greater existential threat to the Democracy in the United States than does ISIS.

Before addressing cognitive miserliness per se, let me remind readers what a democracy is supposed to be..  A democracy is a system in which people vote for candidates and the candidates try to vote for what they think are the correct policies, but negotiate when the need to get the most palatable policy that they can accept.  There will be times when the vote goes against them, but they accept the result.  They do not threaten to shut down the government or actually shut down the government.  As you know this has already happened at least twice.

It is unfortunate that “politician” has negative connotations.  Using “politician” in a pejorative sense, “he’s a politician,” or he is doing this for “political reasons” is both unfair and wrong.  The first requirement of a politician is to make the political system work.  Sometimes that might correspond to political beliefs, sometimes it will not.  But beliefs or principals should not be the driving factor.

The advancement of mankind has been in direct proportion to the advancement of science.  Key to science is thinking.  Cognitive miserliness is anathema to effective science.  Whatever beliefs science has are beliefs that are subject to change.  It that is not the case, then the enterprise is not science.  There have been enormous changes in science during my lifetime.  There is not a single subject matter that has not changed.  Until fairly recently science believed that humans could not generate new neurons.  In other words there was no such think as neurogenesis.  Had I argued to the contrary as a graduate student I would have quickly been booted out of graduate school.  It was not until close to the end of the 20th century that neurogenesis was accepted and the notion of neuroplasticity  was advanced.

I become particularly annoyed when I hear reporters accuse politicians of flip flopping.  It seems like this is the stock in trade for many reporters.  This reminds me of the response the eminent economist John Maynard Keynes gave when he was accused of a statement that was in conflict with previous comments.  He responded,”when the facts change, I change my mind.  What do you do, sir.”  An argument can be made that opinions are not being changed by facts, but by political considerations.  Here I would refer you to the remedial exposition on democracy I offered above.

I also argue that cognitive miserliness is a problem for the Supreme Court of the United States.  There are two views of the Constitution.  One is that it is supposed to be a dynamic document that has been written that is expected to change with the times.  The other, originalism, is that the Constitution needs to be interpreted in terms of what the authors intended.  We need to remember that when the Constitution was written, slavery existed, black people were counted as three-fifths of a human being, and women could not vote.  It should also be remembered that one of the most advanced scientists of the time, Benjamin Franklin, did not know what current high school physics students know.  Moreover, I am virtually certain that if the framers of the constitution knew what we do today, they would have written a different constitution.  I am upset when the Supreme Court Justice who recently passed away is described as having a brilliant mind.  He was an originalist.  He believed that what the framers of the constitution believed at that time should provide the basis of judicial decisions.  I regard such individuals as intellectual runts.

The results of cognitive miserliness are readily apparent in the United States.  Realize that the United States is the only advanced country that does not have a system of national health insurance.  What we do have is the country with the most expensive medical costs with results comparable to third world countries.  We are the only advanced country that has no control over the cost of prescription medications.  And we are the only country that has a major political party that refuses to believe in global warming.  We also have a major TV network that insists on always having a denier of global warming on a show where a scientist is presenting data bearing on global warming and its ramifications.  This is in spite of the fact that this is a small minority of scientists, some of whom are paid scientific guns to counter the overwhelming evidence.

The reason that is often presented is one of American Exceptionalism.  This exceptionalism is a product of cognitive miserliness.

© Douglas Griffith and healthymemory.wordpress.com, 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.

Syndrome E

November 27, 2015

In the recent healthymemory blog post, “A Single Shifting Mega-Organism,” Syndrome E (E stands for evil) was briefly discussed.  Syndrome E was developed to describe the atrocities, mass-killings, genocides such as the holocaust and the killings by ISIS.  The neurosurgeon Itzhak Fried describes these atrocities as examples of Syndrome E.   He defined the following seven symptoms of Syndrome E:

Compulsive repetitive violence
Obsessive beliefs
Rapid desensitisation to violence
Flat emotional state
Separation of violence from everyday activities
Obedience to an authority
Perceiving group members as virtuous

Having decided that neuroscience has come a long way since his original paper in 1997 (Syndrome E in The Lancet, Volume 150, No. 9094, p1845-1847) Fried  organized a conference in Paris earlier this year to revisit the concept.  Highlights of this conference were published in the New Scientist, November 14-20, 2015 in a feature by Laura Spinney.

Fried’s theory starts with the assumption that people normally have an aversion to harming others.  If this is correct, the higher brain overrides this instinct in people with Syndrome E.  So how might this occur.

The lateral regions of the prefrontal cortex (PFC) are sensitive to rules from the newer parts of the brain.  The medial region of the PFC receives information from the limbic system, a primitive part of the brain that processes emotional states and is sensitive to our innate to preferences.  An experiment using brain scanning was designed to put these two parts of the brain in conflict.  Both these parts of the PFC were observed to light up.  People followed the rule but still considered their personal preference showing that activity in the lateral PFC overrode the personal preference.  The idea here is in the normal brain the higher brain overrides signals coming from the primitive brain.  However, in the pathological brain with Syndrome E, the primitive brain prevails.

Fried suggests that people experience a visceral reaction when they kill for the first time, but some become rapidly desensitized.  And the primary instinct not to harm may become more easily overcome when people are “just following orders.”  Unpublished research using brain scans has shown that coercion makes us feel less responsible for our actions.  Although coercion can cause people to take extraordinarily actions (see the healthy memory blog post “Good vs. Evil”), there are individuals who are predisposed to violence who are just awaiting an opportunity.

Unfortunately, the question remains as to how to prevent people from joining such radicalized groups.  Research in this area is just beginning and much more needs to be done (See the healthy memory blog post,”Why DARPA is studying stories”). Being a neuroscientist, it is not surprising that Fried thinks  that we should use our growing neuroscientific knowledge to identify radicalization early, isolate those affected and help them change.  We wish him, and hopefully many others in this effort.

What is not mentioned in this article is that it can be advantageous for one group to adopt Syndrome E to take from or to take advantage of another group.  Consider North America.  Syndrome E was involved in vacating Native American lands for Europeans.  Moreover, up until the Civil War, blacks were enslaved and slavery was a key component of the economy of the United States.  I sometimes ponder how would North America been settled by Europeans had we the moral and ethical standards of today.

© Douglas Griffith and healthymemory.wordpress.com, 2015. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and healthymemory.wordpress.com with appropriate and specific direction to the original content.