Posts Tagged ‘DARPA’

The Conflicts That Drive the Web and the World

January 23, 2019

This is the eleventh post in a series of posts on a book by P.W. Singer and Emerson T. Brooking titled “Likewar: The Weaponization of Social Media.” The title to this post is identical to the subtitle of the chapter titled “Likewar.” In 1990 two political scientists with the Pentagon’s think tank at the RAND Corporation started to explore the security implications of the internet. John Arquilla and David Ronfeldt made their findings public in a revolutionary article titled “Cyberwar Is Coming!” in a 1993 article. They wrote that “information is becoming a strategic resource that may prove as valuable in the post-industrial era as capital and labor have been in the industrial age.” They argued that future conflicts would not be won by physical forces, but by the availability and manipulation of information. They warned of “cyberwar,” battles in which computer hackers might remotely target economies and disable military capabilities.

They went further and predicted that cyberwar would be accompanied by netwar. They explained: It means trying to disrupt, damage, or modify what a target population “knows” or thinks it knows about itself and the world around it. A network may focus on public or elite opinion, or both. It may involve public diplomacy, measures, propaganda and psychological campaigns, political and cultural subversion, deception of or interference with the local media…In other words, netwar represents a new entry on the spectrum of conflict that spans economic, political, and social as well as military forms of ‘war.’

Early netwar became the province of far-left activists undemocratic protesters, beginning with the 1994 Zapatista uprising in Mexico and culminating in the 2011 Arab Spring. In time, terrorists and far-right extremists also began to gravitate toward net war tactics. The balance shifted for disenchanted activists when dictators learned to use the internet to strengthen their regimes. For us, the moment came when we saw how ISIS militants used the internet not just to sow terror across the globe, but to win its battles in the field. For Putin’s government it came when the Russian military reorganized itself to strike back what it perceived as a Western information offensive. For many in American politics and Silicon Valley, it came when the Russian effort poisoned the networks with a flood of disinformation, bots, and hate.

In 2011, DARPA’s research division launched the new Social Media in Strategic Communications program to study online sentiment analysis and manipulation. About the same time, the U.S. military’s Central Command began overseeing Operation Earnest Voice to fight jihadists across the Middle East by distorting Arabic social media conversations. One part of this initiative was the development of an “online persona management service,” which is essentially sockpuppet software, “to allow one U.S. serviceman or woman to control up to 10 separate identities based all over the world.” Beginning in 2014, the U.S. State Department poured vast amounts of resources into countering violent extremism (CVE) efforts, building an array of online organizations that sought to counter ISIS by launching information offensives of their own.

The authors say national militaries have reoriented themselves to fight global information conflicts, the domestic politics of these countries have also morphed to resemble netwars. The authors write, “Online, there’s little difference in the information tactics required to “win” either a violent conflict or a peaceful campaign. Often, their battles are not just indistinguishable but also directly linked in their activities (such as the alignment of Russian sockpuppets and alt-right activists). The realms of war and politics have begun to merge.”

Memes and memetic warfare also emerged. Pepe the Frog was green and a dumb internet meme. In 2015, Pepe was adopted as the banner of Trump’s vociferous online army. By 2016, he’d also become a symbol of a resurgent timed of white nationalism, declared a hate symbol by the Anti-Defamation League. Trump tweeted a picture of himself as an anthropomorphized Pepe. Pepe was ascendant by 2017. Trump supporters launched a crowdfunding campaign to elect a Pepe billboard “somewhere in the American Midwest.” On Twitter, Russia’s UK embassy used a smug Pepe to taunt the British government in the midst of a diplomatic argument.

Pepe formed an ideological bridge between trolling and the next-generation white nationalist, alt-right movement that had lined up behind Trump. The authors note that Third Reich phrases like “blood and soil” filtered through Pepe memes, fit surprisingly well with Trump’s America First, anti-immigration, anti-Islamic campaign platform. The wink and note of a cartoon frog allowed a rich, but easily deniable, symbolism.

Pepe transformed again when Trump won. Pepe became representative of a successful, hard-fought campaign—one that now controlled all the levers of government. On Inauguration Day in Washington, DC, buttons and printouts of Pepe were visible in the crowd. Online vendors began selling a hat printed in the same style as those worn by military veterans of Vietnam, Korea, and WW II. It proudly pronounced its wearer as a “Meme War Veteran.”

The problem with memes is that by highjacking or chance, a meme can come to contain vastly different ideas than those that inspired it, even as it retains all its old reach and influence. And once a meme has been so redefined, it becomes nearly impossible to reclaim. Making something go viral is hard; co-opting or poisoning something that’s already viral can be remarkable. U.S Marine Corps Major Michael Prosser published a thesis titled: “Memetics—a Growth industry in US Military Operations.. Prosser’s work kicked off a tiny DARPA-Funded industry devoted to “military memetics.”

Could Sputnik be Responsible for the Internet?

January 14, 2019

This is the second post in a series of posts on a book by P.W. Singer and Emerson T. Brooking titled “Likewar: The Weaponization of Social Media.” Probably most readers are wondering what is or was Sputnik? Sputnik was the first space satellite to orbit the earth. It was launched by the Soviet Union. The United States was desperately trying to launch such a satellite, but was yet to do so. A young HM appeared as part of a team of elementary school presenters on educational TV that made a presentation on Sputnik and on the plans of the United States to launch such a satellite. The young version of HM explained the plans for the rocket to launch a satellite. Unfortunately, the model briefed by HM failed repeatedly, and a different rocket was needed for the successful launch.

The successful launch of Sputnik created panic in the United States about how far we were behind the Russians. Money was poured into scientific and engineering research and into the education of young scientists and engineers. HM personally benefited from this generosity as it furthered his undergraduate and graduate education.

Licklider and Taylor the authors of the seminal paper, “The Computer as a Communication Device” were employees of the Pentagon’s Defense Advanced Research Project Agency (DARPA). An internetted communications system was important for the U.S. military was that it would banish its greatest nightmare: the prospect of the Soviet Union being able to decapitate U.S. command and control with a single nuclear strike. But the selling point for the scientists working for DARPA was that linking up computers would be a useful way to share what was at the time incredibly rare and costly computer time. A network could spread the load and make it easier on everyone. So a project was funded to transform the Intergalactic Computer Network into reality. It was called ARPANET.

It is interesting to speculate what would have been developed in the absence of the Soviet threat. It is difficult to think that this would have been done by private industry.
Perhaps it is a poor commentary on homo sapiens, but it seems that many, if not most, technological advances have been developed primarily for warfare and defense.

It is also ironic to think that technology developed to thwart the Soviet Union would be used by Russia to interfere in American elections to insure that their chosen candidate for President was elected.

© Douglas Griffith and, 2019. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.

Web of Lies

May 1, 2016

“Web of lies:  Is the Internet making a world without truth” is an article by Chris Baranluk in the Feb 20-26, 2016 edition of the New Scientist.  The World Economic Forum ranks massive digital misinformation as a geopolitical risk alongside terrorism.  This problem is especially pernicious as misinformation is very difficult to correct (enter “misinformation” into the healthy memory search block to see relevant posts).  Bruce Schneider, a director of the Electronic Frontier Foundation, says that we’re entering an era of unprecedented psychological manipulation.

Walter Quattrociocchi at the IMT Institute for Advanced Studies in Lucca, Italy, along with his colleagues looked at how different types of information are spread on Facebook by different communities.  They analyzed two groups:  those who shared conspiracy theories and those who shared science news articles.  They found that science stories received an initial spike of interest and were shared or “liked” frequently.  Conspiracy theories started with a low level of interest, but sometimes grew to be even more important than the science stories overall.  Both groups tended to ignore information that challenged they views.  Confirmation bias leads to an echo chamber.  Information that does not fit with an individual’s world view does not get passed on.  On social networks, people true their peers and use them as their primary information sources.   Quattrociocchi  says “The role of the expert is going to disappear.”

DARPA, a research agency for the U.S. Military,  is funding a Social Media in Strategic Communication Program, which funds dozens of studies looking at everything from subtle linguistic cues in specific posts to how information flows across large networks.
DARPA has also sponsored a challenge to design bots that can sniff out misinformation deliberately planted on Twitter.

Ultimately the aim of this research is to find ways to identify misinformation and effectively counter it, reducing the ability of groups like ISIS to manipulate events.  Jonathan Russell, head of policy at counter-terrorism think tank Quilliam in London says, “They have managed to digitize propaganda in a way that is completely understanding of social media and how it’s used.  Russell says that a lack of other voices also gives the impression that they are winning.  There’s no other  effective media coming out of Iraq and Syria.  Think tank Quilliam has attempted to counter such narratives with videos like “Not Another Brother,” which depicts a jihadist recruit in desperate circumstances.  It aims to show how easily people can be seduced by exposure to a narrow view of the world.

This research is key.  Information warfare will play an increasingly larger percentage of warfare than kinetic effects.

Pangiotis Metasxes of Wellsley College believes that we have entered a new ea in which the definition of literacy needs to be updated.  “In the past to be literate you needed to know reading and writing.  Today, these two are not enough.  Information reaches us from a vast number of sources.  We need to learn what to read, as well as how.”

Why DARPA is Studying Stories

October 3, 2015

Why DARPA is studying stories is the title of another section in Humans are Underrated:  What High Achievers Know That Brilliant Machines Never Will  by Geoff Colvin.  DARPA stands for the Defense Advanced Projects Agency.  At time this has been called ARPA, by simply dropping the D.  But regardless of the acronym, it has been sponsoring  advanced research.  The internet was developed from research sponsored by DARPA, as was GPS.

The U.S. Defense establishment is convinced that stories are at the foundation of today’s security environment that it has established a program called Narrative Networks DARPA.  The program asks “Why are some narrative themes successful at building support for terrorism?”  The Narrative Networks program aims to understand how these stories contribute to radicalization, violent social mobilization, insurgency, and terrorism among populations.

Given that we can now destroy civilization several times over with Nuclear Weapons, it appears that we can already achieved the maximum in kinetic effects.  But now our security is jeopardized by narratives.  We need to know how to counter and neutralize these narratives.

A tremendous resource we had to conduct research on this problem has been overlooked, and that is the large population of terrorists imprisoned in Guantanamo.  This might be an overstatement as we cannot confidently say that everyone imprisoned is a terrorist as many have been languishing in prison without being tried.  Some might even die having been falsely charged.

This population should have been used to develop and test different narratives with respect to their effectiveness.  If it appeared that certain narratives had been effective for certain inmates, then the ultimate test would have been done by releasing them.  True this is risky, but what right do we have to keep people imprisoned indefinitely without trial?  If we saw that certain narratives were effective, then perhaps a more general effective campaign could be developed.  This would be an effective war on terrorism, which is what we want to develop.  The term “War on Terror” is nonsensical.  Terror is a tactic of warfare.  It is analogous to saying war on tactical dogfights, or war on amphibious warfare.

© Douglas Griffith and, 2015. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.

Can Social Networking Make It Easier to Solve Real-World Problems?

September 23, 2012

An article in The Economist1 raised this question. According to an article in 2011, Facebook analysed 72 million users of its social networking site and found that an average of 4.7 hops could link any two of them via mutual friends. This is even less that the Six Degrees of Separation popularized by John Guare in his play by the same name.

In the United States the Defense Advanced Research Projects Agency (DARPA) staged the Red Balloon Challenge in 2009. It was trying to determine how quickly and efficiently information could be gathered using social media. Competitors were to race to find ten red weather balloons that had been tethered at random locations throughout the United States for a $40,000 prize. MIT had the winning team that found all ten balloons in nine hours using the following incentive-based system to encourage participation. The first person to send the correct coordinates of a balloon received $2,000. Whoever recruited that person received $1,000, and the recruiters recruiter received $500, and so forth and so forth.

DARPA staged a new challenge this year, the Tag Challenge. This time the goal was to locate and photograph five people each wearing unique T-shirts in five named cities across two continents. All five had to be identified within 12 hours from nothing more than a mugshot. The prize fund was $5,000. This time none of the teams managed to find all five targets. However, one team with members from MIT,the universities of Edinburgh and Southampton, and the University of California at San Diego did manage to fine three, one in each of the following cities, New York, Washington DC, and Bratislava. This team had a website and a mobile app to make it easier to report findings and to recruit people. Each finder was offered $500 and whoever recruited the finder $100. So anyone who did not know anyone in one of the target cities had no incentive to recruit someone who did. The team promoted itself on Facebook and Twitter. Nevertheless, most participants just used conventional email. It was conjectured that in the future smart phones might have an app that can query people all over the world, who can then steer the query towards people with the right information.

To return to the title of this post, Can Social Networking Make It Easier to Solve Real-World Problems, I would conclude, if the social problem involves finding someone or something, the answer would be yes. But I think that real-world problems typically involve collaboration of diverse people. In this respect one might argue that social media are actually a detriment to solving real world problems. Social media are good at bringing people of like minds together about something. If what is needed is collaboration among people of diverse opinions, this would not seem productive, and might very likely be counterproductive.

However, there still might be solutions using technology. Wikis provide a useful tool for collaboration. Another approach would having people of relevant, but diverse perspective could interact with each other anonymously using computers. Physical cues and identities would be absent. This would negate or minimize ego or group involvement and would be an exchange of information and ideas with the goal of arriving at a viable consensus. The number of people who can collaborate at a given time appears to be a constraint.

1Six Degrees of Mobilization, The Economist Technology Quarterly, September 2012, p.8.

© Douglas Griffith and, 2012. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.