Deception & Misdirection

Anti-deception and collective intelligence: Dan Rather, George W. Bush, and how the Internet makes deception more difficult (part 2)


Continuing our series on deception in politics and public policy.

Anti-deception and collective intelligence: How the Internet makes deception more difficult (part 2)

By Dr. Steven J. Allen (JD, PhD)

In last week’s installment, we looked at the effort to use “crowdsourcing” to catch the Boston Marathon bombers, at the system called “reCAPTCHA” that uses the work of millions of people to decipher words from old books and newspapers, and at the collective effort that caught CBS anchor Dan Rather when he tried to use fake National Guard documents to defect President George W. Bush in the 2004 election. We cited the Rather case as an example of using the Internet to expose deception.

This week, let’s look at the origins of the Internet and how it relates to the concept of collective intelligence—what columnist and author James Surowiecki calls “the wisdom of crowds.”

The origin of the Internet and the blogosphere

Some historians trace the theoretical origin of the Internet to a July 1945 essay by Vannevar Bush, a science advisor to FDR, in which Bush proposed the creation of an easily accessible storehouse of knowledge with “trails” linking topics.  Twelve years later, after the Soviets appeared to take the lead in the space race, the Advanced Research Projects Agency was created to restore American preeminence in technology. ARPA, part of the Defense Department, lost control of the space program to the civilian NASA and floundered for a time as it sought new direction. One project that didn’t seem very important at the time was to link university computers in order to reduce the time the expensive machines spent idle. At the beginning, the network linked computers at UCLA, Stanford, UC-Santa Barbara, and the University of Utah.

Later, more and more people, mostly scientists, got on the network and began to use it to send messages (the first, apparently, relating to a razor a scientist left behind after attending a conference). By the early ’70s, electronic mail was the main traffic on the network. It took about the same amount of effort to send e-mail to one person or a large number of people, so e-mail was used increasingly for large groups such as “SF [science fiction] Lovers” to communicate.

On this network, information was disassembled and sent in what scientists called “packets” along a system of multiple paths between any two points, with no central control that could be destroyed or damaged.  A breakdown in one part of the system was simply routed around. At the destination, the information was reassembled. Government officials realized that the network was the perfect solution to the problem of maintaining communications during a catastrophe such as a nuclear war. (Paul Baran of RAND had described such a network in a 1964 paper.) The network thus became a critical part of the (Cold) war effort.

In 1974, Vint Cerf and Bob Kahn outlined a system, the Transmission Control Protocol / Internet Protocol, for allowing computers with different standards and platforms to communicate.  That removed the major barrier to connecting to what became known as the Internet.

Usenet, the system of interest groups or “newsgroups,” was added to the system in 1979. In 1983, the University of Wisconsin created the Domain Name System, which replaced raw addresses, series of numbers such as 123.23.234.45, with names that humans could remember.

The last major piece of the Internet was created in 1990-92 by Tim Berners-Lee, a scientist at the European Particle Physics Laboratory (CERN), and his associates.  Berners-Lee and company wanted to use the Internet to share physics papers. They took advantage of the concept of hypertext—electronic text that need not be read in any particular order, in which notes are embedded in the text so that the reader can get more information on a given word or phrase by simply “clicking” on it.  The scientists also wanted to be able to take advantage of the Internet’s increasing speed by mixing pictures with the medium’s traditional text.

The result of their effort was the World Wide Web. On top of the existing parts of the Internet such as e-mail and Usenet, the Web made the medium accessible to hundreds of millions of people who would never write a line of computer code or even of Hypertext Markup Language, HTML, the language of the Web. The National Center for Supercomputing Applications developed a “browser” (Web/Internet reader) called NCSA Mosaic, and some of the browser’s developers went into business for themselves to create Netscape, the first common browser.

By the late 1990s, many people used their own Web sites to post their personal diaries and their comments on politics, the arts, and other interests. In the spirit of comity, they linked to the similar Web sites of others, and they began to accept comments from readers elaborating on or contradicting their own points.  These sites evolved into “weblogs” (a term coined in December 1997); the creator of one weblog presented the term as “we blog,” and the sites became “blogs,” the creators “bloggers,” and the worldwide community of blogs the “blogosphere.”

One prominent blogger began a comprehensive list, counting 23 blogs as of the beginning of 1999. Another count that year put the number at 50. By midyear 1999, a number of tools were released for creating blogs, making blogging accessible to people without much expertise. People could use software to create blogs on their own sites or they could use specialized services that hosted blogs and made them easy to create. By 2004, estimates put the number of blogs at 2.4 million to 4.1 million. By February 2011, there were an estimated 156 million public blogs. Today, in the sense that anyone on Facebook is a blogger of sorts, there may be a billion bloggers out there.

Rebecca Blood, in “Weblogs: A History and Perspective” (September 7, 2000), described the typical blog:

[E]ditors present links both to little-known corners of the web and to current news articles they feel are worthy of note. Such links are nearly always accompanied by the editor’s commentary. An editor with some expertise in a field might demonstrate the accuracy or inaccuracy of a highlighted article or certain facts therein; provide additional facts he feels are pertinent to the issue at hand; or simply add an opinion or differing viewpoint from the one in the piece he has linked. . . . Weblog editors sometimes contextualize an article by juxtaposing it with an article on a related subject; each article, considered in the light of the other, may take on additional meaning, or even draw the reader to conclusions contrary to the implicit aim of each. . . . By writing a few lines each day, weblog editors begin to redefine media as a public, participatory endeavor.

Critics of the Internet often ridicule the medium, and blogs in particular, as journalistic wannabes unfit to compete with the likes of The New York Times and CBS News. Early in the National Guard memos controversy, Jonathan Klein, former executive director of CBS News (and president of U.S. news for CNN in 2004-2010), said on Fox News: “You couldn’t have a starker contrast between the multiple layers of checks and balances [at “60 Minutes”] and a guy sitting in his living room in his pajamas writing.”

In fact, the pajamaheddin, as some call themselves, are subject to faster and more effective checks that the so-called MSM (“Mainstream Media”). On a well-trafficked blog, inaccurate information is usually corrected, or at least challenged, quickly.  Most newspapers correct themselves rarely except on matters of little importance like the spelling of names.  Broadcast news media almost never make corrections.  Members of the MSM never suggest that readers or viewers check out a competing publication or broadcast to get the other side of the story.  Yet blogs on opposing sides of issues as contentious as the Iraq War link to one another and respond to each other’s arguments.

The blog community corrects for political bias, Yale Law Professor Jack Balkin noted, because “bloggers who write about political subjects cannot avoid addressing (and, more importantly, linking to) arguments made by people with different views. The reason is that much of the blogosphere is devoted to criticizing what other people have to say.”

There is nonsense on blogs and on the Internet as well. But to criticize the medium for carrying this crazy accusation or that urban legend is like criticizing the telephone system for carrying gossip – or, as President Clinton did after the Oklahoma City bombing, criticizing Rush Limbaugh and Talk Radio for the rants of some violence-promoting nut with a shortwave transmitter.

Because blogs are archived and the archives are immediately available, their biases and standards are easy to judge.  Does the blog do a good job in selecting news to highlight, in analyzing that news and making predictions about the course of events?  Does it expose the fallacies of opposing arguments and correct the arguments of those with whom it agrees?  Has the blog attracted a community of people who make comments worth reading?

The blog community makes use of the collective intelligence of the Internet to police itself.  Blogs that are the most interesting get the greatest readership, and the greatest number of links from other highly-rated blogs, which leads to even greater readership.  The result is a hierarchical pattern of readership similar to that of Web sites in general. According to Steven Johnson in his book Emergence, “The distribution of Web sites and their audiences appears to follow what is called a power law: the top ten most popular sites are ten times larger than the next hundred most popular sites, which are themselves ten times more popular than the next thousand sites.”

As blogs get more sophisticated, they use other methods to aggregate collective judgment and separate the best material from the dregs.  On some sites such as Slashdot, individual entries are rated; the reader can actually set his browser to read only entries that are rated, say, 3 or higher on a scale of -1 to +5.  In the effect known as “slashdotting,” a site with larger amounts of traffic links to a story on a smaller site, causing a massive upsurge in traffic (and, sometimes, causing the smaller site to crash).

At the online retailer Amazon.com, readers rate the reviews of books and other items written by other readers, which in turn creates a hierarchy of reviewers (a reader can become a “top 100 reviewer”).

“Event” futures markets, in which people bet on future events, usually work very well.  For example, in 2004, TradeSports.com took bets on the presidential election outcome state by state. As of October 26, 2004, President Bush was listed as the likely winner in states with 296 electoral votes; he won with 286.  OnlyWisconsinwas called “wrong” by the bettors.

The success of such collective judgments supports the ideas of New Yorker columnist James Surowiecki.

In his book The Wisdom of Crowds, Surowiecki examined the ways in which, under the right conditions, collective judgments are better than individual judgments.  Among his examples:

* Eugenicist Francis Galton, in an effort to prove the stupidity of the average person, studied the results from a fair at which people guessed the weight of meat that would be produced when a certain ox was slaughtered.  Participants in the contest ranged from farmers and butchers to people with no special knowledge of oxen and meat. The average guess: 1,197 pounds. The actual figure: 1,198 pounds. Galton was disappointed.

* When the U.S.submarine Scorpion disappeared in theAtlantic in 1968, such that a search would have to cover an area 20 miles wide, a naval officer put together a team of submarine experts, salvage men, mathematicians, and others with specialized knowledge. Using an averaging technique, he aggregated their guesses as to the sub’s location. The result was 220 yards from the actual location.

* When the Challenger space shuttle exploded, the stock market quickly targeted Morton Thiokol as the company that made the critical error; its stock fell relative to that of other possibly responsible companies. By the end of the day, Thiokol stock was down 12%, compared to 3% for the others. Countless individuals, acting collectively, had, it seemed, reached the conclusion that investigators would announce six months later.

In case after case cited by Surowiecki, the judgment of the group is better than the judgment of any, or almost any, individual member of the group—again, under the right conditions. Those conditions are that the members of the group are acting independently, so that individuals’ opinions are not affected by peer pressure or the human tendency to go along with the crowd (“groupthink”); that the membership of the group is diverse in background, knowledge, and way of thinking; and that there is a way to aggregate the individual opinions into a coherent whole, which can be as simple as averaging.

As the economist F.A. Hayek pointed out, a free-market system sets prices based on collective judgment – collective judgment that seems to meet the Surowieki conditions. Prices measure the relative value of different things, and the Soviets’ inability to make that judgment effectively, due to the lack of a free market, doomed their system.

The search engine Google uses something akin to Surowiekian principles to rank the results of Web searches.  A site is ranked largely on the basis of how many Web sites link to that site, with each linking site’s “vote” weighted according to the number of sites that link to it, and so on. Thus, the Google rankings reflect the collective opinion of the World Wide Web regarding the relative value of different sites. (Google, founded in 1998, rose rapidly to the top of the multi-billion-dollar search engine industry.  By 2002, “google” was being used as a verb.)

It is important to note that, for collective judgment to be effective, the membership of the group need not be uniformly of high intelligence.  Ants act in a highly intelligent way as they run a colony, because they use highly efficient aggregation systems to answer such questions as “Where is the closest food?”  Each ant responds to its local environment and releases pheromones that express its opinion to other ants, who pass along that opinion and their own along with a pheromone release, and so on.

This principle has also been illustrated by computer agents, intelligent beings whose behavior is simulated in a computer.  In experiments designed by Scott Page of theUniversityofMichigan, the agents’ collective judgment appears to be better if the agents think in diverse ways about problems, even if some of the agents appear to exhibit stupidity.

There are other reasons beyond concepts of Artificial Intelligence to believe that a blog network would be a useful tool in intelligence analysis.

Blogs and the Internet have shown a remarkable ability to cross national borders and to provide “ground level” information in places around the world.

During the 1991 coup attempt in theSoviet Union, coup plotters got control of most broadcast facilities and other media, but were seemingly unaware of the ability of computer networks to get around government restrictions.  The Internet was used to coordinate demonstrations and report on them, to disseminate alternative media reports and first-hand accounts of coup-related actions, and to persuade the world news media and foreign governments that the coup would not succeed (thereby helping ensure that it did not succeed).

By 2004, according to Daniel W. Drezner and Henry Farrell in Foreign Policy, one Internet service provider alone hosted 60,000 blogs in Iran, and the fourth most common language among bloggers was reported to be Farsi. For months before and after the beginning of the Iraq War, a 29-year-old architect called Salam Pax, “the Baghdad Blogger,” provided information to the world about events on the ground. In the cases of some countries such as Sudan and North Korea, people outside a country’s borders gather information from travelers and present it on their blogs.

John Gilmore, a founder of the Electronic Frontier Foundation, once said that the Internet “interprets censorship as damage and routes around it.”

Today, oppressive governments are testing Gilmore’s proposition, building their own alternative internets in order to limit the amount of information to which their people have access. The Obama administration has shown itself to be no friend of Internet freedom, with actions ranging from the shutdown (to U.S. users) of the predictions market Intrade to the attempt to blame the Benghazi attack on an anti-Islam YouTube video. The United Nations and other anti-freedom international organizations are seeking to establish an international regime for control of the medium and control of the worldwide flow of information.

Will they succeed? Can they succeed? We’ll examine that in a future article.

 

========================

Dr. Steven J. Allen (JD, PhD) is editor of the Capitol Research Center publications Green Watch and Labor Watch. In the 1990s, he was editor of the Internet Political Report, the first publication on the relationship between politics and the Internet.

Dr. Steven J. Allen

A journalist with 45 years’ experience, Dr. Allen served as press secretary to U.S. Senator Jeremiah Denton and as senior researcher for Newt Gingrich’s presidential campaign. He earned a master’s…
+ More by Dr. Steven J. Allen