Deception & Misdirection

Another Pew survey, another fake


[Continuing our series on deception in politics and policy.]

Pew—the Pew Research Center for the People & the Press—has published another poll showing people’s opinions on the Keystone XL pipeline. The results are highly favorable to the construction of the pipeline, which I support. I wish the results were true. But they’re fake.

Here’s how the survey was reported by the Los Angeles Times:

Nearly half of Democrats favor granting a permit for the construction of the controversial Keystone XL pipeline, according to a poll released Wednesday by the Pew Research Center for the People & the Press. . . .

The Pew poll showed that, despite the work of anti-pipeline activists, support for the project has remained solid, especially among Republicans and independents. Backers of the pipeline have argued that it would create jobs and secure more oil from a friendly, democratic country.

Overall, 61% of respondents favor building the pipeline, while 27% are opposed, a proportion that has held steady for the last year or so, according to Pew. About 49% of Democrats back the pipeline and 38% oppose it. The remaining 13% said they did not know.

The poll was conducted from Feb. 27 to March 16 among 3,335 adults.

So, according to Pew, 88% of respondents are either in favor of the pipeline or opposed. That’s an incredible result, truly unbelievable—and I mean that literally.

Here’s why: No one in his or her right mind should believe that 88% of people in the surveyed group have ever heard of the Keystone XL pipeline. That means that the Pew survey responses were to the wording of the question, not to the matter of whether  the pipeline should be built.

It’s like that recurring bit on the Jimmy Kimmel late-night show, in which Kimmel staffers conduct man-on-the-street interviews with questions like “What was your reaction when President Obama pardoned the sequester?” …and people respond as if the questions make sense. Often, the participants elaborate on their answers, providing additional information such as what they were doing when they watched the broadcast of the President’s announcement of his pardon of the sequester.

In the Pew survey, the wording of the question was “Do you favor or oppose building the Keystone XL pipeline that would transport oil from Canada’s oil sands region through the Midwest to refineries in Texas?” I’ll give the Pew researchers the benefit of the doubt; I’ll assume that they put a lot of effort into the wording of the question, to make it as fair as possible, but such a result—showing a vast majority of people with an opinion on a complex issue that doesn’t affect them directly—is inherently flawed. People who have never heard of the Keystone XL pipeline do not have an opinion on the Keystone XL pipeline, and no clever wording of the question can get around that problem.

Assuming that the Pew survey was conducted in the manner that it sponsors claimed, and that it followed standard procedures for polls that are conducted scientifically, it is nevertheless fatally flawed because the result makes no sense.

How do I know that fewer than 88%, probably a lot fewer than 88%, have heard of the pipeline? Because, based on more than 40 years experience analyzing public opinion polls, I know that most people simply don’t have opinions on issues that are more complex than “For whom will you vote, Obama or Romney?” and that don’t affect them directly, such as “Have you had a favorable or unfavorable experience with recent changes to healthcare laws?” I don’t mean this as an insult to the average American. Most people are reasonably smart, but they have real lives and they don’t spend a lot of time reading up on complex matters of public policy. To their detriment, they rely on political leaders to make public policy and on journalists to provide analysis of public policy.

I first noticed this problem with polls back in the 1980s, when 80% of respondents in one poll had an opinion on the proposed Nuclear Freeze, but the number of people who had any idea what the Nuclear Freeze was, was close to zero. Same thing in the 1990s, with the chemical weapons treaty that was then under consideration, and polls “showed” that most regular people had an opinion on the issue, at a time when most Capitol Hill staffers didn’t.

Consider the following:

►At any given point, fewer than 70% (probably, fewer than 60%) of U.S. adults can name the Vice President.

►In 2011, Newsweek gave a group of American citizens a version of the test that’s given to immigrants seeking citizenship. Only 62% passed. The magazine reported that “Seventy-three percent couldn’t correctly say why we fought the Cold War. Forty-four percent were unable to define the Bill of Rights. And 6 percent couldn’t even circle Independence Day on a calendar.” The Newsweek story referenced a 2009 study by the European Journal of Communication  in which only 58 percent of Americans managed to identify the Taliban, against which the U.S. had been fighting a war for more than seven years.

►A 2013 Reuters/Ipsos poll had 27% selecting the correct definition of “quantitative easing” from a list of five answers. That’s compared to the 20% who would have picked the correct answer at random.  (If as few as eight or nine percent actually knew the answer, and the rest guessed blindly, you’d get around 27% answering correctly.) No wonder that, when politicians like Sarah Palin talk about the issue, it flies right over the heads of most voters.

►Left-wing columnist rosa Brooks wrote in 2006:

Last spring, one survey found that although 52% of Americans could name two or more of the characters from “The Simpsons,” only 28% could identify two of the freedoms protected under the 1st Amendment. Another recent poll found that 77% of Americans could name at least two of the Seven Dwarfs from “Snow White,” but only 24% could name two or more Supreme Court justices.

In September, the Annenberg Public Policy Center released a poll showing that only two-thirds of Americans could identify all three branches of government; only 55% of Americans were aware that the Supreme Court can declare an act of Congress unconstitutional; and 35% thought that it was the intention of the founding fathers to give the president “the final say” over Congress and the judiciary.

►In a 2008 issue of the Chronicle of Higher Education, Professor Ted Gup of Case Western Reserve Universuty, wrote about his experience with some of his students: “Nearly half of a recent class could not name a single country that bordered Israel. In an introductory journalism class, 11 of 18 students could not name what country Kabul was in, although we have been at war there for half a decade. Last fall only one in 21 students could name the U.S. secretary of defense. Given a list of four countries—China, Cuba, India, and Japan—not one of those same 21 students could identify India and Japan as democracies. Their grasp of history was little better. The question of when the Civil War was fought invited an array of responses – half a dozen were off by a decade or more. Some students thought that Islam was the principal religion of South America, that Roe v. Wade was about slavery, that 50 justices sit on the U.S. Supreme Court, that the atom bomb was dropped on Hiroshima in 1975.”

To make this point even clearer, consider a Pew survey released last September that asked people whether the amount of energy being produced in the U.S. has been increasing (the correct answer, 48%), staying the same (31%), or decreasing (12%).  The commentary accompanying the poll suggested that “less than half” of people knew the correct answer. But, of course, that overstates the number of people who knew the correct answer because, given a choice of three answers, one-third of those responding would have picked the correct answer if they had simply picked a response by chance. If, say, 22% of the respondents knew the correct answer, and 78% guessed blindly, the percentage getting the answer correct would be 22+(78/3)=48. You could get a 48% correct response if only 22% actually knew the answer. (I’m simplifying things a little, but you get the point.)

The fake Keystone XL poll isn’t the first time Pew has done something like this. In 2009, Pew surveyed members of a left-wing activist group, the American Association for the Advancement of Science, and reported the results as representative of “scientists” (including such nonsense as the idea that only 6% of scientists are Republicans and that the ratio of liberals-to-conservatives among scientists is 11 times the ratio among the general population).

I don’t mean to pick on Pew. The vast majority of public opinion polls that are widely reported are misleading or outright fraudulent. Private polls, for which people pay lots of money, tend to be reasonably accurate. Public polls are sponsored by left-wing academics, interest groups, and news organizations, and reflect the biases of those people and organizations and their lack of understanding of how polls work. Let the reader beware.

Dr. Steven J. Allen

A journalist with 45 years’ experience, Dr. Allen served as press secretary to U.S. Senator Jeremiah Denton and as senior researcher for Newt Gingrich’s presidential campaign. He earned a master’s…
+ More by Dr. Steven J. Allen