Sacrificing Scientific Skepticism: (Re)Discovering Disproof
What This Science Writer Learned About Climate Change
Summary: A veteran science journalist looks back 15-plus years of reporting on science and climate issues to offer key lessons—and warning—about climatology. Journalists and scientists should remain the most skeptical of professionals, particularly in areas like climatology that we don’t understand well.
The Possibility of a “Little Ice Age”
For over a decade, solar scientists have been observing an unusual change in the sun’s activity. They know it’s unusual because the data record goes back over 400 years, beginning with Galileo’s early telescopic observations in the early 1600s. The record shows that between 1645 and 1715, very few sunspots appeared; in fact, for long stretches the sun’s face was blank. Solar scientists call it the Maunder Minimum. During approximately that same period, Europe experienced an era known as the Little Ice Age. Rivers such as the Thames in England froze solid for months at a time, though they rarely were seen to freeze before or since. And Europe suffered short summers coupled with long, very cold winters.
Why is this potentially important? Two reasons.
First, many climate scientists have asserted that the buildup of CO2 and other greenhouse gases in the atmosphere exerts a much stronger effect on global temperatures than any changes in solar activity.
Second, the aforementioned solar behavior is mirroring what happened before the Maunder Minimum. After nearly three decades of studying the solar magnetic field (sunspots are actually explosive outcroppings of the sun’s magnetism), scientists believe that perhaps within two decades Europe will see another Little Ice Age.
In some ways, we could consider the onset of a “little” ice age, if it develops and lasts maybe a half-century, to be fortuitous. Yes, such a temperature shift would no doubt cause difficulty for many people and nations. But the change would also produce two significant benefits. One, it would prove or disprove assertions about the degree of the sun’s direct influence on global temperatures—if sunspots disappear and Earth cools, then there’s no question solar activity is the preeminent climate driver. But if a Maunder Minimum recurs without a corresponding and widespread cooling, then the extra atmospheric CO2 has been sufficient to overcome a diminished sun.
Two, if a new Little Ice Age arrives, we should immediately abandon any thoughts of trying to cool the planet, because it would become obvious that doing so would be a terrible mistake. Instead, scientists and politicians could set their sights on either mitigating or preventing the next ice age, and, in the process, head off what surely would become a global catastrophe for humanity.
That’s yet another item on the list of large unknowns, climate-wise. Setting aside the apparent disagreement between the computer models and the temperature data, assume that global warming is happening and that human CO2 emissions are the primary cause. Could global warming finally disrupt the ice-age cycle, even though at least three supervolcanic eruptions apparently could not? And, if so, should we then regard the presumed driver of global warming—the burning of fossil fuels—as the savior of civilization?
Could global warming disrupt the ice-age cycle, even though at least three supervolcanic eruptions apparently could not?
It would be foolish, and potentially dangerous, to draw any such conclusions at this point. Some of the proxy data suggest that during the last interglacial period, about 120,000 years ago, the average temperature on Earth was 2 degrees Celsius higher than it is now. If true, somehow the planet achieved that amount of warmth without the aid of an industrial civilization.
So, another set of questions:
What source warmed Earth that much?
If an extra 2 degrees of warming could not head off the last ice age, how much more warmth does the planet need to escape the next one?
While we await the outcome of this latest stage in solar activity, and given the dichotomy between the climate-model projections and actual temperature data, should we accept that curbing CO2 output, via strict emissions rules and high taxes, is our best course? Should the governments of the world base their policy decisions on a scientific consensus? And should scientists whose research contradicts the consensus be disdained and ridiculed?
Science is About Disproof
Nearly a decade ago, I attended a weeklong fellowship for science journalists hosted by one of the country’s premier climate-science organizations. I attended daily workshops presenting the latest findings about the topic—including the assertion that CO2 buildup would offset any reduction in solar activity. I met with many of the scientists charged with collecting and analyzing the data and with creating and refining the models. I eagerly participated in tours of the facility and its sophisticated instruments and aircraft.
At the end of the fellowship, our hosts invited us to a panel discussion with a dozen or so of the scientists and instructors, the purpose being to review what we had learned and to comment and ask questions. When my turn came, I raised a point that had been troubling me the entire time of my visit. I told those gathered that it astounded me how casually and derisively the term “skeptic” had been used by many of the instructors and other speakers, as though it would be foolish to doubt anything presented that past week.
To vaguely hostile stares and even stunned silence, I asked those present if skepticism shouldn’t be the primary factor in all scientific endeavors. Shouldn’t scientists—and journalists—always be the most skeptical of professionals, even if, as Carl Sagan once said, “Skepticism does not sell well”?
I probably soured much of the good will I had generated during that week. But my point was, and is, valid. Science is not primarily about proof; science is about disproof. Nothing in science, absolutely nothing, should ever be taken at face value. This view isn’t new; it’s age old. Certainly, I’ve encountered it time and again during my science-writing years.
I was once privileged to cover the discovery of dark energy, a mysterious force that is causing the universe’s 13.8-billion-years-and-counting expansion to accelerate and is thought to comprise nearly seven-tenths of everything in the cosmos.
The astronomers who discovered dark energy did so by using what at the time was called the “standard candle.” It’s the light emitted by a specific strain of supernova called a Type 1A—an exploding star. This light brightens and dims with such precision that, using Einstein’s Theory of Relativity, astronomers can determine exactly how fast even an extremely distant star is moving away from Earth. Based on those measurements, dark energy’s discoverers hypothesized that the universe will eventually expand so fast that many billions of years from now all of the galaxies will grow so far apart they will no longer be visible to one another.
Except that subsequent research showed the light from Type 1A supernovae sometimes behaves not so precisely. Therefore, basing the concept of dark energy on such a factor is fundamentally flawed.
Eventually, other researchers discovered a different phenomenon, unrelated to supernovae, that seems to reconfirm dark energy. But no cosmologist is ready to declare dark energy and its effects as indisputable facts. For that matter, some scientists are still arguing about whether the Big Bang—the explosive creation of our universe—actually happened. Such actions require rigorous examination and constant testing, just as Einstein’s Theory of Relativity, now over a century old, remains solid because it has been continually tested.
Dark energy represents one example of the normal process of discovery within science. I’ll give you another, which for me remains the cautionary tale about the dangers of relying on consensus and of disdaining skepticism, in the final installment.
In the final installment of “Sacrificing Scientific Skepticism,” we examine the importance of independent science, dispassionate review, and avoiding consensus on global warming – the future depends on it.