We are a commune of inquiring, skeptical, politically centrist, capitalist, anglophile, traditionalist New England Yankee humans, humanoids, and animals with many interests beyond and above politics. Each of us has had a high-school education (or GED), but all had ADD so didn't pay attention very well, especially the dogs. Each one of us does "try my best to be just like I am," and none of us enjoys working for others, including for Maggie, from whom we receive neither a nickel nor a dime. Freedom from nags, cranks, government, do-gooders, control-freaks and idiots is all that we ask for.
Readers know that I am a collector of formal fallacies. I keep them on the mantle, well-dusted and polished. "Saving the Hypothesis" doesn't strike me as a formal, Aristotelian fallacy, but it surely is a common thing for folks to struggle to salvage a notion in which they are emotionally invested, regardless of new data. We all do that sometimes unless we catch ourselves BSing ourselves.
Larry Anderson at American Thinker proposed this fallacy in relation to the Global WarmingClimate Change whatever topic. One quote:
I distinctly remember the first time I had an argument with someone about "climate change." It was a warm spring day in April of 1975. I was walking across the campus at Harvard headed for lunch. A fellow classmate (we were both juniors) ran up to me. He was really excited. He hollered out:
"Have you heard? The ice age is coming. They've proven it with computer studies at MIT."
"How did they prove that there is an ice age coming with computers?" I asked my friend.
"The planet is getting colder. They have the data. And it is going to keep getting colder. They have these computer models --"
I stopped him right there. I knew enough about computers to understand that they were not up to accurately predicting short-term weather patterns -- let alone an ice age.
"No way. Computers aren't that powerful. "
"But they have the data and they have computers!"
"I know they do. But computers spit out whatever they are programmed to spit out. Load a computer with the data that the world has been getting colder; ask it what the weather will be tomorrow, and what do you think the computer is going to tell you? The world is getting hotter? If it does you'd better get a better computer. You hungry?" I replied and I continued on my way to lunch.
The original "climate change" hypothesis was that the planet was getting colder and that it would continue getting colder. That was a very simple hypothesis and was easily proven or refuted. Planet keeps getting colder = hypothesis correct. Planet gets warmer = hypothesis incorrect.
The world got warmer instead of colder. The "climate change" hypothesis was rewritten. This time the planet was facing a catastrophic meltdown. The world was not only getting warmer -- it was going to keep getting warmer at an ever increasing and life-threatening rate.
Somebody wrote long ago, dismissing some expert nonsense or other, ``we know this by linear extrapolation, the same process by which we predict the price of tobacco in the year 2000,...'' Or the stock price of the New York Times in the year 2012, I could update it as.
I'm reminded of Linderholm's Mathematics Made Difficult, the chapter ``Guess the Next Number,'' recommending lagrange interpolation for everything, on the grounds that it's more general and universally applicable, against the received notion that it's a good test of intelligence. Thus the next number for 1,2,4,8,16,_ is not 32 but 31.
Computers today could do it.
It ought to work for global warming. And it can't be defeated. Just add another term next year.
The only thing you can't predict about it is what it will predict.
The climate modelers don't want to interpolate. They want to extrapolate. One of my math professors used to warn us students, the only thing more risky than extrapolation is fortelling the future.
The climate models are a joke. Those people are implicitly claiming they know how to write a computer program to fortell the future. Before you can program the computer to fortell the future, you have to know how to fortell the future so you can write the instructions for the computer. Remember, the computer is nothing but a fancy adding machine.
Dumb ideas are no real problem --they last until familiarity chases them away. What's bothersome about current events is that a long list of scientists signed that UN report and to this day swear by it, sneering off disagreement as corrupt and or stupid. Even though the basic data aquisition has been shown to have been faulty (so many of the temp monitors badly placed).
You can slice & dice it anyway you want, but in the end, it is a deliberate --planned --conspiracy.