First off, this post is going to be very complicated and it deals with computer modeling namely to do with GCM’s (Global Climate Models) and how they get simple things incorrect due to too much usage of “fudge factors.”
Fudge factors despite their name at a basic level are used often in more complex models. The reason is that there is an unknown variable for instance that influences the behavior of the system in known ways, and the best way to predict or otherwise explain this variable is through the creation of a fudge factor which explains the behavior of this variable and for all intensive purposes could be 10 or more variables. The explanation here is that you want the models to match reality and in order to do that, its often required to do this.
But the inherent danger in using fudge factors is that you also hide the behavior of your known variables inside of these variable in a chaotic system such as climate. So the usage of too many fudge factors will inherently lead to problems down the road. I realize that is very complicated, but I tried to simplify it from even wikipedia and other sources to give some background on modeling the climate.
Modeling the climate is not easy. That being said, simple mistakes can be seen from the output of GCM’s.
Since I want to keep this post simple, I won’t go into detail on GCM’s at this point in time, but for further reading wikipedia gives a much more complicated and hard to read description, so if you want to know more, here you go:
http://en.wikipedia.org/wiki/Global_climate_model
But lets anaylze trends in GCM output and some mistakes that popped out. I want to remind everyone at this point that I did explain why these mistakes occur. The over-use of fudge factors causes a model of a system to behave in strange ways, especially when you combine “fudge factors” with extrapolation. To explain the extrapolation, any model that predicts future trends is extrapolating the future.
Its in this fashion that we can anaylze the trends seen in models. I am going to concentrate on just the super el Niño artifact for the rest of this post.
Here is some preliminary information on it:
http://motls.blogspot.com/2009/02/hansens-colossal-failures-super-el-nino.html
<–this link provides a little background from a sceptic perspective.
Some news about it and predictions of the future so to speak:
http://stevengoddard.wordpress.com/2011/07/12/hansen-on-the-super-el-nino/
http://ff.org/centers/csspp/library/co2weekly/20060414_02/20060414_05.html
With that, we can see the prediction that a super El Niño effect is predicted back in 2006-2007 for the immediate future. Lots of sceptics as you can see from some of my links predicted that this was just “fear-mongering” but I think in the end the scientists actually beleived that the Super El Niño wasn nothing more then expected due to their model output.
Why would the models output conditions on a global scale similar to a super el Niño? That is the main question I am going to explain for the rest of this post, but first, here is a brief explanation of ENSO: (which is what the El Niño effect is a part of.)
http://www.esrl.noaa.gov/psd/enso/
Is probably the best site for an overview along with latest ENSO figures. To explain ENSO briefly, its nothing more then the measure of basic fluctuations in ocean temperatures across a narrow stretch of the Pacific Ocean. These fluctuations which are based on wind for the most part play such a crucial and important role in our weather. ENSO effects caused droughts, floods, more frequent or less frequent hurricanes or stronger hurricanes all the way down to local temperature differences. In essece, ENSO is mostly the driver of our weather patterns.
El Niño for simplicity sake is simply the “hot” part of the ENSO cycle which varies between la Niña (cold) normal conditions (no name) and like I said El Niño. A super el Niño effect as predicted would result in conditions that for all intensive purposes mimicked a permant el Niño effects pretty much world-wide. The droughts and flood and changes in climate if this came to pass would be severe. But as I will explain, these predictions have not occurred yet and odds are never will because they are nothing but a computer artifact.
When we look at trends and at the predictions that models make, we look at why the computer predicts these certain trends. In the case of ENSO heading towards permanant or super El Niño status, this is due to a trend of less frequent and weaker la Niñas and stronger and longer lasting El Niños that occurred mostly between 1975 – 2010.
We have just recently scratched the surface on why ENSO went so strongly towards El Niño during this time period. This is due to a roughly 60 year long ocean cycle in the Pacific called the PDO that alternates between the warm and the cold phase. The main effect seen in a “cold” Pacific is longer la Niñas that are more severe and shorter El Niños that are also tamer.
A warm Pacific is exactly the opposite.
For further reading on the PDO, see here:
http://jisao.washington.edu/pdo/
But let me explain the trend here. The models are attempting to predict the future by viewing from first a time of a cold PDO that showed a dominant La Niña and then after that we had the time period from 1975 to roughly 2010 that showed a dominant El Niño. When you combine those two trends, the computer is just doing its job and thinks that El Niño will continue to be dominant and in fact will become more dominant until the ENSO condition is stuck in El Niño mode forever and this will just get worse and worse as time goes by and someday the condition known as La Niña will be nothing more then a long-gone memory. So basically the climate system will instead of reaching equilibrium like expected go down a catastrophic path that will cause all life on this planet to be in jeapordy due to extreme changes.
Obviously that just sounds silly. But that is the effect of the artifact seen. The simplest explanation is that this is not understood by the scientists and since they do not understand the 60 year ocean cycle and how it can install an artifact in their GCM’s, they tell everyone that the mistake they made is the Gospel Truth and never admit they are wrong or tell people that the Ocean Cylcles caused this modeling mistake since they did not take a Cold or Warm Pacific into account when they wrote their code.
The artifact is not just seen in the most blatant example I have shown here, but in many other GCM’s as minor quarks where local trends tend to match the El Niño condition in most cases and that individual differences in GCM’s tend to come out to how strong they think the El Niño is going to be based on like what I said previously: fudge factors.
But that is besides the point, if a computer model (GCM) could have such a simple artifact in it and not be noticed by the person who most people think is the smartest climate modeler out there (Dr Hansen) how can we be sure any of them are free of this bug/artifact to some extent? That is the question obviously and I hope this illustration was useful in understanding one simple way that computer models can go wrong in predicting the future. One small error in one fudge factor and you get such unpredictable results.
Well, this is a lot of information on the climate and weather of our planet, and this is why I tended to keep this post focused since I wanted most people to be able to read this and understand to an extent what things are and not just those who have a computer background or understand trend analysis and what it means.
The entire concept of a super El Niño is just terrible that a simple computer artifact that anyone should have been able to find was left out and publicized. This mistake is not yet showing up in general scientific circles, so to this date I wonder if they even realize why the GCM got this wrong? To anyone who has read this article, it seems clear and concise why the computer predicted the Super El Niño. Lets hope these scientists fix their mistakes and not repeat their same mistakes again.
Please post questions or corrections to this essay. Its a rough draft.