In the real world, we are in the midst of accelerating rates of change. Therefore, past linear rates of change will no longer apply to the future. These accelerating rates mean that it is quite erroneous for the policy world to assume that the linear changes of the past will hold for the future.
Policy should be based on reasonable worst-case scenarios and avoiding that outcome, instead of assuming that the future of the planet will be driven by simple linear change.
This means climate change policy must become much more aggressive in order to effectively deal with what the data shows; we are now experiencing non-linear change.
Example #1 – Increasing atmospheric greenhouse gas concentration:
|2011-2016||2.39 +/- 0.32 ppm increase|
|1989-1994||1.21 +/- 0.41 ppm increase|
Despite any global policy efforts to curb CO2 emissions, we are currently in the midst of accelerating CO2 deposition into the atmosphere. On average, we have doubled our CO2 emissions over the last 20-25 years (see the Mauna Loa CO2 data summarized in the table). Indeed in 2016 a new record was set at 3.38 ppm annual increase leading to a cumulative value of 404.2 ppm at the end of 2016 (and 408.8 in June of 2017). Clearly any previous policy emission scenarios designed to prevent us from reaching 400 ppm have failed precisely because of accelerating rates of change. This rate change is highly relevant to the Dec 12 2015 Paris Accord agreement to “keep a global temperature rise this century well below 2 degrees C …” Qualitatively, it seems quite unlikely that this goal can be reached given the acceleration of CO2 deposition into the atmosphere. In example 3, we will quantitatively show that proper non-linear analysis of the global temperature data will show that the Paris Accord limit most definitely cannot be reached. Overall, the Paris Accord is a good example of linear based policy which seems quite of out touch with what is actually happening
Example #2 – The rate of Arctic sea ice loss:
The rate of Arctic Sea Ice loss provides a good example of the difference between linear and non-linear fits to the data when policy extrapolates to the issue of “when will the Arctic Ocean be free of ice in September”. The data, complied from the National Snow and Ice Data Center (NSIDC) is shown in Figure 1 which plots average September sea ice extent vs time. The time record time starts in 1979, the first year of satellite measurements. In Figure 1 we show three fits to this data that are each extrapolated to zero ice extent in some future year.
A. Suppose that in 2007 a policy concern arises about an ice-free Arctic ocean. Some international committee is formed to study the issue and produce policy recommendations. Well, the red line in Figure 1 would show the requisite linear fit to the data available at that time (i.e. 1979-2006) that extrapolates to zero ice in the year 2106. The committee meeting is quite short: the problem won’t occur for 100 years so why worry about it now?
B. Okay, we now reconvene the committee 10 years later to update the situation based on 10 years additional data. Well, now the linear extrapolation from 1979-2016 (black line above) leads to the ice-free Arctic in 2070. Yes, the new data has produced a shorter timescale but, hey, that’s still 50 years away so again, why worry about it now?
C. But wait, what is that “weird” green line on this infographic? Well, the green line, which produces a prediction of 2035 (less than 20 years from now) is the best fitting non-linear relation to the data. Does this matter?
The difference between the linear and non-linear fit predictions is 30-40 years, which is significant in terms of human decision-making timescales. In the linear policy world, we would just punt on the issue since the crisis point is way into the future. However, if the non-linear approach yields the correct trend, but we remain stuck with our linear mindset then it’s quite likely that policy will be set after the time when children can visit Santa’s home on a cruise ship.
The lesson here is clear: when you are on an accelerating rate of change, the future becomes harder to predict – which adds uncertainty to the overall process. Instead of paralyzing the policy process, this increased uncertainty should focus efforts for policy to be based on more accurate trend forecasting. In this case, the data clearly show that the rate of Arctic Sea Ice loss is accelerating.
Example #3 – Global Average Temperatures:
Here we make use of the composite Land-Ocean temperature anomalies, with respect to the baseline of 1951-1980, as recently provided by NASA Goddard. In figure 2 we show a linear fit to that data as smoothed with a 5-year window and extrapolate that to the year 2050 with a predicted temperature anomaly (∆T) = 0.75 C. To directly compare to the Paris Accord we need to renormalize and refer the anomaly to the 1881-1910 period that can be used for the “pre-industrialized” level; that renormalized baseline then adds +.26 C to the 2050 value. Hence the predicted 2050 ∆T is +1.01 C – well beneath the stated goal of the Paris Accord, under the implicit policy assumption that “the linear trend is the most appropriate”. So, we can all smile and go home assured of mission accomplished. But again, wait, that green data line doesn’t look like it is well represented by a simple straight black line, does it? It seems that recent years are systematically departing from the linear trend. Does that matter? Oh wait, it will be okay because future years will relax back to the trend line, right? Hmm … is this an issue? Indeed, should this data be equally weighted as if all temperature points over time are independent and equally valid? Or should this systematic departure of the recent data carry with it more weight as being indicative of an actual manifestation of climate change?
For this kind of inter-annual data that shows significant random variations (random noise), it is often better to average the data over some timescale to better reveal the long term trend. We will then bin the data in units of 9 years. This gives us 15 bins from 1880 to 2014 and we will initially leave the 2015 and 2016 data points out. Note that there are no special rules for how to bin and smooth data – one just wants enough binning to see the waveform, but not too much binning to see the noise. Figure 3 shows the resultant plot. This predicts ∆T = 0.71, (0.97 Paris accord); the same as before but now the trend is more easily discernible.
Figure 3 clearly shows the well-known “mid-century” cooling that followed the period of warming earlier in the record. This cooling has been used to suggest that a similar event will happen in the near future. While the origin of this cooling is unknown, a very likely hypothesis is that during this period industrial pollution from aerosols dominated over GHG pollution which led to this period of global cooling. This is plausible because a) there was little law or regulation concerning industrial pollution and b) during this period total greenhouse gas emissions were ~ 4–5 times less than today (1958 Mauna Loa data show 0.6—0.7 ppm per year; 2016 was ~3.4 ppm).
We now address the issue of weighting by simply adding the 2015 and 2016 as two additional points to the diagram. Figure 4 shows the weighted fit leading to a higher predicted ∆T than the previous unweighted treatment. The scientific reasoning behind this weighting is straightforward: the years 2015 and 2016, are two successive record breaking years and this is likely an indication that we indeed are now in the non-linear regime of warming. However, whether the data is weighted or not, the linear fit to the data is poor and does not pass any valid statistical goodness-of-fit test, which your eye-brain can also clearly see.
Now we apply better science and fit the data to a proper non-linear model as shown in Figure 5. In this case most of the data points intersect the model line. Wait a minute, that can’t be right, the “answer” now violates the Paris Accord (fake news, anyone?). What is going on here? Isn’t this just another example of scientists screwing with the data to mislead the rest of us? The future really doesn’t look like that because the future is always linear, right?
Well this is entirely the policy issue. How does the policy world deal with this large difference between the linear responses of the system in the past compared to the current non-linear response of the system? If for instance we “believe” that linear fits are the best, then our future is far less dire, than that predicted by Figure 5. Furthermore, this more correct data-fitting approach predicts a value of 2.67 (+.26) C by 2050 as the pre-industrialized temperature rise which now strongly violates the Paris accord. Overall, this weighted fit predicts ∆T = 2.93 or approximately a 3 C rise in global temperature above the pre-industrialized level, by the year 2050 – only half way through this century!
So, what do we do? Do we pay attention to these non-linear trend estimates or do we just ignore them since the future is always a simple straight line? Do we keep our heads in the sand and just wait for a few more years’ worth of data that might show the above non-linear expectations are not continued by the future data? Or, do we use better scientific principles to guide our policy making and recognize that we are now in the non-linear regime which no longer offers the luxury of time, either for global average temperature changes or melting Arctic sea ice. This is the lesson here; if non-linear trend extrapolation is the best way to represent the future, then climate change policy needs to become more aggressive and enacted more immediately. If the policy world ignores these increasing rates, then the subsequent real changes in the world will be far greater than those predicted by the benign, linear approach to trend prediction,
In the policy world, the word “eventually” has always been code for “yeah it might be a future problem but it’s not a problem now, so we will ignore it …” If you are a millennial reading this then you should be pissed off. Ignoring data and its trending nature generally allows the next generation to inherit a problem with a reduced timescale to “fix it” and you are supposed to be the “fix it” generation. My generation made the problem for you, we ignored it, we accelerated it, and now you get to “fix it” and to do that, take heed of this last paragraph.
The data presented here strongly suggests that eventually = now. The policy world needs to wake up to the reality of accelerating rates, as borne out by the data, and refrain from continued wishful thinking that the physical world follows linear trends. It does not and that stark reality cries out for more intensive and thoughtful policy and planning processes so as to change our trajectory towards a more livable future and to prevent Santa from drowning.
By Greg Bothun and Jordan Chess, Dept. of Physics, University of Oregon