1: Requerimientos para un objetivo global
2: Dos grados más ¿respecto a qué? ¿Qué es lo realmente importante?
3.1: Historia de una cifra – 3.1. La artificialidad de las versiones economicistas tipo Stern
3.2: Historia de una cifra – 3.2. El origen religioso de la versión (supuestamente) científica
4: Cómo sería un mundo +2 ºC más caliente
5: Umbral de estabilidad del sistema climático, y el problema de control
6: Determinación del umbral de estabilidad desde abajo
- Wiki – Focal point (game theory) – Wikipedia – 27/01/2011 – http://en.wikipedia.org/wiki/Focal_point_(game_theory)
“In game theory, a focal point (also called Schelling point) is a solution that people will tend to use in the absence of communication, because it seems natural, special or relevant to them. The concept was introduced by the Nobel Prize winning American economist Thomas Schelling in his book The Strategy of Conflict (1960). In this book (at p. 57), Schelling describes «focal point[s] for each person’s expectation of what the other expects him to expect to be expected to do.» This type of focal point later was named after Schelling.” - Conference of the Parties 15 – Copenhagen Accord – United Nations Framework Convention on Climate Change – 18/12/2009 – http://unfccc.int/home/items/5262.php
“To achieve the ultimate objective of the Convention to stabilize greenhouse gas concentration in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system, we shall, recognizing the scientific view that the increase in global temperature should be below 2 degrees Celsius, on the basis of equity and in the context of sustainable development, enhance our long-term cooperative action to combat climate change. We recognize the critical impacts of climate change and the potential impacts of response measures on countries particularly vulnerable to its adverse effects and stress the need to establish a comprehensive adaptation programme including international support.” - Joseph Romm – Must-read IEA report explains what must be done to avoid 6°C warming – Climate Progress – 12/11/2008 – http://climateprogress.org/2008/11/12/must-read-iea-report-explains-what-must-be-done-to-avoid-6%C2%B0c-warming/
“The IEA’s conclusions on climate are even starker: ‘Without a change in policy, the world is on a path for a rise in global temperature of up to 6°C.’ The report does a good job explaining the practical difference between pursuing 450 ppm and 550 ppm. Yes, I’m aware that 350 ppm is a superior long-term target to minimize the risk of an ice-free planet and that the countless amplifying carbon cycle feedbacks mean 550 probably takes you to 1000 ppm and 6°C anyway, but the IEA is not so up on all the latest, depressing science, and it tends to temper its climate desperation with some practical short- and medium-term energy and political realities.” - M. Hajer (1993) – Discourse coalitions and the institutionalization of practice: the case of acid-rain in Britain – En: F. Fischer and J. Forester (Eds.) – The Argumentative Turn in Policy Analysis and Planning – Duke University Press
- Paul Baer and Dan Kammen – Climate scientists debate with prime minister – Environmental Research Web – 19/05/2009 – http://environmentalresearchweb.org/cws/article/opinion/39126
“I need your assistance to push this process in the right direction, and in that respect, I need fixed targets and certain figures, and not too many considerations on uncertainty and risk and things like that.” ow much risk society wants to take.” - Fiona Harvey and Jim Pickard – Stern takes bleaker view on warming – Financial Times – 16/04/2008 – http://www.ft.com/cms/s/d3e78456-0bde-11dd-9840-0000779fd2ac
“The Stern report on climate change underestimated the risks of global warming, its author said on Wednesday, and should have presented a gloomier view of the future. “We underestimated the risks … we underestimated the damage associated with temperature increases … and we underestimated the probabilities of temperature increases,” Lord Stern, former chief economist at the World Bank, told the Financial Times on Wednesday.” - Mark Hertsgaard – A scary new climate study will have you saying ‘Oh, shit!’ – Grist – 13/10/2009 – http://www.grist.org/article/2009-10-13-a-scary-new-climate-study-will-have-you-saying-oh-shit/
“G-8 leaders agreed in July to limit the global temperature rise to 2 ºC (3.6 F) above the pre-industrial level at which human civilization developed. Schellnhuber, addressing the Santa Fe conference, joked that the G-8 leaders agreed to the 2 ºC limit “probably because they don’t know what it means.” - Antonio Cerrillo – Christiana Figueres: «El planeta sólo será seguro si la temperatura no aumenta más de 1,5 grados» – La Vanguardia – 04/06/2011 – http://www.lavanguardia.com/medio-ambiente/20110604/54165103514/christiana-figueres-el-planeta-solo-sera-seguro-si-la-temperatura-no-aumenta-mas-de-1-5-grados.html
“Desde un punto de vista de la justicia y de supervivencia de las poblaciones más vulnerables del planeta, no podemos hacer otra cosa que plantear el objetivo de que la temperatura no supere los 1,5 grados. Es lo que garantiza la supervivencia de los países de las pequeñas islas o los países del subsahara.” - Referència Science T a preindustrial
- James Hansen et al (2011) – The Case for Young People and Nature: A Path to a Healthy, Natural, Prosperous Future – Draft paper – 04/05/2011 – Columbia University Earth Institute, New York – http://www.columbia.edu/~jeh1/mailings/2011/20110505_CaseForYoungPeople.pdf – 14 authors
“Knowledge of Earth’s energy imbalance allows us to specify accurately how much CO2 must be reduced to restore energy balance and stabilize climate. CO2 must be reduced from the current level of 390 ppm to 360 ppm to increase Earth’s heat radiation to space by 0.5 W/m2, or to 345 ppm to increase heat radiation to space by 0.75 W/m2, thus restoring Earth’s energy balance and stabilizing climate.” - Timothy M. Lenton (2011) – Beyond 2°C: redefining dangerous climate change for physical systems – Wiley Interdisciplinary Reviews: Climate Change doi:10.1002/wcc.107 – 10/03/2011 – School of Environmental Sciences, University of East Anglia
“Some potential thresholds cannot be meaningfully linked to global temperature change, others are sensitive to rates of climate change, and some are most sensitive to spatial gradients of climate change. In some cases, the heterogeneous distributions of reflective (sulfate) aerosols, absorbing (black carbon) aerosols, and land use could be more dangerous than changes in globally well-mixed greenhouse gases.” - George Monbiot – Giving Up On Two Degrees – The Guardian – 01/05/2007 – http://www.monbiot.com/archives/2007/05/01/1058/
“This is a cut in total emissions, not in emissions per head. If the population were to rise from 6 to 9 billion between now and then, we would need an 87% cut in global emissions per person. If carbon emissions are to be distributed equally, the greater cut must be made by the biggest polluters: rich nations like us. The UK’s emissions per capita would need to fall by 91%. But our governments appear quietly to have abandoned their aim of preventing dangerous climate change. If so, they condemn millions to death. What the IPCC report shows is that we have to stop treating climate change as an urgent issue. We have to start treating it as an international emergency.” - Kevin Anderson et al (2008) – From long-term targets to cumulative emission pathways: Reframing UK climate policy – Energy Policy 36:3714-3722 doi:10.1016/j.enpol.2008.07.003 – 08/08/2008 – The Tyndall Centre for Climate Change Research, MACE, University of Manchester – 3 authors
«Only recently have CO2 stabilisation studies begun to incorporate the impact of carbon cycle-feedback mechanisms (Matthews, 2005; Jones et al., 2006), despite a number of previous studies illustrating that there will likely be decreases in terrestrial and oceanic carbon uptake due to climate change (Cox et al., 2000; Friedlingstein et al., 2001). The new cumulative carbon range for a 450ppmv stabilisation level published in the IPCC (2007) report and presented in Table 1 illustrates the significance of such feedbacks.» - Ayami Hayashiet al (2009) – Evaluation of global warming impacts for different levels of stabilization as a step toward determination of the long-term stabilization target – Climatic Change 98:87-112 doi:10.1007/s10584-009-9663-6 – 25/07/2009 – Research Institute of Innovative Technology for the Earth (RITE), Japan – 5 authors “The world total population living in water-stressed basins (in which annual runoff per capita is less than 1000m3) will increase rapidly to 3.77 billion people in 2050 from 1.171 billion people in 1990 as a result of population growth, even if the annual water resources remain at 1990 levels … 1.4 billion people will experience an increase in the water stress in 2050 in terms of the world total … The global production potential of wheat and rice will increase in 2050, 2100 and 2150 for all emission pathways .. ”
- Joyce E. Penner et al (2011) – Satellite methods underestimate indirect climate forcing by aerosols – Proceedings of the National Academy of Sciences PNAS doi: 10.1073/pnas.1018526108 – 01/08/2011 – Department of Atmospheric, Oceanic, and Space Sciences, University of Michigan – 3 authors
“Satellite-based estimates of the aerosol indirect effect (AIE) are consistently smaller than the estimates from global aerosol models, and, partly as a result of these differences, the assessment of this climate forcing includes large uncertainties.” - Bill Hare (2005) – Relationship between increases in global mean temperature and impacts on ecosystems, food production, water and socio-economic systems – UK Met Office conference «Avoiding dangerous climate change» – 03/02/2005 – Visiting Scientist, Potsdam Institute for Climate Impact Research – http://www.pik-potsdam.de/~mmalte/simcap/publications/Hare_submitted_impacts.pdf
«In this paper the relationship between increases in global mean temperature and the latter elements mentioned in Article 2 are explored in order to cast light on the risks posed by climate change.» - James S. Risbey and Terence J. O’Kane (2011) – Sources of knowledge and ignorance in climate research – Climatic Change doi:10.1007/s10584-011-0186-6 – Published online: 20/08/2011 – Centre for Australian Weather and Climate Research; CSIRO Marine and Atmospheric Research
“The deficiencies of ocean GCMs can be understood in contrast with the more mature NWP GCMs (Section 3.1). These deficiencies reflect the fact that ocean modelling is a more difficult problem. There are a number of reasons for this, but the relevant one for the purpose of this discussion is that the critical instability scale in the ocean occurs on spatial scales an order of magnitude finer than in the atmosphere and we simply don’t yet have powerful enough computers to resolve these scales.” - Ian Enting et al (2008) – The Science of Stabilising Greenhouse Gas Concentrations – Garnaut Climate Change Review – 01/04/2008- The University of Melbourne
“Uncertainties in the science of the climate’s response to rising concentrations of greenhouse gases mean that it is not possible to state a precise relation between a chosen temperature target and corresponding targets for atmospheric concentrations and emissions. Rather, the choice of mitigation targets needs to be made on the basis of consideration of reduction of risk.” - Paul Baer and Dan Kammen – Climate scientists debate with prime minister – Environmental Research Web – 19/05/2009 – http://environmentalresearchweb.org/cws/article/opinion/39126
“I could not honestly go and tell the public that two degrees warming is safe. We’re already seeing a lot of impacts of the 0.7 degrees warming that we’ve had so far. So I consider two degrees not safe, and John Schellnhuber this morning asked about the question ‘Is Russian roulette dangerous?’ and in Russian roulette you have a one in six chance of something terrible happening, I think that when we go to two degrees we probably have more than a one in six chance of really bad impacts occurring … two degrees is really an upper limit, and it’s not something that, you know, we aim for two degrees but it’s OK if we end up at three. That was my key message … when you try to come to some number like two degrees, that’s a judgment that uses science, but it’s not for scientists to give you that number. It’s a risk game, and how much risk society wants to take.” - Convención marco de las Naciones Unidas sobre el cambio climático – Organización de las Naciones Unidas – 09/09/1992 – New York
“Artículo 2: El objetivo último de la presente Convención y de todo instrumento jurídico conexo que adopte la Conferencia de las Partes, es lograr, de conformidad con las disposiciones pertinentes de la Convención, la estabilización de las concentraciones de gases de efecto invernadero en la atmósfera a un nivel que impida interferencias antropógenas peligrosas en el sistema climático. Ese nivel debería lograrse en un plazo suficiente para permitir que los ecosistemas se adapten naturalmente al cambio climático, asegurar que la producción de alimentos no se vea amenazada y permitir que el desarrollo económico prosiga de manera sostenible.” - Shardul Agrawala (1999) – Early science-policy interactions in climate change: lessons from the Advisory Group on Greenhouse Gases – Global Environmental Change 9:157-169 doi:10.1016/S0959-3780(99)00003-5 – 23/06/1999 – Science, Technology and Environmental Policy Program, Woodrow Wilson School of Public and International Affairs, Princeton University
“The consensus-based science advisory apparatus of the global climate regime is frequently faulted for achieving legitimacy at the cost of compromising the policy specificity of its assessments. The merits of an alternate corporatist model with closed-door interaction between a few experts and stakeholders are examined here by exhuming the legacy of the Advisory Group on greenhouse Gases (AGGG). The study concludes that while such a model engendered policy innovation in the short term, it caused the erosion of scientific and political support over the longer term, leading to the marginalization of the AGGG.” - Hans Joachim Schellnhuber (2008) – Global warming: Stop worrying, start panicking? – Proceedings of the National Academy of Sciences PNAS 105:14239-14240 doi:10.1073/pnas.0807331105 – 23/09/2008 – Potsdam Institute for Climate Impact Research
«Also, by construction, the IPCC vessel tends to steer clear of value judgments that might be easily converted into ‘‘policy-prescriptive’’ statements. The downside of this well-meaning attitude is that the 2007 report does not, for instance, make a systematic attempt to characterize what dangerous anthropogenic interference (DAI) with the natural climate system is all about. Again, all of the relevant information is implicitly contained in the IPCC tomes, most notably in chapter 19 of the Working Group II report (3) (see also ref. 4). Yet even that chapter shies away from updating the ‘‘burning embers diagram’’ (5), which provides a direct scientific way to gauge the political target of limiting global mean temperature (GMT) rise to less than 2°C (6) against avoided climate impacts.» - Le jour d’après – Wikipedia – Accedido: 12/05/2008 – http://fr.wikipedia.org/wiki/Le_Jour_d’apr%C3%A8s_(film,_2004)
“Le Jour d’après est un film catastrophe fondé sur des hypothèses scientifiques mais dont certains points ont été exagérés. En effet, les événements se déroulent dans un délai extrêmement court dans le film (quelques jours) alors qu’en réalité, les scientifiques prévoient des changements climatiques sur plusieurs années, voire plusieurs siècles. Le producteur s’est toutefois démarqué au niveau des effets spéciaux.” - E. Hawkins et al (2011) – Bistability of the Atlantic overturning circulation in a global climate model and links to ocean freshwater transport – Geophysical Research Letters 38, L10605 doi:10.1029/2011GL047208 – 25/05/2011 – NCAS-Climate, University of Reading – 7 authors – http://www.met.rdg.ac.uk/~ed/publications/hawkins_etal_2011_hysteresis.pdf
«Here we demonstrate AMOC bistability in the response to freshwater perturbations in the FAMOUS AOGCM – the most complex AOGCM to exhibit such behaviour to date. The results also support recent suggestions that the direction of the net freshwater transport at the southern boundary of the Atlantic by the AMOC may be a useful physical indicator of the existence of bistability. We also present new estimates for this net freshwater transport by the AMOC from a range of ocean reanalyses which suggest that the Atlantic AMOC is currently in a bistable regime, although with large uncertainties. More accurate observational constraints, and an improved physical understanding of this quantity, could help narrow uncertainty in the future evolution of the AMOC and to assess the risk of a rapid AMOC collapse.» - Michael Oppenheimer (1998) – Global warming and the stability of the West Antarctic Ice Sheet – Nature 393:325-332 doi:10.1038/30661 – Environmental Defense Fund – 28/05/1998 – http://www.geo.utexas.edu/courses/387h/papers/oppenheimer/1998/nature.pdf
“It is not possible to place high confidence in any specific prediction about the future of WAIS. Nevertheless, policy makers are confronted by three scenarios that span the range of plausible outcomes forWAIS, assuming continued growth in greenhouse-gas emissions according to rates characteristic of most of the IPCC ‘IS92’ emissions projections (ref) The following ice-sheet scenarios reflect the assumption that there is a gradual dynamic response to removal of the ice shelf, no dynamic response, and very rapid dynamic response, respectively.” - Richard S.J. Tol (2006) – Europe’s long-term climate target: A critical evaluation – Energy Policy 35:424-432 – 20/01/2006 – Research unit Sustainability and Global Change, Hamburg University and Centre for Marine and Atmospheric Science – http://www.mi.uni-hamburg.de/fileadmin/fnu-files/publication/tol/RM7208.pdf
“This target is supported by rather thin arguments, based on inadequate methods, sloppy reasoning, and selective citation from a very narrow set of studies. In the scientific literature on ‘dangerous interference with the climate system’, most … studies do not make specific recommendations, with the exception of cost-benefit analyses, which unanimously argue for less stringent policy targets. However, there are also a few ‘scientific’ studies that recommend a target without supporting argumentation. Overall, the 2 ºC target of the EU seems unfounded.” - Elizabeth A. Stanton (2009) – Negishi Welfare Weights: The Mathematics of Global Inequality – Climatic Change 107:417-432 doi:10.1007/s10584-010-9967-6 – Published online: 16/12/2010 – Stockholm Environment Institute – http://sei-us.org/Publications_PDF/SEI-WorkingPaperUS-0902.pdf
“Like any economic models, IAMs are not value-free – many of the assumptions that go into building climate-economics models are based on moral judgments, and not on scientific facts. The choice of a discount rate is, perhaps, the best known example. The optimal course of action recommended by an IAM can only be understood in the context of the discount rate employed – the higher the discount rate the lower the value that we place on the well-being of future generations.” - H. E. Goeller and Alvin M. Weinberg (1978) – The Age of Substitutability – The American Economic Review 68:1-11 – 28825 – Oak Ridge National Laboratory; Institute For Energy Analysis, Oak Ridge – http://www.jstor.org/stable/pdfplus/2951003.pdf
“On the basis of their scrutiny of these geological and technological data, Goeller and Weinberg pronounce the principle of infinite substitutability: With the exception of phosphorus and some trace elements for agriculture, mainly cobalt, copper and zinc, and finally the CH_x (coal, oil and gas), society can exist on near-inexhaustible resources for an indefinite period.” - Dieter Helm (2003) – The Assessment: Climate-Change Policy – Oxford Review of Economic Policy 19:349-361 doi:10.1093/oxrep/19.3.349 – Published online: 01/01/2003 – New College, Oxford – http://oxrep.oxfordjournals.org/content/19/3/349.full.pdf+html
“Politicians and the wider public, having little understanding of the technicalities of such models, may place too much faith in their predictions.5 While this danger is widely manifest, the problem with this response is that it leaves policy design to the whims of political selection, and it provides no guidance as to the optimal level of emissions and emissions control. For a central insight which environmental economics brings is that the optimal level of pollution is not normally zero: it is where the marginal costs of abatement equal the marginal costs of the pollution.” - Carlo C. Jaeger and Julia Jaeger (2010) – Three Views of Two Degrees – European Climate Forum – Published online: 01/01/2010 – Potsdam Institute for Climate Impact Research, European Climate Forum and Beijing Normal University; European Climate Forum – http://www.european-climate-forum.net/fileadmin/ecf-documents/publications/ecf-working-papers/jaeger__three-views-of-two-degrees__ecf-working-paper-2-2010.pdf
”If a 2.5° C temperature rise leads to a 2% loss of GDP in 2100, then a 2° limit may lead to a 1.5% loss, so that marginal benefits would be 0.5% of GDP in 2100. If on the other hand a 2.5° limit leads to a reduction of annual growth by 0.003%, then a 2° limit may lead to a reduction of annual growth by 0.0006%, and so to marginal costs of 0.07% of GDP in 2100. This, however would imply that a 2° limit is way to loose, and the optimal policy would be to aim for 1 or even 0.5°. Things look different if one introduces discounting (which CEU 2005 does not).” - William D. Nordhaus (1991) – To slow or not to slow: the economics of the greenhouse effect – The Economic Journal 101:920-937 – Published online: 01/07/1991 – http://www.sc-eco.univ-nantes.fr/~tvallee/memoire/model-DICE/nordhaus/to-Slow-or-Not-to-Slow-The-Economics-of-The-Greenhouse-Effect.pdf
“Notwithstanding these simplifications, the approach laid out here may help clarify the questions and help identify the scientific, economic, and policy issues that must underpin any rational decision. Once the fundamental concepts are clear, it is relatively straightforward to move to a more detailed disaggregated approach so as to fine tune the calculations. But whether we use simple approaches like the present one or more elaborate models, we must balance costs and damages if we are to preserve our precious time and resources for the most important threats to our health and happiness.” - Nicholas Stern (2006) – Stern review on the economics of climate change – Cambridge University Press, Cambridge, UK – Grantham Institute, India Observatory, and STICERD at the London School of Economics and Political Science – http://mudancasclimaticas.cptec.inpe.br/~rmclima/pdfs/destaques/sternreview_report_complete.pdf
- Stern afirma que efectos cambio climático – Milenio – Published online: Visitado en: 17/04/2008 – http://www.milenio.com/index.php/2008/04/17/225694/
“Las personas que opinan que es desatar el pánico están completamente equivocadas. Si hay algo que hice, fue ser demasiado moderado.” - David Adam – I underestimated the threat, says Stern – The Guardian – Published online: 18/04/2008 – http://www.guardian.co.uk/environment/2008/apr/18/climatechange.carbonemissions
“He pointed to last year’s reports from the Intergovernmental Panel on Climate Change (IPCC) and new research which shows that the planet’s oceans and forests are soaking up less carbon dioxide than expected. He said: «Emissions are growing much faster than we’d thought, the absorptive capacity of the planet is less than we’d thought, the risks of greenhouse gases are potentially bigger than more cautious estimates and the speed of climate change seems to be faster.» Stern said the new findings vindicated his report, which has been criticised by climate sceptics and some economists as exaggerating the possible damage. «People who said I was scaremongering were profoundly wrong,» he told a conference in London.” - William Nordhaus (2007) – Critical Assumptions in the Stern Review on Climate Change – Science 317:201-202 doi:10.1126/science.1137316 – Published online: 13/07/2007 – Yale University – Department of Economics; National Bureau of Economic Research (NBER) – http://nordhaus.econ.yale.edu/nordhaus_stern_science.pdf
“The Stern Review recommended urgent, immediate, and sharp reductions in greenhouse-gas emissions. These findings differ markedly from economic models that calculate least-cost emissions paths to stabilize concentrations or paths that balance the costs and benefits of emissions reductions.” - William Nordhaus – The Stern Review on the Economics of Climate Change – – Published online: 03/05/2007
“An examination of the Review’s radical revision of the economics of climate change finds, however, that it depends decisively on the assumption of a near-zero time discount rate combined with a specific utility function. The Review’s unambiguous conclusions about the need for extreme immediate action will not survive the substitution of assumptions that are consistent with today’s marketplace real interest rates and savings rates.” - Clive L. Spash (2007) – The economics of climate change impacts a la Stern: novel and nuanced or rhetorically restricted? – Ecological Economics 63:706-713 doi:10.1016/j.ecolecon.2007.05.017 – Published online: 27/06/2007 – CSIRO, Sustainable Ecosystems Division – http://www.clivespash.org/ee2007_spashonstern.pdf
“The basic issue is not the detail but the whole approach (Spash, 2007b). Economists finding themselves facing a complex and long-term problem recognise many of the key issues. Stern repeatedly tell us that there is considerable uncertainty over cause–effect relationships, that these will be outside empirical observation (Stern, 2006: 293 ft nt7), that their model relies upon “nonexistent data” (Stern, 2006: 153), and that ethics and social values are crucial to the decision. However they then squeeze all issues to fit within an existing theoretical model which is totally inadequate for addressing the problems they themselves have outlined.” - John P. Weyant (2008) – A Critique of the Stern Review’s Mitigation Cost Analyses and Integrated Assessment – Review of Environmental Economics and Policy 2:77–93 doi:10.1093/reep/rem022 – 23/04/2008 – Professor of management science and engineering and the director of the EnergyModeling Forum at Stanford University
“Although the Review makes many significant contributions, I believe that the framework the Review adopted to formulate policy recommendations is difficult to understand, and probably not a great fit with the problem being addressed, and that the specific recommendation for large emission reductions in the short run to facilitate stabilization of greenhouse gases in the atmosphere at 550 (or even 450) parts per million CO2 equivalents in the atmosphere is not justified by the analysis.” - Carlo Jaeger et al (2008) – Stern’s Review and Adam’s fallacy – Climatic Change 89:207–218 doi:10.1007/s10584-008-9436-7 – Published online: 08/07/2008 – PIK–Potsdam Institute for Climate Impact Research, Potsdam – 3 autores – http://www.eci.ox.ac.uk/publications/downloads/schellnhuber08-stern-and-adam.pdf
“The Review has compared climate change to experiences of suffering like World War I. That war, however, hardly affected global GDP. The long-term damages to be expected from business-as-usual greenhouse gas emissions include loss of the coastal cities of the world over the next millennia. current and future international climate mitigation policies … Business leaders worried about climate change need to pay attention to the tensions between ethical and economic concerns. Otherwise, a credibility crisis threatens global climate policy. An important step to establish the credibility needed for effective climate policy will be to gradually move towards a regime where emission permits are auctioned, not handed out as hidden subsidies. The revenues generated by permit auctions should be used to establish a global system of regional climate funds.” - Eric Neumayer (2007) – A missed opportunity: the Stern review on climate change fails to tackle the issue of non-substitutable loss of natural capital – Global Environmental Change 17:297-301 doi:10.1016/j.gloenvcha.2007.04.001 – Professor of Environment and Development, London School of Economics and Political Science (LSE), Department of Geography and Environment and Centre for Environmental Policy and Governance – http://eprints.lse.ac.uk/3059/1/A_missed_opportunity_(LSERO).pdf
“The choice of a low discount rate is the main reason for the Review’s divergence in conclusions compared to other economic studies. I argue that the Review’s ethical reasons for a low discount rate are defendable, but unlikely to find wider public support.” - Ted Trainer – The Stern Review; A Critical Assessment of Its Mitigation Optimism – Published online: 18/12/2007 – http://ssis.arts.unsw.edu.au/tsw/Stern.18.12.07.html
“The argument in this paper is that even if these statements are correct they are seriously misleading, because the Review does take appropriate energy or emission targets, it does not deal with most of the steps that must be taken to solve the greenhouse problem, and it relies inappropriately on economic modelling and especially because it makes at least highly challengeable assumptions regarding the potential of renewable energy. Other reasons are given to support conclusions that are dramatically at variance with those of Stern, i.e., that the greenhouse problem cannot be solved without large reductions in aggregate world economic activity, possibly of the order of 75% or more, and therefore its solution is not compatible with the continuation of consumer-capitalist society.” - Martin L. Weitzman (2009) – On Modeling and Interpreting the Economics of Catastrophic Climate Change – The Review of Economics and Statistics 91:1-19 doi:10.1162/rest.91.1.1 – http://www.mitpressjournals.org/doi/pdf/10.1162/rest.91.1.1
“With climate change as prototype example, this paper analyzes the implications of structural uncertainty for the economics of low-probability, high-impact catastrophes. Even when updated by Bayesian learning, uncertain structural parameters induce a critical “tail fattening” of posterior-predictive distributions. Such fattened tails have strong implications for situations, like climate change, where a catastrophe is theoretically possible because prior knowledge cannot place sufficiently narrow bounds on overall damages. This paper shows that the economic consequences of fat-tailed structural uncertainty (along with unsureness about high-temperature damages) can readily outweigh the effects of discounting in climate-change policy analysis.” - Thomas Bruckner et al (2003) – Methodological Aspects of The Tolerable Windows Approach – Climatic Change 56:73–89 doi:10.1023/A:1021388429866 – Institute for Energy Engineering, Technical University of Berlin – 4
“In a nutshell, the TWA can be described as follows: on the basis of a set of prescribed constraints (‘guardrails’) that exclude intolerable climate change impacts and unacceptable mitigation measures, the admissible range of future emissions paths is sought by investigating the dynamic relationships linking the causes and effects of global climate change.” - Ferenc L. Toth (2003) – Climate Policy In Light Of Climate Science: The ICLIPS Project – Climatic Change 56:7-36 doi:10.1023/A:1021376128049 – Potsdam Institute for Climate Impact Research – http://www.pik-potsdam.de/~fuessel/download/help_iit/publications/cc03_iclips_paper.pdf –
“The paper introduces the Tolerable Windows Approach (TWA) as a decision analytical framework for addressing global climate change. It is implemented as an integrated assessment model (IAM) developed in the project on Integrated Assessment of Climate Protection Strategies (ICLIPS) … Key features of the TWA are compared with those of cost-benefit and cost-effectiveness frameworks. An overview of the ICLIPS IAM framework is provided together with its methodological foundations. Main features of the individual models are presented, covering the climate, the aggregated economic, and the impact models.” - Wissenschaftlicher Beirat der Bundesregierung Globale Umweltveränderungen (Wissenschaftlicher Beirat der Bundesregierung Globale Umweltveränderungen (WBGU)) (1995) – Special Report. Scenario for the derivation of global CO2 reduction targets and implementation strategies – WBGU (Consejo Asesor del Cambio Global) – 01/03/1995 – Statement on the occasion of the First Conference of the Parties to the Framework Convention on Climate Change in Berlin – http://www.wbgu.de/fileadmin/templates/dateien/veroeffentlichungen/sondergutachten/sn1995/wbgu_sn1995_engl.pdf
«In our scenario, a mean value for the burden on global society of 5% of GGP is taken as the maximum tolerable limit (to the extent that this burden can be expressed in monetary terms). It has to be taken into account that, given the uneven spatial distribution of climate impacts, some states may be affected much more seriously than others … the extreme reduction requirements that arise after approx. 30 years would then exceed the elasticity of the world economic system.» - WBGU (1997) – Targets for Climate Protection – German Advisory Council on Global Change – http://www.wbgu.de/fileadmin/templates/dateien/veroeffentlichungen/sondergutachten/sn1997/wbgu_sn1997_engl.pdf
“The tolerable window approach … does not attach absolute priority to environmental protection at the expense of economic and social objectives… avoids a number of problems that arise when performing cost-benefit analysis … Losses in one category (e.g. the irrecoverable loss of essential life-giving resources) cannot be arbitrarily compensated for by gains in a different category (e.g. a regional increase in recreational value).” - German Advisory Council on Global Change WBGU (1995) – Special Report. Scenario for the derivation of global CO2 reduction targets and implementation strategies – Statement on the occasion of the First Conference of the Parties to the Framework Convention on Climate Change in Berlin – 17/02/1995
“The “backwards mode” has to be used instead for such answers: taking account of the impacts of climate change on human beings and nature, the “window” of tolerable future climate change is defined. The next step is to calculate the global emission profiles which ensure conformity within that window.” - Gary W. Yohe (1999) – The Tolerable Windows Approach: Lessons and Limitations – Climatic Change 41:3-295, doi:10.1023/A:1005451718361 – Department of Economics, Wesleyan University + Center for Integrated Study of the Human Dimensions of Global Change, Carnegie Mellon University
“Where does it fall short? It says nothing about how to build and sustain the institutions within which policy-makers will negotiate which constraints are to be allowed and which are not. It says nothing about how to design the mitigation policy that will most efficiently maintain tolerable climate change. It says nothing about how to build and sustain the institutions with which the globe will adjust this policy as our understanding of tolerable change evolves. But it does stand ready to complement other analytical approaches in informing these processes.” - Richard S.J. Tol (2006) – Europe’s long-term climate target: A critical evaluation – Energy Policy 35:424-432 – Published online: 20/01/2006 – Research unit Sustainability and Global Change, Hamburg University and Centre for Marine and Atmospheric Science
“The 0.1 ºC/decade target can be traced to the late 1980s, but then the trace vanishes. Apocryphal evidence [ref] holds that the 0.1 ºC/ decade target is appropriate for a plant species on the shores of a lake in North America. This study was never published, but mentioned at dinner during an early climate conference. Someone else repeated the information the next day in plenary, and an urban legend was born. As the 0.1 ºC/decade target has vanished from policy discussions— perhaps because natural variability is greater3—it need not concern us any further. However, it is important to know whether the 2 ºC is valid or similarly based on bogus science.” - Samuel Randalls (2010) – History of the 2 ºC climate target – Wiley Interdisciplinary Reviews: Climate Change 1:598-605 DOI:10.1002/wcc.62 – Published online: 01/07/2010 – Department of Geography, University College London
“Legend has it that the targets emerged from a dinner conversation about unpublished work on plant species on the shores of a North American Lake, which was subsequently reported in a plenary conference session.” - Wissenschaftlicher Beirat der Bundesregierung Globale Umweltveränderungen (1995) – Special Report. Scenario for the derivation of global CO2 reduction targets and implementation strategies. Statement on the occasion of the First Conference of the Parties to the Framework Convention on Climate Change in Berlin – WBGU (Consejo Asesor del Cambio Global) – Marzo 1995 – http://www.wbgu.de/fileadmin/templates/dateien/veroeffentlichungen/sondergutachten/sn1995/wbgu_sn1995_engl.pdf
“The first principle, preservation of Creation in its present form, is presented within this scenario in the form of a tolerable “temperature window”. This window is derived from the range of fluctuation for the Earth’s mean temperature in the late Quarternary period. This geological epoch has shaped our present-day environment, with the lowest temperatures occurring in the last ice age (mean minimum around 10.4 °C) and the highest temperatures during the last interglacial period (mean maximum around 16.1 °C). If this temperature range is exceeded in either direction, dramatic changes in the composition and function of today’s ecosystems can be expected. If we extend the tolerance range by a further 0.5 °C at either end, then the tolerable temperature window extends from 9.9 °C to 16.6 °C. Today’s global mean temperature is around 15.3 °C, which means that the temperature span to the tolerable maximum is currently only 1.3 °C.” - Potsdam Memorandum – Global Sustainability: A Nobel Cause – 10/10/2007
“In order to achieve Climate Stabilization, a post-2012 regime should comprise the following key elements: A global target such as the 2º C-limit relative to pre-industrial levels …” - Nick Brooks et al (2005) – Climate stabilisation and “dangerous” climate change: A review of the relevant issues – Background paper – Tyndall Centre for Climate Change Research, University of East Anglia; Centre for Ecology & Hydrology, Wallingford; Ministry of the Environment, Stockholm, Sweden; Cambridge Econometrics, Covent Garden, Cambridge; Tyndall Centre (North), School of Mechanical Engineering
“Estimates of climate sensitivity and future warming are generally based on an assumption that the climate system responds to increasing GHG concentrations in a more-or-less linear fashion. However, there is concern that, above critical temperature thresholds, large “singular” events may be triggered … Many existing climate models are unable to reproduce transient non-linear changes in climate such as have occurred in the past, suggesting that the likelihood of abrupt climate change may have been underestimated.” - James Hansen et al (2011) – The Case for Young People and Nature: A Path to a Healthy, Natural, Prosperous Future – Draft paper – 04/05/2011 – Columbia University Earth Institute, New York – http://www.columbia.edu/~jeh1/mailings/2011/20110505_CaseForYoungPeople.pdf – 14
“The conclusion is that global warming of 1°C relative to 1880-1920 mean temperature (i.e., 0.75°C above the 1951-1980 temperature or 0.3°C above the 5-year running mean temperature in 2000), if maintained for long, is already close to or into the ‘dangerous’ zone. The suggestion that 2°C global warming may be a ‘safe’ target is extremely unwise based on critical evidence accumulated over the past three decades. Global warming of this amount would be putting Earth on a path toward Pliocene-like conditions, i.e., a very different world marked by massive and continual disruptions to both society and ecosystems. It would be a world in which the world’s species and ecosystems will have had no recent evolutionary experience, surely with consequences and disruptions to the ecosystem services that maintain human communities today. There are no credible arguments that such rapid change would not have catastrophic circumstances for human well-being.” - Hansen (2005) – A slippery slope: How much global warming constitutes “dangerous anthropogenic interference”. An Editorial Essay – Climatic Change 68:269-279 doi:10.1007/s10584-005-4135-0 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute
“The 2 ◦C scenario cannot be recommended as a responsible target, as it almost surely takes us well into the realm of dangerous anthropogenic interference with the climate system.” - Lynn Rosentrater (2005) – 2° is too much! Evidence and Implications of Dangerous Climate Change in the Arctic – WWF International Arctic Programme – 7 http://www.panda.org/downloads/arctic/050129evidenceandimplicationshires.pdf
“In the Arctic, even a slight shift in temperature, pushing averages to above freezing, can bring about rapid and dramatic changes in an ecosystem that is defined by being frozen.” - Katherine Richardson et al (2009) – Global Risks, Challenges & Decisions – International Scientific Congress Climate Change – 12/03/2009 – Australian National University, ETH Zürich, National University of Singapore, Peking University, University of California – Berkeley, University of Cambridge, University of Copenhagen, University of Oxford, The University of Tokyo, Yale University – http://www.climatecongress.ku.dk – 12
“A 2 ºC guardrail, which was thought in 2001 to have avoided serious risks for all five reasons for concern, is now inadequate to avoid serious risks to many unique and threatened ecosystems and to avoid a large increase in the risks associated with extreme weather events. Third, the risks of large scale discontinuities, such as the tipping elements described above, were considered to be very low in 2001 for a 2 ºC increase but are now considered to be moderate for the same increase. In summary, although a 2 ºC rise in temperature above pre-industrial remains the most commonly quoted guardrail for avoiding dangerous climate change, it nevertheless carries significant risks of deleterious impacts for society and the environment.” - Bruce T. Anderson (2011) – Near-term increase in frequency of seasonal temperature extremes prior to the 2°C global warming target – Climatic Change doi:10.1007/s10584-011-0196-4 – Published online: 08/09/2011 – Department of Geography and Environment, Boston University
“Results indicate that given a 2°C global mean temperature increase it is expected that for 70–80% of the land surface maximum seasonal-mean temperatures will exceed historical extremes (as determined from the 95th percentile threshold value over the second half of the 20th Century) in at least half of all years, i.e. the current historical extreme values will effectively become the norm. Many regions of the globe—including much of Africa, the southeastern and central portions of Asia, Indonesia, and the Amazon—will reach this point given the “committed” future global-mean temperature increase of 0.6°C (1.4°C relative to the pre-industrial era) and 50% of the land surface will reach it given a future global-mean temperature increase of between 0.8 and 0.95°C (1.6–1.75°C relative to the pre-industrial era). These results suggest substantial fractions of the globe could experience seasonal-mean temperature extremes with high regularity, even if the global-mean temperature increase remains below the 2°C target.” - F. Simon, G. Lopez-Abente, E. Ballester, F. Martinez (2005) – Mortality in Spain during the heat waves of summer 2003 – Eurosurveillance 10:156-161 – Published online: 01/07/2005 – Centro Nacional de Epidemiología, Instituto de Salud Carlos III; Programa de Epidemiología Aplicada de Campo (PEAC) – http://www.eurosurveillance.org/ViewArticle.aspx?ArticleId=555 – 4 autores
“Meteorological information was provided by the Instituto Nacional de Meteorología (National Institute of Meteorology). Spain experienced three heat waves in 2003. The total associated excess deaths were 8% (43.212 observed deaths compared with 40.046 expected deaths). Excess deaths were only observed in those aged 75 years and over (15% more deaths than expected for the age group 75 to 84 and 29% for those aged 85 or over).” - J.M. Robine et al (2008) – Death toll exceeded 70.000 in Europe during the summer of 2003 – Current Research in Biology 331:171-17 – Published online: 31/12/2007 – INSERM, Démographie et santé – http://www.ncbi.nlm.nih.gov/pubmed/18241810 – 7 autores
“Daily numbers of deaths at a regional level were collected in 16 European countries. Summer mortality was analyzed for the reference period 1998-2002 and for 2003. More than 70,000 additional deaths occurred in Europe during the summer 2003. Major distortions occurred in the age distribution of the deaths, but no harvesting effect was observed in the months following August 2003. Global warming constitutes a new health threat in an aged Europe that may be difficult to detect at the country level, depending on its size. Centralizing the count of daily deaths on an operational geographical scale constitutes a priority for Public Health in Europe.” - Bruce T. Anderson (2011) – Intensification of seasonal extremes given a 2°C global warming target – Climatic Change doi:10.1007/s10584-011-0213-7 – Published online: 08/09/2011 – Department of Geography and Environment, Boston University
“Hotspot regions include much of eastern and northern South America, large portions of Africa, and regions throughout Indonesia— regions where the base-temperature increases, while more moderate than elsewhere, are relatively large compared to the interannual-to-decadal variations in seasonal mean temperatures (BN09). In these regions, given a 1.2°C increase in global-mean temperatures it is more likely than not (and in some cases highly likely) that each year will have a 3- month period in which temperatures exceed the most extreme value experienced during the last half of the 20th Century. As found in BN09, these regions are also likely to be the ones in which the average seasonal-mean temperatures first exceed the most extreme value experienced during the last half of the 20th Century.” - James Hansen et al (2005) – Earth’s Energy Imbalance: Confirmation and Implications – Science 308:1431-1435 doi:10.1126/science.1110252 – Published online: 03/06/2005 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute – 15 authors
“Implications include (i) the expectation of additional global warming of about 0.6-C without further change of atmospheric composition.” - David. S. Battisti and Rosamond L. Naylor (2009) – Historical Warnings of Future Food Insecurity with Unprecedented Seasonal Heat – Science 323:240-244 – Published online: 09/01/2009 – Department of Atmospheric Sciences, University of Washington; Program on Food Security and the Environment, Stanford University
“We used observational data and output from 23 global climate models to show a high probability (>90%) that growing season temperatures in the tropics and subtropics by the end of the 21st century will exceed the most extreme seasonal temperatures recorded from 1900 to 2006. In temperate regions, the hottest seasons on record will represent the future norm in many locations … What if the average future seasonal temperature were to exceed the hottest seasons on record (fig)? Entering a whole new realm of high seasonally averaged temperatures, not just multiday heat waves, will surely challenge the global population’s ability to produce adequate food in the future or even to cope physically with chronic heat stress, unless major adaptations are made.” - Hit the brakes hard – Real Climate – Published online: 29/04/2009 – – – autores
“We feel compelled to note that even a “moderate” warming of 2°C stands a strong chance of provoking drought and storm responses that could challenge civilized society, leading potentially to the conflict and suffering that go with failed states and mass migrations. Global warming of 2°C would leave the Earth warmer than it has been in millions of years, a disruption of climate conditions that have been stable for longer than the history of human agriculture. Given the drought that already afflicts Australia, the crumbling of the sea ice in the Arctic, and the increasing storm damage after only 0.8°C of warming so far, calling 2°C a danger limit seems to us pretty cavalier.” - Chris D. Thomas et al (2004) – Extinction risk from climate change – Nature 427:145-148 doi:10.1038/nature02121 – Published online: 08/01/2004 – Centre for Biodiversity and Conservation, School of Biology, University of Leeds – 19 authors
“Exploring three approaches in which the estimated probability of extinction shows a power-law relationship with geographical range size, we predict, on the basis of mid-range climate-warming scenarios for 2050, that 15–37% of species in our sample of regions and taxa will be ‘committed to extinction’.” - Harry J. Dowsett et al (1999) – Middle Pliocene Paleoenvironmental Reconstruction: PRISM2 – US Geological Survey – Published online: 01/01/1999 – U.S. Geological Survey – http://pubs.usgs.gov/of/1999/of99-535/ – 7 autores
“Important features of PRISM2 compared to modern are: … 3. Sea level change of + 25 meters which requires substantial reduction of the Antarctic Ice Sheet.” - James E. Hansen and Makiko Sato (2011) – Paleoclimate Implications for Human-Made Climate Change – En: Climate Change at the Eve of the Second Decade of the Century: Inferences from Paleoclimate and Regional Aspects: Proceedings of Milutin Milankovitch 130th Anniversary Symposium» (Eds. A. Berger, F. Mesinger and D Šijački) – Published online: 19/01/2011 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute, New York – http://arxiv.org/abs/1105.0968v3
“We have presented evidence in this paper that prior interglacial periods were less than 1°C warmer than the Holocene maximum. If we are correct in that conclusion, the EU2C scenario implies a sea level rise of many meters. It is difficult to predict a time scale for the sea level rise, but it would be dangerous and foolish to take such a global warming scenario as a goal.” - Timothy M. Lenton et al (2008) – Tipping elements in the Earth’s climate system – Proceedings of the National Academy of Sciences PNAS 105:1786-1793 doi:10.1073/pnas.0705414105 – Published online: 12/02/2008 – School of Environmental Sciences, University of East Anglia, and Tyndall Centre for Climate Change Research – 7 autores
“Changes in fire frequency probably contribute to bistability and will be amplified by forest fragmentation due to human activity. Indeed land-use change alone could potentially bring forest cover to a critical threshold. Thus, the fate of the Amazon may be determined by a complex interplay between direct land-use change and the response of regional precipitation and ENSO to global forcing.” - Hideo Shiogama et al (2011) – Observational constraints indicate risk of drying in the Amazon basin – Nature Communications 2:253 doi:10.1038/ncomms1252 – Published online: 29/03/2011 – Atmospheric Environment Division, National Institute for Environmental Studies, Tsukuba, Japan – 7 autores
“Here, we show that, although the ensemble mean assessment suggested wetting across most of South America, the observational constraints indicate a higher probability of drying in the Amazon basin. Thus, over-reliance on the consensus of models can lead to inappropriate decision making.” - Daniel C Nepstad et al (2008) – Interactions among Amazon land use, forests and climate: prospects for a near-term forest tipping point – Philosophical Transactions of the Royal Society of London B 363:1737-1746 doi:10.1098/rstb.2007.0036 – Published online: 11/02/2008 – Woods Hole Research Center – 4 autores
“Rising worldwide demands for biofuel and meat are creating powerful new incentives for agro-industrial expansion into Amazon forest regions. Forest fires, drought and logging increase susceptibility to further burning while deforestation and smoke can inhibit rainfall, exacerbating fire risk. If sea surface temperature anomalies (such as El Niño episodes) and associated Amazon droughts of the last decade continue into the future, approximately 55% of the forests of the Amazon will be cleared, logged, damaged by drought or burned over the next 20 years, emitting 15–26 Pg of carbon to the atmosphere. Several important trends could prevent a near-term dieback. As fire-sensitive investments accumulate in the landscape, property holders use less fire and invest more in fire control. Commodity markets are demanding higher environmental performance from farmers and cattle ranchers. Protected areas have been established in the pathway of expanding agricultural frontiers. Finally, emerging carbon market incentives for reductions in deforestation could support these trends.” - Simon L. Lewis et al (2011) – The 2010 Amazon Drought – Science 331:554 DOI:10.1126/science.1200807 – Published online: 04/02/2011 – School of Geography, University of Leeds
“By using relationships between drying and forest biomass responses measured for 2005, we predict the impact of the 2010 drought as 2.2 × 1015 grams of carbon [95% confidence intervals (CIs) are 1.2 and 3.4], largely longer-term committed emissions from drought-induced tree deaths, compared with 1.6 ×1015 grams of carbon (CIs 0.8 and 2.6) for the 2005 event.” - Peter M. Cox et al (2008) – Increasing risk of Amazonian drought due to decreasing aerosol production – Nature 453:212-215 doi:10.1038/nature06960 – Published online: 08/05/2008 – School of Engineering, Computing and Mathematics, University of Exeter – 9 autores
“The Amazon rainforest plays a crucial role in the climate system, helping to drive atmospheric circulations in the tropics by absorbing energy and recycling about half of the rainfall that falls on it. This region (Amazonia) is also estimated to contain about one-tenth of the total carbon stored in land ecosystems, and to account for one-tenth of global, net primary productivity [ref]. The resilience of the forest to the combined pressures of deforestation and global warming is therefore of great concern [ref], especially as some general circulation models (GCMs) predict a severe drying of Amazonia in the twenty-first century [refs].” - Corinne Le Quéré, Michael R. Raupach, Josep G. Canadell, Gregg Marland et al (2009) – Trends in the sources and sinks of carbon dioxide – Nature Geoscience 2:831-836 doi:10.1038/ngeo689 – Published online: 17/11/2009 – School of Environment Sciences, University of East Anglia – 24 autores – http://www.globalcarbonproject.org/global/pdf/lequere-et-al.-2009.trends-sources-and-sinks.nature-geo.pdf
“In the past 50 years, the fraction of CO2 emissions that remains in the atmosphere each year has likely increased, from about 40% to 45%, and models suggest that this trend was caused by a decrease in the uptake of CO2 by the carbon sinks in response to climate change and variability. Changes in the CO2 sinks are highly uncertain, but they could have a significant influence on future atmospheric CO2 levels. It is therefore crucial to reduce the uncertainties.” - J.E.N. Veron (2008) – Mass extinctions and ocean acidification: biological constraints on geological dilemmas – Coral Reefs 27:459–472 – Published online: 06/05/2008 – Coral Reef Research, Australia
“By process of elimination, primary causes of mass extinctions are linked in various ways to the carbon cycle in general and ocean chemistry in particular with clear association with atmospheric carbon dioxide levels. The prospect of ocean acidification is potentially the most serious of all predicted outcomes of anthropogenic carbon dioxide increase. This study concludes that acidi- fication has the potential to trigger a sixth mass extinction event and to do so independently of anthropogenic extinctions that are currently taking place.” - Joel B. Smith et al (2009) – Assessing dangerous climate change through an update of the Intergovernmental Panel on Climate Change (IPCC) ‘reasons for concern’ – Proceedings of the National Academy of Sciences PNAS 106:4133-4137 doi:10.1073/pnas.0812355106 – Published online: 17/03/2009 – Stratus Consulting, Inc. – http://www.pnas.org/content/106/11/4133.full.pdf+html – 15 autores
“In summary, the shifting of risk transitions to lower GMTs is derived from assessment of (i) strengthened observations of impacts already occurring because of warming to date, (ii) better understanding and greater confidence in the likelihood of climatic events and the magnitude of impacts and risks associated with increases in GMT, (iii) more precise identification of particularly affected sectors, groups, and regions, and (iv) growing evidence that even modest increases in GMT above levels circa 1990 could commit†† the climate system to the risk of very large impacts on multiple-century time scales.” - John Bechhoefer (2005) – Feedback for physicists: A tutorial essay on control – Reviews of Modern Physics 77:783-836 – 31/08/2005 – Department of Physics, Simon Fraser University – http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.124.7043&rep=rep1&type=pdf
“Feedback and control theory are important ideas that should form part of the education of a physicist but rarely do. … [they] are such important concepts that it is odd that they usually find no formal place in the education of physicists … Introductory engineering textbooks … are long (800 pages is typical) … their examples are understandably geared more to the engineers than to the physicist.” - Edward Lorenz (1963) – Deterministic Nonperiodic Flow – Journal of Atmospheric Sciences 20:130-141 – Published online: 01/03/1963 – Massachusets Institute of Technology
“Finite systems of deterministic ordinary nonlinear differential equations may be designed to represent forced dissipative hydrodynamic flow. Solutions of these equations can be identified with trajectories in phase space. For those systems with bunded solutions, it is found that nonperiodic solutions are ordinarily unstable with respect to small modifications, so that slightly differing initial states can evolve into considerable different sttes. Systems with bounded solutons are found to posses bounded numerical solutions. A simple system representing cellular convenction is solved numerically. All of the solutions are found to be unstable, and almost all of them are nonperiodic.” - Donella Meadows et al (1972) – The Limits to Growth. A Report for the Club of Rome’s Project for the Predicament of Mankind – Universe Books, New York
- Mattew R. Simmons (2000) – Revisiting «The Limits of Growth»: Could the Club of Rome have been correct, after all?
“Phase One of the predicament of mankind never really made up to Phase Two. Instead, rather than merely ignoring this work and forgetting his chilly conclusions is the issues raised were forgotten, too many «experts» decided to use this thoughful work as an easy target of intellectual scorn.” - Graham M. Turner (2008) – A comparison of The Limits to Growth with 30 years of reality – Global Environmental Change 18:397- 411 doi:10.1016/j.gloenvcha.2008.05.001 – Published online: 13/05/2008 – CSIRO Sustainable Ecosystems
“The analysis shows that 30 years of historical data compares favourably wuth key features of a ‘business as usual’ scenario, which result in collapse of the global system midway through the 21st Century.” - Andrew Jarvis, 2010. Comunicación personal
- Wilfrid Bach (1980) – The CO2 issue – what are the realistic options? – Climatic Change 3:3-5 doi:10.1007/BF02423165
“[A] broad systems approach … to help define some ‘threshold’ value of CO2-induced climate change beyond which there would likely be a major disruption of the economic, social and political fabric of certain societies … An assessment of such a critical CO2- level ahead of time could help to define those climatic changes, which would be acceptable and those that should be averted if possible.” - Daniel A. Lashof (1988) – The dynamic greenhouse: Feedback processes that may influence future concentrations of atmospheric trace gases and climatic change – Climatic Change 14:213-242 doi:10.1007/BF00134964 – Published online: 01/01/1989 – U.S. Environmental Protection Agency
“The gain from biogeochemical feedbacks is estimated to be 0.05–0.29 compared to 0.17–0.77 for geophysical climate feedbacks. The potentially most significant biogeochemical feedbacks are probably release of methane hydrates, changes in ocean chemistry, biology, and circulation, and changes in the albedo of the global vegetation. While each of these feedbacks is modest compared to the water vapor feedback, the biogeochemical feedbacks in combination have the potential to substantially increase the climate change associated with any given initial forcing.” - Georgi M. Dimirovski et al (2006) – Control system approaches for sustainable development and instability management in the globalization age – Annual Reviews in Control 30:103-115 doi:10.1016/j.arcontrol.2006.01.004 – Published online: 19/06/2006 – Dogus University, Faculty of Engineering, Acibadem, Kadikoy, Turkey – 6 authors
“The civilization of mankind in the globalization age depends heavily on advanced information technologies emerging from automation and decision expertise and their respective scientific disciplines. The broad area of social systems, being essentially human centred systems, is a cross-, inter- and multi-disciplinary challenge to the control community. Social systems of contemporary civilization are reviewed from the systems science viewpoint and on the grounds of recent developments in control science and technology. Recent developments have emphasised the social responsibility of the control community during the on-going globalization and changes from the Cold-War bipolar world to a unipolar one, on the way to mankind’s multi-polar world of the future.” - Alessio Alexiadis (2007) – Global warming and human activity: A model for studying the potential instability of the carbon dioxide/temperature feedback mechanism – Ecological Modelling 203:243-256 – Published online: 08/01/2007 – UCY-CompSci, European Marie Curie Transfer of Knowledge Center (TOK-DEV) for the Computational Sciences, Department of Mechanical and Manufacturing Engineering , University of Cyprus
“This means that there is a 14% chance that the pole is already in the unstable region and that the temperatures and the concentrations that we experience today are just the initial transient of the typical run-away behavior of an unstable system. Besides this, the effect of the variables that have not been included as inputs (i.e. aerosols, ice albedo, other greenhouse gases, etc.) must be considered. Their effect in the past is already in the parameters since the model is trained from historical data. When the model is used to calculate future scenarios, however, it automatically assumes that these phenomena behave with the same dynamic they showed in the past.” - Benjamin M. Sanderson et al (2007) – Towards constraining climate sensitivity by linear analysis of feedback patterns in thousands of perturbed-physics GCM simulations – Climate Dynamics 30:175-190 doi:10.1007/s00382-007-0280-7 – Published online: 03/07/2007 – AOPP, Department of Physics, University of Oxford, Clarendon Laboratory – 4 autores
“A linear analysis is applied to a multi-thousand member “perturbed physics» GCM ensemble to identify the dominant physical processes responsible for variation in climate sensitivity across the ensemble. Model simulations are provided by the distributed computing project, climate prediction.net … Our validation does not rule out all the strong tropical convective feedbacks leading to a large climate sensitivity.” - Marcia B. Baker and Gerard H. Roe (2009) – The Shape of Things to Come: Why Is Climate Change So Predictable? – Journal of Climate 22:4574-4589 – Published online: 10/02/2009 – Department of Earth and Space Sciences, University of Washington
“The framework of feedback analysis is used to explore the controls on the shape of the probability distribution of global mean surface temperature response to climate forcing. It is shown that ocean heat uptake, which delays and damps the temperature rise, can be represented as a transient negative feedback.” - S. E. Huisman et al (2009) – Robustness of multiple equilibria in the global ocean circulation – Geophysical Research Letters 36, L01610, doi:10.1029/2008GL036322 – Published online: 15/02/2009 – Institute for Marine and Atmospheric Research Utrecht – 4 autores
“In an idealized Atlantic-Pacific ocean model we study the steady state solutions versus freshwater input in the northern North Atlantic. We find that four different states, the Conveyor (C), the Southern Sinking (SS), the Northern Sinking (NS) and the Inverse Conveyor (IC), appear as two disconnected branches of solutions, where the C is connected with the SS and the NS with the IC. We argue that the latter has the intriguing consequence that the parameter volume for which multiple steady states exist is greatly increased.” - Stephen E. Schwartz (2010) – Feedback and sensitivity in an electrical circuit: an analog for climate models – Climatic Change doi:10.1007/s10584-010-9903-9 – Published online: 27/07/2010 – Atmospheric Sciences Division, Brookhaven National Laboratory
“This analogy is quite valuable in interpreting the sensitivity of the climate system, but usage of this algebra and terminology in the climate literature is often inconsistent, with resultant potential for confusion and loss of physical insight”. - David Wasdell – The ‘Apollo-Gaia Project’. Hystorical Background – Unit for Research into Changing Institutions – Published online: 18/06/2007 – – http://www.apollo-gaia.org/A-GProjectDevelopment.pdf
“In the early months of 2005, several scientists in the field of Climate Change and Footprint Overshoot were beginning to assert that political decision-making and effective action were not responding rationally to clearly presented scientific material. While some of the resistance and inertia could doubtless be attributed to the influence of vested interests and the political fear of losing electoral support, the intensity of the dynamics of denial pointed to powerful underlying social processes that were largely unconscious.” - David Wasdell (2009) – Runaway Climate Change: Boundary Conditions & Implications for Policy – World Forum on Enterprise and the Environment – Published online: 07/07/2009 – Founder and Director of the Unit for Research into Changing Institutions – http://www.feasta.org/forum/files/interim_scientific_report_153.pdf
“With four years analytic work behind us, the conceptual design of a system dynamics platform for future climate modelling is nearing completion … This elegant, energy-based approach, avoids inappropriate dependency on climate sensitivity ratios. The critical inputs depend on iterative approximations of the contribution of the various temperature-sensitive feedback mechanisms to the value of radiative forcing, taking account of the time-delay variables in each mechanism. Initial quantification of some of the critical parameters and ratios is complete.” - Hans J. Schellnhuber (1999) – Earth system analysis and the Second Copernican Revolution – Nature 402:C19-C23 – Published online: 02/12/1999 – Potsdam Institute for Climate Impact Research
“The second Copernican revolution will be completed only if we take this responsibility – in spite of irreducible cognitive deficits as once lamented by Alonso X of Castile: “If the Lord Almighty had consulted me before embarking on the Creation, I would have recommended something simpler.” - James Valverde A. Jr et al (2004) – Sequential climate decisions under uncertainty: An integrated framework – Environmental Modeling and Assessment 4:87-101 doi:10.1023/A:1019056032181 – Published online: 28/10/2004 – Department of Operational Research, London School of Economics
“We develop an integrated framework for evaluating sequential greenhouse gas abatement policies under uncertainty. The analysis integrates information concerning the magnitude, timing, and impacts of climate change with data on the likely effectiveness and cost of possible response options. Reduced-scale representations of the global climate system, drawn from the MIT Integrated Global System Model, form the empirical basis of the analysis. The method is illustrated in application to emissions control policies of the form considered under the United Nations Framework Convention on Climate Change.” - Andrew J. Jarvis et al (2005) – An Incremental Carbon Emissions Strategy for the Global Carbon Cycle Using State Variable Feedback Control Design – Stabilization 2005 – Department of Environmental Sciences and Centre for Research on Environmental Systems and Statistics, Lancaster University – http://stabilisation.metoffice.com/posters/Jarvis_Andrew.pdf – 4 authors
“It is shown that, provided a reasonable description of the short to medium term dynamics of the global carbon cycle can be identified, this new approach to specifying carbon emissions for policy making appears robust to model uncertainty and exogenous disturbance. It also provides a control law that can be implemented `on-line’ and so is adjusted in relation to the latest measured levels of atmospheric CO2 as time progresses.” - O. Bahna et al (2008) – A stochastic control model for optimal timing of climate policies – Automatica 44:1545-1558 – Published online: 19/05/2008 – GERAD and MQG, HEC Montréal
“A stochastic control model is proposed as a paradigm for the design of optimal timing of greenhouse gas (GHG) emission abatement. The resolution of uncertainty concerning climate sensitivity and the technological breakthrough providing access to a carbon-free production economy are modeled as controlled stochastic jump processes. The optimal policy is characterized using the dynamic programming solution to a piecewise deterministic optimal control problem.” - Andrew Jarvis et al (2009) – Stabilizing global mean surface temperature: A feedback control perspective – Environmental Modelling & Software 24:665-674 doi:10.1016/j.envsoft.2008.10.016 – 24/12/2008 – Lancaster Environment Centre, Lancaster University; Engineering Department, Lancaster University; Fenner School of Environment and Society, Australian National University; School of Electrical Engineering and Telecommunications, University of New South Wales, Sydney – 4 autores
“The growing field of research based around Model Predictive Control (ref) is deeply involved in addressing control issues which appear very closely related to those encountered in climate mitigation research … a very large, high order simulation model can be emulated over a whole range of its parameter values, so that the emulation model in this complete form can be used in place of the large model for prediction and control purposes.” - S. Hallegatte et al (2009) – An Approach to Climate Change in Terms of Feedback Loops – Geophysical Research Abstracts – European Geosciences Union – Published online: 01/01/2009 – CIRED (EHESS/CNRS)
“In strongly coupled systems, it is difficult to establish causal relationships. In front of this difficulty, the formulation in terms of feedback loops looks promising, and has already been used in a static framework in the climatology field. This talk proposes an original methodology to characterize feedback loops and to generalize the static feedback gain to take into account the dynamics … This also provides insights on future research topics: it shows in particular that accounting for the diversity of the economic time scales is essential.” - Michael Funke and Michael Paetz (2011) – Environmental policy under model uncertainty: a robust optimal control approach – Climatic Change doi:10.1007/s10584-010-9943-1 – Published online: 21/12/2010 – Department of Economics, Hamburg University
“We hope that further applications of the robust modelling technique will soon follow, making use of increasing processor speeds which makes robustness analysis feasible for larger climate models requiring more computational time. By doing this, the gap between robust control theory and its application may be closing.” - Robert Lempert and Shawn McKay (2011) – Some thoughts on the role of robust control theory in climate-related decision support – An Editorial Comment – Climatic Change 107:241-246 doi:10.1007/s10584-011-0135-4 – Published online: 12/07/2011 – RAND Corporation
“Control theory also offers additional descriptions of adaptive policies, representations of uncertainty, and decision criteria than those used by Funke and Paetz. There are literally hundreds of different control algorithms that may have potential use in climate policy studies.” - William F. Ruddiman (2003) – The anthropogenic greenhouse era began thousands of years ago – Climatic Change 61:261-93 – Department of Environmental Sciences, University of Virginia – http://stephenschneider.stanford.edu/Publications/PDF_Papers/Ruddiman2003.pdf
“The anthropogenic era is generally thought to have begun 150 to 200 years ago, when the industrial revolution began producing CO2 and CH4 at rates sufficient to alter their compositions in the atmosphere. A different hypothesis is posed here: anthropogenic emissions of these gases first altered atmospheric concentrations thousands of years ago. This hypothesis is based on three arguments.” - William F. Ruddiman (2005) – Los tres jinetes del cambio climático. Una historia milenaria del hombre y el clima – Turner, Madrid – Departamento de Ciencias del Medio Ambiente, Universidad de Virginia”
- International Conference on the Assessment of the Role of Carbon Dioxide and of Other Greenhouse Gases in Climate Variations and Associated Impacts (1985: Villach, Austria) – Report of the International Conference on the assessment of the role of carbon dioxide and of other greenhouse gases in climate variations and associated impacts – WMO/UNEP/ICSU
“…[B]eyond 1 degree C may elicit rapid, unpredictable and non-linear responses that could lead to extensive ecosystem damage.” - Summary of Report before Negotiations – Summary Report After Negotiations – Desmogblog – 10/04/2007 – http://www.desmogblog.com/sites/beta.desmogblog.com/files/side-by-side-before-and-after-WG2-negotiations_0.pdf
- James Hansen (2007) – Scientific reticence and sea level rise – Environmental Research Letters 2 024002 doi:10.1088/1748-9326/2/2/024002 – Published online: 24/05/2007 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute – http://pubs.giss.nasa.gov/docs/2007/2007_Hansen.pdf
“I believe there is a pressure on scientists to be conservative. Papers are accepted for publication more readily if they do not push too far and are larded with caveats. Caveats are essential to science, being born in skepticism, which is essential to the process of investigation and verification. But there is a question of degree. A tendency for ‘gradualism’ as new evidence comes to light may be ill-suited for communication, when an issue with a short time fuse is concerned. However, these matters are subjective. I could not see how to prove the existence of a ‘scientific reticence’ about ice sheets and sea level. Score one for the plaintiff, and their ally and ‘friend of the court’, the United States federal government.” - Bernard Barber (1961) – Resistance by Scientists to Scientific Discovery – Science 134:596-602 doi:10.1126/science.134.3479.596 – Director of the Centre for the Study of Knowledge Expertise Science at Cardiff University, UK – http://web.missouri.edu/~hanuscind/8710/Barber1961.pdf
“Too often, unfortunately, where resistance by scientists has been noted, it has been merely noted, merely alleged, without detailed substantiation and without attempt at explanation. Sometimes, when explanations are offered, they are notably vague and all-inclusive, thus proving too little by trying to prove too much. One such explanation is contained in the frequently repeated phrase, «After all, scientists are also human beings,» a phrase implying that scientists are more human when they err than when they are right (11). Other such vague explanations can be found in phrases such as «Zeitgeist,» «human nature,» «lack of progressive spirit,» fear of novelty, and «climate of opinion.” - James Hansen (2007) – Huge sea level rises are coming – unless we act now – New Scientist 2614 – 25/07/2007 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute – http://www.newscientist.com/article/mg19526141.600-huge-sea-level-rises-are-coming–unless-we-act-now.html
“John Mercer effect: … I noticed that researchers who suggested that his paper was alarmist were regarded as more authoritative. It seems to me that scientists downplaying the dangers of climate change fare better when it comes to getting funding … After I published a paper in 1981 that described the likely effects of fossil fuel use, the US Department of Energy reversed a decision to fund my group’s research, specifically criticising aspects of that paper. I believe there is pressure on scientists to be conservative.” - J. H. Mercer (1978) – West Antarctic ice sheet and CO2 greenhouse effect: a threat of disaster – Nature 271:321-325 doi:10.1038/271321a0 – Published online: 26/01/1978 – Institute of Polar Studies, The Ohio State University
“If the global consumption of fossil fuels continues to grow at its present rate, atmospheric CO2 content will double in about 50 years. Climatic models suggest that the resultant greenhouse-warming effect will be greatly magnified in high latitudes. The computed temperature rise at lat 80° S could start rapid deglaciation of West Antarctica, leading to a 5 m rise in sea level.” - David G. Vaughan – West Antarctic Ice Sheet collapse – the fall and rise of a paradigm – Climatic Change 91:65-79 DOI 10.1007/s10584-008-9448-3 – Published online: 20/08/2008 – British Antarctic Survey, Natural Environment Research Council
“Indeed, all of the elements of the positive-feedback cycle that would, according to Mercer, lead inexorably to collapse, have now been observed on Pine Island Glacier: thinning of the ice shelf, inland migration of the grounding line, acceleration of the main trunk of the glacier, and thinning rates on the interior basins. In short, if thirty years ago Mercer and his colleagues had described the changes they would have expected as diagnostic of emergent collapse, this is the list that they might have written. Furthermore, the changes are occurring in the area of WAIS, which was always considered to be most vulnerable to collapse.” - Bruce R. Barkstrom (1984) – The Earth Radiation Budget Experiment (ERBE) – Bulletin of the American Meteorological Society 65:1170-1185 – Atmospheric Sciences Division, NASA Langley Research Center
“The Earth Radiation Budget Experiment (ERBE) is the first multisatellite system designed to measure the Earth’s radiation budget… are expected to provide a substantial improvement in the accuracy of the radiation budget on regional as well as global scales . This paper also provides a brief description of the implementation of the ERBE Project, including the ERBE Science Team.” - Gravity Recovery and Climate Experiment – Wikipedia – 04/10/2011 – http://en.wikipedia.org/wiki/Gravity_Recovery_and_Climate_Experiment
«The data so far obtained by GRACE are the most precise gravimetric data yet recorded: they have been used to re-analyse data obtained from the LAGEOS experiment to try to measure the relativistic frame-dragging effect. In 2006, a team of researchers led by Ralph von Frese and Laramie Potts used GRACE data to discover the 480-kilometer (300 mi) wide Wilkes Land crater in Antarctica, which probably formed about 250 million years ago.[ref] GRACE has been used to map the hydrologic cycle in the Amazon River basin and the location and magnitude of post-glacial rebound from changes in the free air gravity anomaly. GRACE data have also been used to analyze the shifts in the Earth’s crust caused by the earthquake that created the 2004 Indian Ocean tsunami.[ref] Scientists have recently developed a new way to calculate ocean bottom pressure—as important to oceanographers as atmospheric pressure is to meteorologists—using GRACE data.[ref]” - Referencia pendiente
- Michael E. Mann and Philip D. Jones – Global Surface Temperatures Over the Past Two Millenia – Geophysical Research Letters 30(15) 1820 doi:10.1029/2003GL017814 – Published online: 14/08/2003 – Department of Environmental Sciences, University of Virginia; Climatic Research Unit, University of East Anglia – http://holocene.meteo.psu.edu/shared/articles/mannjones03.pdf
“We present reconstructions of Northern and Southern Hemisphere mean surface temperature over the past two millennia based on high-resolution ‘proxy’ temperature data which retain millennial-scale variability. These reconstructions indicate that late 20th century warmth is unprecedented.” - Jeffrey P. Severinghaus et al (1998) – Timing of abrupt climate change at the end of the Younger Dryas interval from thermally fractionated gases in polar ice – Nature 391:141-146 doi:10.1038/34346 – Published online: 08/01/1998 – Graduate School of Oceanography, University of Rhode Island – http://icebubbles.ucsd.edu/Publications/YoungerDryas.pdf – 5 authors
“The climate change was synchronous (within a few decades) over a region of at least hemispheric extent, and providing constraints on previously proposed mechanisms of climate change at this time … during the Younger Dryas, the summit of Greenland was 15 – 38 ºC colder than today.” - Achim Brauer et al (2008) – A abrupt wind shift in Western Europe at the onset of the Youner Dryas cold period – Nature Geoscience 8:520-523 doi:10.1038/ngeo263 – Published online: 01/08/2008 – GFZ German Research Centre for Geosciences – http://geoweb.princeton.edu/people/sigman/paperpdfs/Brauer08.pdf – 5 authors
«Our data indicate an abrupt increase in storminess during the autumn to spring seasons, occurring from one year to the next at 12,679 yr BP, broadly coincident with other changes in this region.” - Stephen H. Schneider and Terry L. Root (1996) – Ecological implications of climate change will include surprises – Biodiversity and Conservation 5:1109-1119 – Published online: 01/01/1996 – Department of Biological Sciences and Institute for International Studies, Stanford University, School of Natural Resources & Environment, University of Michigan
“More consideration is needed to estimate extreme events or ‘surprises’. This is particularly important at the intersection of disciplines like climate and ecology because the potential for large discontinuities is high given all the possible climate/biota interactions. The vast disparities in scales encountered by those working in traditional ecology (typically 20 m) and climatology (typically 200 kin) make diagnoses of such interactions difficult, but these can be addressed by an emerging research paradigm we call strategic cyclical scaling (SCS).” - Stephen H. Schneider et al (2000) – Costing non-linearities, surprises and irreversible events – Pacific and Asian Journal of Energy 10:81- http://stephenschneider.stanford.edu/Publications/PDF_Papers/SchneiderKKDAzar2000.pdf – 3 autores
“Non-linearities and the likelihood of rapid, unanticipated events (surprises) require that costing methods use a wide range of estimates for key parameters or structural formulations and that, when possible, results be cast in probabilistic terms rather than central tendencies since the latter mask the policy-relevant wide range of potential results such a diversity of approaches implies.” - Peter Schwartz and Doug Randall (2003) – An Abrupt Climate Change Scenario and Its Implications for United States National Security. Imagining the Unthinkable – The Pentagon – October 2003 – http://www.climate.org/PDF/clim_change_scenario.pdf
“The purpose of this report is to imagine the unthinkable – to push the boundaries of current research on climate change so we may better understand the potential implications on United States national security … We have created a climate change scenario that although not the most likely, is plausible, and would challenge United States national security in ways that should be considered immediately.” - J.G. Lockwood (2001) – Abrupt and Sudden Climatic Transitions and Fluctuations: A Review – International Journal of Climatology 21:1153-1179 doi:10.1002/joc.630
“The point needs to be stressed that the climate system is a non-linear system, often far from equilibrium and, therefore, should be expected to show many complex patterns with sudden jumps from one distribution pattern to the next.” - Referencia pendiente
- National Academy of Sciences (2002) – Committee on Abrupt Climate Change, Ocean Studies Board, Polar Research Board, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, National Research Council (2002) – Abrupt Climate Change: Inevitable Surprises – http://www.nap.edu/openbook.php?isbn=0309074347
“Recent scientific evidence shows that major and widespread climate changes have occurred with startling speed. For example, roughly half the north Atlantic warming since the last ice age was achieved in only a decade, and it was accompanied by significant climatic changes across most of the globe. Similar events, including local warmings as large as 16°C, occurred repeatedly during the slide into and climb out of the last ice age.” - Meinrat O. Andreae et al (2005) – Strong present-day aerosol cooling implies a hot future – Nature 435:1187-1190 doi:10.1038/nature03671 – Published online: 30/06/2005 – Max Planck Institute for Chemistry; Hadley Centre for Climate Prediction and Research; Centre for Ecology and Hydrology – http://irina.eas.gatech.edu/EAS_spring2006/Andreae2005.pdf – 3 authors
“Strong aerosol cooling in the past and present would then imply that future global warming may proceed at or even above the upper extreme of the range projected by the Intergovernmental Panel on Climate Change” - Veerabhadran Ramanathan and Y. Feng (2008) – On avoiding dangerous anthropogenic interference with the climate system: Formidable challenges ahead – Proceedings of the National Academy of Sciences PNAS 105:14245-14250 doi:10.1073/pnas.0803838105 – Published online: 23/03/2008 – Scripps Institution of Oceanography, University of California at San Diego – http://scrippsnews.ucsd.edu/Releases/doc/zpq038084771p.pdf
“About 90% or more of the rest of the committed warming of 1.6°C will unfold during the 21st century, determined by the rate of the unmasking of the aerosol cooling effect by air pollution abatement laws and by the rate of release of the GHGs-forcing stored in the oceans. The accompanying sea-level rise can continue for more than several centuries. Lastly, even the most aggressive CO2 mitigation steps as envisioned now can only limit further additions to the committed warming, but not reduce the already committed GHGs warming of 2.4°C” - Guus J. M. Veldersa et al (2009) – The large contribution of projected HFC emissions to future climate forcing – Proceedings of the National Academy of Sciences PNAS doi/10.1073/pnas.0902817106 – Published online: 14/05/2009 – Netherlands Environmental Assessment Agency – http://www.epa.gov/greenchill/downloads/Velders_PNAS.pdf – 5 authors “Global HFC emissions in 2050 are equivalent to 9–19% (CO2-eq. basis) of projected global CO2 emissions in business-as-usual scenarios and contribute a radiative forcing equivalent to that from 6–13 years of CO2 emissions near 2050. This percentage increases to 28–45% compared with projected CO2 emissions in a 450-ppm CO2 stabilization scenario.”
- Referencia pendiente
- Vasilis Dakos et al (2008) – Slowing down as an early warning signal for abrupt climate change – Proceedings of the National Academy of Sciences PNAS 105:14308-14312 – Published online: 23/09/2008 – Max Planck Institute for Meteorology – – authors “We analyze eight ancient abrupt climate shifts and show that they were all preceded by a characteristic slowing down of the fluctuations starting well before the actual shift. Such slowing down, measured as increased autocorrelation, can be mathematically shown to be a hallmark of tipping points.”
- Marten Scheffer et al (2009) – Early-warning signals for critical transitions – Nature 461:53-59 doi:10.1038/nature08227 – Published online: 03/09/2009 – Department of Environmental Sciences, Wageningen University, The Netherlands – http://deepeco.ucsd.edu/~george/publications/09_critical_transitions.pdf – 10 authors
“Although a trend in the indicators may serve as a warning, the actual moment of a transition remains difficult to predict … We are far from being able to develop accurate models to predict thresholds in most complex systems, ranging from cells to organisms, ecosystems or the climate. We simply do not understand all the relevant mechanisms and feedbacks sufficiently well … if we have reasons to suspect the possibility of a critical transition, early-warning signals may be a significant step forwards when it comes to judging whether the probability of such an event is increasing.” - Peter Ditlevsen (2010) – Tipping points: Early warning and wishful thinking – Geophysical Research Letters doi:10.1029/2010GL044486 – Published online: 24/08/2010 – University of Copenhagen, Niels Bohr Institute, Centre for Ice and Climate – http://www.leif.org/EOS/2010GL044486.pdf –
““The early warning of climate changes or structural change in any dynamical system driven through a bifurcation, can only be obtained if increase in both variance and autocorrelation is observed. Conclusions drawn based solely on one of the signals and not the other are invalid. Furthermore, detecting increased autocorrelation, or critical slow down, with statistical significance is difficult. For the DO climate transitions, increased variance and autocorrelation are not observed. These shifts are thus noise induced with very limited predictability, and early detection of them in the future might be wishful thinking.” - Timothy Lenton (2011) – Early warning of climate tipping points – – Published online: 01/06/2011 – School of Environmental Sciences, University of East Anglia – http://www.slideshare.net/Stepscentre/tim-lenton-early-warning-of-climate-tipping-points
“Conclusion: 1) Tipping elements in the climate system could be triggered this century by anthropogenic forcing; 2) The Greenland and West Antarctic ice sheets probably represent the largest risks; 3) Some tipping points can be anticipated in principle, but sufficiently high-resolution, long records are often lacking; 4) A change in the number of climate states can be detected, in a noisy climate system that is moving between states; 5) Improved understanding is needed to help policy makers “avoid the unmanageable and manage the unavoidable.” - Doug McNeall et al (2011) – Analyzing abrupt and nonlinear climate changes and their impacts – WIREs Climate Change 2:I663–686 doi:10.1002/wcc.130 – Published online: 29/06/2011 – Met Office Hadley Centre – 4 authors
“The uncertainty in the nature and strength of climate–carbon cycle feedbacks leads to increased uncertainty in the rate of global warming arising from a given emissions scenario. This uncertainty mainly affects the upper end of the range of warming, due to the model consensus that the feedback is positive. Therefore, consideration of climate–carbon cycle feedbacks raises the upper limit of the projected range of temperature responses, but does not significantly affect the lower limit.” - Timothy M. Lenton (2011) – Early warning of climate tipping points – Nature Climate Change 1:201–209 doi:10.1038/nclimate1143 – Published online: 19/06/2011 – College of Life and Environmental Sciences, University of Exeter + UK and School of Environmental Sciences, University of East Anglia –
“Recent assessments give an increased probability of future tipping events, and the corresponding impacts are estimated to be large, making them significant risks. Recent work shows that early warning of an approaching climate tipping point is possible in principle, and could have considerable value in reducing the risk that they pose.” - Marlowe Hood – Top UN climate scientist backs ambitious CO2 cuts – Google News – 25/08/2009 – AFP – http://www.google.com/hostednews/afp/article/ALeqM5hacayDuUcngLmhNkplHB5VtG5GNw
«As chairman of the Intergovernmental Panel on Climate Change (IPCC) I cannot take a position because we do not make recommendations,» said Rajendra Pachauri when asked if he supported calls to keep atmospheric carbon dioxide concentrations below 350 parts per million (ppm). «But as a human being I am fully supportive of that goal. What is happening, and what is likely to happen, convinces me that the world must be really ambitious and very determined at moving toward a 350 target,» he told AFP in an interview. In its benchmark 2007 report, the IPCC said that the key for preventing dangerous global warming was to keep CO2 concentrations below 450 ppm.» - Bill McKibben – Remember This: 350 Parts Per Million – The Washington Post, 28/12/2007 – http://www.washingtonpost.com/wp-dyn/content/article/2007/12/27/AR2007122701942.html
“But what may turn out to be the most crucial development went largely unnoticed. It happened at an academic conclave in San Francisco. A NASA scientist named James Hansen offered a simple, straightforward and mind-blowing bottom line for the planet: 350, as in parts per million carbon dioxide in the atmosphere. It’s a number that may make what happened in Washington and Bali seem quaint and nearly irrelevant. It’s the number that may define our future.” - James Hansen et al (2008) – Target Atmospheric CO2: Where Should Humanity Aim? – The Open Atmospheric Science Journal 2:217-231 – Published online: 01/02/2008 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute – http://www.columbia.edu/~jeh1/2008/TargetCO2_20080407.pdf – 10 autores
“If humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted, paleoclimate evidence and ongoing climate change suggest that CO2 will need to be reduced from its current 385 ppm to at most 350 ppm, but likely less than that.” - James Hansen et al (2008) – Target Atmospheric CO2: Where Should Humanity Aim? – The Open Atmospheric Science Journal 2:217-231 – Published online: 01/02/2008 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute – http://pubs.giss.nasa.gov/docs/2008/2008_Hansen_etal.pdf – 10 authors
“CO2 amount must be reduced to 325-355 ppm to increase outgoing flux 0.5-1 W/m2, if other forcings are unchanged. A further imbalance reduction, and thus CO2 ~300-325 ppm, may be needed to restore sea ice to its area of 25 years ago.” - Christine Ehlig-Economides and Michael J. Economides (2010) – Sequestering carbon dioxide in a closed underground volume – Journal of Petroleum Science and Engineering 70:123-130 doi:10.1016/j.petrol.2009.11.002 – Published online: 28/04/2010 – Department of Petroleum Engineering, Texas A&M University; Department of Chemical Engineering, University of Houston – http://twodoctors.org/manual/economides.pdf
“The implications of this work are profound. A simple analytical model shows immediate results very similar to those that take hours to produce with numerical simulation .. Neither of these bodes well for geological CO2 sequestration and the findings of this work clearly suggest that it is not a practical means to provide any substantive reduction in CO2 emissions, although it has been repeatedly presented as such by others.” - Quanlin Zhou and Jens T. Birkholzer (2011) – On scale and magnitude of pressure build-up induced by large-scale geologic storage of CO2 – Greenhouse Gas Science and Technology 1:11–20 DOI:10.1002/ghg3 – Published online: 17/02/2011 – Lawrence Berkeley National Laboratory, Berkeley – http://esd.lbl.gov/files/about/staff/quanlinzhou/Paper21_PDF.pdf
“These studies show that the limiting effect of pressure build-up on dynamic storage capacity is not as significant as suggested by Ehlig-Economides and Economides, who considered closed systems without any attenuation effects.” - David Adam – Roll back time to safeguard climate, expert warns – The Guardian – Published online: 15/09/2008 – http://www.guardian.co.uk/environment/2008/sep/15/climatechange.carbonemissions
“Professor John Schellnhuber, director of the Potsdam Institute for Climate Impact Research in Germany, told the Guardian … ‘It is a very sweeping argument, but nobody can say for sure that 330ppm is safe,’ he said. ‘Perhaps it will not matter whether we have 270ppm or 320ppm, but operating well outside the [historic] realm of carbon dioxide concentrations is risky as long as we have not fully understood the relevant feedback mechanisms.” - Christian Azar and Henning Rodhe (1997) – Targets for Stabilization of Atmospheric CO2 – Science 276:1818-1819 doi:10.1126/science.276.5320.1818 – 20/06/1997 – Institute of Physical Resource Theory, Chalmers University of Technology–Göteborg University; Department of Meteorology, Stockholm University – http://www.atmos.washington.edu/~davidc/ATMS211/articles_required/Azar_Rodhe_97_targets.pdf
“Scientists must join the debate, despite the difficulty of pinpointing the correct number; otherwise, action may not be taken. Analysis suggests that a value nearer to 350 ppmv should be adopted, until it can be proven that a higher value is safe.” - James Hansen (2005) – A slippery slope: How much global warming constitutes “dangerous anthropogenic interference”. An Editorial Essay – Climatic Change 68:269-279 doi:10.1007/s10584-005-4135-0 – 01/02/2005 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute – http://www.columbia.edu/~jeh1/2004/dai_complete_20041026.pdf
“These conclusions, together with the discussion above about time constants, imply that global warming of more than 1◦C above today’s global temperature would likely constitute “dangerous anthropogenic interference” with climate. In turn, given the current planetary energy imbalance and empirical modeling evidence that climate sensitivity is about 3/4 ºC per W/m2, this implies that we should seek to keep long-term additional climate forcings from exceeding about 1 W/m2.” - Simon Retallack (2005) – Setting a long-term climate objective – Institute for Public Policy Research – 01/10/2004 – Research Fellow – http://www.gtinitiative.org/documents/SettingLongTermClimateObjective.pdf –
«Atmospheric concentrations of CO2 have already reached 376ppm. For concentrations to be brought down, human emissions over the century would need to amount to less than the total absorbed by the natural world over that period. Assuming a natural carbon sink of 4 billion tons of carbon (GtC) per year that would leave a budget of about 380 GtC that could be emitted from 2000-2100. That in turn would need to be achieved by reducing global CO2 emissions annually by about 2.5% in absolute terms from 2010. For the medium term, that would imply global emission reductions of 15% below 2000 levels or 10% below 1990 levels by 2020. For the longer term, it would imply global emission reductions of about 60% by 2050 and of about 90% by 2100. « - Malte Meinshausen et al (2005) – Multi-Gas Emissions Pathways to Meet Climate Targets – Climatic Change 75:151-194 doi:10.1007/s10584-005-9013-2 – Accepted: 07/08/2005 – Swiss Federal Institute of Technology (ETH Zurich), Environmental Physics, Environmental Science Department – http://www.pik-potsdam.de/~mmalte/simcap/publications/denElzen_Meinshausen_2005_nonCO2_NCGG_4%20Utrecht.pdf – 7 autores
“The ability of the [presented method ‘Equal Quantile Walk’ (EQW)] to analyze emission implications in a probabilistic multi-gas framework is demonstrated. The probability of overshooting a 2 ◦C climate target is derived by using different sets of EQW radiative forcing peaking pathways. If the probability shall not be increased above 30%, it seems necessary to peak CO2 equivalence concentrations around 475 ppm and return to lower levels after peaking (below 400 ppm).” - David Archer (2006) – How much CO2 emission is too much? – Real Climate – 06/11/2006 – http://www.realclimate.org/index.php/archives/2006/11/how-much-co2-emission-is-too-much/
“Of the 450 ppm, 170 ppm would be from fossil fuels (given an original natural pCO2 of 280 ppm). 170 ppm equals 340 Gton C, which divided by the peak airborne fraction of 60% yields a total emission slug of about 570 Gton C. How much is 570 Gton C? We have already released about 300 Gton C, and the business-as-usual scenario projects 1600 Gton C total release by the year 2100. Avoiding dangerous climate change requires very deep cuts in CO2 emissions in the long term, something like 85% of business-as-usual averaged over the coming century. Put it this way and it sounds impossible. Another way to look at it, which doesn’t seem quite as intractable, is to say that the 200 Gton C that can still be “safely” emitted is roughly equivalent to the remaining traditional reserves of oil and natural gas. We could burn those until they’re gone, but declare an immediate moratorium on coal, and that would be OK, according to our defined danger limit of 2°C. A third perspective is that if we could limit emissions to 5 Gton C per year starting now, we could continue doing that for 250/5 = 50 years.» - Paul Baer with Dr Michael Mastrandrea (2006) – Designing emissions pathways to reduce the risk of dangerous climate change – Institute for Public Policy Research – 01/11/2006 – http://www.ecoequity.org/wp-content/uploads/2009/05/high_stakes.pdf
“Although our focus is on ‘peak and decline’ scenarios, we include here an example of a calculation of risk associated with a more familiar stabilization scenario that is a common focus of the policy debate, in which CO2 concentrations reach 450 ppm and are held at that level indefinitely. In this example, in which other non-CO2 GHGs are counted separately (but in fact held to rather optimistically low levels) we calculate the following estimated risks for the likelihood of exceeding various temperature thresholds in the next 200 years: Risk of exceeding 2ºC: between 46 and 85 per cent; Risk of exceeding 2.5ºC: between 21 and 55 per cent; Risk of exceeding 3ºC: between 11 and 24 per cent; Risk of exceeding 3.5ºC: between 4 and 11 per cent. Scenarios in which CO2 concentrations reach 500 or 550 ppm have a correspondingly greater risk of exceeding 2°C: 70-95 per cent and 78-99 per cent respectively. Why the range? Why can we not give a more precise estimate of the likely temperature increase? ” - Amy L. Luers (2007) – How to Avoid Dangerous Climate Change- A Target for U.S. Emissions Reductions – Union of Concerned Scientists – Union of Concerned Scientists – http://www.ucsusa.org/assets/documents/global_warming/emissions-target-report.pdf
“A seemingly modest increase of 11 percent in the stabilization target (from 450 to 500 ppm CO2eq) will increase the chances of a greater than 2°C increase in global average temperature from 50-50 to 70-30, and of a greater than 3°C increase from 30-70 to 50-50.” - Andrew J. Weaver et al (2007) – Long term climate implications of 2050 emission reduction targets – Geophysical Research Letters 34 L19703 doi:10.1029/2007GL031018 – 06/10/2007 – http://climate.uvic.ca/people/alvaro/Emi_2050.pdf – 3 autores
“Even when emissions are stabilized at 90% below present levels at 2050, this 2.0°C threshold is eventually broken. Our results suggest that if a 2.0°C warming is to be avoided, direct CO2 capture from the air, together with subsequent sequestration, would eventually have to be introduced in addition to sustained 90% global carbon emissions reductions by 2050.” - James Hansen et al (2007) – Dangerous human-made interference with climate: a GISS modelE study – Atmospheric Chemistry in Physics 7:2287-312 – 07/05/2007 – NASA Goddard Institute for Space Studies and Columbia University Earth Institute – http://arxiv.org/ftp/physics/papers/0610/0610115.pdf – 47 autores
«The alternative scenario, with peak added forcing ~1.5 W/m2 in 2100, keeps further global warming under 1ºC if climate sensitivity is ~3º C or less for doubled CO2. The alternative scenario keeps mean regional seasonal warming within 2σ (standard deviations) of 20th century variability, but other scenarios yield regional changes of 5–10σ, i.e. mean conditions outside the range of local experience. We conclude that a CO2 level exceeding about 450 ppm is “dangerous”, but reduction of non-CO2 forcings can provide modest relief on the CO2 constraint.» - Cameron Hepburn and Nicholas Stern (2008) – A new global deal on climate change – Oxford Review of Economic Policy 24:259-279 doi:10.1093/oxrep/grn020 – Smith School of Enterprise and the Environment and James Martin Institute, Saïd Business School, University of Oxford, and New College, Oxford; Grantham Institute, IndiaObservatory, and STICERD at the London School of Economics and Political Science – http://oxrep.oxfordjournals.org/cgi/reprint/24/2/259
«A global target of stabilizing greenhouse-gas concentrations at between 450 and 550 parts per million carbon-dioxide equivalent (ppm CO2e) has proven robust to recent developments in the science and economics of climate change. Retrospective analysis of the Stern Review (2007) suggests that the risks were underestimated, indicating a stabilization target closer to 450 ppm CO2e.” - H. D. Matthews and Ken Caldeira (2008) – Stabilizing climate requires near-zero emissions – Geophysical Research Abstracts 10 EGU2008-A-08242 ID:1607-7962/gra/EGU2008-A-08242 – 27/02/2008 – Department of Geography, Planning and Environment, Concordia University; Department of Global Ecology, Carnegie Institution of Washington, Stanford – http://www.see.ed.ac.uk/~shs/Climate%20change/Geo-politics/Matthews_Caldeira%20zero%20carbon.pdf
“To hold climate constant at a given global temperature requires near-zero future carbon emissions … future anthropogenic emissions would need to be eliminated in order to stabilize global-mean temperatures … any future anthropogenic emissions will commit the climate system to warming that is essentially irreversible on centennial timescales”. - EG Science – The 2 °C target – European Comission – 09/07/2008
“In order to meet the 2°C target with at least a 50% probability, atmospheric CO2eq concentration would need to be stabilised at approximately 440ppm or lower. Stabilization at 400ppm CO2eq or lower would raise the probability of keeping the temperature increase below 2°C to above 66%.” - PRES/96/188 – 1939th European Council meeting, Luxembourg – European Union – Published online: 25/06/1996 – http://europa.eu/rapid/pressReleasesAction.do?reference=PRES/96/188&format=HTML&aged=1&language=ES&guiLanguage=es
“El Consejo reconoce que, según el segundo informe del IPCC, la estabilización de las concentraciones atmosféricas de CO2 en un nivel igual al doble del nivel preindustrial, es decir, 550 ppm, exigirá que en última instancia la totalidad de las emisiones mundiales sea inferior al 50% del nivel actual de las mismas; dicho nivel de concentración ya provocará un aumento de la temperatura media mundial de aproximadamente 2 C por encima del nivel preindustrial.” - Kevin Anderson et al (2008) – From long-term targets to cumulative emission pathways: Reframing UK climate policy – Energy Policy 36:3714-3722 doi:10.1016/j.enpol.2008.07.003 – 08/08/2008 – The Tyndall Centre for Climate Change Research, MACE, University of Manchester – http://www.ecodiy.org/Kevin-anderson-publication.pdf – 3 autores
«Only recently have CO2 stabilization studies begun to incorporate the impact of carbon cycle-feedback mechanisms (Matthews, 2005; Jones et al., 2006), despite a number of previous studies illustrating that there will likely be decreases in terrestrial and oceanic carbon uptake due to climate change (Cox et al., 2000; Friedlingstein et al., 2001). The new cumulative carbon range for a 450 ppmv stabilization level published in the IPCC (2007) report and presented in Table 1 illustrates the significance of such feedbacks.» - Referencia pendiente
- Gabriele C. Hegerl, Francis W. Zwiers et al (2007) – Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change [Susan Solomon et al (eds.)] – Intergovernmental Panel for Climate Change (IPCC) – Intergovernmental Panel on Climate Change – http://www.ipcc.ch/pdf/assessment-report/ar4/wg1/ar4-wg1-faqs.pdf –
“Climate-carbon cycle coupling is expected to add carbon dioxide to the atmosphere as the climate system warms, but the magnitude of this feedback is uncertain. This increases the uncertainty in the trajectory of carbon dioxide emissions required to achieve a particular stabilization level of atmospheric carbon dioxide concentration. Based on current understanding of climate carbon cycle feedbacks, model studies suggest that to stabilize at 450 ppm carbon dioxide could require the cumulative emissions over the 21st-century be reduced from an average of approximately 670 gigatons carbon to approximately 490 GtC.” - Miles R. Allen et al (2009) – Warming caused by cumulative carbon emissions towards the trillionth tonne – Nature 458:1163-1166 doi:10.1038/nature08019 – 30/04/2009 – Department of Physics, University of Oxford – http://www.fraw.org.uk/files/climate/allen_2009.pdf – 7 autores
“Total anthropogenic emissions of one trillion tonnes of carbon (3.67 trillion tonnes of CO2), about half of which has already been emitted since industrialization began, results in a most likely peak carbon-dioxide induced warming of 2 ºC above pre-industrial temperatures, with a 5–95% confidence interval of 1.3–3.9 ºC.” - Malte Meinshausen et al (2009) – Greenhouse-gas emission targets for limiting global warming to 2 °C – Nature 458:1158-1162 doi:10.1038/nature08017 – 30/04/2009 – Potsdam Institute for Climate Impact Research – 8 autores
“Limiting cumulative CO2 emissions over 2000–50 to 1,000 Gt CO2 yields a 25% probability of warming exceeding 2 ºC—and a limit of 1,440 Gt CO2 yields a 50% probability—given a representative estimate of the distribution of climate system properties … .for the scenarios considered, the probability of exceeding 2 °C rises to 53–87% if global GHG emissions are still more than 25% above 2000 levels in 2020.” - Ian Allison et al (2009) – The Copenhagen Diagnosis, 2009: Updating the World on the Latest Climate Science – UNSW Climate Change Research Centre Australia – 01/11/2009 – UNSW Climate Change Research Centre, Australia – http://www.copenhagendiagnosis.com/ 26 autores
«If global warming is to be limited to a maximum of 2oC above pre-industrial values, global emissions need to peak between 2015 and 2020 and then decline rapidly. To stabilize climate, a decarbonized global society – with near-zero emissions of CO2 and other long-lived greenhouse gases – need to be reached well within this century. More specifically, the average annual per-capita emissions will have to shrink to well under 1 metric ton CO2 by 2050. This is 80-90% below the per-capita emissions in developed nations in 2000.” - Seth Shulman (2008) – Undermining Science. Suppression and Distortion in the Bush Administration – University of California Press – ISBN: 978-0-520-24702-4 – 202 págs.
“An astonishing 46% of the climate scientists surveyed [sample: 1600] even reported that they or their colleagues ad been personally pressured by Bush administration officials to eliminate the words ‘climate change’ or ‘global warming’ from official documents, presumably in an Orwellian effort to try to suppress public attention to the issue” - Chris Mooney (2005) – The Republican War on Science – Perseus Books – ISBN: 978-0-465-04675-1
“As head of Nixon’s Council on Environmental Quality at the time, Russel Train [(WWF)] publicly raised questions about the SST program’s environmental impact … Train angered the Nixon White House, but got away with criticizing the SST. Garwin wasn’t so lucky. The physicist free-lancing contributed to Nixon’s decision, after his 1972 reelection, to dissolve PSAC and abolish the office of presidential science adviser, a landmark moment in the relationship between scientists and government, and one that laid the groundwork for much of the politicization that came later… Jerome Wiesner later wrote: ‘He chose to kill the messenger’.” - Shawn Lawrence Otto (2011) – Fool Me Twice: Fighting the Assault on Science in America – Rodale Books – ISBN-13: 978-1605292175 – 384 págs. – 11/10/2011 – 2009 IEEE-USA National Distinguished Public Service Award
- Marc Bowen (2008) – Censoring Science. Inside the Political Attack on Dr. James Hansen and the Truth of Global Warming – Penguin Group – ISBN-10: 0525950141 – 336 págs.
“On the 20th of January, as it began to appear that Yime magazine would let the story flip, Jim gave roughly the same information to an old contact, Andrew Revkin, the lead global warming correspondent for The New York Times. On the 24th, Larry Travis was hit and severely injured by a truck as he walked across Bradway on his way to work, Jim’s (Hansen) car was also broken into around that time, and the house in New Jersey in which he and Annie had raised their children burned to the ground. Darnell Cain, Jim’s assistant admits to being ‘sufficiently lazy and negligent to not update the NASA public records with Jim’s new address when he moved to Pennsylvania.” - Committee Report: White House Engaged in Systematic Effort to Manipulate Climate Change Science – Administration Oversight Environment Politics and Science – Published online: 12/12/2007 – http://democrats.oversight.house.gov/images/stories/documents/20071210101633.pdf
“Inescapable conclusion: the Bush Administration has engaged in a systematic effort to manipulate climate change science and mislead policymakers and the public about the dangers of global warming.”
1: Requerimientos para un objetivo global
2: Dos grados más ¿respecto a qué? ¿Qué es lo realmente importante?
3.1: Historia de una cifra – 3.1. La artificialidad de las versiones economicistas tipo Stern
3.2: Historia de una cifra – 3.2. El origen religioso de la versión (supuestamente) científica
4: Cómo sería un mundo +2 ºC más caliente
5: Umbral de estabilidad del sistema climático, y el problema de control
6: Determinación del umbral de estabilidad desde abajo
Comenta cuando quieras