This blog examines global warming and its effects. Henry Auer have strong interests in global warming, electoral reform, climate change, scientific basis, science policy, U.S regional and other related issues.
Summary. The day that Hurricane Harvey made landfall in Texas the water temperature in the Gulf of Mexico was about 3-7°F higher than the average for 1961-1990. This is important, because warmer water releases more moisture into the air than cooler water, feeding heavier rainfall. This contributed to the extreme, unprecedented flooding in the Houston area caused by Harvey. More moisture also leads to stronger winds in storms.
Climate models project that if humanity continues to burn fossil fuels without restraint the added carbon dioxide produced will lead to sharply higher global average temperatures. These will produce more frequent and intense extreme weather and climate events, which bring serious socioeconomic harms to society. One model study of storm activity along the Texas coast finds that the probability of an event will triple, from 6% per year to 18% per year, by the end of this century if emissions continue unabated.
Climate scientists have been warning of major climate consequences from man-made greenhouse gas emissions for almost three decades. Those predictions have not changed, indeed have only improved, as scientific capabilities grew. If humanity had responded earlier, the costs of action would have been lower or spread over longer times. In the absence of past action at the scale needed, now is the time to act.
Introduction. Climate scientists understand that the long-term global average temperature will continue to increase largely in response to the increased concentration of carbon dioxide (CO2) and other greenhouse gases (GHGs) in the atmosphere. CO2 is increasing since it is the combustion product of humanity’s burning of fossil fuels (i.e., fuels based on carbon: coal, petroleum products and natural gas). Other GHGs likewise arise from human activity. CO2 is especially significant since, once emitted into the air, it resides there for centuries; it continues accumulating without balancing effects that remove it from the atmosphere (after about one-third of it dissolves into the ocean).
The greenhouse effect originating from these excess GHGs raises the global average temperature. The temperature will remain elevated in coming centuries as the excess GHGs continue residing in the atmosphere.
One effect of higher temperatures at the surface of lakes and oceans is that more water evaporates as the water temperature rises, by about 4% per degree F (about 7% per degree C). In addition, evaporation of water vapor requires the input of heat; as a result the surrounding air momentarily cools off. Conversely, as water vapor condenses, such as in cloud and raindrop formation, heat is released, warming the surrounding air momentarily. These temperature changes lead to local winds.
Storms such as hurricanes sweep over ocean water and entrap large amounts of water vapor. When the vapor condenses the liquid falls to the ground as rain. As this activity intensifies strong winds result. Climate scientists foresee that as the earth warms, storms such as hurricanes will potentially carry more water vapor and generate stronger winds than in earlier decades.
Continued GHG emissions will lead to a higher incidence of extreme hurricanes.
The United Nations-sponsored Intergovernmental Panel on Climate Change (IPCC) pubished its Fifth Assessment Report, Part 1, in 2013. It includes climate model projections of the relationship between the excess CO2 accumulated in the atmosphere from human activity and the predicted increase in the global average temperature resulting from the added CO2.
The models were run by assuming four CO2emission scenarios up to the year 2100: the most stringent ends GHG emissions beyond 2050, while the least stringent continues current use and unabated future growth in use of fossil fuels. The results are shown in the graphic below, showing the dependence of the global average temperature on the atmospheric CO2 level, including the historical record of global average temperature from 1880 to 2010 in the lower left of the image.
Historical record of global annual temperature increase above the average for 1861-1880 (vertical axis) as a function of historical atmospheric accumulated weight of excess carbon dioxide, due to human use of fossil fuels, above the level in 1870 (black; lower left). The circles mark decades from 1870 to 2100. Future model projections of the same temperature-carbon dioxide dependencies are shown from 2010 to 2100, based on four scenarios describing the stringency of policy used to limit future emissions (dark blue, most stringent; light blue, next less stringent; orange, weak limits on emissions; red, continued emissions from unabated use of fossil fuels). Source: Intergovernmental Panel on Climate Change, Fifth Assessment Report, Working Group 1, Summary for Policymakers. http://www.climatechange2013.org/images/report/WG1AR5_SPM_FINAL.pdf.
The historical data show that in 2010 the global average temperature was about 0.9°C (1.6°F) higher than in 1870. The modeling shows that the most stringent scenario (dark blue) projects a temperature increase above the 1870 level of about 1.8°C (3.2°F) by 2050-2100. On the other hand, the scenario based on unconstrained continued use of carbon-containing fuels (red) foresees that the global average temperature in 2100 will be about 4.7°C (8.5°F) above the 1870 temperature. Such a drastic increase in global temperature will lead to periods of time, and/or regions of the earth’s surface, experiencing one or more of fierce heat waves; extreme storms that may be more frequent, or have more intense rainfall and winds; droughts; wildfires; and pronounced increases in sea levels.
Hurricane Harvey pummeled Houston and neighboring regions with torrential rainfall in August 2017. While the hurricane would likely have happened anyway, rainfall was more intense because the water of the Gulf of Mexico was warmer than in the past. This is seen in heat map for the Gulf, shown below for the day that Harvey made landfall.
(The legend under the heat map is the one appearing on the web site from which the map was copied.) The map makes clear that the excess heat in the Gulf of Mexico abutting the Texas coast, shown by the color code bar at the right, was a factor in the extreme rainfall and flooding generated by the storm. As explained in the Introduction, warmer water leads to more moisture evaporating into the storm. A second factor was that the hurricane lingered over the Houston area for several days. The total rainfall from the storm at Cedar Bayou was 51.88 in (1318 mm), perhaps the highest in the region.
The likelihood of a return event of a hurricane like Harvey increases 3-fold in the unrestrained emission scenariodescribed above for the first graphic. K. Emanuel published an analysis analysis of hurricane rainfall properties by modeling previous storms impacting Texas. This was carried out historically for the period 1980-2016, and with the unrestrained scenario for 2081-2100. He developed results assuming a storm with 500 mm (19.7 in) of rainfall, much less than the local maximum cited above for Cedar Bayou. The likelihood of such precipitation is evaluated at 6% per year for 2017, and increases three-fold to 18% per year by the final decades of this century if fossil fuel use remains unconstrained. Emanuel also found that for the period 1981-2000 the historical likelihood is modeled as 1% per year. Thus, his modeling shows that global warming has already increased hurricane/storm likelihoods in recent decades by a factor of six, and for the century-long interval from the end of the 20th century to the end of 21stcentury by 18-fold.
Climate scientists have been warning for almost three decades of the hazards arising from increased levels of greenhouse gases in the atmosphere. The increase in atmospheric CO2 from 1958 to the present is shown below.
Atmospheric concentration of CO2 in parts of CO2 per million parts of air; ppm).
From the first IPCC Assessment Report in 1990 to the fifth in 2013-2014, the forecasts of future climate trends and harmful events have not changed. It was already understood in1990 that manmade global warming imperiled our society’s wellbeing. What has changed is first, an increase in the CO2 level from about 355 ppm in 1990 to 402 ppm in August 2016 (see the graphic just above); and second, increased certainty in the predictions of the effects of higher levels of greenhouse gases on the earth’s climate, arising from dramatic increases in data available, sophistication of climate models employed, and the computational power of modern supercomputers.
The flooding from the hurricanes that struck the Caribbean and southeastern U. S. in the summer of 2017 is just one example of the types of extreme events that climate scientists have foreseen over the past decades. Others include heat waves, droughts, forest wildfires and sea level rise. In the earlier decades they were only predictions, dismissed by many. But by today the warnings have come to pass; extreme events will increase in occurrence and severity as warming worsens.
The harms and damages inflicted by extreme events have major economic and societal consequences. The need arises to reconstruct damaged homes and facilities and to undertake projects that increase resiliency in the face of future climate threats. These costs ultimately fall on the population at large, for example from increased insurance premiums and higher taxes. Had earlier action been undertaken it is likely that such societal costs would have been lower, or at least spread out over longer time frames.
As of today, however, much of the response is on an emergency basis, i.e. as the response to unforeseen disasters. The U.S. in particular, as well as the world at large, should accept the reality of the climate change threat. We must make the investments now that are needed to minimize further greenhouse gas emissions and to adapt to the threats already with us.
Summary.The Environmental Protection Agency cancelled presentations by three EPA scientists at a conference about the effects of climate change on Narragansett Bay, Rhode Island. Their work focused on the effects of climate change on the ecology of the Bay. EPA gave no reason. It has been supporting this research for decades, but in its draft budget for the coming year the agency has zeroed out support for all 28 of its estuary projects.
Administrator Pruitt has publicly rejected the role of humans in causing climate change, suggesting there remains some disagreement over the issue. In fact, 99.99% of climate scientists affirm the reality that humans cause global warming. Scientists the world over agree fossil fuel use emits excess carbon dioxide which retains extra heat in the earth system, leading to warming and its harmful consequences.
Politicization of science harms the public because political considerations supersede scientific reality in developing policy. Here EPA suppressed research findings characterizing effects of worsening climate change. A professor of oceanography at the University of Rhode Island considers the muffling of his colleagues a deliberate act of scientific censorship. We must all resist further efforts at stifling research. We must reinstate bona fide science as the guide for our policies.
EPA Scientists Prevented from Speaking. The Environmental Protection Agency (EPA) abruptly cancelled the speaking presentations of three EPA scientists at a conference about the effects of climate change on Narragansett Bay, in Rhode Island, held on Oct. 23, 2017. One of them, a research ecologist at the EPA’s National Health and Environmental Effects Research Laboratory Atlantic Ecology Division in Rhode Islandwas to give the keynote address. The other two, a postdoctoral researcher at the same EPA facility and a scientific contractor for EPA, were to be on a panel addressing the topic “The Present and Future Biological Implications of Climate Change.” EPA’s decision was relayed to the meeting organizers just one business day before the conference. No substantive reason was provided for the prohibition.
The NBE Program was the host for the conference. It issued a 500-page Technical Report, entitled “The State of Narragansett Bay and Its Watershed”, on the day of the conference. Of three effects stressing the condition of the bay, climate change was identified as one. This in turn was broken down to effects of temperature change, precipitation intensity and frequency, and sea level rise.
The Narragansett Bay Estuary Program (NBE Program), long supported by EPA, has been studying the ecological health of the bay since 1985. The NBE was recognized as an “estuary of national significance” by the EPA’s National Estuary Program in 1988. For reasons such as these colleagues were surprised at the EPA’s prohibition.
Research support of US$26 million for all the 28 Estuary Programs, including the NBE Program, has been dropped from EPA’s proposed budget for 2018.
Politicization of science harms the public because political considerations supersede scientific reality in developing policy. Here EPA suppressed research findings characterizing effects of worsening climate change. EPA’s interference with its NBE Program employees stifles scientific study related to climate change. “It’s definitely a blatant example of the scientific censorship we all suspected was going to start being enforced at EPA,” said John King, professor of oceanography at the University of Rhode Island, the head of the science advisory committee of the NBE Program. He continued “[t]hey don’t believe in climate change, so I think what they’re trying to do is stifle discussions of the impacts of climate change.”
Adminstrator Pruitt rejects human-caused climate change. The NBE Program case is but one of the agency’s many political actions. In March 2017 the Administrator, Scott Pruitt, said “…there’s tremendous disagreement about the degree of impact [of ‘human activity on the climate’], so no, I would not agree that it’s a primary contributor to the global warming that we see”. Pruitt’s assertion conflicts directly with EPA’s own statement, reported by The Guardian on March 9, 2017, “carbon dioxide is the ‘primary greenhouse gas that is contributing to recent climate change’”. (This statement could not be accessed on Oct. 26, 2017 using the Guardian’s link to the EPA page.) Pruitt’s statement is also contradicted by James Lawrence Powell’s journal article (Bulletin of Science, Technology & Society1–4, 2016; DOI:10.1177/0270467616634958). He found that during 2013 and 2014 only 4 of 69,406(0.0058%) authors of peer-reviewed journal articles dealing with climate change rejected the reality of man-made global warming. That is, there is no disagreement on impact of human activity on the climate.
EPA is cleansing its sites of references to climate change.EPA has purged most content dealing with climate change from its web pages related to helping state and local governments deal with climate change. The site, previously called “Climate and Energy Resources for State, Local and Tribal Governments” is now titled “Energy Resources for State, Local and Tribal Governments,” dropping the lead word “Climate”. The original 375 web pages now are reduced to 175, with changes in content that an outside group terms “substantial”. Looking to the future, a draft outline of EPA’s plans for the next four years omits mention of climate change.
EPA is undertaking a review of automobile Corporate Average Fuel Economy standards that were intended to increase efficiency and reduce fuel use by about half by 2025, the Portland Press Herald reports. “Administrator Scott Pruitt is intent on subverting that agency’s mission. At the behest of automakers, he is now reconsidering vehicular emission standards that help protect public health, save consumers money, and guard against further climate disruption”, the newspaper writes. It reports that William D. Ruckelshaus, former EPA director under two Republican presidents, says that Pruitt’s approach appears more like “taking a meat ax to the protections of public health and the environment and then hiding [the ax].”
EPA recently announced a draft rule overturning the Clean Power Plan (CPP), the Obama administration’s detailed program to reduce carbon dioxide emissions from large electric generating plants. While Scott Pruitt was Attorney General of Oklahoma he helped lead more than 24 states in suing to overturn the CPP. Gina McCarthy, EPA Administrator under President Obama, said the proposal “is a wholesale retreat from EPA’s legal, scientific and moral obligation to address the threats of climate change.”
Carbon dioxide was identified as a greenhouse gas in the middle of the nineteenth century. A warning that the gas would contribute to warming of the atmosphere was first made in 1896. More recently the work of hundreds of climate scientists from countries all around the world have been researching this field for decades. As noted above, essentially all agree: Man-made emissions of carbon dioxide from burning fossil fuels, and other greenhouse gases, are warming the earth system at geologically unprecedented speed. The effects of fossil fuel use lead to weather extremes, wildfires, sea level rise and changing habitats of pests and disease carriers. Toxins released from fossil fuels cause illnesses among the public. The costs of future mitigation of, and adaptation to, global warming keep rising, as the threats become more severe.
The United States is the only country of the more than 190 nations that joined the Paris Agreement to withdraw from it. Of the other nations, only Syria never acceded to the Agreement. (In October 2017 the only other holdout, Nicaragua, joined the Agreement.) It is unconscionable that a nation as respected as the United States has consistently refused to join the other nations of the world in recognizing the irrefutable scientific evidence, and acted accordingly. The present U. S. administration, instead of mitigating emission rates of greenhouse gases, is consciously reversing previously enacted policies. The result can only be accelerated emissions of greenhouse gases, with the consequent worsening of all the effects of warming. This will be a legacy for all our children and further progeny, one our leaders cannot be proud of.
For these reasons we here in the U.S. must act to restrain Pruitt’s EPA policies and the framework envisioned by the Trump administration. We must reinstate bona fide science as the guide for our actions.
Summary. Two extreme rainfall events with catastrophic flooding occurred in the U. S. recently. The first was in the Baton Rouge, Louisiana, area in August 2016, and the second is ongoing at this writing in August 2017 in southeastern Texas including Houston.
Attribution of extreme events to global warming has become more reliable as a result of increased capabilities built into the statistical procedures employed in such analyses. Global warming likely contributed about 20% to the rainfall experienced in the Baton Rouge flooding event of 2016.
Global warming is now recognized to be due largely to emissions of greenhouse gases by humans. It is projected to grow worse in coming decades if stringent efforts are not made to reduce these emissions. In that case it is foreseen that extreme weather events may become more frequent and more severe.
Introduction. The southern United States has suffered two episodes of unprecedented rainfall and flooding in the past year. In August 2016 Baton Rouge, Louisiana and the surrounding area experienced torrential rain and rapid, extreme flooding beginning August 11 and extending beyond August 16. Major damage and human dislocations resulted from this catastrophe.
In 2017 Hurricane Harvey left the Gulf of Mexico and made landfall near Corpus Christi, Texas on August 25. Contrary to the paths of many hurricanes, Harvey degenerated into a tropical depression and stalled over southern Texas for days; as of this writing on August 29 it has drifted slowly to the northeast, hovering over Houston, Texas. At various locations it has drenched the land with 20-40 inches (50-100 cm) of rain over this time (accessed August 29, 2017), causing extreme flooding, especially in the Houston area. It is projected to continue northeastward toward Louisiana in the next day or more.
Flooding in Baton Rougearose as an unusual weather pattern leading to excessive rainy conditions slowed considerably over the region for several days . In the most severe case rain fell at a rate of 2–3 inches (5.1–7.6 cm) per hour, and produced a total of 24 inches (61 cm) of rainfall, with a maximum recorded as 31.4 inches (79.7 cm) in Watson, Louisiana. The National Weather Service estimated the likelihood of such an event as 0.1%. Flooding of eight rivers in the area led to major disruptions and damage, including damage to 146,000 homes, with tens of thousands of people relocated to emergency shelters. About 265,000 children, or one-third of Louisiana’s school pupils, were prevented from attending school. The economic impact has been estimated at between $10-$15 billion.
Rainfall and flooding in southern Texas is continuing at the time of this writing, and is expected to migrate east toward Louisiana in the coming days. The amount of rainfall to date is extremely high; an interactive display of rainfall rates and total accumulated rainfall at various locations is available online (based on the National Weather Service; accessed August 29, 2017). As of this writing, the total for the Corpus Christi area is 20 inches (50 cm), with a maximum rate of almost 3 inches per hour (7.5 cm per hour) on August 26. The Houston area is far more seriously affected, according to the interactive map. One location northeast of Houston shows a total rainfall to date of 52 inches (130 cm) with a maximum rate of about 10 inches per hour (25 cm per hour). (The normal annual rainfall in Houston is 49.8 inches (126 cm). Images and videos of the flooding, its damage and human tragedy can be seen currently on news sources and the internet. The economic impacts will certainly be extremely high.
Reports such as the Fourth National Climate Assessment draft (NCA) foresee worsening catastrophes such as those described here. The draft NCA was prepared by climate scientists and related specialists drawn from thirteen U. S. government departments and agencies, as well as a large number of scientists in nongovernmental research facilities. They critically assessed peer-reviewed research and similar public sources, including primary datasets and widely-recognized climate modeling frameworks. These standards assure that the findings of the report are objectively accurate, avoiding bias toward any unsubstantiated point of view. By law the NCA cannot make any policy recommendations.
Among its conclusions, the NCA finds it is “extremely likely” that activities by humans have been the “dominant” cause of the warming observed since the middle of the 20th century. It states with “very high confidence” that no alternatives, such as cyclical changes in solar energy reaching the Earth or variations in natural planetary factors, can explain the observed climate changes.
The NCA projects with “high confidence” that heavy precipitation events will continue increasing over the 21st century. As noted, these trends are attributed to human activity. They will likely worsen considerably as the climate warms.
Global warming contributes to the severity of extreme weather events. Of the excess heat retained by the earth, i.e., the land, air and sea, as a result of man-made global warming, 90% enters the waters of the ocean. The U. S. National Oceanographic and Atmospheric Administration finds that the sea surface temperature of the Gulf of Mexico in the early months of 2017 exceeded the 35-year average for 1981-2016 by about 0.75°C (1.3°F), and about equaled the record for that period. Since the amount of water vapor that air can hold increases by about 7% per °C (about 4% per °F), the warmer Gulf surface temperature increased the water vapor capacity of the air by about 5% compared to earlier years.
Since the complete weather system defined as hurricane/depression Harvey is spending a large fraction of its time over the Gulf, it recharges its moisture content continuously, indefinitely. Over land, much of this added moisture in the system falls as additional amounts of rain, compared to earlier years. Similar considerations hold for the Baton Rouge extreme event of 2016. The physical damage and human harm inflicted by such calamities is costly. Ultimately much of the burden becomes added expenditures imposed as taxes on the population at large.
Attribution of specific events to the general finding that global temperatures are rising has become far more reliable in recent years. The procedures use advanced statistical measures to assess whether the extent by which the extreme event exceeds historical records has explanations other than global warming. If not, a proportion of the overall extreme event may be attributed to the excess effect provided by global warming.
Since the Houston extreme rainfall and flooding event is still in progress, it is too early to attempt attribution of its causes. The Baton Rouge event, however, has been assessed by attribution methods. Wang and coworkers identified atmospheric weather patterns that promoted the catastrophic rainfall of this episode. Regional model simulations lead to an estimate that global warming since 1985 likely increased the observed rainfall by 20%.
Authoritative analyses of the earth’s climate show that the warming experienced to date is primarily due to man-made additions of greenhouse gases to the atmosphere. This enhances retention of heat within the earth system rather than radiating excess heat to space. Continued human activity that produces more greenhouse gases in the future is expected to worsen this effect, according to climate models, leading to excessive warming of the planet’s air, land and oceans. In such a case, one consequence is expected to be more severe, and more frequent, extreme weather events such as the Baton Rouge intense rain and flooding, and hurricane/tropical depression Harvey currently wreaking havoc in Texas and Louisiana.
Stringent reductions in further emissions of greenhouse gases are called for in order to lessen the impact of future extreme weather events.
A draft of the Fourth National Climate Assessment reports that the global average temperature for the 30-year period from 1986 to 2016 rose by 1.2°F (0.7°C). It is extremely likely that activities by humans have been the principal cause of this warming. Extreme temperature and rainfall events have increased over this time, as have forest wildfires.
Arctic land-based ice has been lost to melting, and the extent and thickness of sea ice has decreased. The mean sea level has risen about 7-8 in (about 26-21 cm) since 1900. Ocean waters have taken up 93% of the excess heat of the Earth system due to global warming since the 1950s.
Global greenhouse gas emissions are projected to continue and consequently global temperatures will continue increasing and related trends will continue. Limits to the intended increase require that humanity reduce annual emissions to zero by 2100.
The draft states “Choices made today will determine the magnitude of climate change risks beyond the next few decades.”
The Fourth National Climate Assessment(NCA) is due in 2018 (See Background at the end of this post). However the U.S. Global Change Research Program (USGCRP), which oversees preparation of the NCA, has prepared a Final Draft of a Climate Science Special Report (CSSR; see Note 1 at the end of the post) that has become publicly available as a freestanding document on which the actual NCA will be based.
This post is based on the CSSR Executive Summary (ES).Confidence levels and likelihoods given here in italicsare taken directly from the ES. They are carefully defined in the CSSR. Phrases in quotes are taken verbatim from the CSSR text.
The Historical Record
The global average temperature has risenabove the average for the six decades 1901-1960 by 1.2°F (0.7°C) for the recent period from 1986 to 2016 (very high confidence). The map below shows temperature increases gridded across the globe.
The map shows that since the reference decades the entire surface of the planet, both land and sea, has increased in regional average temperature. The greatest increase has occurred in Canada (especially in the northern and Arctic regions), Alaska, Siberia, northeastern China and eastern Brazil. Indeed, the Arctic is warming about twice as fast as the global average.
It is extremely likely that activities by humans have been the “dominant” cause of the warming observed since the middle of the 20thcentury. No alternatives, such as the cyclical changes in solar energy reaching the Earth or variations in natural planetary factors, can explain the observed climate changes (very high confidence).
Extreme climate-related weather events have increased in number and severity. Since 1980 the cost of such calamities in the U. S. is over US$1 trillion. Extreme events can impact water quality, agriculture, human health, infrastructure, and lead to disaster events. In the U. S. the number of high temperature records in the past 20 years is much higher than the number of low temperature records (very high confidence).
Heavy precipitation events in most regions of the U. S. have increased in intensity and frequency since 1901, especially in the northeast. (high confidence).
The occurrence of large forest wildfires has increased in the U. S. West and Alaska since the early 1980s (high confidence).
The waters of the oceans have absorbed about 93% of the heat accumulating in the Earth system due to global warming since the 1950s (very high confidence). This affects climate patterns around the world.
In the Arctic, ice sheets overlaying land have been melting for at least the last three decades; in some locations the rate of loss is accelerating (very high confidence). The rate of melting of ice sheets over Greenland has accelerated in the last few years (high confidence). As this ice melts the water flows to the ocean, resulting in a net increase of sea level. Arctic sea ice has been imaged since satellite flights permitted. The sea ice floats on the Arctic Ocean; its area expands and contracts in freeze-thaw seasonal cycles without any net change to global sea levels. Rather, the extent responds to changes in air and sea temperatures. The least extent, i.e., the most melting, occurs typically in September. Striking images showing the loss of September sea ice from 1984 to 2016, both in thickness (color coded white as having been formed at least four years earlier) and in overall surface area, are shown in the images below:
Sea ice thickness has decreased by between 4.3 and 7.5 feet. September sea ice extent has decreased by 10.7% to 15.9% per decade (very high confidence). These changes reflect warming of the Arctic region over this time frame. It is virtually certain that human activity has contributed to Arctic surface temperature increases, sea ice loss, glacier mass loss and snow extent decline seen across the Arctic (very high confidence).
The mean sea level has risen about 7-8 in (about 26-21 cm) since 1900(very high confidence). This is attributed “substantially” to human-induced climate change (high confidence). The rate of sea level rise is greater than any found in the last 2,800 years (medium confidence). Ocean waters are absorbing more than 25% of the carbon dioxideemitted into the atmosphere by burning fossil fuels. Carbon dioxide is weakly acidic when dissolved in water, increasing its acidity (very high confidence). This negatively impacts marine ecosystems in many important ways.
Projected Future Climate Trends
Extreme climate-related weather events will continue for many decades.
By the end of the 21st century if the world generates significant reductions in greenhouse gases the global average temperature increase could be limited to 3.6°F (2.0°C) or less. This would require a pathway of annual GHG emissions reaching near zero by then. In contrast, minimal constraints on the annual emissions rate could result in a rise of 5.8-11.9°F (3.2-6.6°C) (high confidence).
Even if the annual rate of greenhouse gas (GHG) emissions were to fall to zero, the high burden of GHGs already accumulated in the atmosphere would persist for a long time. The CSSR foresees that even in this event the global average temperature would rise further (high confidence), perhaps by an additional 1.1°F (0.6°C) (medium confidence).
But realistic projections all foresee continued GHG emissions into the future. U. S. temperatures will continue to rise (very high confidence); new records for high temperatures will be frequent (virtually certain). Temperatures by the end of this century will be much higher than the present (high confidence). Heavy precipitation events are projected to continue increasing over the 21st century (high confidence). In the western U. S., large reductions in mountain snowpack, and more precipitation falling as rain rather than snow, are projected as the climate warms (high confidence). These trends are attributed to human activity (high confidence). They will likely worsen considerably as the climate warms (veryhigh confidence). In the absence of reductions in emission rates long-duration hydrological drought, due to decreased retention of soil moisture, becomes more likely by the end of the century (veryhigh confidence).
Further warming is projected to lead to increases in wildfires (medium confidence).
If GHG emission rates continue unconstrained, the average sea surface temperature is projected to increase about 4.9°F (about 2.7°C) by 2100 (very high confidence).
The mean sea level will continue increasing, to varying extents depending on future emission rates, by at least 1 ft (30 cm; very high confidence) and as much as 4 ft (130 cm; low confidence) by 2100. If the Antarctic ice shelf is lost due to high emission rates the upper bound could be as high as 8 ft (260 cm). It is extremely likely that sea level will continue rising beyond 2100 (high confidence) as ice continues melting.
Further loss in Arctic sea ice will continue throughout the 21st century, very likely resulting in a virtually ice-free ocean by the 2040s (very high confidence).
Conclusions of the CSSR
Limiting the total global average temperature increase to 3.6°F (2.0°C), or less, from a 19th century baseline will require significant constraints on future GHG emission rates. Even though annual emission rates decreased slightly in 2014 and 2015, they are still too high to meet commitments that nations made upon entering the 2015 Paris Agreement (high confidence). Indeed, present and projected emission rates would bring the atmospheric level of GHGs to levels so high that they have not occurred for at least the last 50 million years (medium confidence).
New carbon dioxide released “today” is long-lived, persisting in the atmosphere for decades to thousands of years. Therefore it’s important to note that the relationship between total atmospheric CO2 concentration and the increase in global temperature is a linear one.
The ES states “Choices made today will determine the magnitude of climate change risks beyond the next few decades. Stabilizing global mean temperature below 3.6°F (2°C) or lower relative to preindustrial levels requires significant reductions in …CO2emissions…before 2040 and likely requires net emissions to become zero….” If humanity continues emitting GHGs at rates higher than called for here we would reach the 3.6°F limit only two decades from now, with further temperature increases later.
Finally, changes that are unanticipated or difficult or impossible to manage, may arise during the next century. Examples are complex (or simultaneously occurring) phenomena, and self-reinforcing changes (positive feedbacks). Such occurrences would accelerate the world’s changes to points beyond the accepted CO2temperature limits.
Issuance of this NCA is mandated by an act of Congress. It is important that this Final Draft, the CSSR, continue on its bureaucratic trajectory and be issued on schedule in 2018. Yet some of the scientists contributing to the Draft hold positions in departments or agencies whose heads have expressed disdain or opposition to the phenomenon of global warming, the Paris Climate Agreement, have acted to reverse federal policies that limit extraction and use of fossil fuels, or have deleted pages concerning global warming from their agency’s websites. They all work in an administration whose head has declared global warming to be a hoax. These situations potentially place the contributing scientists in conflicting positions. Their work is commendable and should be supported. The NCA should be issued without being altered, nor should it be suppressed.
[Update August 17, 2017: The science journal Nature has published a news article that discusses the CSSR and the political considerations facing the U. S. administration as it weighs issuing the Fourth NCA. Climate scientists are concerned about the fate of the Report. Nature notes that the Heartland Institute, a conservative think tank that promotes skepticism about global warming, is consulting with the Environmental Protection Agency on this issue.]
This NCA is only the latest in a long series of reports detailing the reality of warming and specifying the harms that global warming and climate change cause to our planet. In particular, it attributes the cause to human activity, including the burning of fossil fuels.
We must all undertake to reduce emissions of GHGs in our personal lives, and support policies promoting reductions at the state, national and international levels.
The U. S. Global Change Research Act of 1990 mandates preparation of assessments of global change every four years to “assist the nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change”. It assesses the current state of scientific understanding of global change on the natural and human environments. Its tasks, however, do not include formulation of policies to address global warming.
Climate scientists and related specialists drawn from thirteen U. S. government departments and agencies (see Note 2 at the end of this post), as well as a large number of scientists in nongovernmental research facilities, prepared the CSSR and the NCA. They critically assessed peer-reviewed research and similar public sources, including primary datasets and recognized climate modeling frameworks.
1. USGCRP, 2017: Climate Science Special Report: A Sustained Assessment Activity of the U.S. Global Change Research Program [Wuebbles, D.J., D.W. Fahey, K.A. Hibbard, D.J. Dokken, B.C. Stewart, and T.K. Maycock (eds.)]. U.S. Global Change Research Program, Washington, DC, USA, 669 pp.
2. The federal scientists involved in preparing the NCA and the CSSR are drawn from the:
Department of Agriculture,
Department of Commerce (National Oceanic and Atmospheric Administration),
Summary. Scientific research is pursued as an unbiased, objective inquiry into the properties of the natural world. The foundations of climate science were laid over the last two hundred years, establishing that man-made production of carbon dioxide induces an atmospheric greenhouse effect. Current political influence seeks wrongly to raise doubts about these immutable facts.
These advances are all based on a common set of principles that underlie scientific investigation: the rigorous preservation of independent, unbiased research; pursuit of new scientific knowledge that builds on the results of previous studies; and research that either seeks to find support for new hypotheses by further experimentation or pursues open-ended research in order to gain new, detailed understanding of the natural world. As the posts exemplify, new knowledge can lead to new technologies that are readily commercialized and broadly benefit the public at large. Regardless of the scientific field, these advances resulted from the universal application of open, unbiased inquiry into the properties of our natural world.
Our understanding of the atmospheric greenhouse effect and the role of excess carbon dioxide produced by humanity’s use of fossil fuels began two centuries ago. The scientists involved were either of nobility or in royal or university research settings. As with scientific progress generally, the development of climate science was based on the same principles of inquiry detailed above. The field grew during a time when scientific endeavor was pursued for its own value. Contrary to the present times extra-scientific factors, such as political influences on science and the results it provided, were largely unknown.
Here five landmarks in the development of what we now call the atmospheric greenhouse effect are summarized and discussed. More complete presentations of each are given in the Details section further below.
De Saussure’s Heliothermometer. In the late 18th century Horace-Benedict de Saussure developed an box, blackened on the inside and covered by glass panes, containing a thermometer. In sunlight the temperature inside this box rose to be much higher than that of the air outside the box. He called the box a heliothermometer.
Jean-Baptiste Joseph Fourierwas a French physicist and mathematician, interested in studying heat flow and thermal equilibrium at a global scale. In the 1820’s he knew of de Saussure’s box, and analogized its properties to those of the Earth. He likened the glass panes to the Earth’s gaseous atmosphere. He distinguished between the visible light of the sun passing through the atmosphere and striking the Earth, and the invisible rays of heat radiation, which he surmised were confined by the atmosphere. The heat radiation that can not escape to space results in warming of the Earth surface. As one whose thinking was conceptual, he did not perform any experimental work based on his model.
John Tyndallwas a British physicist, who, in the late 1850’s to 1860’s, constructed a novel apparatus that permitted him to measure directly whether a gas absorbed heat radiation. He showed that carbon dioxide was among several gases he studied that do absorb heat; water vapor also absorbs heat radiation. Tyndall’s results provided concrete evidence that components in the atmosphere retain heat within the Earth system, instead of radiating the heat into space.
In the 1890’s Svante Arrhenius, a Swedish physical chemist, feared that the excess carbon dioxide entering the atmosphere from burning fossil fuels (coal, oil and natural gas) would warm the Earth. He performed extensive calculations by pencil and paper supporting his concern, and predicted significant increases in global temperature if fossil fuel use were to continue.
Charles Keeling, an American geochemist, first measured the time dependence of the carbon dioxide level in the air, beginning in 1958. He showed that the amount was higher than at the time of Arrhenius, and that it increased year by year due to continued use of carbon-based fuels (fossil fuels) by humankind. His observations certified the fears that Arrhenius expressed. Others have vindicated the predicted rise in the temperature of the Earth.
The foundation of scientific investigation was laid more than four hundred years ago. It is based on an unbiased pursuit of new knowledge, gained by factual investigation into the properties of the natural world without preconceived biases on how the results should turn out. Recent posts here have provided examples of scientific findings and technological advances in the nineteenth and twentieth centuries.
Development of climate science followed the same principles: unfettered, open inquiry directed only to gaining a better understanding of our climate. This post highlights five main contributors to this endeavor starting in the late eighteenth century. Their work has led to an understanding of the atmospheric greenhouse effect, its basis in carbon dioxide and water vapor, and the rapid worsening of global warming as humanity’s use of fossil fuels has continued unabated. Developments in recent decades, building on the work of these five pioneers, makes clear that the world’s energy economy has to decarbonize as rapidly as possible.
Yet commercial energy interests have exerted their considerable political influence to maintain the status quo. They seek to discredit the science of global warming, by questioning that conclusion without supporting scientific data. They could just as readily have embraced the new reality, and committed themselves to new business models, free of fossil fuels, yet which have comparable potential for entrepreneurship and pursuit of profit.
The Heliothermometer. In 1779 Horace-Benedict de Saussure, a meteorologist and geologist of noble origins, published a set of experiments based on a thermally insulated box he devised. It was lined on the bottom with blackened cork and topped by a set of three glass panes separated from one another by air gaps. (Jean-Louis Dufresne: Jean-Baptiste Joseph Fourier et la découverte de l’éffet de serre. La Méterologie, Méteo et Climat, 2006, 53, pp. 42-46) The box contained a thermometer. He found that when the sun shone on this apparatus the temperature inside was much higher than that of the outside air. De Saussure, however, did not try to understand the basis for his finding.
Jean-Baptiste Joseph Fourierwas a French physicist and mathematician of the early nineteenth century. His interests lay in understanding the physics of heat, and in deducing the sources of heat that led to the ambient temperature of the Earth and its atmosphere. He was granted a faculty position at the École Polytechnique, and was elected to the Académie des Sciences.
Fourier concerned himself with heat fluxes of the entire Earth system (even though direct measurements had to wait until satellites became available about 150 years later; Jerome Whitington, 2016, The Terrestrial Envelope: Joseph Fourier’s Geological Speculation). He considered that de Saussure’s heliothermometer provided an analogy for the Earth. As described by Dufresne (cited above), Fourier first noted that the heat accumulated within the box is not dissipated by circulation to its exterior, and second, that the heat arriving from the sun as (visible) light differs from what he calls “hidden (i.e. invisible) light”. Rays from the sun penetrate the glass covers of the box and reach its bottom. They heat the air and walls that contain it. These rays are no longer “luminous” (i.e. are not visible) and preserve only properties of “dark” (or invisible) heat rays. Heat rays do not freely pass through the glass covers of the box, or through its walls. Rather, heat accumulates within it. The temperature in the box increases until a point of thermal balance is reached such that the heat added from the sun is balanced by the poor dissipation of heat through the walls.
Heat radiation had been discovered earlier during Fourier’s lifetime and he probably was familiar with this phenomenon. Today we identify heat as infrared radiation, and de Saussure’s heliothermometer as a fine example of a greenhouse. Indeed any car standing closed in the sun becomes a greenhouse. When we get in it we are immediately immersed in a very hot atmosphere.
According to Dufresne, Fourier drew the analogy between the glass covers of de Saussure’s heliothermometer and the Earth’s atmosphere. He understood that the atmosphere is transparent to the visible light of the sun, which then reaches the surface of the Earth. The surface is heated by the sunlight and emits its energy as “dark”, or heat, radiation. Fourier wrote (Dufresne; this writer’s translation): “Thus the temperature is raised by the barrier presented by the atmosphere, because the heat easily penetrates the atmosphere in the form of visible light, but cannot pass through the air [i.e., back into space] when converted into dark [i.e. invisible] light.” This is the phenomenon which we now call the greenhouse effect exerted by the atmosphere.
John Tyndall showed that carbon dioxide absorbs heat radiation. Tyndall was a British physicist whose research centered around radiation and energy. He became a fellow of the Royal Society in 1852, and became a professor at the Royal Institution of Great Britain.
He studied whether the gaseous components of air absorb radiant heat. He devised an apparatus, shown in the image below, that compares the absorption of heat by a gas to that of a reference:
Tyndall’s differential spectrometer for measuring radiant heat absorption by a gas. The gas was introduced into the long tube in the upper center. Loss due to absorption of radiant heat by the gas was compared to a reference heat signal produced at the left. The losses were compared in the double-conical thermopile at left center, and the resulting electrical signal was measured by the galvanometer (a sensitive measuring device) at the lower center.
Source: James Rodger Fleming, “Historical Perspectives on Climate Change”, Oxford University Press, 2005; attributed in turn to John Tyndall, “Contributions to Molecular Physics in the Domain of Radiant Heat” (London, 1872).
In 1859, among other gases, Tyndall studied oxygen, nitrogen (the major components of air), water vapor and carbon dioxide. He found that oxygen and nitrogen had minimal absorption of heat radiation, and that water vapor was a strong absorber. His experiments placed water vapor and carbon dioxide as main contributors to the role of the atmosphere in retaining heat radiation. He stated “if…the chief influence be exercised by the aqueous vapor [i.e., water], every variation…must produce a change of climate. Similar remarks would apply to the carbonic acid [i.e., carbon dioxide] diffused through the air…” (cited by Crawford, “Arrhenius’ 1896 Model of the Greenhouse Effect in Context”, Ambio, Vol. 26, pp 6-11, 1997). Tyndall’s specific findings extended the theory that Fourier had set out in more general terms (see above) more than three decades earlier (Rudy M. Baum, Sr., “Future Calculations; The First Climate Change Believer”, in Distillation, 2016).
[Climate scientists are not concerned about a danger to our planet from warming due to water vapor. The amount of water vapor in the air at any temperature has an upper limit: what we call 100% relative humidity. When that limit is reached water vapor in the air returns to Earth as liquid (rain) or solid (snow, hail) precipitation. The water vapor content of the atmosphere can never exceed this upper bound. This limit is slightly higher with increased global temperature. In contrast, the atmospheric content of carbon dioxide has no upper bound. That is why scientists urge us to decarbonize our energy economy.]
Svante Arrhenius was a Swedish physical chemist working around the turn of the 20thcentury. He had wide-ranging interests in various aspects of chemistry, including the effects of carbon dioxide on the temperature of the Earth. He won the Nobel Prize in Chemistry in 1903, and became the Director of the Nobel Institute in 1905. He was motivated by the desire to understand the origins of past ice ages. Knowing of Tyndall’s work on carbon dioxide, he raised the possibility that humanity’s use of coal and other fossil fuels would lead to excess warming.
Arrhenius estimated the intensity of heat absorption by water vapor and carbon dioxide in the atmosphere from data gathered by an American astronomer, Samuel Langley (described by Crawford). He hypothetically changed the absorption intensities of the gases due to changes in their quantities. This is important in today’s context because the increase in the quantity of carbon dioxide is the principal cause for warming today. [As we recall that his mode of calculation was pushing pencil on paper, it is estimated that he formidably carried out between 10,000 and 100,000 individual calculations.]
Importantly, Arrhenius identified human use of fossil fuels as a significant contributor to increased amounts of atmospheric carbon dioxide. He predicted that if the gas amounts increased by 50% the temperature would rise by 3°C (5.4°F); for a doubling of carbon dioxide in the atmosphere he predicted an increase in global temperature of 5° to 6°C (9.0 to 10.8°F ; J. Uppenbrink, Science, 272, p. 1122).
His projection was met with skepticism at the time because the actual amounts of the gas added to the atmosphere then was thought to be inconsequential, and because some assumed that much the gas would be absorbed by the oceans.
Charles Keeling was the first to show atmospheric carbon dioxide is increasing with time. He was an atmospheric scientist working at the Scripps Institution of Oceanography in the U. S., beginning in 1956. (Scripps was the nucleus for the University of California campus at San Diego.) A few years earlier he developed an instrument that measures atmospheric carbon dioxide content in real time. Keeling began monitoring carbondioxide on the summit of the volcano Mauna Loa, in Hawaii, 3000 m (ca 2 mi) above sea level. Because of its remote location and high altitude this site was thought to be largely unaffected by short term changes arising from human activities.
Keeling earlier had determined that the carbon dioxide level was higher than in the 19thcentury. After three years he showed clearly that the level was increasing with time. He, and the Mauna Loa laboratory more recently, has tracked the gas level in the atmosphere; the results (now called the “Keeling Curve”) to 2016 are shown here:
A principal motivation for Keeling’s interest in the carbon dioxide content of air came from Arrhenius’s prediction 60 years earlier that addition of carbon dioxide to the air from burning fossil fuels could increase global temperature. In contrast to doubts about warming raised in Arrhenius’s time, Keeling’s measurements show unequivocally that the carbon dioxide level is rising rapidly (see the graphic above).
Keeling also devised the measurement of the ratios of the isotopes of carbon in atmospheric carbon dioxide. This permitted others to show clearly that the excess carbon dioxide in the atmosphere can only arise from plant matter, i.e., from the industrial scale burning of coal, oil and natural gas by humanity. Keeling’s work was the basis for a report from the U. S. National Science Foundation in 1963, and of the U. S. President’s Science Advisory Committee in 1965, warning of dangers of excess, and increasing levels of, heat-trapping gases (such as carbon dioxide) in the atmosphere.
Summary. Our understanding of the human immune system made tremendous progress all through the 20th century, continuing even to the present day. As part of this effort, important advances have been made in vaccine development, saving millions of lives worldwide. But false reports linking vaccination with autism have led to rejection of vaccination by many parents, who fear incorrectly that vaccination may trigger the later appearance of autism.
Science can only proceed by open-ended inquiry, untainted by preconceived biases. Unscientific proposals that are counter to the results of objective inquiry, such as the harmful movement to shun vaccination, are unproductive. They harm society at large by diverting attention and wasting resources. Human progress relies on critical verification of scientific discovery, and on building further on the progress made.
Allison had married relatively late, and was happy to have her infant son and husband as her family setting.
As a parent, she’s been aware of the controversy in the popular press surrounding the question of a possible link between administering vaccines to infants and the later development of autism. Having done considerable research on the controversy during her pregnancy, Allison realized she needn’t have concerns about this. She concluded that a possible connection with autism was mistaken. She proceeded to give all the vaccines recommended for growing children to her son, who continued growing to become a healthy child.
+ + + + +
The modern practice of medicinehas benefited enormously from the results of scientific research in the biological and medical sciences. We all are better off from the results of these efforts. The previous post dealt with the discovery of the structure of DNA and the genetic code, and the immune system. Here I discuss the immune system in vaccination, and fallacies surrounding use of vaccination.
Antibodies. One of the ways that the immune system reacts to, and fights against, a foreign (likely disease-causing) particle, such as a virus, cell, or parasite, is to generate antibodies (a group of special protein molecules) that specifically react against particular structures on the particle (the antigen). The antibodies bind molecule-to-molecule to the antigen, inactivating the particle (see the previous post) and preventing the disease or minimizing its harmful effects.
Vaccines. Vaccination is another process involving the immune system. It has long been used as a way of deliberately confronting the immune system of a healthy person with a component obtained from, or that is similar to, a foreign antigen that causes disease in humans, in order to generate protective immunity against appearance of the disease later on. The process was originally developed by Edward Jenner in the 18th century for preventing smallpox. It was known anecdotally that milkmaids who contracted cowpox from infected cows rarely became infected with smallpox. Cowpox causes a similar, but weaker, illness in humans than smallpox.
Jenner’s scientific breakthrough was to take the fluid produced in the cowpox infection, and to introduce it under the skin of healthy people. He found that the treated people did not develop smallpox. (We now understand that his procedure induced the immune system to produce antibodies that specifically attack the cowpox- or smallpox-causing substance, a virus. A protein on the cowpox virus is sufficiently like the corresponding protein on the smallpox virus that the human antibodies that bind to cowpox also bind to smallpox, thus inactivating it.)
Many vaccines have been developed since then to immunize people against infection from viruses or bacteria. Today commercial vaccines contain a very small amount of a preservative to keep them sterile. Most vaccination is for young infants and toddlers, to protect against diseases such as diphtheria, whooping cough and measles.
Vaccines and Autism. Public fear over use of vaccines has developed over the past two decades. It started with an erroneous and misleading report of a connection between vaccination and autism in children. (See Science (2017), Vol.356, pp.364-373.)
In 1998 a physician, Andrew Wakefield, published a report in a respected British medical journal suggesting that use of the measles-mumps-rubella triple vaccine could lead to later development of autism. His report led to a 20% reduction in vaccinations in Britain. But in 2004 a journalist found that Wakefield was involved in trying to patent a competing measles vaccine, which was a serious conflict of interest. The journal, in further investigations of its own found serious ethics violations by Wakefield and retracted his original paper in 2010. A short time later the British General Medical Council revoked his license to practice medicine at all. Thus Wakefield’s original findings were definitively determined to be invalid.
There is also a more general suspicion by many parents of an unwelcome intrusion by government authorities into a family’s health care decisions, as well as other nonscientific fears.
More recently, the Journal of the American Medical Association reported no difference in rates of autism between vaccinated and unvaccinated children. (See Science, cited above.)
Thimerosal, a mercury-based preservative, had been used in very small amounts in vaccines until 2001, when its use was discontinued in almost all vaccine compositions. Even so, four years later in 2005 the lawyer Robert F. Kennedy Jr. (late President John F. Kennedy’s nephew) alleged in Rolling Stone and Salon that the U. S. government was hiding evidence that use of thimerosal led to increased incidence of autism. The data Kennedy cited were mistaken, and in 2011 Salon retracted his article.
Both the U. S. Centers for Disease Control and Prevention and the United Nations’ World Health Organization have concluded that thimerosal causes no health problems in children. (See Science, cited above.) By contrast, a report published online in Nature on July 12, 2017 compares social interactions of infants and toddlers among pairs of identical twins, pairs of fraternal twins, and paired but unrelated single-birth infants and toddlers. A genetic component was detected among children in the study that later were diagnosed with autism but that was absent in normal children. This work shows that our scientific understanding of autism is growing as rigorous investigation proceeds.
Public refusal of vaccinationplaces the children of rejecting parents in considerable danger because the infections that the vaccines inhibit cause serious or fatal diseases. If enough children remain unvaccinated contagion can spread the diseases among them. On the other hand, the World Health Organization estimates that 2 to 3 million lives a year are saved because of successful vaccination programs.
Remarkable advancesin biological and medical science, including immunology, have been made since the end of World War II. Among these is cancer immunotherapy, in which antibodies are created to antigens on cancer cells, as if the cancers were foreign invaders like viruses and bacteria (see the previous post).
Immunological research has also led to use of new, effective vaccines, raising immunity to dangerous diseases in our bodies. To our detriment, nonscientific, indeed antiscientific, campaigns against the use of vaccines in children have increased the number of susceptible children among the populations of many developed countries, including the U.S., potentially to dangerous levels. The spread of virulent infections is enabled by the failure to vaccinate. (See Science, cited above.)
Scientific investigation seeks to expand our knowledge and understanding of the world we live in. Reflecting on the state of knowledge at any time, a curious scientist poses a question or suggests a hypothesis. Experiments directed toward answering the question or verifying the hypothesis are carried out as an objective pursuit, further characterizing the natural world, without introducing preconceived biases. Conclusions are then drawn based on the new results obtained. These frequently lead to practical applications that improve our health and prolong lives.
The intrinsic value of scientific study should be defended and supported by all to continue its progress, and to promote human welfare.
Summary. Biological science has made tremendous progress since James Watson and Francis Crick presented their model for the structure of deoxyribonucleic acid, DNA. Their model built on, and provided explanation for, important findings about DNA that were already known.
Since then, important progress has been made, for example, in analyzing the base sequences, i.e., the chemical code, of specific genes and using that information for diagnostic and therapeutic purposes. Gene sequences that identify mutations in breast cancer genes alert physicians to the possibility that virulent malignancies may develop later in life. Another genetic aberration, in the HER2 protein, permits targeting it with a therapeutic antibody, a major step forward.
Science can only proceed by open-ended inquiry, untainted by preconceived biases. Unscientific proposals that oppose the results of objective inquiry are unproductive. They harm society at large by diverting attention and wasting resources. Human progress relies on critical verification of scientific discovery, and on building further on the progress made.
Allison just left her gynecologist’s office, with a referral to have a breast biopsy. She was apprehensive, of course, since she had married relatively late, and was worried about her infant son and husband.
Allison had been particularly concerned about her biopsy, because she’s a relatively late-age mother. While breast cancer can arise in women of all ages, she had only recently had her first child, a son.
A couple of weeks later Allison went in for the biopsy procedure. In addition to the well-established examination of the samples from the specimen under a microscope, a small portion was also sent for DNA analysis, which searches for a genetic match for a mutation associated with some breast cancer cases. When such a match is found, the patient is more likely to develop breast cancer during her lifetime. The pathologist found no visual evidence for cancer in Allison’s samples, and the mutant DNA also was not present.
Now, with the negative biopsy result, and the information from the genetic analysis, Allison can continue to enjoy watching her son develop. But she will be under close medical scrutiny for the possibility that a cancer may develop later, as the years pass.
+ + + + +
The modern practice of medicinehas benefited enormously from the results of scientific research in the biological and medical sciences. Here I describe the discovery of the structure of DNA and the genetic code, and the immune system. The next post discusses the immunology of vaccination and the unproven fears that vaccines may be connected to autism.
DNA and the Genetic Code. Modern understanding of genetics has progressed dramatically since 1953, when the double helical molecular structure of DNA, the material containing the genetic code of all cells, was worked out by James Watson and Francis Crick (based on additional contributions from Rosalind Franklin). Other renowned scientists of the time had tried but failed to predict the structure. Watson and Crick discovered that the structure can only be correctly constructed if, considering the four DNA building blocks, an adenine on one strand of the helix is paired with thymine on the opposing strand; and guanine is paired with cytosine (see the graphic below).
Diagram showing the double helix structure of the DNA molecule. Guanine (green) on one strand can only bind with cytosine (red) on the other, and thymine (orange) only with adenine (blue).
The sequence of the four building blocks, generally called nitrogenous bases, along the DNA strand, defines a code that specifies how each protein is synthesized in the cell. Proteins are the large molecules that variously define a) how cells go about their molecular business, b) provide structural elements that give cells their sizes and shapes, or c) communicate with neighboring and distant cells of the organism.
Breast Cancer Susceptibility. As time passed, the base sequences of many single genes in cells have been worked out. Among them were the two breast cancer, or BRCA, genes, BRCA1 and BRCA2. A mutant, or variant, form of either of the BRCA genes can be easily identified by a genetic analysis that seeks the occurrence of the mutant base sequence in a subject’s DNA. Having the mutant form increases the risk for later development of breast cancer by about 50%.
Antibody therapy. Certain breast cancers can be treated with some new drugs that are the result of major recent advances. A breast cancer biopsy specimen can be assessed for the presence of a larger than normal level of a surface protein, the human epidermal growth factor receptor 2 (HER2) protein, which makes the cancer highly malignant. Such breast cancers can be treated with a new biotechnology product, a monoclonal antibody drug called Herceptin®, that binds to and inactivates HER2.
Polyclonal antibodies. One of the ways that the immune system reacts to, and fights against, a foreign (likely disease-causing) particle, such as a virus, cell, or parasite, is to generate antibodies (a group of special protein molecules) that specifically react against particular structures on the particle (the antigen). The antibodies bind molecule-to-molecule to the antigen, inactivating the particle.
Our natural immune system works by creating many antibody proteins that differ at the molecular level one from another, but that all bind to the foreign particle. Each antibody protein arises from a distinct, specific cell in the immune system. Because the natural immune reaction generates this large number of antibody species, collectively they are called polyclonal antibodies (see the image below).
Monoclonal antibodies. Immunological research revealed that different antibody molecules in the polyclonal set bind differently to the antigen (see the image above). Some that bind weakly only partly inactivate the antigen, whereas others that bind quite tightly are efficient as inactivators.
In 1975, Georges Köhler and César Milstein (Nature, 1975 Aug 7;256(5517):495-7), developed a technique that separated immune cells from one another, as well as creating hybrid cell fusions that never died. They cultured each isolated immune cell in a growth medium so that each culture now contained the daughter cells originating from a single fusion cell. Now the antibody molecules in each culture were all identical to one another. These are called monoclonal antibodies (see the image above). The resulting monoclonal antibodies could be tested for how well they neutralized the antigen.
Herceptin®is the trade name for a monoclonal antibody drug that reacts specifically with the HER2 protein on a tumor cell surface, preventing the normal functioning of the receptor and inhibiting tumor growth. In essence Herceptin® treats HER2, a normal constituent of human cells, as a foreign substance, attacking and inactivating it.
Remarkable advances in biological and medical science have been made since the discovery of the DNA double helix and the identification of the genetic code. Since about 2000 we have the sequences of all the genes in human DNA. This permits identification of normal and pathological conditions in humans. Understanding of the origins of many cancers has resulted, and has led to new therapies. Among these is cancer immunotherapy, in which antibodies are created to antigens on cancer cells, as if the cancers were foreign invaders like viruses and bacteria.
Science advances by the objective pursuit of new information about the natural world. The results frequently lead to practical applications that improve our health and prolong lives. The intrinsic value of scientific study should be defended and supported by all to continue its progress.
Summary. Our lives in the early 21st century benefit from remarkable changes wrought by science and technology in the last 200 years. As a human endeavor, science consists of a method of inquiry into the natural world based on open-minded investigation, rather than one biased in one direction or another that develops support for a desired point of view.
Certain science-based phenomena have come to light in recent decades that adversely affect human health or damage the environment. Rigorous study showed that, in each case, products or practices of large corporations turned out to be responsible. Those commercial interests sought to raise questions about the scientific results in the minds of the public, rather than continue further research to develop sound solutions to the problems.
We humans have benefited from the advances provided by science and technology. We cannot justifiably select the science we like and dismiss the science that we don’t.
Benedict (Benny to his friends) is waking slowly, after having stayed late at a party last night. He’s already enveloped in the soothing sounds of his favorite music, Sounds from Space, that invariably puts him in a mellow mood. His radio came on with the music using an alarm setting. He also plays music on his CD player; over the years he’s accumulated an extensive library of CDs. His tastes run quite eclectic.
At last Benny swings himself out of bed and hops on to his Stair Stepper for a workout. It’s equipped with a TV monitor so he can watch the latest news as he exercises.
After a leisurely breakfast, he gets ready to head out for his weekly frisbee match.
After the vigorous physical exertion of the game, he comes home and turns on his air conditioner to make his apartment more comfortable. Air conditioners are effective because they lower the air temperature, but equally importantly, they remove some humidity from the air. Lower humidity makes the body feel cooler because its perspiration evaporates more easily, cooling the skin.
Later, that afternoon, Benny has decided to attend a lecture at the local library on the shoreline habitats for all manner of wildlife. Lately he’s become even more interested in the natural world, and how different species interact in their habitats. The lecturer is using a computer-driven digital projector, and he emphasizes his discussion as he goes along using a laser pointer.
In the evening, Benny and Valerie, his girlfriend, went out for dinner and came back to relax with a movie streamed over the internet.
* * * * *
Benny’s day, a rather routine one in today’s world, benefited from many products that rely on developments in science and technology. Here we’ll discuss two classes of appliance, and a third that because of careful scientific investigation, became quite controversial.
Telegraph, and radio and television. For all of history before the industrial revolution, news, books and artwork traveled only as fast as humans could carry them. Walking and travel by horseback could transmit physical objects, whereas drumbeats, smoke signals and semaphore signaling could communicate more terse messages.
In the 1830’s and 1840’s clusters of inventors in the U.S. and England separately developed the telegraph. In the U. S., one of those was Samuel Morse. The previous post mentioned that nineteenth century physicists developed an understanding of the reciprocal interactions between electricity and magnetism. With the telegraph, a key pressed by a sender completed an electric circuit so that current could instantaneously flow as far as a conducting wire could be strung. At the destination, the current activated an electromagnet to sound a click. In addition to developing the technology Morse invented Morse code, by which the spacing between clicks permitted coding every letter of the alphabet. The technology developed into the Western Union Company (cofounded by Ezra Cornell, for whom the university is named) which strung wires across the U. S. This revolutionary technology liberated the transmission of information from the historical limits of personal or visual/auditory messaging.
The telephone built on the electromagnetic transmission of coded messages to the direct, immediate transmission of sound, especially the human voice.
The laws of physics relating to electromagnetism also led to radio and television transmission. Perhaps, if you live in an older home, you’ve noticed that a window sash will buzz or vibrate in its track as an airplane or a truck passes by. The window sash has its own characteristic vibration. The sound from the passing plane or truck can set the window vibrating, but only if the vibrations of the sound waves have the same pitch as the natural vibration of the window sash. This is variously called forced vibration or sympathetic vibration.
Radio and TV transmission and receptionwork the same way. A radio transmitter is designed to emit radio waves at a specific vibration frequency. If a specific receiver circuit in a radio or TV is adjusted to vibrate at the same frequency, the broadcast signal is picked up by the receiver, amplified, and delivers sound and picture images. If the tuner is not adjusted to the appropriate frequency it will not receive the broadcast signal.
Benny’s air conditioner is filled with a refrigerant gas, a chlorofluorocarbon. The technological principles underlying operation of refrigerators and air conditioners were explained in the preceding post.
Use of chlorofluorocarbons (CFCs) is an example where a useful technology turns out to have harmful consequences. When they were developed and entered the market, the use of CFCs as refrigerants and in other applications became widespread. During the 1980’s, however, researchers discovered that the amount of ozone in the stratosphere (a zone centered around 15 mi. above Earth’s surface) was diminishing compared to earlier years. Stratospheric ozone is beneficial because it filters out ultraviolet light from incident sunlight. (This should not be confused with ground level ozone, a health hazard, which is produced by smog on hot days.) If ozone becomes depleted, more ultraviolet (UV) light can reach the surface of the earth. The additional UV could increase the incidence of skin cancer the world over if the ozone depletion were to continue.
After some years atmospheric scientists showed clearly that chlorofluorocarbons (CFCs) caused the ozone depletion. These compounds enter the atmosphere when refrigeration equipment leaks its refrigerant or is improperly disposed of; when we use spray cans, such as hair spray; and when CFCs are used as industrial foaming agents. Even a small amount of CFCs has a powerful destructive effect because the active component derived from CFCs is re-used in the chemistry of ozone destruction many times over. For this discovery, Paul Crutzen, Mario Molina, and Frank Rowland were awarded the Nobel Prize in 1995.
In light of this new understanding almost 50 of the world’s nations, the main producers and users of CFCs, agreed to the Montreal Protocol of 1987 to phase out use of these compounds.
Corporate Interests generated doubt and delay. Early on the manufacture of CFCs, and of the spray cans that use them, became a lucrative business. Rigorous scientific research, pursued as a quest for understanding of basic properties of the natural world, led to evidence showing that CFCs were responsible for destroying stratospheric ozone. As this evidence was accumulating, however, the companies sought to neutralize the impact of the scientific results (Wikipedia; N. Oreskes and E. M. Conway, “Merchants of Doubt”, 2010, Bloomsbury Press, New York), without offering scientific evidence to support their position.
In one paper, prepared by Greenpeace for the 9th meeting of participants in the Montreal Protocol in 1997, a threefold corporate strategy of disinformation used by a major corporation was summarized:
Deny that CFCs are responsible. The corporation wrote in 1979: "No ozone depletion has ever been detected...all ozone depletion figures to date are based on a series of uncertain projections."
Delay. In the years surrounding the signing of the Montreal Protocol, this corporation sought to delay implementation of its terms by lobbying activities. In 1986 it testified before Congress: "we believe that there is no immediate crisis that demands unilateral regulation."
Dominate. The industry had already developed alternatives to CFCs, closely related in chemical structure to the banned compounds, by which they intended to dominate the world market for refrigerants and propellants.
This post and the preceding one, and perhaps a few more to come, strive to point out that humanity benefits from scientific endeavor, in all its varied subject matter. Scientists work by pursuing characterization of our natural world in an open, unbiased fashion. The results of scientific investigations and the technologies that result from those studies benefit our lives in innumerable ways. The progress we humans have made began largely with the industrial revolution in the nineteenth century; it represents a revolutionary departure from the way of life humans had known throughout history.
Telegraph and radio communication point out how scientific development permitted humans to communicate instantaneously across great distances. Prior to this time human communication traveled primarily only as fast as we could move across land and sea.
The example of CFCs used as refrigerants and propellants likewise shows how research creates new materials intended to have beneficial properties. The detrimental aspect of their use, promoting the destruction of stratospheric ozone, was unforeseen. It is thanks to further atmospheric research that the mechanism of ozone destruction was unequivocally identified, and still newer substances that avoid this downside were created. (Unfortunately, both CFCs and the newer refrigerants are extremely potent greenhouse gases. It will require still further efforts to overcome this detriment.)
When the drawback of CFCs was identified the powerful corporations that manufactured them sought to diminish the significance in the mind of the public of the scientific research underlying the problem. But science proceeds in the same way regardless of whether we consider the results to be favorable or harmful. As shown above, the same scientific process led to potential solutions that overcame the disadvantages.
The public at large, and corporate entities impacted by research results, cannot cherry pick the results they like and dismiss the ones they don’t. Rigorous pursuit of the scientific method is the only way forward.
Summary. Our daily routines, as we go about our lives in the early 21stcentury, benefit from revolutionary changes wrought by science and technology in the last 200 years. As a human endeavor, science consists of a framework of inquiry into the natural world based on open-minded investigation, rather than one in which scientists seek evidence or arguments that support preconceived biases and reject evidence that refutes those preconceptions.
Certain phenomena came to light in recent decades that adversely affected human health or damaged the environment. Rigorous scientific study showed that, in each case, human activity involving products or practices of large corporations turned out to be responsible. Those commercial interests sought to invalidate the scientific results in the minds of the public, rather than continue further research to develop sound solutions to the problems.
We humans have welcomed the advances provided by science and technology. We cannot justifiably select the science we like and dismiss the science that we don’t.
The Daily Routine
Janice gets up in the morning and gets ready to go to work. She switches on the light and the TV to get the latest news and weather. For breakfast, she takes a quick snack from the frig and heats it up in the microwave oven. She gets into her battery-powered electric car, which she bought just a few weeks ago; she’s really impressed with its ease of use and responsiveness on the road.
Once in the office, Janice turns on a networked computer which contains more computing power than the massive main-frame computers of a generation ago. Her coworkers include many colleagues scattered around the U. S., with whom she effortlessly teleconferences directly from her workspace. This saves many hours that would be lost in travel time flying to another location for a face-to-face meeting, as well as travel expenses. Her day is turning out to be highly productive as a result, and saves her company money in the process.
Back home in the evening, Janice has a dinner composed of foods grown using advances in agriculture that promote higher crop yields; farmers benefit greatly from weather and climate research that helps them plan effectively for the best sowing and harvesting operations.
* * * *
The Science and Technology That Janice Likely Takes for Granted
Science. As a human endeavor, science is a framework of thought and experiment carried out in an open-ended, fact-based fashion. Scientists seek to make sense of our physical world, both animate and inanimate. By not having preconceived notions of how they want an investigation to turn out, they probe physical reality in ways that add to our body of knowledge, and that suggest further investigation of questions that may have arisen in earlier work. New information obtained from these efforts may have direct practical significance having the potential to lead to products that improve our lives.
Technology, or applied science, seeks to optimize characteristics of a system to solve a specific practical problem or to make a specific article with an intended practical use.
Modern life. Like Janice, we all benefit from the progress of science and technology in our daily lives, and relish the conveniences and capabilities of new devices or processes as they reach the market. We, the public at large, accept these with open arms, whether we “understand” the scientific principles that govern their operation or not. We do not question the truth or validity of the science that undergirds these objects that ease our daily life; indeed, we welcome it with open arms because of the benefits that it brings to our lives.
The scientific basis underlying some of the items and phenomena that Janice encounters in her daily routine are set forth at the end of this post in the Details section.
But some scientific questions, or technological accomplishments, have turned out to provide adverse consequences. Smoking tobacco became associated with lung disease, including cancer. Pristine forests and fish in lakes downwind of coal-fired electric generating plants began to die inexplicably, which was ultimately attributed to acid rain from burning coal. The ozone in the stratosphere, which absorbs harmful ultraviolet radiation in sunlight, became depleted relatively suddenly. Research showed that certain chemicals developed to serve as propellants in spray cans were responsible.
Each of these cases is associated with a powerful and lucrative industry. Careful scientific investigation, using the same conceptual approaches as outlined above, in these cases succeeded in providing a sound scientific basis for the harm that each phenomenon produced. Yet the industries involved mounted strong public relations campaigns (not based on science) to discredit the science in order to sow doubts about the scientific explanations.
But we cannot cherry pick which science we like and which science we disavow. Open-ended, unbiased investigation leads us universally to the scientific progress we welcome and depend on. In the examples above, scientific study not only explained the origin of the respective adverse effects but also suggested how to remedy the problems. Thus, here too the scientific method has led to benefits that promote our wellbeing and the integrity of the physical world we inhabit.
Electricity. The laws of physics governing the interactions between electrically conducting materials (such as metal wires) and magnetic fields were identified during the nineteenth century. The phenomena are reciprocal: wires moving through a magnetic field generate electrical current, and electrical current flowing through wires generate strong magnetic fields when wound around a core. In other words, the opposite of generating electricity is the use of wire-wound motors to provide rotational mechanical motion by passing electrical current through them.
Thomas Edison on the one hand, and Nikola Tesla and Charles Steinmetz on the other, developed differing ways of generating electricity. Tesla joined the Westinghouse company; their technology won out. Steinmetz joined the General Electric Research Laboratories.
Refrigerators. The intrinsic physical properties of most gases are such that when the gas is compressed it releases heat to its environment, and when the pressurized gas expands it cools down, absorbing heat from the environment. Refrigerators work by expanding the gas in the chamber that needs cooling, absorbing heat from the food in the chamber so that the food is cooled. The refrigerator then compresses the gas outside the chamber, releasing the heat to the environment. (In recent decades, the reciprocal process has been applied in heat pumps: a gas is expanded in an external environment, absorbing heat, and compressed inside a home, releasing heat to warm the interior space.)
Microwave ovens. Physicists whose understanding led to generation of electricity pursued their studies leading to suitable instruments that emit microwaves. A second group of physicists who developed quantum theory over several decades in the early twentieth century understood that materials could specifically absorb microwaves (among other forms of energy) according to the laws of quantum physics. Water is one such substance, which is warmed in the process. A microwave oven generates the specific type of microwave radiation that water absorbs. Specifically, the oven works by efficiently warming the water contained in various foods using microwave energy.
Electric cars. Electric cars depend critically on high capacity batteries. To date these are based on lithium. The basis for this technology originates in fundamental investigations by chemists, mostly in the nineteenth century. One contribution was developing the systematics of the chemical periodic table. Lithium is a very light material, atom for atom, a first physical property favorable for use in batteries. Second, chemists found that the intrinsic ability of lithium to provide electrical energy is among the highest of all among the chemical elements. These two inherent physical attributes of lithium make it an optimal choice for use in electric car batteries. Current research and development is directed to making the batteries as efficient and long-lasting as possible.
Agricultural production. The Austrian friar Gregor Mendel was the first to discover the laws that govern inheritance of traits in organisms. Working with pea plants he showed by conventional breeding experiments that intrinsic factors (now called genes) govern how physical traits are passed from generation to generation. (His work was clearly painstakingly slow, since only one generation of pea plants can grow per year.)
Agricultural breeders utilize Mendelian genetics to enhance the properties of commercially significant plants and animals. These properties may include nutritional value, hardiness, and drought and/or heat tolerance, for example. The results of these projects benefit us, the consumers, as we make our grocery purchases.
The Agricultural Research Service (ARS), a division of the U. S. Department of Agriculture, conducts ongoing characterization and forecasting of near-term weather as well as the longer-term seasonal climate. Farmers use the information provided by these projects advantageously to plan their activities: planting, fertilizing, and harvesting. The work of the ARS is summarized in the pamphlet “Science in Your Shopping Cart” .
Summary. This post discusses three newspaper articles concerning global warming-induced sea level rise, which all appeared in a one-week period about the third week of April, 2017.
Sea level rise is inexorable, already irreversibly “baked in” to the planet’s climate, because melting of ice in the summer season is not restored by new snow and ice in the winter, and because the melted water flows away into the ocean.
Sea level rise is already causing human societal and economic damage around the world. It will continue unabated, and likely worsen, in future centuries. To minimize these harms, the world has to minimize greenhouse gas emissions to near zero as soon as possible. This process would be significantly advanced by adhering to the Paris climate agreement.
The Washington Post reported on April 26, 2017 that the rate of sea level rise now foreseen by scientists is considerably higher than published only four years ago by the Intergovernmental Panel on Climate Change (IPCC) in its Fifth Assessment Report.
The Post article reports that the projections are a collaborative effort among 90 scientists, which was subjected to peer review by 28 other scientists. Climate models based on two scenarios for continued rates of emission of greenhouse gases to the year 2100 were used for the projections. One is a moderately stringent policy limiting emission rates. The second is a scenario based on continued unconstrained emission rates comparable to those that reflect today’s fuel use. The results are shown in the following table, which also includes the 2013 IPCC projections for comparison.
Predicted sea level rise by 2100 [2013 IPCC prediction]
At least 52 centimeters (1.7 feet) [32 centimeters (1 foot)]
At least 74 centimeters (2.4 feet) [45 centimeters (1.5 feet)]
The updated estimates take into account the increased rate of melting of the Greenland Ice Sheet and Antarctic ice shelves recently observed, and expansion of the liquid ocean due to its higher temperature, among other contributing sources.
This article reports that Tasmania, the island south of the Australian mainland, is already suffering the ravages of sea level rise. The shoreline is being eroded by rising seas, and trees are being uprooted and falling into the sea. An abandoned shoreline coal mine is being filled in by the sea. The article states “The ocean is rising in large part…because people the world over have burned so much coal, pumping planet-warming carbon dioxide into the air. Perhaps a new stone marker [referencing a seaside prisoners’ graveyard] ought to be planted above the eroding mine: Cause, Meet Effect.” A Tasmanian ecologist stated, with some irony, “It’s a smoking gun for sea-level rise causing an acceleration of erosion. And it’s coal! Mined for burning!”
The article summarizes manifestations of worsening global warming: “In country after country, managers of national parks and other historic sites are realizing that climate change, with its coastal flooding and erosion, rising temperatures and more intense rainstorms, represents a profound risk to the heritage they are trying to preserve.” It mentions damage to the Statue of Liberty’s foundation by Hurricane Sandy, loss of most of the glaciers in America’s Glacier National Park, damage to Australia’s Great Barrier Reef due to rising ocean temperature (vindicating a 10-year old prediction), among many other examples.
Singapore is an independent island city-nation just off the coast of the southern tip of Malaysia. It is a thriving metropolis, whose economic base is commerce and the financial industry. The article notes that Singapore has felt the limitations of its small land area for decades. This has constrained the ways it can develop additional useful real estate as its fortunes continue to grow.
In recent years this quandary has been worsened by the encroachment of rising sea levels. Singapore fortunately has the financial resources artificially to expand its land area by robbing it from the sea. The image above shows one example. The city sinks massive ocean-resistant caissons (seen above from the air) into the sea bed surrounding its natural land base, forming void rectangular enclosures. It then imports huge quantities of sand, or of pulverized rock, and fills in the rectangles to provide new land area which, when completed, will form new surface area for development. The new land is high enough to withstand sea level rise in the coming years.
The article contrasts the case of affluent Singapore with other, more impoverished, island “micro-nations” that are losing the battle against rising seas. Solomon Islands is a Pacific Ocean nation on six major islands and several hundred smaller islands, with an area of 11,000 sq. mi. The article notes that five small islands have already disappeared under rising seas. Kiribati has bought 6,000 acres of land 1,000 miles away in Fiji for resettlement of its people. The Maldives is considering a similar purchase in Australia. Some of the people living on the island micro-nations of Tuvulu, the Marshall Islands and Nauru have already departed.
Newspaper reports on sea level rise. The examples cited in the articles above pinpoint the flooding, and consequent damages, to be expected along coastlines all over the world as sea levels continue rising. Man-made global warming, the main cause for the rising seas, is unequivocally due to humanity’s burning of carbon-containing fuels for energy, emitting the greenhouse gas carbon dioxide into the atmosphere. (Other man-made greenhouse gases also contribute to warming.)
The fundamental problem is thatcarbon dioxide remains resident in the atmosphere for centuries because there are no natural processes that remove it at the speed and on the massive scale needed to balance the excess amounts that we produce. As a result, warming will continue worsening until emissions are effectively minimized to near zero.
Polar melting. As noted in the Summary, the long-term average temperature of air in contact with the Greenland ice sheet and of ocean water in contact with the Antarctic ice shelves is already warm enough to lead to net melting of these ice reservoirs, raising global sea levels. We cannot go back to a planetary regime having a lower temperature (which might slow or stop melting of the ice) because of the permanence of carbon dioxide in the atmosphere. Consequently the sea level is projected to increase for centuries. Projected higher temperatures will worsen this trend.
The “social cost of carbon”is an economic term for a framework that attempts to place direct financial costs, as well as indirect societal costs, on the consequences of carbon dioxide-induced global warming. This is necessary because direct costs for the use of carbon-containing fossil fuels stop at the point of sale of the fuel. The costs incurred as consequences of the resulting global warming are not reckoned in the sale price.
This may be contrasted, for example, by the costs that residents bear to have their household waste removed by tax-supported services, or the charges that they pay for treatment of their waste water. The separate expense of handling the waste is directly borne by property owners and/or municipal taxpayers. No analogous cost for waste treatment is built into the cost structures of fossil fuel-derived energy use. This is the accounting that enters into pricing the social cost of carbon.
Contributions to the social cost of carbon are seen in the journal snippets presented here. Singapore is fortunate in having the resources to protect itself from sea level encroachment. The other oceanic island micro-nations mentioned here do not; they face existential threats in the near future.
In the U. S., coastal communities in Miami and south Florida, as well as Norfolk, Virginia, now suffer fair weather flooding at high tide, due to higher sea levels, that had not occurred previously. Their cost of carbon lies in the extensive, expensive barriers they are forced to put in place to minimize the flooding. Likewise, the New York region is planning to construct similar barriers as a defense against the possibility that future storm surges similar to that of Hurricane Sandy could occur. All these projects were not foreseen in earlier budgeting processes. The additional expenses for them become unexpected taxpayer burdens at the state and local levels. They clearly represent social costs of carbon that are not included in the prices paid for fossil fuels at the time of use.
Three simultaneously published newspaper articles have pointed out the present and future harms to humanity due to sea level rise. The rising level is due to humanity’s burning of fossil fuels, worsening the carbon dioxide-induced greenhouse effect and producing warmer global average temperatures that melt polar ice caps.
We must work together to minimize future increases in the carbon dioxide burden of the atmosphere in order to slow continued sea level rise. (The world’s temperature is already too high to stop it outright.) The Paris climate agreement of 2015 is a good start on this path. All nations of the world should embrace its provisions, and improve the emission limits it has created. Rejecting the agreement would be at humanity’s peril.