Skeptical Science examines the science and arguments of global warming skepticism. Common objections like 'global warming is caused by the sun', 'temperature has changed naturally in the past' or 'other planets are warming too' are examined to see what the science really says.
We are coming back to the topic of sea-surface temperature measurements, which we have taken on in the article “Ocean Temperature – Part 1”. This time, we will discuss issues concerning satellite measurements.
Around 1800, Marc-Auguste Pictet performed an experiment in Geneva, in which he showed that the temperature of a cold or hot object can be monitored remotely. His experiment is illustrated in Figure 1. Two mirrors, A and B, with a diameter of 40 centimetres and a focal length of 40 centimetres, were placed nearly 5 metres apart. When a cold object (a glass bulb filled with ice) was placed at the focus of mirror A, the temperature at point D immediately began to drop. This effect did not occur for a thermometer placed at any other point. This experiment still surprises a lot of people, even though few of them have trouble believing that a hot object in the focal point C would raise the temperature of the thermometer. Today we know that the experiment illustrates the propagation of infrared electromagnetic waves, rather than heat exchange produced by the mixing of air between mirrors. The experiment also shows that mirrors reflect and focus infrared radiation the same as light.
Figure 2: Explanation of Pictet’s experiment.
The experiment is further explained in Figure 2. Each body emits infrared radiation, which depends on its temperature and is usually equal in every direction. These rays are gathered and reflected by mirror A, which is made of a material that does not absorb infrared radiation. In an analogous process, mirror B then focuses the radiation.
One can imagine a number of complications in conducting this experiment. For example, the mirror can focus not only the radiation coming from the focal point C, but also other radiation emitted nearby. In this case, the measurement would not be a reflection of the change of the examined object’s temperature, but a product of the temperatures of this object and the surroundings. Furthermore, mirrors are never perfectly reflective. Also, the lower the intensity of radiation, the lower the sensitivity of such a measurement.
Satellite measurements in infrared and microwave frequency bands
Figure 3: Special Sensor Microwave Imager/Sounder (SSMIS). Source: NSIDC.
Similar temperature estimation techniques based on the emitted radiation are currently used to make sea-surface temperature measurements using satellites orbiting the Earth. Figure 3 shows a satellite used for microwave radiation measurements. One of its components is a mirror that scans the Earth and focuses the incoming radiation onto a sensor that records the energy of the signal. We do not place another mirror on the ocean’s surface, of course, but instead of this, radiation is gathered from a large area. Sea-surface temperature measurements are made using infrared radiation with wavelengths between 3.7 and 12 micrometres, and microwave radiation with frequencies ranging from 6 to 200 GHz, which corresponds to wavelengths between 0.14 and 5 centimetres.
Figure 4: Relation of spectral flux density of irradiated energy to wavelength of microwave band. Source: The COMET Program.
For a given temperature, radiation emission decreases with increasing wavelength (Figure 4). Therefore, microwave sensing requires collecting information from a larger area of the ocean than infrared sensing. This makes it difficult to make high spatial resolution measurements of Earth’s surface in the microwave band (Figure 5).
Figure 5: Images show the details of sea-surface temperature distributions depending on the resolution of satellite measurements. In infrared, the resolution is significantly higher (1 km) than for microwave measurements (around 25 km).
Just as in Pictet’s experiment, the surroundings may influence satellite temperature measurements through emission of radiation by other sources than the ocean itself. Such effects are related to the radiation of clouds, of particles suspended in the air, and the radiation emission of gases that constitute the air. This occurs even though SST measurements are performed at wavelengths known as “atmospheric windows,” at which the atmosphere absorbs (and emits) little radiation. The problem of how to determine temperature based on measurements in different wavelengths is complex and requires the introduction of atmospheric corrections and comparisons to direct measurements; in other words, satellite algorithms based on the laws of physics (the equation of radiative transfer) are empirically corrected based on direct measurements in the same time and place. This is called parameterisation of the relation between observed temperature and satellite estimates. In essence, temperature Tb, measured in space, depends not only on ocean temperature Ts and its emissivity e, but also on an atmospheric correction and on the reflection of radiation in the ocean-atmosphere system. This is illustrated in Figure 6.
Tb=e⋅Ts+Tatmospheric contribution +Treflection from ocean and atmosphere
Figure 6: Contributions to temperature measured at the top of the atmosphere (upward arrows): (i) from direct emission by ocean surface; (ii) from direct emission by atmosphere; and (iii) from the reflection of incoming radiation. Source: K. N. Liou, An Introduction to Atmospheric Radiation.
Satellite measurements of sea surface temperature are also affected by cloud droplets and rain. Cloud droplets are around 10 microns in diameter, which means that they are small enough for the microwave radiation to only be slightly disturbed by the presence of clouds. However, this changes when it starts to rain, as raindrops can be as big as one millimetre in diameter, and they scatter microwave radiation. In infrared, the size of cloud droplets is comparable with the wavelengths used for measurements (around 10 micrometres), which is why infrared radiation cannot be used to measure sea surface temperature in the presence of clouds nor rain.
Physical properties of water
It is worth mentioning, at least briefly, the physical properties of seawater, because they determine the quality of satellite measurements. For infrared waves, the information about temperature comes from the depth of a dozen or so microns below the air-ocean interface – the skin depth of the ocean. Radiation with infrared wavelengths of 3.7–12 micrometres is strongly absorbed by water. The situation is similar in the microwave band – waves with a frequency of 6–200 GHz penetrate water to a depth of around 1 millimetre below the air-ocean interface. Figure 7 shows how the loss of intensity depends on the depth; you can see that visible light penetrates water much deeper than infrared or microwave radiation.
Figure 7: Intensity of absorption for electromagnetic waves of different wavelengths.
Another physical parameter of water is its emissivity. It defines how “black” water is for a given wavelength. It turns out that if we were able to see in infrared, water would be almost completely black, and in microwaves, it would look like a dirty mirror, because water has low emissivity in the microwave band.
Algorithms, calibration, systematic errors
Sea surface temperature estimation from satellite data is a so-called reversible process; it is not a direct measurement. Algorithms for calculating temperature from satellite measurements are often based on radiative transfer equations, but they also strongly rely on the calibration of data with direct measurements and techniques of approximate inversion of complex mathematical equations. Although satellite measurements have the remarkable advantage of global coverage, they also have some significant downsides. Satellite measurements are not continuous, and a typical satellite’s lifespan is a couple of years, after which it needs to be replaced. For this reason, climatological data series based on satellite measurements, which by their nature span a long time, are not accurate and demand caution in their interpretation. Sometimes a new generation satellite has a similar instrument as its predecessor on board, as is in the case of the American instrument Advanced Very High Resolution Radiometer (AVHRR), but the interpretation of data spanning many years of measurements still requires caution due to the difficulties with calibration and the gradual degradation of sensors. Some instruments of new generations have been designed with climate measurements in mind. One such instrument is the Along-Track Scanning Radiometer (ATSR), but for now, its measurements cover a relatively short period of time. On the other hand, satellite temperature measurements play the central role in weather forecasting, oceanic phenomena analysis and seasonal phenomena forecasting.
It could be seem that the progress from temperature measurements in water buckets to drifting buoys is so enormous that further advancements of measurement techniques are not necessary anymore. Even so, life still calls for new solutions – sometimes required by new technologies, sometimes by the mundane reality of politics and economy. For example, in recent years, it has become dangerous to deploy buoys around the Horn of Africa because of groups of armed pirates operating in these waters. In some regions, drifters are quickly caught and destroyed by fishermen. Satellite programmes are discontinued for lack of funding or because of the cuts to complex research programme funding introduced by politicians.
Despite all that, climate research obviously continues. For example, atmospheric physicists and oceanographers who conduct observations in equatorial Pacific have formed a group “Tropical Pacific Observing System,” and are actively planning future sea surface temperature measurements. One of the group’s recommendations is to create an observational network based on direct measurements with low systematic error instruments; for example, they call for the improvement of thermistors used in drifters, so that measurements do not depend on how long a drifter has been in water. They also suggest new research challenges: understanding diurnal temperature variations should be an important element of the research, because we do not know whether night-time measurements are sufficiently representative. The group argues that direct measurements should be conducted in cloudy and rainy regions, because satellite measurements are difficult or even impossible. Temperature-measurement planning also includes the development of new measurement techniques. One of them is the system called Argo, which became fully functional quite recently, in 2006. It consists of floats, which are auto-submersible buoys that spend the majority of the time at a depth of 1000 metres, where the influence of drift is relatively small. Once every 10 days, the floats ascend to the surface of the ocean, measuring temperature and salinity during their ascent. The measurements are currently stopped about a half a metre before the floats reach the ocean’s surface to prevent forming a polluting deposit on the sensors. During the short period on the surface, they transmit data to a satellite. Scientists also call for a certain percentage of Argo measurements to be made using a much shorter cycle, which would allow us to understand the changes of the ocean temperature throughout the day.
Figure 8: Argo system. Floats ascend from the depths every 10 days, collecting data on water temperature and salinity during the ascent. During the short period on the surface, they transmit data to a satellite. Source: Argo.
Another measurement system under consideration is the use of autonomous sailboats, known as sailing drones, like the prototype shown in Figure 9. They can make measurements on large areas of the ocean and are less susceptible to drift.
It is hard to predict what the future will bring, but it is worth doing the preparatory research well in advance, because complex research projects require over a decade of advanced planning for satellite-based projects, and multiple years of advanced planning for ground-based projects.
Figure 9: Photograph of a prototype sailing drone for meteorological and ocean measurements. Source: Saildrone.
Piotr J. Flatau, University of California, San Diego
Carella et al. (2017) “Measurements and models of the temperature change of water samples in Sea Surface Temperature buckets.” Quarterly Journal of the Royal Meteorological Society.
Jones (2016). “The reliability of global and hemispheric surface temperature records.” Advances in Atmospheric Sciences 33.3 (2016): 269-282.
Kennedy (2014) “A review of uncertainty in in situ measurements and data sets of sea surface temperature.” Reviews of Geophysics 52.1 (2014): 1-32.
Kent et al. (2010) “Effects of instrumentation changes on sea surface temperature measured in situ.” Wiley Interdisciplinary Reviews: Climate Change 1.5 (2010): 718-728.
Reverdin et al. (2010) “Temperature measurements from surface drifters.” Journal of Atmospheric and Oceanic Technology 27.8 (2010): 1403-1409.
The original article was written by Pjotr J. Flatau from the University of California San Diego (UCSD) for our Polish partner site Nauka o Klimacie and Juliusz P. Braun translated it into English. You can access the original article by clicking on the logo:
Among the eleven modeling teams the key findings were consistent. First, a carbon tax is effective at reducing carbon pollution, although the structure of the tax (the price and the rate at which it rises) are important. Second, this type of revenue-neutral carbon tax would have a very modest impact on the economy in terms of gross domestic product (GDP). In all likelihood it would slightly slow economic growth, but by an amount that would be more than offset by the benefits of cutting pollution and slowing global warming.
Meanwhile, House Republicans are again on the verge of introducing a Resolution denouncing a carbon tax as “detrimental to American families and businesses, and is not in the best interest of the United States.”
The strong economic case for a carbon tax
The modeling teams examined four carbon tax scenarios, with starting prices of $25 or $50 per ton of carbon dioxide, rising at 1% or 5% per year. These are somewhat modest policy scenarios; CCL proposes a starting tax of $15 per ton rising at $10 per year, and the CLC proposes $40 per ton rising around 4% per year. The most aggressive policy considered by the Stanford EMF teams ($50 per ton rising 5% per year) falls in between these two proposals.
The carbon price each year 2020–2050 in proposals by Citizens’ Climate Lobby (blue), the Climate Leadership Council (red), and the four approaches modeled by the Stanford EMF teams (green). Illustration: Dana Nuccitelli
The modeling studies consistently found that for all four carbon tax policies considered, whether the revenue is returned via rebate checks of by offsetting income taxes, the direct economic impact is minimal:
in every policy scenario, in every model, the U.S. economy continues to grow at or near its long-term average baseline rate, deviating from reference growth by no more than about 0.1% points. We find robust evidence that even the most ambitious carbon tax is consistent with long-term positive economic growth, near baseline rates, not even counting the growth benefits of a less-disrupted climate or lower ambient air pollution
The last sentence is critical. The analyses consistently found that coal power plants would be the biggest losers if a carbon tax were implemented, and the costs associated with health impacts from other pollutants released by burning coal (e.g. soot and mercury) are substantial. Phasing out coal power plants results in significant health and economic benefits to society.
So does slowing global warming, of course. A working paper recently published by the Federal Reserve Bank of Richmond concluded that US economic growth would slow by an extra 0.2–0.5% per year if we stay on our current climate path (3–3.5°C global warming) than if we meet the 2°C Paris target. This compares favorably to a less than 0.1% per year slowing of the US economic growth rate under the carbon tax scenarios.
In short, climate change will slow American economic growth. If we don’t curb global warming, the economic impact will be larger. If we implement a carbon tax to help meet the Paris climate targets, the economic impact will be negligible, and will be offset by the benefits of phasing out dirty coal power plants.
Carbon taxes are effective at cutting pollution
The Stanford EMF studies also consistently concluded that a carbon tax is an effective way to curb carbon pollution, especially in the power sector:
carbon price scenarios lead to significant reductions in CO2 emissions, with the vast majority of the reductions occurring in the electricity sector and disproportionately through reductions in coal … Expected economic costs (not accounting for any of the benefits of GHG and conventional pollutant mitigation), in terms of either GDP or welfare, are modest
The analyses also found that the rate of increase of the carbon tax was more important than the starting price.
A chronological listing of news articles posted on the Skeptical Science Facebook Page during the past week.
Heat Records Falling Around the World in 2018
Above: A sampling of all-time high temperatures reported around the world in 2018 thus far, rounded to the nearest degree Fahrenheit. Most of these were set in late June and early July (see details below). The reading of 51.3°C (124.3°F) at Ouargla, Algeria, is the highest reliably measured temperature on record for Africa. Background image credit: NASA Earth Observatory.
The first five months of 2018 were the fourth warmest in global records going back to 1880, according to NOAA. Along the way, a number of extreme heat events have occurred already this year. In recent weeks across the Northern Hemisphere, these records have included an impressive number of all-time highs (an all-time high is the warmest temperature reported on any date at a given location).
Setting an all-time high is no small accomplishment, especially for locations that have long periods of record (PORs). All-time highs are especially noteworthy when you consider that, on average, the planet is warming more during winter than during summer, and more at night than during the day. Urban heat islands are no doubt contributing somewhat to the heat records achieved in large urban areas, but the extreme heat of 2018 has also played out in remote rural areas without any urban heat islands.
As of July 13, the U.S. Records summary page maintained by NOAA showed that 18 U.S. locations had set or tied all-time highs so far this year, as opposed to 10 locations that set or tied all-time lows. There is an even sharper contrast between the number of all-time warm daily lows (40) and all-time cool daily highs (5), which has been a common pattern in recent years.
Here is a summary of some of the more significant heat-related events of the year-to-date around the world, in chronological order. Note that in some cases, extremely high temperatures recorded in the early 20th century are not considered reliable because of instrument placement and/or observing practices (as was the case with the infamous and ultimately disqualified El Azizi world heat record). All of the all-time highs shown below are valid for the climatological records that are considered reliable at a given location. All records are shown in the units used locally, followed by conversions to Celsius or Fahrenheit. (The United States is the only major country on Earth that does not primarly use the metric system.)
Ocean waters are rising because of global warming. They are rising for two reasons. First, and perhaps most obvious, ice is melting. There is a tremendous amount of ice locked away in Greenland, Antarctica, and in glaciers. As the world warms, that ice melts and the liquid water flows to the oceans.
The other reason why water is rising is that warmer water is less dense – it expands. This expansion causes the surface of the water to rise.
Rising oceans are a big deal. About 150 million people live within 1 meter (3 feet) of sea level. About 600 million live within 10 meters (33 feet) of sea level. As waters rise, these people will have to go somewhere. It is inevitable that climate refugees will have to move their homes and workplaces because of rising waters.
In some places, humans will be able to build sea walls to block off the water’s rise. But, in many places, that won’t be possible. For instance, Miami, Florida has a porous base rock that allows sea water to permeate through the soils. You cannot wall that off. In other places, any sea walls would be prohibitively expensive.
It isn’t just the inevitable march of sea level that is an issue. Rising waters make storm surges worse. A great example is Superstorm Sandy, which hit the US East Coast in 2012. It cost approximately $65 bn of damage. The cost was higher because of sea level rise caused by global warming.
Climate scientists do their best to project how much and how fast oceans will rise in the future. These projections help city planners prepare future infrastructure. My estimation is that oceans will be approximately 1 meter higher in the year 2100; that is what our infrastructure should be prepared for. What I don’t know is how much this will cost us as a society.
A very recent paper was published that looked into this issue. The authors analyzed the cost of sea level if we limit the Earth to 1.5°C or 2°C warming. They also considered the future cost using “business as usual” scenarios.
What the authors found was fascinating. If humans take action to limit warming to 1.5°C, they estimate sea level will rise 52 cm by the year 2100. If humans hold global warming to 2°C, sea levels will rise by perhaps 63 cm by 2100.
The difference (11 cm) could cost $1.4 tn per year if no other societal adaptation is made. This is a staggering number and in itself, should motivate us to take action.
But the authors went further, they considered an even higher future temperature scenario (one that is essentially business as usual). With that future, global annual flood costs would increase to a whopping $14 tn per year.
In the study, the authors considered which countries and regions would suffer most. It turns out upper middle income countries will be worse off, particularly China. Higher-income countries have a slightly better prognosis because of their present flood protection standards. But make no mistake about it, we will all suffer and the suffering will be very costly.
There are four important takeaways from this study.
How have we measured the temperature of the ocean’s upper layer in the last 150 years? How does understanding physical processes and observational errors help to standardise climate data and understand climate change?
Why do we measure ocean temperature?
Figure 1: Convective clouds over the Western Pacific. Source: NOAA.
In the tropical atmosphere, tall clouds such as those shown in Figure 1 are governed by sea surface temperature, and one of the surprising hypotheses of climate change (known as the Thermostat Hypothesis) states that they, in turn, limit the maximum temperatures above the ocean by reflecting sunlight. These kinds of feedback loops between ocean temperature and other meteorological and oceanographic features, such as clouds, atmospheric circulation, ocean currents, and precipitation, are the reason why measuring sea surface temperature is an important element of today’s climate-change debate.
Sea surface temperature (SST) is also one of the climate indices with the longest histories of direct measurements. Because ocean makes up about 70% of the total Earth’s surface, changes in the temperature of its surface are a key factor for determining the global temperature of the planet’s surface.
Figure 2: Global changes in sea surface temperature based on two different climate databases, which only use temperature measurements above the ocean. The data are expressed as deviations from the 1961–1990 average. Source: Jones, 2016.
Figure 3: Global sea-surface temperature distribution on 13 December 2010 (La Niña) and on 3 December 2015 (El Niño). The data are based on the measurements using the following instruments: Advanced Very High Resolution Radiometer (AVHRR), Moderate Imaging Spectroradiometer (MODIS), and Terra and Aqua and Advanced Microwave Spectroradiometer-EOS (AMSR-E). Source: PODAAC.
Sea surface temperature is also important in forecasting many atmospheric phenomena. Figure 3 shows the sea surface temperature distribution in December 2010 and in December 2015. There are visible differences on the Pacific between these two years.
The late 2010 – early 2011 season (on the left) was a period of La Niña, and the late 2015 – early 2016 season (on the right) was a period of a strong El Niño. During El Niño, the water temperature off the coast of South America is higher than during La Niña. Not only does it influence precipitation in South America, but it also affects the weather in the middle latitudes.
The occurrence of El Niño or La Niña can be diagnosed by indices based on the measurements of the ocean’s upper layer temperature.
Many details of oceanic flows are visible in the satellite images of sea surface temperature. In Figure 4, you can see large areas of warm water in the western Pacific and in the eastern Indian Ocean, which lead to high precipitation in Indonesia. In the eastern Pacific, a narrow swath of lower temperatures is marked as a “Cold tongue.” This is where the so-called Tropical Instability Waves develop.
Figure 4: 24 August 2014. A huge pool of warm water is visible in the western Pacific. In the eastern Pacific, the “Cold tongue” of lower sea surface temperature and the developing disturbances known as the Tropical Instability Waves can be seen. Source: PODAAC.
Figure 5 shows the northern area of the Atlantic. In the blue frame, you can see the Gulf Stream, which carries warm water from the subtropics northwards, and then towards Europe, causing European winters to be warmer than in other areas on similar latitudes.
Figure 5: Sea surface temperature on 24 August 2014. In the blue frame, you can see the meandering Gulf Stream.
Figure 6: Sea surface temperature around the eye of hurricane Frances. You can see the “cold trail” left by the hurricane. Source: Pearn Niiler, UCSD.
Measuring the ocean’s upper layer temperature is also important in forecasting tropical cyclones. In Figure 6, which shows the sea surface temperature distribution around the eye of hurricane Frances, you can see that the eye moves towards warmer areas, leaving a “cold trail” behind. The existence of such trails means that hurricanes draw a lot of their energy from the ocean, and then convert it to kinetic energy of the winds through condensation of water vapour. A warm anomaly of sea surface temperature under a moving storm can cause it to rapidly intensify. It is also possible (although still largely up for debate) that the rise of ocean temperatures caused by global warming can increase the number and the intensity of hurricanes in a warming climate.
Figure 7: Definitions of different sea surface temperatures. SSTskin is measured in infrared, SSTsub-skin is measured in microwaves. Source: GHRRST.
But what exactly is sea-surface temperature? Before we move on to the measurement techniques, let us have a look at Figure 7, which illustrates the vertical profile of temperature in the ocean’s upper layer, up to a depth of around 10 metres (foundation depth). Figure 7 shows typical night-time and daytime temperature distributions. This “foundation” sea-surface temperature, marked with a blue star, is usually observed throughout the entire layer by the end of the night, and it is the parameter used to express the changes in sea-surface temperature.
However, as shown in Fig. 7, there are “different” sea surface temperatures, which depend on the measurement depth. At the interface of air and ocean, a white star marks SSTint temperature. It is usually a little lower than the temperature of the water directly beneath it. This is because energy is emitted through infrared radiation. SSTint is the temperature that would be observed at the very surface of the water, but its definition has mostly theoretical importance, because it is impossible to directly measure the temperature at this interface.
The red and yellow stars mark the sea surface temperatures measured at around 10 micrometres (for comparison, a human hair is about 60 micrometres thick) and a couple of millimetres from the air-water interface. It turns out that infrared electromagnetic radiation only penetrates the ocean to a very small depth. Microwave radiation reaches slightly deeper, up to a few millimetres. Water temperatures measured at these depths are called skin and subskin sea surface temperatures, or skin sea-surface temperatures with regard to infrared and to microwave radiation. They can be indirectly calculated based on satellite measurements.
The green star represents sea-surface temperature as observed at a given depth. It can be measured, for example, using a bucket, a scoop, drifting buoys or water drawn from overboard to cool a ship’s engine. As measurements can be taken at different depths, it is generally denoted by SST with added information about the depth of measurement, e.g. SST(18cm).
Sea surface temperature changes when the upper layer of the ocean absorbs sunlight. This leads to diurnal temperature variations: a warm layer is formed during the day, and usually disappears at the end of the night. The vertical profile of this warm layer is not yet thoroughly understood. Its structure depends on how transparent water is, which is linked to the amount of phytoplankton, the wind speeds at the ocean’s surface and the vertical flow of water. All throughout the night and day, energy is radiated from the ocean surface in infrared. Temperature is also governed by water evaporation.
In the last couple of years, all methods of measurement are being reduced to determine the foundation sea surface temperature as defined above. In other words, no matter what measurement techniques were used or when the observations were performed, we try to introduce systematic corrections so that all measurements have the same point of reference. In practice, these systematic corrections are sometimes very hard to implement, especially if the measurements were performed a few dozen years ago.
Direct measurements – at first, we used wooden buckets
Buckets tied to ropes would be thrown off a ship, then pulled back up (Figure 8). Next, thermometers would be put inside the buckets to measure the water’s temperature. This is how measurements of sea surface temperature began around 150 years ago. The decision to start such measurements had been made during the International Maritime Conference in 1853, mainly to improve the security of marine navigation. They were performed by voluntary observing ships during their regular voyages. As it turns out, it is currently the longest-running measurement series available to assess the changes in Earth’s surface temperature.
Figure 8: A wooden bucket, a canvas bucket and a modern German scoop [Carella 2017].
Other climate data and methods of evaluating changes in temperature are based on paleoclimatic indices, that is information coming not from direct measurements, but from indirect assessments or numeric models. Thermometers in water buckets offer direct measurements, which is why they are so important.
Although a measurement protocol known as “the Abstract Log” had been in place since 1853, many different techniques were still used afterwards, making it difficult to standardise the results today. Measurement techniques have evolved during the 150 years – wooden buckets have been replaced by canvas and rubber buckets, then we measured the temperature of water used to cool the engine, since 1971 we used moored buoys, and since 1978 we also used buoys drifting on the surface of the ocean. Today we also commonly use satellite measurements. At times, external circumstances changed – during WWII European ships rarely performed measurements, so most of the data comes from American vessels. Furthermore, no one measured temperature during the night, so as not to reveal the ship’s position.
What happens to water in a bucket after pulling it on board a ship? During the day, the Sun shines on the bucket and wind causes evaporation. The material of which the bucket is built causes water temperature to change faster or slower, depending on air temperature. Scientists have used various techniques to estimate the changes in the temperature of water in a bucket standing on board a ship. One such method is a laboratory measurement (Figure 9). The results of such a procedure can be expressed through mathematical expressions that account for factors such as incoming solar radiation, loss of energy through infrared radiation, size and material of the bucket, wind speed, water mixing, and the initial difference of temperature and humidity between air and water. For example (Figure 10), if a measurement was performed 3 minutes after pulling the bucket up, and water was 5°C warmer than air, it would cool down by 0.2°C – in other words, the reports systematically understate sea surface temperature. This “cold error” is now taken into account through correction of sea surface temperature measurements, which is one of the methods of standardising climate data.
Figure 9: Illustration of experimental measurements to assess temperature change inside various buckets and scoops. The procedure uses precision thermometers to measure air and water temperature, a ventilator and an automatic stirrer (Carella et al., 2017).
Figure 10: Measured water temperature (black, continuous line) vs. time and initial temperature difference between air and water. Case d1: water temperature is 5°C higher than air temperature; d2: water about 1°C colder than surrounding air; d3: water 5°C colder than air. Only results for weak stirring are presented. Wind speed is 3.5 m/s. Shading indicates model results. Source: Carella 2017.
Direct measurements – drifters
If you throw a ball in a river, it will go downstream. In the late 1970s, the first buoys were built and deployed to drift in the ocean. Drifters, as they are called, consist of a sphere, inside of which there is equipment to measure temperature, batteries and a system that enables communication with satellites passing above. Based on the position obtained from a GPS receiver, the buoy’s speed and direction of movement can be determined, and satellite data transmission allows us to estimate sea surface temperature at a depth of around 18 centimetres below the air-water interface.
A typical buoy is about 40 centimetres across, and is tethered to something that resembles a “holey sock,” a 7-metre-long cylinder that limits the drift to the velocities and directions of water currents around 15 metres below the ocean’s surface. A large number of drifters released into the ocean approximately 500 kilometres apart can help to gauge sea surface temperature over vast areas.
In the last two decades, we had about 1300 buoys drifting on the ocean’s surface. Some of them provide information not only about ocean currents and water temperature, but also about surface pressure and/or water salinity. Sometimes drifters are deployed before tropical cyclones to measure the hurricane’s cold trail, that is the decrease in water temperature caused by the interaction between the ocean and the atmospheric circulation in the cyclone.
The fact that buoys drift, shoved by wind and ocean currents, is in itself not good for global temperature measurements, because after some time drifters tend to end up on a shore or close together. For this reason, a few dozen new drifters are deployed every year. In measuring temperature, drifters typically use a small sensor – an epoxy-coated thermistor in a steel tube that sticks out from the spherical buoy casing close to the bottom. The precision of such measurements is around 0.1°C, but because drifters spend a couple of years in the ocean, there may occur systematic variations of absolute temperature reported by thermometers. However, they are usually relatively minor. Surface buoys are an important source of global sea-surface temperature measurements, because they may be deployed virtually anywhere, as opposed to ship-based measurements, which are generally performed on the sea routes between economically important regions.
We should stop to think about what exactly “global temperature measurements” are. On land, temperature measurement stations are placed about 300 kilometres apart, and measurements are performed at so-called synoptic hours – simultaneously throughout the world. The word “synoptic” derives from Greek words meaning “together with” and “view,” which is why in oceanography and meteorology, it means simultaneous measurements. Due to the vastness of the global oceans, taking simultaneous measurements every 300 kilometres has always been difficult. Drifters allowed us to obtain a more “synoptic” and global image of water temperature.
Figure 12: Drifting buoy (white-and-blue) with a red “sock.”
Figure 13: Drifting and moored buoy distribution in July 2017. Source: JCOMMOPS.
Buoy array in equatorial regions
One of the most important achievements of the last two decades was to establish a measurement array of moored buoys (see Figure 14) throughout the equatorial regions of the Pacific, the Indian Ocean and the Atlantic (TAO/TRITON, RAMA and PIRATA). They are used not only to measure sea surface temperature, but also for measurements below the surface and for atmospheric measurements. The buoys are deployed on various latitudes between 10° S and 10° N. Moored buoys play an important role in satellite instrument calibration. On the other hand, they require relatively frequent and expensive maintenance. A ship needs to visit them every dozen or so months to replace the instruments. Due to these costs, the future of this network is currently uncertain.
Figure 14: Photograph of a buoy used in the TAO array in the equatorial Pacific. Source: PMEL.
Figure 15: Map of the TAO array of moored buoys in the Pacific. Source: PMEL.
In the second part of the article, you will read about the specificity of satellite measurements and about a system of autonomous submerging measurement buoys. See you next time!
The original article was written by Piotr J. Flatau from the University of California San Diego (UCSD) for our Polish partner site Nauka o Klimacie and Juliusz P. Braun translated it into English. You can access the original article by clicking on the logo:
Story of the Week... Toon of the Week... Quote of the Week... Coming Soon on SkS... Climate Feedback Reviews... Poster of the Week... SkS Week in Review...
Story of the Week...
Global warming may be twice what climate models predict
Sunset. Credit: Patrik Linderstam, Unsplash
Future global warming may eventually be twice as warm as projected by climate models and sea levels may rise six metres or more even if the world meets the 2°C target, according to an international team of researchers from 17 countries.
The findings published last week in Nature Geoscience are based on observational evidence from three warm periods over the past 3.5 million years when the world was 0.5°C-2°C warmer than the pre-industrial temperatures of the 19th Century.
The research also revealed how large areas of the polar ice caps could collapse and significant changes to ecosystems could see the Sahara Desert become green and the edges of tropical forests turn into fire dominated savanna.
“Observations of past warming periods suggest that a number of amplifying mechanisms, which are poorly represented in climate models, increase long-term warming beyond climate model projections,” said lead author, Prof Hubertus Fischer of the University of Bern.
“This suggests the carbon budget to avoid 2°C of global warming may be far smaller than estimated, leaving very little margin for error to meet the Paris targets.”
Quote of the Week...
The dirty little secret behind 'clean energy' wood pellets
Wood pellets will be counted as renewable energy, the EPA administrator, Scott Pruitt has said, even though the EPA’s scientific board is still working on its advice on their environmental impact. Photograph: Alamy Stock Photo
“Philosophically it looks good but practically it looks pretty bad in many cases,” said William Schlesinger, a biogeochemist and member of the US Environmental Protection Agency advisory board.
“When you cut down existing trees and burn them, you immediately put carbon dioxide in the air. None of the companies can guarantee they can regrow untouched forest to capture the same amount of carbon released. The whole renewable forest industry is kind of a hoax in terms of its benefit as climate mitigation.”
Schlesinger added, however, that burning wood can result in lower emissions than coal if managed and certified properly and could be used as a “bridge fuel” as solar and wind energy continues to expand.
Scott Pruitt, the administrator of the EPA , recently announced that wood pellets will be classified as renewable energy similar to solar or wind power.
This has caused alarm among some experts, including those on the EPA’s own scientific board, which is still working on its own advice on the environmental impact of burning wood to generate energy. “Pruitt announcing that before we weighed in was appalling – frankly it was insulting to our existence,” said Schlesinger.
“If you burn young trees and regrow them, it might not be too bad. If you venture into older trees or forests that have never been cut before, that can be very bad.”
Six scientists analyzed the article and estimate its overall scientific credibility to be 'neutral'.
This article in the Daily Mail describes recent record warmth occurring in a number of different places. But while the article is correct to note that climate change results in more frequent and stronger heatwaves, scientists who reviewed it found that the article (and particularly the headline) is not entirely accurate in its explanation of how these heatwaves relate to climate change.
A chronological listing of news articles posted on the Skeptical Science Facebook Page during the past week.
Climate change will get a whole lot worse before it gets better, according to game theory
A firefighter douses flames from a backfire in San Andreas, California, Getty Images /Josh Edelson
It’s going to get a lot worse before it gets any better. According to new research published in Nature, humanity will witness marked sea level rises and frequent killer heatwaves before governments take decisive action against climate change. And to predict the future, mathematicians have turned to game theory.
The paper, published by a team of mathematicians, uses game theory to explain why it is so hard to protect the environment, updating it so they could model the effects of climate change, overuse of precious resources and pollution of pristine environments.
The bad news is that the model suggests that, when it comes to climate change, things might have to get demonstrably worse before they can get better. The good news, on the other hand, is that game theory could help policymakers to craft new and better incentives to help nations cooperate in international agreements.
Taking action on climate is about a lot more than our energy economy. Climate disruption is the leading threat to our built environment, an accelerant of armed conflict, and a leading cause of mass migration. Its effects intensify and prolong storms, droughts, wildfires, and floods — resulting in the US spending as much on disaster management in 2017 as in the three decades from 1980 to 2010.
Out of control wildfire approaching Estreito da Calheta, Portugal. September 2017. Photograph: Michael Held
Fiscal conservatism and national security require a smart, focused, effective solution that protects our economy and our values.
Political division between the major parties in Washington has left the burden of achieving that solution largely on Democratic administrations using regulatory measures that — for all their smart design and ambition — cannot be transformational enough to carry us through to a livable future.
Conservatives say the nation needs an insurance policy. Business leaders want to future-proof their operations and investments. Young people are demanding intervention on the scale of the Allies’ efforts to rebuild Europe after World War II.
The International Monetary Fund — whose mission is to ensure national dysfunction doesn’t undermine the solvency of public budgets and lead to failed states — warns that nations that depend heavily on publicly subsidized fossil fuels are endangering their future solvency by investing in a way that destroys future economic resilience. Resilience intelligence requires diversification and innovation on a massive scale.
The rapid expansion of green bonds is making clear the deep need for clean economy holdings among major banks and institutional investors. Climate-smart finance, still a new concept, is expected to be the standard for both public and private-sector actors at all levels within 10 to 20 years.
Main Street economies suffer when too much of the money in circulation flows to finance, without clear incentives to lend to small businesses. The steadily rising monthly carbon dividend makes sure more of the money in circulation flows through small businesses, locking in that incentive and making the whole economy more efficient at creating wealth for the average household. Photograph: Joseph Robertson
Unpaid-for pollution and climate disruption limit our personal freedom and then, by adding cost and risk to the whole economy, undermine our collective ability to defend our freedom and secure future prosperity. Even with record oil and gas production, the US still depends heavily on foreign regimes hostile to democracy that manipulate supply and undermine the efficiency of our everyday economy.
A study by Regional Economic Models, Inc., which modeled the interacting economy-wide impacts of monthly household carbon dividends found real disposable personal income rising for at least 20 years after the first dividends show up in the mail. Details at http://CitizensClimateLobby.org/REMI-Report Illustration: Regional Economic Models, Inc.
Ask any small business owner if they would rather have higher or lower hidden business costs built into everything they buy from their suppliers. Of course, they would prefer lower hidden costs and risks, and for consumers to have more money in their pockets.
That is how carbon dividends work.
A simple, upstream fee, paid at the source by any entity that wants to sell polluting fuels that carry such hidden costs and risk. This is administratively simple, light-touch, economy-wide, and fair to all.
100% of the revenues from that fee are returned to households in equal shares, every month. This ensures the Main Street economy keeps humming along.
Because both the fee and the dividend steadily rise, pollution-dependent businesses — and the banks that finance them — can see the optimal rate of innovation and diversification to liberate themselves from the subsidized pollution trap. The whole economy becomes more competitive and more efficient at delivering real-world value to Main Street.
To ensure energy intensive trade-exposed industries are not drawn away by other nations keeping carbon fuels artificially cheap, a simple border carbon adjustment ensures a level playing field, while adding negotiating power to US diplomatic efforts, on every issue everywhere.
Climate change will increase ice shelf melt rates around Antarctica. That’s the not-very-surprising conclusion of my latest modelling study, done in collaboration with both Australian and German researchers, which was just published in Journal of Climate. Here’s the less intuitive result: much of the projected increase in melt rates is actually linked to a decrease in sea ice formation.
That’s a lot of different kinds of ice, so let’s back up a bit. Sea ice is just frozen seawater. But ice shelves (as well as ice sheets and icebergs) are originally formed of snow. Snow falls on the Antarctic continent, and over many years compacts into a system of interconnected glaciers that we call an ice sheet. These glaciers flow downhill towards the coast. If they hit the coast and keep going, floating on the ocean surface, the floating bits are called ice shelves. Sometimes the edges of ice shelves will break off and form icebergs, but they don’t really come into this story.
Climate models don’t typically include ice sheets, or ice shelves, or icebergs. This is one reason why projections of sea level rise are so uncertain. But some standalone ocean models do include ice shelves. At least, they include the little pockets of ocean beneath the ice shelves – we call them ice shelf cavities – and can simulate the melting and refreezing that happens on the ice shelf base.
We took one of these ocean/ice-shelf models and forced it with the atmospheric output of regular climate models, which periodically make projections of climate change from now until the end of the century. We completed four different simulations, consisting of two different greenhouse gas emissions scenarios (“Representative Concentration Pathways” or RCPs) and two different choices of climate model (“ACCESS 1.0”, or “MMM” for the multi-model mean). Each simulation required 896 processors on the supercomputer in Canberra. By comparison, your laptop or desktop computer probably has about 4 processors. These are pretty sizable models!
In every simulation, and in every region of Antarctica, ice shelf melting increased over the 21st century. The total increase ranged from 41% to 129% depending on the scenario. The largest increases occurred in the Amundsen Sea region, marked with red circles in the maps below, which happens to be the region exhibiting the most severe melting in recent observations. In the most extreme scenario, ice shelf melting in this region nearly quadrupled.
So what processes were causing this melting? This is where the sea ice comes in. When sea ice forms, it spits out most of the salt from the seawater (brine rejection), leaving the remaining water saltier than before. Salty water is denser than fresh water, so it sinks. This drives a lot of vertical mixing, and the heat from warmer, deeper water is lost to the atmosphere. The ocean surrounding Antarctica is unusual in that the deep water is generally warmer than the surface water. We call this warm, deep water Circumpolar Deep Water, and it’s currently the biggest threat to the Antarctic Ice Sheet. (I say “warm” – it’s only about 1°C, so you wouldn’t want to go swimming in it, but it’s plenty warm enough to melt ice.)
In our simulations, warming winters caused a decrease in sea ice formation. So there was less brine rejection, causing fresher surface waters, causing less vertical mixing, and the warmth of Circumpolar Deep Water was no longer lost to the atmosphere. As a result, ocean temperatures near the bottom of the Amundsen Sea increased. This better-preserved Circumpolar Deep Water found its way into ice shelf cavities, causing large increases in melting.
This link between weakened sea ice formation and increased ice shelf melting has troubling implications for sea level rise. Unfortunately, models like the one we used for this study can’t actually be used to simulate sea level rise, as they have to assume that ice shelf geometry stays constant. No matter how much ice shelf melting the model simulates, the ice shelves aren’t allowed to thin or collapse. Basically, this design assumes that any ocean-driven melting is exactly compensated by the flow of the upstream glacier such that ice shelf geometry remains constant.
Of course this is not a good assumption, because we’re observing ice shelves thinning all over the place, and a few have even collapsed. But removing this assumption would necessitate coupling with an ice sheet model, which presents major engineering challenges. We’re working on it – at least ten different research groups around the world – and over the next few years, fully coupled ice-sheet/ocean models should be ready to use for the most reliable sea level rise projections yet.