Research shows that African-American women and the uninsured are four times more likely to lose their jobs after a diagnosis of breast cancer, despite its high survival rates.
Job loss following early-stage breast cancer diagnosis is associated with race and insurance status, but not with any clinical or treatment-related factors, the new study suggests.
Further, African-American patients or those without insurance were also more likely to return in a lesser job within the first two years of being cancer-free.
“We examined post-treatment employment patterns in early-stage breast cancer patients,” says lead author Christine Ekenga, assistant professor at the Brown School at Washington University in St. Louis and lead author of the paper in Cancer.
“During the two-year follow-up, African-American patients were four times more likely to leave the workforce than white patients, and uninsured/publicly insured patients were 4.7 times more likely to leave the workforce than privately insured patients.”
As part of a larger cohort study of 1,096 patients with early-stage breast cancer and same-aged women without breast cancer, data from 723 working-age women 40-64 years old (347 patients and 376 controls) were analyzed to evaluate four employment trajectories: sustained unemployment, diminished employment, emerging employment, and sustained employment.
The study focused on patients with early-stage breast cancer, a population that has an excellent prognosis for disease-free survival.
Patients who are fatigued, African-American patients, and publicly insured/uninsured patients with cancer were more likely to experience diminished employment after two years of follow-up.
The study collected information regarding a participant’s inability to work but participants were not asked why they were unable to work. A follow up study aims to address this question.
Although cancer survivors are more likely to report unemployment than individuals without a cancer history, working after diagnosis may represent a return to normalcy for some patients with breast cancer, Ekenga writes in the paper.
“In addition to the added benefit of employer-sponsored health insurance, paid employment has the potential to mitigate the financial stresses associated with cancer.
“Moreover, for women with breast cancer, employment could play a significant role in post-diagnostic health. Health benefits associated with employment include an increased sense of purpose, higher self-esteem, and a stronger sense of social support from others, all of which have been associated with improved quality of life.”
The National Cancer Institute and the Breast Cancer Stamp Fund; the National Institute for Occupational Safety and Health; the National Cancer Institute Postdoctoral Training in Cancer Prevention and Control; and the Cancer Center Support Grant funded the work.
When an invasive rose bush dominates urban parks, ticks there are twice as likely as in uninvaded forest fragments to carry the bacteria that causes Lyme disease, research shows.
But outdoor enthusiasts can’t dodge disease just by staying away from thick stands of invasive plants within forest areas. The trend reverses itself at a broader scale, when you compare invaded forests to that mature trees dominate.
A rich leaf layer on the forest floor is a welcome mat for ticks—infected and not.
“While rose appears to increase B. burgdorferi transmission by bringing together ticks and infectious host animals, this is not the whole story,” says Solny Adalsteinsson, a staff scientist at Tyson Research Center of Washington University in St. Louis who conducted the research in New Castle County, Delaware. That state has one of the nation’s highest rates of Lyme disease per capita.
Instead, Adalsteinsson says, there are far more ticks tramping around in forests without the invasive rose, Rosa multiflora.
The difference likely results from the amount of leaf litter on the ground. Ticks live for up to two years, and they need places to hide when they’re not actively downing their blood meals. Forests with mature trees, aged more than 100 years, tend to have thick layers of fallen leaves on the ground, but the soil is bare in many younger forest fragments that are choked with invasive bushes.
When spending time outdoors, learn to recognize the conditions that make for good tick habitat—including bushy, invasive undergrowth that could provide hiding places for mice, birds, and other tick hosts. Those thickets could be local hot spots for disease. But also know that a rich leaf layer on the forest floor is a welcome mat for ticks—infected and not.
“Because there are so many more ticks in the uninvaded forest, even though the ticks are less likely to be carrying a pathogen, your overall chance of encountering an infected tick is going to be greater in an uninvaded forest fragment,” Adalsteinsson says. “It’s a sheer numbers game at that point.”
Christians who are comparatively well represented in the medical field, such as those who are Korean-American, understand the relationship between faith and health differently than those who are not, such as African-Americans and Latinos, a new study suggests.
Researchers found that 80 percent of black and Latino Americans interviewed in the study said they believe in the potential healing power of religious faith, while nearly two-thirds of Korean-Americans interviewed said that a religious environment mainly provides support for individuals with regard to health decisions, but made few mentions of prayer or divine healing.
“Every time that I’m sick, I believe that God can heal my sickness…”
In the study, the researchers examined views on the relationship between faith and health for two groups that are overrepresented in American Christianity and underrepresented in medical careers—African-Americans and Latinos as well as the views of a group that is similarly religious but comparatively well represented in medical professions—Korean-Americans.
The researchers say they were motivated to pursue this research by the growing number of studies on collaborations between churches and health care providers. Such partnerships are often established without consideration of how racial representation in medical professions might shape distrust of medicine in religious communities.
“Each of the groups emphasized the prevalence of health initiatives already taking place in their congregations, ranging from exercise classes to informational seminars,” says coauthor Elaine Howard Ecklund, founding director of the Rice University’s Religion and Public Life Program. “But while each group expressed optimism about potential partnerships between churches and medical providers, the groups differed in their views on the relationship between faith and health.”
While the majority of blacks and Latinos interviewed expressed confidence in the potential healing power of Christianity, most Koreans interviewed said that a religious environment can provide support for individuals with regard to making health decisions (such as which doctor to visit for a specific condition), but they did not often mention prayer or divine healing.
Excerpts from interviews with each group are included in the paper.
“Every time that I’m sick, I believe that God can heal my sickness,” said one member of a Latino church. Another member said, “I trust medicine a lot. But I think my first choice is God.”
Ecklund says that the view of God as the creator of science helped congregants substantiate trust in medicine. One African-American church member said, “I think that God gives us access to certain things to help us to be better, to serve him more.”
This did not preclude mistrust on the part of the interview respondents, however, coauthor Cleve Tinsley notes.
“Narratives regarding distrust of the medical community arose almost exclusively among African-American respondents. The Tuskegee syphilis experiment seemed to have a notable legacy within this community’s cultural memory, as the experiment came up unsolicited,” says Tinsley, a PhD student at Rice.
Korean-American respondents often saw the benefits of religion as practical, as they often downplayed the efficacy of prayer while highlighting the support of the religious community, support that often came from medical professionals within the church.
One Korean-American surveyed said about the relationship between faith and health: “I think mostly in terms of just realizing that (congregation members are) not alone, that there is a community out there that will go through it with you type of thing. More of a support, I suppose.”
Another respondent said that while faith “plays a tremendous role” in coping with the stress that is related to health issues, it should not necessarily be the “primary way to deal with an actual ailment.”
The paper includes interviews with 19 church leaders representing 18 different organizations and 28 congregation members as well as observations from three different Christian congregations. Researchers selected the respondents from the Religion, Inequality, and Science Education project, a larger study exploring how minority Christian congregations view science and medicine.
Stress a muscle and it gets stronger. Mechanically stress a new rubbery material—say with a twist or a bend—and the it automatically stiffens by up to 300 percent, the engineers says.
In lab tests, mechanical stresses transformed a flexible strip of the material into a hard composite that can support 50 times its own weight.
…the new material could be used in medicine to support delicate tissues or in industry to protect valuable sensors.
This new composite material doesn’t need outside energy sources such as heat, light, or electricity to change its properties. And it could be used in a variety of ways, including applications in medicine and industry.
The researchers found a simple, low-cost way to produce particles of undercooled metal—that’s metal that remains liquid even below its melting temperature. Researchers created the tiny particles (they’re just 1 to 20 millionths of a meter across) by exposing droplets of melted metal to oxygen, creating an oxidation layer that coats the droplets and stops the liquid metal from turning solid. They also found ways to mix the liquid-metal particles with a rubbery elastomer material without breaking the particles.
When this hybrid material is subject to mechanical stresses—pushing, twisting, bending, squeezing—the liquid-metal particles break open. The liquid metal flows out of the oxide shell, fuses together, and solidifies.
“You can squeeze these particles just like a balloon,” says lead author Martin Thuo, assistant professor of materials science and engineering at Iowa State University. “When they pop, that’s what makes the metal flow and solidify.”
The result, lead author Michael Bartlett says, is a “metal mesh that forms inside the material.”
Thuo and Bartlett, also an assistant professor of materials science and engineering at Iowa State, say the popping point can be tuned to make the liquid metal flow after varying amounts of mechanical stress. Tuning could involve changing the metal used, changing the particle sizes, or changing the soft material.
In this case, the liquid-metal particles contain Field’s metal, an alloy of bismuth, indium, and tin. But Thuo says other metals will work, too.
“The idea is that no matter what metal you can get to undercool, you’ll get the same behavior,” he says.
The engineers say the new material could be used in medicine to support delicate tissues or in industry to protect valuable sensors. There could also be uses in soft and bio-inspired robotics or reconfigurable and wearable electronics.
“A device with this material can flex up to a certain amount of load,” Bartlett says. “But if you continue stressing it, the elastomer will stiffen and stop or slow down these forces.”
The Iowa State University Research Foundation is working to patent the material, which is available for licensing. Iowa State startup funds for Thuo and Bartlett supported development of the new material. Thuo’s faculty fellowship also helped support the project.
A new technology platform lets scientists systematically modify and customize bacteriophages, viruses that can attack and kill specific bacteria.
These “phages” occur everywhere in the natural world. Precisely because they are matched to just one specific type of bacteria, researchers and medics hope that phages can be engineered to combat certain bacterial infections. For example, the food industry is already using phages to destroy pathogens in food products.
However, genetically engineering phages in order to customize them for specific uses continues to be a very challenging and time-consuming process. It is particularly difficult to modify phages to combat Gram-positive bacteria such as Staphylococcus. Incorporating a synthetic phage genome into Gram-positive bacteria has so far been very problematic, as their cell walls are so thick.
As reported in PNAS, however, the new platform enables scientists to genetically modify phage genomes systematically, provide them with additional functionality, and finally reactivate them in a bacterial “surrogate”—a cell-wall deficient Listeria cell, or L-form.
The new phage workbench allows such viruses to be created very quickly and the “toolbox” is extremely modular: it allows the scientists to create almost any bacteriophages for different purposes, with a great variety of functions.
“Previously it was almost impossible to modify the genome of a bacteriophage,” says team leader Martin Loessner, professor of food microbiology at ETH Zurich. The methods were also very inefficient. For example, a gene was only integrated into an existing genome in a tiny fraction of the phages. Isolating the modified phage was therefore often like searching for a needle in a haystack.
“In the past we had to screen millions of phages and select those with the desired characteristics. Now we are able to create these viruses from scratch, test them within a reasonable period, and if necessary modify them again,” Loessner stresses.
Rupture and collect
Samuel Kilcher, a specialist in molecular virology, used synthetic biology methods to plan the genome of a bacteriophage on the drawing board and assemble it in a test tube from DNA fragments. At the same time new, additional functions were incorporated in the phage genome, such as enzymes to dissolve the bacterial cell wall.
In addition, Kilcher is able to remove genes that give a phage unwanted properties, such as the integration into the bacterial genome or the production of cytotoxins.
In order to reactivate a phage from synthetic DNA, the genome was introduced into spherical, cell wall-deficient but viable forms of the Listeria bacterium (L-form Listeria). Based on the genetic blueprint, these bacterial cells then produce all the components of the desired phage and ensure that the virus particles are assembled correctly.
The researchers also discovered that spherical Listeria cells are not only capable of creating their own specific phages, but also those able to attack other bacteria. Usually, a host only generates its own specific viruses. L-form Listeria are therefore suitable as a virtually universal incubator for bacteriophages. If the Listeria cells are then brought to the point where they rupture (lysis), the bacteriophages are released and can be isolated and multiplied for use in therapy or diagnostics.
“A key prerequisite for using effective synthetic bacteriophages is that their genome is unable to integrate into the host’s genome,” Kilcher emphasizes. If this happens, the virus no longer presents a threat to the bacterium. Using this new method, however, the scientists were able to simply reprogram such integrative phages so that they become interesting again for antibacterial applications.
What about resistance? Or escape?
The two researchers are not particularly worried about potential resistances against the phages. And even if there were any, for example due to a bacterium changing its surface structures to prevent the virus from attaching, the new technology makes it possible to develop a suitable phage against which a bacterium has not yet developed resistance.
The researchers also think the danger of unintended release is very small: because the bacteriophages—both natural and synthetic—are extremely host-specific, they cannot survive for long without their host. This high specificity also prevents the bacteriophages from switching to a new host bacterium. “Adapting to the surface structure of a different host would take an awful long time in nature,” Loessner says.
The researchers have applied for a patent for their technology. Now they hope to find licensees to produce the phages for therapy and diagnostics.
Hurricanes aren’t to blame for most of the large storm surges in the northeastern United States, a new study indicates.
Instead, extratropical cyclones, including nor’easters and other non-tropical storms, generate most of the large storm surges in the Northeast, according to the new study in the Journal of Applied Meteorology and Climatology. They include a freak November 1950 storm and devastating nor’easters in March 1962 and December 1992.
Researchers found intriguing trends after searching for clusters of, or similarities among, storms, says study coauthor Anthony J. Broccoli, chair of the environmental sciences department in the School of Environmental and Biological Sciences at Rutgers University. It’s a new way of studying atmospheric circulation.
Understanding the climatology of storm surges driven by extratropical cyclones is important for evaluating future risks, especially as sea-level rise continues, the researchers says.
“The clusters are like rough police artist sketches of what surge-producing storms look like,” Broccoli says. “Like facial recognition software, clustering is trying to find storms that look like one another.”
“We wanted to understand the large-scale atmospheric circulation associated with storm surges,” says Arielle J. Catalano, the study’s lead author and a doctoral student in the graduate program in atmospheric science at Rutgers–New Brunswick. “It’s an atmospheric approach to the surge-producing storms.”
The study covered the 100 largest storm surges driven by extratropical cyclones at Sewells Point in Norfolk, Virginia; the Battery in southern Manhattan in New York City; and Boston, Massachusetts. It excluded hybrid systems, like Superstorm Sandy, that shifted from tropical to non-tropical or were tropical up to 18 hours before peak surges.
The scientists examined tide gauge records from the early 20th century through 2010. They analyzed atmospheric circulation during storms to look for clusters, and studied climate variability patterns that influenced circulation in the Northeast. They also looked at the probability of surges linked to much larger-scale atmospheric patterns that cover vast areas.
They found that the biggest surges develop when slowly moving extratropical cyclones (low pressure systems) encounter a strong anticyclone, or high-pressure system. That scenario leads to a tighter pressure gradient (the contrast between low and high pressure) and longer-lasting onshore winds, the study says.
This favorable environment for large storm surges is influenced by large-scale atmospheric patterns, including El Niño, the Arctic Oscillation, the North Atlantic Oscillation, and the Pacific-North American pattern.
Though Superstorm Sandy in 2012 led to the largest storm surge on record at the Battery, extratropical cyclones spawned 88 of the 100 largest surges there.
The November 1950 “Great Appalachian Storm,” with wind gusts exceeding 140 mph in the mid-Atlantic region, generated the highest extratropical cyclone surge at the Battery: nearly 7.9 feet. That’s only 20 percent smaller than Sandy’s surge—13 percent smaller if sea-level rise is not considered, the study says.
The water level during the 1950 storm was lower than during Sandy because the surge peaked at close to low tide. Future extratropical cyclones could cause Sandy-like flooding and coastal damages.
At Sewells Point, the highest surge was 5.4 feet in November 2009, while the highest surge at Boston was nearly 6.3 feet in February 2010. Of the 100 largest surges at those locations, extratropical cyclones were responsible for 71 at Sewells Point and 91 at Boston.
“The elephant in the room is sea-level rise,” Broccoli says. “That will likely matter more than how storms may change in the future, but what happens will be a combination of the two factors.”
A protein called RIPK3 could soon help minimize the impact of the influenza season and become a critical player in the fight against lung infections, according to new research.
Each year, influenza kills half a million people globally with the elderly and very young most often the victims. In fact, the Centers for Disease Control and Prevention reported 37 children have died in the United States during the current flu season. Aside from getting the flu shot and employing smart hand hygiene, there are no other methods of prevention.
Maziar Divangahi’s lab has taken on the challenge of trying to understand how the mechanisms of the immune system fight the flu in the hopes of finding new immunotherapies to combat the virus.
The discovery of the RIPK3 protein, which is involved in the regulation of immune response to the flu, means help may be on the horizon.
‘A helpful sidekick’
The researchers have long been interested in understanding the immune system in relation to influenza. They already knew a type of protein called type I IFN (produced by macrophages, a type of white blood cell) stimulates cells to block virus production. In the case of the flu, type I IFN helps in restricting the replication of the flu virus in our lungs.
But the question remained: how did the type I IFN networks operate and what mechanisms were involved to promote their efficacy?
“…by understanding exactly how RIPK3 works to boost IFN’s potency we can look at avenues in the manufacturing of anti-flu drugs.”
The team uncovered an exciting and surprising function of the RIPK3 protein, which sits in the cytoplasm of cells, including macrophages. Most past studies found RIPK3 to be involved in a form of cell death but in the case of the flu infected macrophages, RIPK3 behaves differently.
It turns out that when it comes to the flu, RIPK3 actually functions as a helpful sidekick to type I IFN’s pathway by increasing IFN production, thereby helping to block the replication of the influenza virus. Further, the team found that macrophages that lack RIPK3 are highly susceptible to the flu infection.
“That underlines the importance of RIPK3 in mounting an effective immune response to the virus,” says senior author Maziar Divangahi, who is a scientist at the Meakins-Christie Laboratories, a member of the Translational Research in Respiratory Diseases Program at the Research Institute of the McGill University Health Centre, and a professor of medicine at McGill University.
“What is exciting is that by understanding exactly how RIPK3 works to boost IFN’s potency we can look at avenues in the manufacturing of anti-flu drugs.”
Fighting flu, helping tuberculosis?
The research also turned up another interesting discovery related to RIPK3. In this case, it involved the immunity to another lung infection, tuberculosis. The immune cells of interest here were, as with the flu, macrophages. With TB, the researchers investigated the same biochemical pathway involved in RIPK3, yet surprisingly, the outcome was the opposite for TB.
Stimulating RIPK3 with TB created a surfeit of dying macrophages and promoted the survival and dissemination of the disease rather than blocking it as in the case of the flu. Both findings support the fact that RIPK3 is a critical player in immunity involving lung infections.
“Pulmonary infections like TB and influenza are significant global problems. Former avenues of research haven’t always had the most promising results,” says Jeff Downey, a PhD in Divangahi’s lab and the first author of the influenza paper. “Looking at these new pathways and new ideas can be really helpful in potentially finding new therapies in both cases.”
A foundation grant from the Canadian Institutes of Health Research supports Divangahi. Funding from the Faculty of Medicine of McGill University and the Fonds de recherche du Quebec–Santé supported the lead authors of the flu study. The FRSQ Fellowship supports Nargis Khan, the lead author of the TB study.
As societies become wealthier and more gender equal, women are less likely to obtain degrees in STEM, according to new research. The researchers call this a “gender-equality paradox.”
The underrepresentation of girls and women in science, technology, engineering, and mathematics (STEM) fields occurs globally. Although women are currently well represented in life sciences, they continue to be underrepresented in inorganic sciences, such as computer science and physics.
In their study, researchers also discovered a near-universal sex difference in academic strengths and weaknesses that contributes to the STEM gap.
The Gender-Equality Paradox: Women in STEM Fields - Vimeo
Findings from the study could help refine education efforts and policies geared toward encouraging girls and women with strengths in science or math to participate in STEM fields.
The researchers found that, throughout the world, boys’ academic strengths tend to be in science or mathematics, while girls’ strengths are in reading. Students who have personal strengths in science or math are more likely to enter STEM fields, whereas students with reading as a personal strength are more likely to enter non-STEM fields, according to David Geary, professor of psychological sciences in the University of Missouri’s College of Arts and Science.
These gender differences in academic strengths, as well as interest in science, may explain why the gender differences in STEM fields has been stable for decades, and why current approaches to address them have failed.
“We analyzed data on 475,000 adolescents across 67 countries or regions and found that while boys’ and girls’ achievements in STEM subjects were broadly similar in all countries, science was more likely to be boys’ best subject,” Geary says.
“Girls, even when their abilities in science equaled or excelled that of boys, often were likely to be better overall in reading comprehension, which relates to higher ability in non-STEM subjects. As a result, these girls tended to seek out other professions unrelated to STEM fields,” he says.
Surprisingly, this trend was larger for girls and women living in countries with greater gender equality. The authors call this a “gender-equality paradox,” because countries lauded for their high levels of gender equality, such as Finland, Norway, or Sweden, have relatively few women among their STEM graduates.
In contrast, more socially conservative countries such as Turkey or Algeria have a much larger percentage of women among their STEM graduates.
“In countries with greater gender equality, women are actively encouraged to participate in STEM; yet, they lose more girls because of personal academic strengths,” Geary says. “In more liberal and wealthy countries, personal preferences are more strongly expressed. One consequence is that sex differences in academic strengths and interests become larger and have a stronger influence college and career choices than in more conservative and less wealthy countries, creating the gender-equality paradox.”
The combination of personal academic strengths in reading, lower interest in science, and broader financial security explains why so few women choose a STEM career in highly developed nations.
“STEM careers are generally secure and well-paid but the risks of not following such a path can vary,” says Gijsbert Stoet, professor in psychology at Leeds Beckett University in the UK. “In more affluent countries where any choice of career feels relatively safe, women may feel able to make choices based on non-economic factors. Conversely, in countries with fewer economic opportunities, or where employment might be precarious, a well-paid and relatively secure STEM career can be more attractive to women.”
Findings from this study could help target interventions to make them more effective, say the researchers. Policymakers should reconsider failing national policies focusing on decreasing the gender imbalance in STEM, the researchers add.
After completing training with the Network for Educator Effectiveness, principals improved their accuracy in evaluations of teachers, according to a new study.
In addition to creating greater accuracy, the training also encouraged discussion among principals and teachers about measurable goals.
“The training helps everyone in a school get on the same page about effective teaching.”
More than 90 percent of teacher evaluations in schools include direct observations by principals. However, the evaluations are often subjective, and if principals are not properly trained, the results may not be a fair representation of a teacher’s performance.
Christi Bergin, a research professor in the University of Missouri’s College of Education and one of the developers of the Network for Educator Effectiveness, says that improving teacher observation practices helps education leaders prioritize methods in a way that increases transparency.
“If we are going to put resources into teacher evaluation, then let’s do it in a way that is useful and promotes growth and insight,” Bergin says. “The training helps everyone in a school get on the same page about effective teaching.”
In the study, Bergin and colleagues used diagnostic statistics in an innovative way to identify specific teaching practices that principals find difficult to evaluate accurately. For example, “formative assessment,” which refers to ensuring all students are learning during a lesson, was especially difficult to evaluate accurately. Identifying evaluation challenges is helpful because it pinpoints where more training is needed.
Because raters can be a big source of error, the study’s findings are an encouraging sign that Network for Educator Effectiveness training is effective. Bergin says a standard training for principals may also help teachers be more informed on how principals judge their performance, which then can inform strategies to improve their practice and help promote growth.
“If teachers know their principals are getting high-quality training, then they not only know what to expect in their observations, but they can have confidence in the outcomes,” Bergin says. “The overall community can have faith in their schools knowing that their teachers are growing their skills.”
Bergin’s research team is currently analyzing whether principal characteristics, such as how many years of experience they have, can have a strong impact on their accuracy with evaluations.
The Network for Educator Effectiveness is the largest comprehensive teacher evaluation system in Missouri—more than 270 school districts use the system and the system trains more than 1,500 principals and administrators every year on how to effectively evaluate teachers.
Current commitments won’t meet the Paris Agreement’s aspirational goals of limiting temperature—and that could make the world a degree warmer and considerably more prone to extreme weather.
The difference between this UN goal and the actual country commitments is a mere 1 C, which may seem negligible. But a new study in Science Advances finds that even that 1-degree difference could increase the likelihood of extreme weather.
In this study, Noah Diffenbaugh, professor of earth system science at Stanford University’s School of Earth, Energy & Environmental Sciences and colleagues expanded on previous work analyzing historical climate data, which demonstrated how greenhouse gas emissions have increased the probability of recording-breaking hot, wet, and dry events in the present climate.
Now, the group analyzed similar models to estimate the probability of extreme weather events in the future under two scenarios of the Paris Agreement: increases of 1.5 to 2 degrees if countries live up to their aspirations, or 2 to 3 degrees if they meet the commitments that they have made.
“The really big increases in record-setting event probability are reduced if the world achieves the aspirational targets rather than the actual commitments,” says Diffenbaugh, who is also senior fellow in the Stanford Woods Institute for the Environment. “At the same time, even if those aspirational targets are reached, we still will be living in a climate that has substantially greater probability of unprecedented events than the one we’re in now.”
Droughts, floods, and heat
The new study is the latest application of an extreme event framework that Diffenbaugh and other researchers at Stanford have been developing for years. They have applied this framework to individual events, such as the 2012-2017 California drought and the catastrophic flooding in northern India in June 2013. In their 2017 paper on severe events, they found that global warming from human emissions of greenhouse gases has increased the odds of the hottest events across more than 80 percent of the globe for which reliable observations were available, while also increasing the likelihood of both wet and dry extremes.
“Damages from extreme weather and climate events have been increasing, and 2017 was the costliest year on record.”
The framework relies on a combination of historical climate observations and climate models that are able to simulate the global circulation of the atmosphere and ocean. The group uses output from these models run under two conditions: one that includes only natural climate influences, like sunspot or volcano activity, and another that also includes human influences like rising carbon dioxide concentrations. The researchers compare the simulations to historical extreme event data to test whether the condition with natural or human influences best represents reality.
For the new study, the researchers expanded the number of climate models from their previous paper that had investigated the 1 degree of global warming that has already occurred, strengthening their earlier conclusions. Then, they used their findings to predict the probabilities of severe events in the two Paris Agreement scenarios.
Two very different scenarios
Although the researchers knew that increases in temperature would very likely lead to increases in severe events, the stark difference in the outcomes of the two scenarios surprised them.
The researchers found that emissions consistent with the commitments countries have made are likely to result in a more than fivefold increase in probability of record-breaking warm nights over approximately 50 percent of Europe, and more than 25 percent of East Asia.
This 2 to 3 degrees of global warming would also likely result in a greater than threefold increase in record-breaking wet days over more than 35 percent of North America, Europe, and East Asia. The authors found that this level of warming is also likely to lead to increases in hot days, along with milder cold nights and shorter freezes.
Meeting the Paris Agreement’s goal of keeping the global-scale warming to less than 2 degrees is likely to reduce the area of the globe that experiences greater than threefold increases in the probability of record-setting events. However, even at this reduced level of global warming, the world is still likely to see increases in record-setting events compared to the present.
When people build a dam, plan the management of a river, or build on a floodplain, it is common practice to base decisions on past historical data. This study provides more evidence that these historical probabilities no longer apply in many parts of the world. The new analysis helps clarify what the climate is likely to look like in the future and could help decision makers plan accordingly.
“Damages from extreme weather and climate events have been increasing, and 2017 was the costliest year on record,” Diffenbaugh says. “These rising costs are one of many signs that we are not prepared for today’s climate, let alone for another degree of global warming.”
“But the good news is that we don’t have to wait and play catch-up,” Diffenbaugh adds. “Instead, we can use this kind of research to make decisions that both build resilience now and help us be prepared for the climate that we will face in the future.”
Additional coauthors of this paper are Deepti Singh, postdoctoral fellow at Columbia University and incoming faculty at Washington State University, and Justin Mankin, visiting research scholar and incoming faculty member at Dartmouth College, and scientist with Columbia University and the NASA Goddard Institute for Space Studies.
The School of Earth, Energy & Environmental Sciences and the Woods Institute for the Environment at Stanford University; The Earth Institute and Lamont-Doherty Earth Observatory of Columbia University; and the US Department of Energy funded this work.