A new method could lead to diagnostic tools that precisely measure the third-trimester growth and folding patterns of a baby’s brain in 3D.
The research could help to sound an early alarm on developmental disorders in premature infants that could affect them later in life.
“We all have the same components, but our brain folds are like fingerprints: Everyone has a different pattern.”
During the third trimester, a baby’s brain undergoes rapid development in utero. The cerebral cortex dramatically expands its surface area and begins to fold. Previous work suggests that this quick and very vital growth is an individualized process, with details varying infant to infant.
“One of the things that’s really interesting about people’s brains is that they are so different, yet so similar,” says Philip Bayly, professor of mechanical engineering at the School of Engineering & Applied Science at Washington University in St. Louis. “We all have the same components, but our brain folds are like fingerprints: Everyone has a different pattern. Understanding the mechanical process of folding—when it occurs—might be a way to detect problems for brain development down the road.”
Engineering doctoral student Kara Garcia accessed magnetic resonance 3D brain images from 30 preterm infants that Christopher Smyser, associate professor of neurology, and his pediatric neuroimaging team had scanned. The researchers scanned the babies two to four times each during the period of rapid brain expansion, which typically happens at 28 to 38 weeks.
Using a new computer algorithm, Bayly, Garcia, and their colleagues obtained accurate point-to-point correspondence between younger and older cortical reconstructions of the same infant. From each pair of surfaces, the team calculated precise maps of cortical expansion. Then, using a minimum energy approach to compare brain surfaces at different times, researchers picked up on subtle differences in the babies’ brain folding patterns.
“The minimum energy approach is the one that’s most likely from a physical standpoint,” Bayly says. “When we obtain surfaces from MR images, we don’t know which points on the older surface correspond with which points on the younger surface. We reasoned, that since nature is efficient, the most likely correspondence is the one that produces the best match between surface landmarks, while at the same time, minimizing how much the brain would have to distort while it is growing.
Scientists have developed a new way to track folding patterns in the brains of premature babies. It’s hoped this new process could someday be used for diagnosing a host of diseases, including autism and schizophrenia. (Credit: Bayly Lab)
“When you use this minimum energy approach, you get rid of a lot of noise in the analysis, and what emerged were these subtle patterns of growth that were previously hidden in the data. Not only do we have a better picture of these developmental processes in general, but doctors should hopefully be able to assess individual patients, take a look at their pattern of brain development, and figure out how it’s tracking.”
It’s a measurement tool that could prove invaluable in places such as neonatal intensive care units, where preemies face a variety of challenges. Understanding an individual’s precise pattern of brain development also could assist physicians trying to make a diagnosis later in a patient’s life.
“You do also find folding abnormalities in populations that have cognitive issues later in life, including autism and schizophrenia,” Bayly says. “It’s possible, if medical researchers understand better the folding process and what goes on wrong or differently, then they can understand a little bit more about what causes these problems.”
The researchers report their findings in PNAS. The National Institute of Health supported this work.
People’s willingness to use a Zika vaccine when it’s available will be influenced by how they weigh the risks associated with the disease and the vaccine—but also by their misconceptions about vaccines for other diseases, researchers say.
While a Zika vaccine is in development, researchers examined factors that will affect its eventual acceptance or rejection.
The study, which appears in the Journal of Public Health, shows that people’s erroneous beliefs about an association between the measles, mumps, and rubella (MMR) vaccine and autism were a predictor of people’s lessened intention to get a Zika vaccine.
“…misbelief about the MMR vaccine’s association with autism was more influential on the decision of whether to get vaccinated for Zika than even perceptions of Zika itself…”
Further, the study finds that people’s perceptions of the severity of the Zika virus as well as their general belief in the power of science to solve problems increased their intention to get the vaccine.
“When a new disease arises, people who lack understanding of the new threat may extrapolate from their knowledge of other diseases,” says Yotam Ophir, a PhD candidate at Penn’s Annenberg School for Communication who coauthored the study with Kathleen Hall Jamieson, director of the Annenberg Public Policy Center.
“We found that the misbelief about the MMR vaccine’s association with autism was more influential on the decision of whether to get vaccinated for Zika than even perceptions of Zika itself, which is worrisome, especially in light of the persistence of that misinformation.”
The study analyzes 2016 data from APPC’s Annenberg Science Knowledge (ASK) survey gathered during the outbreak of the Zika virus, which is mosquito-borne and can be sexually transmitted. When a pregnant woman is infected, the virus can cause an increased risk of birth defects, including microcephaly.
The study includes survey responses from 3,337 individuals between August 25, 2016 and September 26, 2016, which was part of a larger, 34-week survey of US adults on attitudes, behavior, and understanding of the Zika virus.
Key findings include:
The more likely someone is to believe in the false association between the use of the MMR vaccine and autism, the less likely that person is to use a Zika vaccine.
People who believe in the ability of science to overcome problems were more likely to intend to use a Zika vaccine.
People who believe that Zika causes the birth defect microcephaly (which is accurate) and those who believe Zika is likely to cause death (which is inaccurate) were more likely to intend to vaccinate.
People who were engaged in behaviors to protect against Zika were less likely to intend to get the vaccination—which “may be the result of their confidence that their actions pre-empt the need to be vaccinated,” researchers say.
The findings have practical and theoretical implications, scientists say. Once the Zika vaccine exists, health communicators will have to cope with “vaccine hesitancy” and anti-vaccine communications. The research also adds evidence to the need for health communicators “to address a spill-over effect from misbeliefs about one vaccine on intention to use another.”
The bogus association between the MMR vaccine and autism has been disproven in numerous studies. However, the argument is still prominent among people who oppose vaccinations.
“Scientists often look at the effect of misinformed beliefs about the MMR vaccine on people’s intention to vaccinate children with the triple vaccine, but they don’t as often look at the dangerous spillover effects that these misbeliefs can have,” Ophir says.
Prior research has shown that it is very hard to completely debunk misinformation, such as the mistaken belief that the MMR vaccine causes autism, but the study results suggest that accurately communicating about the risks of Zika can help lessen the detrimental effects of the misbelief.
“Even if we can’t change what people think about the MMR vaccine, if we can give them an accurate picture of how vulnerable they are to a disease such as Zika, they can make a more informed decision about it,” Ophir says.
Funding from the Science of Science Communication endowment of the Annenberg Public Policy Center of the University of Pennsylvania funded the work.
Researchers have developed a climate record stretching 2,060 years into Mongolia’s past by using the natural archive of weather conditions stored in the rings of Siberian pines.
The researchers also combined information on past climate from the tree rings with computer models that can project future regional climate.
According to the researchers, the extreme wet and dry periods Mongolia has experienced in the late 20th and early 21st centuries are rare but not unprecedented and future droughts may be no worse.
The most recent extended drought in Mongolia lasted from 2000 to 2010 and resulted in major livestock die-offs and a massive migration of nomadic herders to the capital city.
Researchers Kevin Anchukaitis and Amy Hessl walk across Mongolia’s Khorgo lava field to collect cross-sections of dead trees. (Credit: Neil Pederson/U. Arizona)
“We were able to quantify how unusual this drought was,” says coauthor Kevin J. Anchukaitis, an associate professor in the School of Geography and Development at the University of Arizona. “The drought was not unprecedented, but it has a 900-year return interval. It’s a once-in-a-millennium drought.”
Finding that future droughts would likely be no worse than those of the past was a surprise, says Anchukaitis, who led the modeling team. In other semi-arid regions of the world that he has studied, such as California and the Mediterranean, global warming already has changed precipitation and temperature patterns and thereby increased the risk of long-term drought.
Mongolia, a landlocked country in central Asia, has long, cold winters and short summers. Much of the country is cold, semi-arid grasslands that resemble eastern Montana.
“You would expect, based on everything we’ve been thinking about and reading as climate scientists, that elevated temperatures are going to lead to more severe droughts in semi-arid regions. But the models did not project increased frequency or severity of droughts,” says lead author Amy Hessl of West Virginia University.
The timing of the rainy season is the likely cause, the researchers say. Mongolia’s rainy season is in the summer, the warmer time of the year, whereas California and the Mediterranean have winter rains and dry summers. As global temperatures increase, continental regions with summer rains may get more precipitation, offsetting the effects on plants of higher temperatures.
Anchukaitis, who also has appointments in geosciences and in the Laboratory of Tree-Ring Research at UA, is interested in how past and current civilizations dealt with drought and climate variability. This new research is an outgrowth of previous research he, Hessl, and their colleagues conducted to figure out how past climate influenced the Mongol civilization.
Researchers constructed a drought history of Mongolia using tree rings from the Siberian pines growing on the Uurgat lava field, shown here with the Khangai Mountains in the background and seasonal homes of nomadic herders in the lower right. (Credit: Scott Nichols/U. Arizona)
Anchukaitis and his colleagues used their tree-ring record of past climate in Mongolia to reconstruct what the annual Palmer Drought Severity Index, or PDSI, would have been going back in time 2,060 years. The PDSI combines both temperature and precipitation to get a measure of soil moisture, one measure of how water-stressed a plant would be.
He and his colleagues combined the reconstruction of past annual PDSI measures with a set of climate model simulations called the Community Earth System Model to understand what influenced the Mongolian droughts for the period from 850 to 2100.
‘Tug of war’
The model incorporates information about past solar variability, volcanic eruptions, land use changes, and carbon dioxide emissions. For projections to the end of the 21st century, the model uses the future emissions scenario called RCP 8.5, in which the rate of emissions of greenhouse gases continues to increase.
Even with the highest level of greenhouse gas emissions and rising global temperatures, the model simulations indicate that future droughts in Mongolia would be no more severe than those of the past.
“There’s a tug of war between trends toward increased rainfall and more evaporative demand because of hotter temperatures. There’s uncertainty about which will win this tug of war,” Anchukaitis says.
“The simulations say that Mongolia dries between now and about 2050 because of higher temperatures, but then it turns around because of the increase in precipitation,” he says.
Many people in Mongolia are nomadic pastoralists. The vagaries of weather and climate particularly affect them, because the combination of winter and summer temperatures plus rainfall controls the number of cattle the grassland can support.
“What to me stands out is this deep uncertainty about the future, particularly when you have a society that is so vulnerable to climate variability,” he says, adding that uncertainty makes it hard to plan for coping with future climate change.
Anchukaitis says one of his next steps is translating the team’s estimates of future soil moisture into estimates of the future productivity of Mongolia’s grasslands.
The researchers report their findings in the journal Science Advances. National Geographic and the National Science Foundation funded the research.
Additional coauthors are from West Virginia University in Morgantown; the Lamont-Doherty Earth Observatory, Palisades, New York; NASA Goddard Institute for Space Studies, New York City; the National University of Mongolia, Ulaanbaatar; Harvard University; and Auburn University, Alabama.
Patients who beat cancer years or even decades ago still become fatigued more quickly than people without cancer histories, a new study shows.
“The main goal of cancer treatment has been survival, but studies like this suggest that we need also to examine the longer-term effects on health and quality of life.”
In a long-term aging study, cancer survivors reported more fatigue after treadmill tests and, on average, they walked slower than participants who had never had cancer, the scientists found.
“The main goal of cancer treatment has been survival, but studies like this suggest that we need also to examine the longer-term effects on health and quality of life,” says senior researcher Jennifer A. Schrack, assistant professor of epidemiology at Johns Hopkins University.
Successful cancer treatment has left a growing population of cancer survivors: 16 million in the United States as of 2016. But studies suggest that the lingering negative impacts of cancer treatments often resemble an accelerated aging process, with effects including cognitive impairments, heart disease, secondary cancers, and—documented in this new study in Cancer—fatigue.
Fatigue as a general feeling is difficult to measure in an objective way. Schrack and colleagues at the Bloomberg School of Public Health examined it in the context of physical exertion.
Their data came from the Baltimore Longitudinal Study of Aging, a project that has enrolled thousands of people in the Baltimore/Washington area since 1958. BLSA generally follows people for life, conducting periodic health checks. Since 2007, these checks have included measures of endurance and “fatigability” during walks and treadmill tests.
“Researchers at the National Cancer Institute suggested that we look at these BLSA data to see if there were differences in otherwise healthy older adult cancer survivors,” Schrack says. “We were surprised by the magnitude of the differences we found.”
The fatigability test involved a five-minute treadmill walk, after which BLSA participants rated their perceived exertion on a scale of 6 to 20. Ratings over 10 were considered “high perceived fatigability.”
After adjusting for gender- and health-related differences between 334 participants who had a history of cancer and 1,331 who didn’t, the researchers found that a cancer history was associated with a 1.6 times greater risk of high perceived fatigability.
By comparison, being older than 65 years brought a 5.7 times greater risk of high perceived fatigability—implying that the effect of a cancer history was more than a third as large as the effect of aging past 65.
Similarly, a cancer history was associated with 400-meter walk times averaging 14 seconds slower than those for participants with no cancer history—which again was a bit more than a third of the slowing effect (36 seconds) that came from aging past 65. The over-65 participants with a cancer history also deteriorated more steeply in their endurance-walk times from one checkup to the next, compared to those without a cancer history.
“These findings support the idea that a history of cancer is associated with higher fatigability and that this effect worsens with advancing age,” Schrack says.
The researchers aim to follow-up with studies of larger groups of cancer survivors for whom there are more data on cancer type, treatment type, and other important factors. Such studies could distinguish the long-term adverse effects of different cancer treatment regimens, and could even help reveal the biological mechanisms underlying those adverse effects.
“The long-term goal is that doctors and patients will be able to take those specific long-term effects into account when they decide how to treat different cancers,” Schrack says.
The National Cancer Institute and National Institute of Aging paid for the study. NIA funds the Baltimore Longitudinal Study of Aging.
Genetic analysis reveals new evidence to explain how the hogfish uses its skin to “see.”
The hogfish is a pointy-snouted reef fish that can go from pearly white to mottled brown to reddish in a matter of milliseconds as it adjusts to shifting conditions on the ocean floor.
Scientists have long suspected that animals with quick-changing colors don’t just rely on their eyes to tune their appearance to their surroundings—they also sense light with their skin. But exactly how “skin vision” works remains a mystery.
The hogfish can change from white to spotted brown to reddish depending on its surroundings. (Credit: Dean Kimberly/Lori Schweikert/Duke)
In a new study, researchers show that hogfish skin senses light differently from eyes. The results suggest that light-sensing evolved separately in the two tissues, says Lori Schweikert, a postdoctoral scholar with Sönke Johnsen, biology professor at Duke University.
With “dermal photoreception,” as it is called, the skin doesn’t enable animals to perceive details like they do with their eyes, Schweikert says. But it may be sensitive to changes in brightness or wavelength, such as moving shadows cast by approaching predators, or light fluctuations associated with different times of day.
Schweikert, Johnsen, and Duke postdoctoral associate Bob Fitak focused on the hogfish, or Lachnolaimus maximus, which spends its time in shallow waters and coral reefs in the western Atlantic Ocean, from Nova Scotia to northern South America. It can make its skin whitish to blend in with the sandy bottom of the ocean floor and hide from predators or ambush prey. Or it can take on a bright, contrasting pattern to look threatening or attract a mate.
The key to these makeovers are special pigment-containing cells called chromatophores, which, when activated by light, can spread their pigments out or bunch them up to change the skin’s overall color or pattern.
The researchers took pieces of skin and retina from a single female hogfish caught off the Florida Keys and analyzed all of its gene readouts, or RNA transcripts, to see which genes were switched on in each tissue.
Previous studies of other color-changing animals including cuttlefish and octopuses suggest the same molecular pathway that detects light in eyes may have been co-opted to sense light in the skin.
But Schweikert and colleagues found that hogfish skin works differently. Almost none of the genes involved in light detection in the hogfish’s eyes were activated in the skin. Instead, the data suggest that hogfish skin relies on an alternative molecular pathway to sense light, a chain reaction involving a molecule called cyclic AMP.
Just how the hogfish’s “skin vision” supplements input from the eyes to monitor light in their surroundings and bring about a color change remains unclear, Schweikert says. Light-sensing skin could provide information about conditions beyond the animal’s field of view, or outside the range of wavelengths that the eye can pick up.
Wild birds that are cleverer than others at foraging for food have different levels of a neurotransmitter receptor that has links to intelligence in humans, according to a study.
The findings could provide insight into the evolutionary mechanisms affecting cognitive traits in a range of animals.
As reported in Science Advances, the researchers caught Barbados bullfinches and black-faced grassquits near McGill University’s Bellairs Research Institute in Barbados. Bullfinches are bold, opportunistic, and innovative, while grassquits are shy and conservative. They are each other’s closest relative in Barbados and are cousins of Darwin’s finches from the Galápagos islands.
A Barbados bullfinch innovation in the wild: opening sugar packets. (Credit: Louis Lefebvre/McGill)
In captivity, the problem-solving skills of the two species differed considerably in lab tests. Most of the bullfinches quickly figured out how to lift the lid off a jar of food, for example, while all the grassquits were stumped by the challenge. These performances were in line with the differences in the birds’ innovativeness in the wild—a trait that can help animals survive in changing environments.
The researchers then compared the expression of all genes in six parts of the brain of the two bird species.
A family of genes stood out: glutamate neurotransmitter receptors, especially in the part of the bird brain that corresponds to humans’ prefrontal cortex. Glutamate receptors are known to be involved in a variety of cognitive traits in humans and other mammals. In particular a receptor known as GRIN2B, when boosted in transgenic mice, makes them better learners. Levels of that receptor were higher in the Barbados bullfinch than in the grassquit, the researchers found.
“By comparing an extremely innovative species like the Barbados bullfinch with a closely related conservative one like the black-faced grassquit, we gain insight into the evolutionary mechanisms that can lead to divergence in behavior,” McGill biologist Jean-Nicolas Audet .
“It might be that mammals, including humans, and birds like the Barbados bullfinch use similar mechanisms to perform cognitively. If our results are confirmed in future studies, it would be a unique demonstration of convergent evolution of intelligence, involving the same neurotransmitter receptors despite the widely different brain structures of birds and mammals.”
Coauthors of the study are from McGill, Duke, and Harvard universities.
Funding for the research came in part from the Fonds de recherche du Québec – Nature et technologies, the Natural Sciences and Engineering Research Council of Canada, the Howard Hughes Medical Institute, and a Hydro-Québec doctoral scholarship.
Small populations of pathogenic bacteria may be harder to kill off than larger populations because they respond differently to antibiotics, a new study indicates.
“By tuning the growth and death rate of bacterial cells, you can clear small populations of even antibiotic-resistant bacteria using low antibiotic concentrations.”
The research shows that a population of bacteria containing 100 cells or less responds to antibiotics randomly—not homogeneously like a larger population.
“We’ve shown that there may be nothing special about bacterial cells that aren’t killed by drug therapy—they survive by random chance,” says lead author Minsu Kim, an assistant professor in the physics department at Emory University and a member of the university’s Antibiotic Resistance Center.
“This randomness is a double-edged sword,” Kim adds. “On the surface, it makes it more difficult to predict a treatment outcome. But we found a way to manipulate this inherent randomness in a way that clears a small population of bacteria with 100 percent probability. By tuning the growth and death rate of bacterial cells, you can clear small populations of even antibiotic-resistant bacteria using low antibiotic concentrations.”
The researchers developed a treatment model using a cocktail of two different classes of antibiotic drugs. They first demonstrated the effectiveness of the model in laboratory experiments on a small population of E. coli bacteria without antibiotic-drug resistance. In later experiments, they found that the model also worked on a small population of clinically-isolated antibiotic-resistant E. coli.
“We hope that our model can help in the development of more sophisticated antibiotic drug protocols—making them more effective at lower doses for some infections,” Kim says. “It’s important because if you treat a bacterial infection and fail to kill it entirely, that can contribute to antibiotic resistance.”
Antibiotic resistance is projected to lead to 300 million premature deaths annually and a global healthcare burden of $100 trillion by 2050, according to the 2014 Review on Antimicrobial Resistance. The epidemic is partly driven by the inability to reliably eradicate infections of antibiotic-susceptible bacteria.
For decades, it was thought that simply reducing the population size of the bacteria to a few hundred cells would be sufficient because the immune system of an infected person can clear out the remaining bacteria.
“More recently, it became clear that small populations of bacteria really matter in the course of an infection,” Kim says. “The infectious dose—the number of bacterial cells needed to initiate an infection—turned out to be a few or tens of cells for some species of bacteria and, for others, as low as one cell.”
It was not well understood, however, why treatment of bacteria with antibiotics sometimes worked and sometimes failed. Contributing factors may include variations in the immune responses of infected people and possible mutations of bacterial cells to become more virulent.
Kim suspected that something more fundamental was a factor. Research has shown unexpected treatment failure for antibiotic-susceptible infections even in a simple organism like the C. celegans worm, a common model for the study of bacterial virulence.
By focusing on small bacteria populations, the team discovered how the dynamics were different from large ones. Antibiotics induce the concentrations of bacterial cells to fluctuate. When the growth rate topped the death rate by random chance, clearance of the bacteria failed.
The researchers used this knowledge to develop a low-dose cocktail drug therapy of two different kinds of antibiotics. They combined a bactericide (which kills bacteria) and a bacteriostat (which slows the growth of bacteria) to manipulate the random fluctuation in the number of cells and boost the probability of the cell death rate topping the growth rate.
Not all antibiotics fit the model and more research is needed to refine the method for applications in a clinical setting.
“We showed that the successful treatment of a bacterial infection with antibiotics is even more complicated than we thought,” Kim says. “We hope this knowledge leads to new strategies to fight against infections caused by antibiotic-resistant bacteria.”
The researchers report their findings in the journal eLife.
Many top academic journals continue to have low numbers of female authors, new research indicates.
Five years ago, Nature—one of the most prestigious research journals in science—published an editorial pledging to improve on the low number of women editors and authors in its pages.
“…it is not that women are not conducting research and publishing, they are just much less likely to get their work into the really high-profile journals…”
For many readers and scientists, that acknowledgement was a long time in coming. Yet with the hindsight of today’s re-examination of the treatment of women at all levels of society, the editorial could seem almost prescient.
In the time since that editorial, however, not much has changed, according to a new study, which appears online and is cited in a letter in Nature.
The preliminary study, by University of Washington psychology professor Ione Fine and doctoral student Alicia Shen, finds that many high-profile neuroscience journals had a low representation of female authors. For example, fewer than 25 percent of Nature research articles listed women as the first author—usually the junior scientist who led the research. Among last authors—typically the senior laboratory leader—just over 15 percent were women.
Nature‘s top-tier competitor, Science, had similarly low numbers of women authors.
Not much better than before
What most concerned the researchers was that over a 12-year period ending in 2017, the percentage of female authors in these journals showed little improvement: less than 1 percent annually, with many journals showing no increase at all.
For the sake of comparison, the researchers also looked at the number of women who received major National Institute of Health grants during the same time period. Those numbers were much higher, and increased slowly but steadily, with just under 30 percent of grants in 2017 awarded to women.
“There are glass ceilings in technology, in politics, in business. It’s very hard not to believe that this is not just another glass ceiling.”
“These research grants are awarded based on significance, impact, and productivity. We shouldn’t see this huge discrepancy between NIH funding and last authorship in high impact journals,” Fine says.
It’s particularly troubling, the study’s authors say, given that publishing in high-profile journals is virtually imperative for winning academic awards or positions at top-ranked institutions.
Gender disparities in STEM fields has garnered more attention in recent years. While National Science Foundation-compiled data show that women make up a growing proportion of STEM faculty, their numbers remain significantly lower than those of men. A 2016 survey by the Society for Neuroscience showed that a little more than half of neuroscience doctorates are awarded to women, but women make up an average of only 30 percent of neuroscience faculty.
Other studies of gender and authorship have also pointed to the possible contribution of publication bias. A small-scale study focusing on Nature Neuroscience, in 2016, showed similar results to the new findings. And in 2013, a study led by the UW’s Jevin West and Carl Bergstrom, though an analysis of publications in the JSTOR digital library, found that women also are much less likely to be featured in prominent first- or last-author positions.
Bias beyond science
The issue extends beyond science: In spring 2017, an economics lecturer at the University of Liverpool found that papers written by female economists took an average of six months longer to get published than those written by men.
For this study, Shen, Fine, and their psychology coauthors research associate Jason Webster and professor Yuichi Shoda, turned to the MEDLINE database of articles, which is hosted by the US National Library of Medicine. They focused on 15 journals that publish neuroscience research, accounting for nearly 167,000 research articles from 2005 to 2017, and analyzed the author bylines using another database that predicts gender based on more than 216,000 distinct first names.
Some journals did have a proportionate number of female authors. The journals with the highest percentage of first authors were Neuropsychology Review (53 percent) and Brain (43 percent). Among last authors, numbers were highest in Neuropsychology Review (39 percent) and Current Opinion in Neurobiology (27 percent).
“From our analysis, it is not that women are not conducting research and publishing, they are just much less likely to get their work into the really high-profile journals,” Shen says.
The researchers suggest several solutions for all journals: to record and report article authorship by gender; to train reviewers to avoid bias, provide reviewers with more specific review criteria, akin to those required for grant awards; to adopt double-blind reviewing; or to establish byline quotas.
“It’s ridiculous to think bias isn’t at play in these very elite journals,” Fine says. “There are glass ceilings in technology, in politics, in business. It’s very hard not to believe that this is not just another glass ceiling.”
Increasing the number of women faculty in STEM fields is the goal of the ADVANCE Center for Institutional Change at UW. But if publication presents a barrier, then some universities may be challenged to hire and promote women, says Eve Riskin, associate dean of engineering for diversity and access, professor of electrical engineering, and faculty director of ADVANCE at UW.
“Research shows that diverse teams lead to better solutions,” Riskin says. “Research also shows that female students in STEM do better when they have female faculty as instructors. Holding women to higher standards for publication makes it harder for universities to increase their number of female faculty members in STEM and in leadership positions.”
The study’s authors have also made their code publicly available, with the hope that students or faculty in other fields will take on the same challenge, determine the gender breakdown of bylines in a given set of journals, and call for change.
“These journals make a lot of money and wield a huge amount of power. Finding a way to fix this problem is the least they can do,” Fine says. “They are under the same legal obligations to avoid discrimination as other businesses.”
Scientists have discovered a key aspect of how DNA forms loops and wraps inside the cell nucleus—a precise method of “packing” that may affect gene expression.
The research, which appears in the journal Science, shows that a process known as hemimethylation plays a role in looping DNA in a specific way. The research also demonstrates that hemimethylation is maintained deliberately—not through random mistakes as previously thought—and is passed down through human cell generations.
“In order for a protein called CTCF to make loops in the DNA, we discovered that it needs to have hemimethylated DNA close by,” says biologist Victor Corces, whose Emory University lab conducted the research. “Nobody had previously seen that hemimethylated DNA has a function.”
Interior of a cell showing the nucleus with the chromatin fiber (yellow) arranged in the three-dimensional space by loops formed by the CTCF protein (shown in pink). The thin blue lines on the chromatin represent DNA. (Credit: Victor Corces/Emory)
Chromatin is made up of CTCF and other proteins, along with DNA and RNA. One role of chromatin is to fold and package DNA into more compact shapes. Growing evidence suggests that this folding process is not just important to fit DNA into a cell nucleus—it also plays a role in whether genes are expressed normally or malfunction.
The Corces lab specializes in epigenetics: The study of heritable changes in gene function—including chromatin folding—that do not involve changes in the DNA sequence.
DNA methylation, for example, can modify the activity of DNA by adding methyl groups to both strands of the double helix at the site of particular base pairs. The process can be reversed through demethylation.
As cells divide they make a copy of their DNA. In order to do so, they have to untangle the two strands of DNA and split them apart. Each parental strand then replicates a daughter strand.
“When cells divide, it’s important that they keep the methylation the same for both strands,” Corces says, noting that altered patterns of methylation are associated with cancer and other diseases.
Hemimethylation involves the addition of a methyl group to one strand of the DNA helix but not the other. Some researchers observing hemimethylation have hypothesized that they were catching it right after cell division, before the cell had time to fully replicate to form a daughter strand. Another theory was that hemimethylation was the result of random mistakes in the methylation process.
Chenhuan Xu, a post-doctoral fellow in the Corces lab, developed new experimental methods for DNA methylome mapping to conduct the research for the paper. These methods allowed the researchers to observe hemimethylation on DNA in human cells in real-time before, during and after cell division. They also mapped it as the cells continued to replicate.
“If the parental DNA was hemimethylated, the daughter DNA was also hemimethylated at the same place in the genome,” Corces says. “The process is not random and it’s maintained from one cell generation to the next over weeks.”
The researchers found that hemimethlyation only occurs near the binding sites of CTCF—the main protein involved in organizing DNA into loops.
“If we got rid of the hemimethlyation, CTCF did not make loops,” Corces says. “Somehow, hemimethylation is allowing CTCF to make loops.”
And when CTCF makes a loop, it does so by binding ahead, going forward in the DNA sequence, they observe.
“Research suggests that some disorders are associated with CTCF binding—either mutations in the protein itself or with the DNA sequence where the protein binds,” Corces says. “It comes back to the story of how important these loops are to the three-dimensional organization of chromatin, and how that organization affects the gene expression.”
New research could lead to the replacement of toxic materials that work so well in solar cells.
Any substitute for the lead-containing perovskites used in some solar cells would have to really perform. The perovskite semiconductors have been so promising and so efficient at converting sunlight into electricity that replacing them is a challenge.
What materials can produce semiconductors that work just as well, but are safe, abundant, and inexpensive to manufacture?
“Semiconductors are everywhere, right?” says Javier Vela, an associate professor of chemistry at Iowa State University and an associate of the US Department of Energy’s Ames Laboratory. “They’re in our computers and our cell phones. They’re usually in high-end, high-value products. While semiconductors may not contain rare materials, many are toxic or very expensive.”
Vela directs a lab that specializes in developing new, nanostructured materials. While thinking about the problem of lead in solar cells, he found a conference presentation by Massachusetts Institute of Technology researchers in which they suggested possible substitutes for perovskites in semiconductors.
Vela and his colleagues decided to focus on sodium-based alternatives and started an 18-month search for a new kind of semiconductor.
They came up with a compound that features sodium, which is cheap and abundant; bismuth, which is relatively scarce but is overproduced during the mining of other metals and is cheap; and sulfur, the fifth most common element on Earth.
“Our synthesis unlocks a new class of low-cost and environmentally friendly ternary (three-part) semiconductors that show properties of interest for applications in energy conversion,” the chemists write in their paper.
In fact, Rosales is working to create solar cells that use the new semiconducting material.
Vela says variations in synthesis—changing reaction temperature and time, choice of metal ion precursors, adding certain ligands—allows the chemists to control the material’s structure and the size of its nanocrystals. And that allows researchers to change and fine tune the material’s properties.
Several of the material’s properties are already ideal for solar cells: The material’s band gap—the amount of energy required for a light particle to knock an electron loose—is ideal for solar cells. The material, unlike other materials used in solar cells, is also stable when exposed to air and water.
So, the chemists think they have a material that will work well in solar cells, but without the toxicity, scarcity, or costs.
“We believe the experimental and computational results reported here,” they write in their paper, “will help advance the fundamental study and exploration of these and similar materials for energy conversion devices.”