Researchers have sequenced and analyzed the genomes of the glowing bacteria living in the bulbs that hang off the heads of anglerfish.
The anglerfish lives most of its life in total darkness more than 1,000 meters below the ocean surface. Female anglerfish sport a glowing lure on top of their foreheads, basically a pole with a light bulb on the end, where bioluminescent bacteria live. The light-emitting lure attracts both prey and potential mates to the fish.
Little is known about anglerfish and their symbiotic relationship with these brilliant bacteria, because the fish are difficult to acquire and study. The bacteria researchers studied in the new research came from fish specimens collected in the Gulf of Mexico.
The researchers report their findings in the journal mBio. The analysis reveals that the bacteria have lost some of the genes that are needed to live freely in the water. That’s because the fish and bacteria developed a tight, mutually beneficial relationship, where the bacteria generate light while the fish supplies nutrients to the microbe.
“What’s particularly interesting about this specific example is that we see evidence that this evolution is still underway, even though the fish themselves evolved about 100 million years ago,” says lead author Tory Hendry, assistant professor of microbiology at Cornell University. “The bacteria are still losing genes, and it’s unclear why.”
“…this is a third type of situation where the bacteria are not actually stuck with their host but they are undergoing evolution.”
Most of the known symbiotic relationships between organisms and bacteria are between either a host and free-living bacteria that don’t evolve to maintain a symbiosis, or a host and intracellular bacteria that live inside the host’s cells and undergo huge reductions in their genomes through evolution.
The bacteria inside the bulb in anglerfish represents a third type of symbiosis, where preliminary data suggest these bacteria may move from the anglerfish bulb to the water. “It’s a new paradigm in our understanding of symbiosis in general; this is a third type of situation where the bacteria are not actually stuck with their host but they are undergoing evolution,” Hendry says.
Genetic sequencing showed that the genomes of these anglerfish bioluminescent bacteria are 50 percent reduced compared with their free-swimming relatives. The bacteria have lost most of the genes associated with making amino acids and breaking down nutrients other than glucose, suggesting the fish may be supplying the bacteria with nutrients and amino acids.
At the same time, the bacteria have retained some genes that are useful in water outside the host. They have full pathways to make a flagellum, a corkscrew tail for moving in water. The bacteria had lost most of the genes involved in sensing chemical cues in the environment that may lead to food or other useful compounds, though a few remained, leaving a subset of chemicals they still respond to.
“They were pared down to something they cared about,” Hendry says.
The Gulf of Mexico Research Initiative funded the study.
A sample of ancient oxygen from a 1.4 billion-year-old evaporative lake deposit in Ontario provides fresh evidence of what the Earth’s atmosphere and biosphere were like leading up to the emergence of animal life, according to new research.
The findings, which appear in the journal Nature, represent the oldest measurement of atmospheric oxygen isotopes by nearly a billion years. The results support previous research suggesting that oxygen levels in the air during this time in Earth history were a tiny fraction of what they are today due to a much less productive biosphere.
“It has been suggested for many decades now that the composition of the atmosphere has significantly varied through time,” says Peter Crockford, a postdoctoral researcher at Princeton University and Israel’s Weizmann Institute of Science who led the study as a PhD student at McGill University. “We provide unambiguous evidence that it was indeed much different 1.4 billion years ago.”
An image of the history of life and atmospheric oxygen on Earth over its 4.6 billion year history. The magnifying glass shows a picture of cyanobacteria that would have dominated life on Earth across much of the Proterozoic beginning around 2.4 billion years ago. On the far right is an image of the Earth that highlights vegetation on the continents and cholorphyll concentrations in the ocean. What the new study shows is that these colors would have been much less vibrant in Earth’s deep past due to a smaller biosphere. (Credit: McGill)
The study provides the oldest gauge yet of what earth scientists refer to as “primary production,” in which micro-organisms at the base of the food chain—algae, cyanobacteria, and the like—produce organic matter from carbon dioxide and pour oxygen into the air.
Our planet, 1.4 billion years ago
“This study shows that primary production 1.4 billion years ago was much less than today,” says senior coauthor Boswell Wing, an associate professor of geological sciences at the University of Colorado at Boulder who helped supervise Crockford’s work at McGill.
“This means that the size of the global biosphere had to be smaller, and likely just didn’t yield enough food—organic carbon—to support a lot of complex macroscopic life,” says Wing.
To come up with these findings, Crockford teamed up with colleagues who had collected pristine samples of ancient salts, known as sulfates, found in a sedimentary rock formation north of Lake Superior.
The work also sheds new light on a stretch of Earth’s history known as the “boring billion” because it yielded little apparent biological or environmental change.
“Subdued primary productivity during the mid-Proterozoic era—roughly 2 billion to 800 million years ago—has long been implied, but no hard data had been generated to lend strong support to this idea,” notes study coauthor Galen Halverson, an associate professor of earth and planetary sciences.
“That left open the possibility that there was another explanation for why the middle Proterozoic ocean was so uninteresting, in terms of the production and deposit of organic carbon.” Crockford’s data “provide the direct evidence that this boring carbon cycle was due to low primary productivity.”
The findings could also help inform astronomers’ search for life outside our own solar system.
“For most of Earth history our planet was populated with microbes, and projecting into the future they will likely be the stewards of the planet long after we are gone,” says Crockford.
“Understanding the environments they shape not only informs us of our own past and how we got here, but also provides clues to what we might find if we discover an inhabited exoplanet,” he says.
Although ice-cold drinks and ice cream can cause sharp, shooting mouth pain and the occasional “brain freeze,” the two reactions are completely unrelated, says neurologist Roderick Spears.
“Brain freeze starts with a cold stimulus, such as ice cream, touching the palate, the roof of the mouth,” says Spears, a clinician in the neurology department in the Perelman School of Medicine at the University of Pennsylvania. The cold temperature causes vasoconstriction, when blood vessels constrict or shrink quickly.
“This pain can last for anywhere from a few seconds to a few minutes. But there’s an easy way to avoid it.”
But this isn’t what causes brain freeze. Instead, the pain comes from a rapid warming process called vasodilation, during which the vessels rebound back to regular size to counteract the initial rapid cooling. This signal heads to the brain via the trigeminal nerve. Because the trigeminal nerve is responsible for facial sensation, people often perceive this ice cream-related discomfort in the forehead or face.
“This pain can last for anywhere from a few seconds to a few minutes,” says Spears. “But there’s an easy way to avoid it.” Slow down. A study published in BMJ discovered that brain freeze occurs more frequently when people consume ice cream quickly.
Such a solution can’t help someone whose teeth hurt from sensitivity to the cold, however, explains Panagiota Stathopoulou, a periodontist with Penn Dental Medicine. People with healthy teeth and gums shouldn’t experience tooth sensitivity, she says. If this is happening, it could indicate that something is wrong.
“When someone experiences tooth pain or sensitivity, pain stimuli comes in contact with the tooth either directly or indirectly,” says Stathopoulou, an assistant professor of periodontics and director of the postdoctoral periodontics program.
The tissue inside the root canal of the tooth, called the pulp, contains nerves that are responsible for the sharp, uncomfortable feelings some people occasionally experience when they consume cold food and drinks.
“This is not fun…”
Several problems can cause consistent and painful tooth sensitivity. Each tooth has several layers. The exterior layer is a hard white covering called enamel. Just beneath the enamel, a softer, bony tissue called dentin makes up the bulk of the tooth. And dentin wraps around the pulp cavity, which contains living tissue and nerves.
The enamel serves as the pulp’s first layer of defense, with dentin as backup. But dentin is porous and contains tunnels called tubules, which enable the pulp to communicate with the tooth’s exterior. Normally, such communication is crucial, but when enamel breaks down those tubules allow all external oral stimuli, including ice cream, cold beer, even air, to travel directly through the porous dentin and into the pulp.
“This is not fun,” says Stathopoulou. “Additionally, the root of the tooth is normally protected by our gums. But if our gums recede, then we’ve lost that defense as well. Gum recession is often caused by overzealous brushing, and if it’s minor this problem can be solved easily by using a softer brush, better technique, and desensitizing toothpaste.”
Donald Trump and Planned Parenthood are the top recent advertisers and young men were targeted most often, according to a new analysis of Facebook and Instagram political advertising.
“We wanted to quickly give voters easy tools to understand who is advertising and what they are advertising…”
Using complex data scraping methods, cybersecurity researchers analyzed more than 267,000 political ads that primarily ran between May 2018 and July 2018. They developed tools to enable the public to do their own analyses, using weekly updates that the researchers plan to conduct through the November elections.
Initial findings reveal the top recent political advertisers and their minimum impressions and spending:
The Trump Make America Great Again Committee: 4,127 ads, 26.4 million impressions, $190,400
Planned Parenthood Federation of America: 3,389 ads, 24.5 million impressions, $188,800
AAF Nation, LLC (manufacturer of political-themed clothing): 862 ads, 18.4 million impressions, $78,900
National Rifle Association: 213 ads, 18.3 million impressions, $58,000
Beto for Texas (Democrat running for Senate): 377 ads, 13.0 million impressions, $194,400
Priorities USA Action and Senate Majority PAC: 2,794 ads, 12.9 million impressions, $120,600
NowThis (liberal-leaning media company): 35 ads, 11.6 million impressions, $7,400
Donald J. Trump for President, Inc.: 5,396 ads, 11.3 million impressions, $83,700
4Ocean, LLC (focused on reducing ocean pollution): 78 ads, 10.6 million impressions, $68,200
Care2 (creates social networking around causes): 557 ads, 10.1 million impressions, $99,900
The data also reveal substantial online advertising by candidates in congressional and state races.
The researchers found Facebook and Instagram users viewed political ads at least 1.4 billion times—and impressions may have reached nearly 3.9 billion. (Facebook’s data provide only ranges.)
Advertisers targeted males aged 25-34 most frequently. The most ads per capita appeared in Washington, DC, followed by Nevada, Colorado, and Maine. The fewest appeared in Delaware, Nebraska, and New Hampshire.
A heat map showing how Facebook political advertising varied widely from state to state. (Credit: NYU)
Political spending equaled at least $13.9 million and could have been five times that—the uncertainty is due to the ranges provided in the original data. A significant number of ads—43,573—did not comply with Facebook’s new requirement that political ads list sponsors and were therefore shut down, but the researchers’ daily archiving captured these “unvetted sponsor” ads.
The researchers note that some of the offenders may have been caught off guard by the policy change. They also note that while Facebook reduced the time it takes to shut down these ads from 26.4 days to 5.6 days, the delay remains longer than the ads typically run.
The team reported the top five unvetted sponsors as identified by Facebook and their minimum impressions and spending:
American AF: 253 ads, 8.2 million impressions, $103,800
National Rifle Association of America/NRA: 56 ads, 7.9 million impressions, $78,500
I’ll Go Ahead and Keep My Guns, Thanks (listed as a media company): 26 ads, 7.6 million impressions, $120,300
China Xinhua News: 44 ads, 6.8 million impressions, $6,000
Walmart: 18 ads, 5.8 million impressions, $51,900
Next, the team plans to use its complex data scraping methods to reveal similar information for Twitter.
Damon McCoy, an assistant professor of computer science and engineering at New York University conceived the Online Political Ads Transparency Project to build easy-to-use tools to collect, archive, and analyze political advertising data.
Although Facebook became the first major social media company to launch a searchable archive of political advertising, for both Facebook and Instagram, in May 2018, McCoy found the archive required time-consuming manual searches. He decided to apply versions of the data scraping techniques he had previously used against criminals, including human traffickers who advertised and used Bitcoin.
McCoy and his team praised Facebook for its pioneering transparency in establishing a public archive and its plan to launch an API—an app interface—that will enable large-scale analysis; however, Facebook has not specified when in 2018 it will launch this API.
“We wanted to quickly give voters easy tools to understand who is advertising and what they are advertising, as well as how much is being spent to influence votes and the targets of the ads,” McCoy says.
You can visit the project’s website here and download the project’s data here.
Collaborators on the Online Political Ads Transparency Project are from NYU.
Scientists have previously overlooked the astonishing physical strength of the thin outer membrane that clings to E. coli‘s stout cell wall, according to a new study.
For over a century, scientists have studied E. coli, one of the bacteria that cause food poisoning, as a model for fighting infections. Such research has led to a variety of antibiotics that penetrate the protective cell walls of bacteria to kill them.
The new research, however, reveals that E. coli has managed to keep a big secret about its defenses.
E. coli hides “armor” in plain sight - YouTube
Scientists had long known that many bacteria have outer membranes. But until now researchers thought of it like a layer of shrink wrap that simply made it tougher to get antibiotics into cells. But as the new study shows, the outer membrane physically protects the cell and could be a good target for a new class of antibacterial drugs.
“We’ve discovered that the outer membrane can act as a suit of armor that is actually stronger than the cell wall,” says K. C. Huang, an associate professor of bioengineering and of microbiology and immunology at Stanford University. “It’s humbling to think that this function had been hiding in plain sight for all these years.”
Huang says the findings suggest new infection-fighting strategies for the roughly half of all bacterial species that, like E. coli, have outer membranes.
“If we can attack the outer membrane, infectious bacteria will be pre-weakened for targeting with antibiotic treatments that disrupt cells in other ways,” he says.
Behind the shield
All bacteria have a cell wall that surrounds and protects the cell’s inner workings. Many decades ago, scientists discovered that E. coli and many other bacteria have an additional layer, called an outer membrane, that surrounds their cell walls.
Since its discovery, this outer membrane has been used as a way to classify bacteria into those that do and do not react to a common staining technique, called a Gram stain. Bacteria with outer membranes do not react to the chemical stain are called Gram-negative. Bacteria with naked cell walls react to the stain and are classified as Gram-positive.
Both kinds of bacteria can become infectious and, when this occurs, the presence or absence of an outer membrane can also help determine how responsive they will be to antibiotics. Gram-negative bacteria—which have outer membranes—tend to be more resistant to antibiotics.
“Scientists knew that outer membranes were chemical shields,” Huang says. “Thus, it was easy to relegate this third layer to an annoyance when dosing the cell with antibiotics.”
Tests of strength
In recent years, however, researchers have had clues that the outer membrane is more important than they’d thought. In one study, Huang’s lab removed E. coli‘s cell wall but left its outer membrane intact. Unsurprisingly, the bacteria lost their cucumber shape and became blobs. But a large fraction of these blobs survived, multiplied and ultimately regenerated new cucumber-shaped E. coli.
“…a strong outer membrane is the difference between life and death…”
Enrique Rojas, a former postdoctoral scholar in Huang’s lab and first author of the new paper, says that study was a clue that the outer membrane must play important structural and protective roles.
“We just listened to the data. Science is about data, not dogma,” says Rojas, now an assistant professor of biology at New York University.
Over the last four years, the group members tested the outer membrane’s structural powers.
They suddenly collapsed the pressure inside the bacteria, but instead of causing the cell wall to massively shrink, as prevailing assumptions would have predicted, they found that the outer membrane was strong enough to almost entirely maintain E. coli‘s cucumber shape.
In other experiments, they put E. coli cells through two hours of rapid increases and decreases in pressure. E. coli cells normally shrug off these repeated insults and grow as if no changes at all had occurred. However, when the researchers weakened the outer membrane, cells died quickly.
“The presence or absence of a strong outer membrane is the difference between life and death,” Huang says.
The experiments identified a handful of components that give the outer membrane its surprising strength. Drugs that destabilize the deceptively thin outer layer could help destroy infectious bacteria, Huang says.
Huang adds that the findings are part of an emerging field of study called mechanobiology. Whereas scientists once viewed cells as sacks of chemicals to study by chemical means, today a confluence of tools reveal the infinitely complex structural properties that make cells and organs tick.
“It’s a very exciting time to be studying biology,” Huang says. “We are approaching the point at which our tools and techniques are becoming precise enough to discern, sometimes at almost the atomic level, the physical rules that give rise to life.”
Additional coauthors are from Stanford; the University of California, San Francisco; and the University of Wisconsin-Madison.
Funding for the research came from the National Institutes of Health; the National Science Foundation; the Stanford Systems Biology Center and Simbios Center for Physics-Based Computation at Stanford; the Howard Hughes Medical Institute; the Swiss National Science Foundation; and the Allen Discovery Center program through the Paul G. Allen Frontiers Group.
Sending tests in the mail can boost rates of colorectal cancer screening, research shows.
In collaboration with the Mecklenburg County Health Department in Charlotte, researchers with UNC Lineberger’s Carolina Cancer Screening Initiative examined the impact of targeted outreach to more than 2,100 people insured by Medicaid who were not up-to-date with colorectal cancer screening.
The project resulted in a nearly 9 percentage point percent increase in screening rates for patients who received a screening kit in the mail compared with patients who just received a reminder, and it demonstrated that their method could serve as a model to improve screening on a larger scale. The findings appear in the journal Cancer.
50,600 deaths each year
The American Cancer Society estimates that more than 97,000 people will receive a colorectal cancer diagnosis in the United States this year, and the disease will result in approximately 50,600 deaths. It is third most common type of cancer in the United States, and the second leading cause of cancer death.
“Preventive care amongst vulnerable populations rarely rises to the top of the mental queue of things that need to get done.”
While colorectal cancer screening has proven effective in reducing cancer deaths, researchers report too few people are getting screened. Current guidelines from ACS recommend regular screening with either a high-sensitivity stool-based test or a structural (visual) exam for average-risk people aged 45 years and older, and that colonoscopy should follow all positive results.
Despite these recommendations, studies have identified notable gaps in screening rates, including by race, geographic region, and other socioeconomic factors. Among patients who are insured, people with Medicaid have the lowest rates of colorectal cancer testing.
“There has been a national push to increase colorectal cancer screening rates since colorectal cancer is a preventable disease, but screening rates are only about 63 percent, and low-income, and otherwise vulnerable populations, tend to be screened at even lower rates,” says first author Alison Brenner, research assistant professor in the UNC School of Medicine’s internal medicine department.
Test kits in the mail
For the project, researchers either mailed reminders about colorectal cancer screening and instructions on how to arrange one with the health department, or reminders plus a fecal immunochemical test, or FIT kit, which can detect blood in the stool—a symptom of colon cancer. The patient completes the test at home and returns it to a provider for analysis. Patients who have a positive FIT kit result will be scheduled for a colonoscopy.
The researchers worked with the Mecklenburg County Health Department staff, who coordinated the reminders and mailings and ran the test analyses. They also partnered with Medicaid care coordinators to provide patient navigation support to patients who had abnormal test results and required a colonoscopy.
Twenty-one percent of patients who received FIT kits in the mail completed the screening test, compared with 12 percent of patients who just received a reminder. Eighteen people who completed FIT tests had abnormal results, and 15 of those people were eligible for a colonoscopy. Of the 10 who completed the colonoscopy, one patient had an abnormal result.
“Preventive care amongst vulnerable populations rarely rises to the top of the mental queue of things that need to get done,” Brenner says. “In North Carolina, many Medicaid recipients are on disability. Making something like colorectal cancer screening as simple and seamless as possible is really important. If it’s right in front of someone, it’s more likely to get done, even if there are simple barriers in place.”
Brenner says the study shows the potential to harness resources like the county health department for health prevention services.
The researchers plan to move forward to study whether they can implement their approach on a larger scale, and to understand all of the cost implications, says Stephanie Wheeler, associate professor in the UNC Gillings School of Global Public Health and the study’s senior author.
“This is looking at expanding the medical neighborhood—to harness community resources to target patients and in this case, insured patients, who are maybe not getting this from a primary health care organization, and how to increase screening rates in these types of vulnerable populations,” Brenner says.
UNC Lineberger supported the study through a Tier 2 Stimulus Award, the Centers for Disease Control and Prevention, and the National Cancer Institute Cancer Prevention and Control Research Network. Individual researchers had support from the National Institutes of Health and the University of North Carolina Royster Society of Fellows.
While the United States is deeply divided on many issues, there is remarkable consensus on climate change, according to new research.
“But the American people are vastly underestimating how green the country wants to be,” says Jon Krosnick, a professor of communication and of political science at Stanford University, about new findings from a poll he led on American attitudes about climate change.
“The majority doesn’t realize how many people agree with them…”
Researchers conducted the study with ABC News and Resources for the Future, a Washington, DC-based research organization. They polled a representative sample of 1,000 American adults nationwide from May 7 to June 11, 2018. The margin of error is +/- 3.5 percentage points.
The poll showed that Americans don’t realize how much they agree about global warming: Despite 74 percent of Americans believing the world’s temperature has been rising, respondents wrongly guessed 57 percent.
“The majority doesn’t realize how many people agree with them,” says Krosnick. “And this may have important implications for politics: If people knew how prevalent green views are in the country, they might be more inclined to demand more government action on the issue.”
Breaking the numbers down along party lines, although Republicans and Democrats differ on the issue, the poll revealed that the gap is not as large as people perceive.
For example, 57 percent of Republicans believe the world’s temperature has probably been increasing over the past 100 years, and 66 percent believe that humans either mostly or partly caused the increase. However, respondents—which included Republicans, Democrats, and independents—thought only 43 percent of the Republican base perceived that the world’s temperature was probably going up.
Respondents also underestimated Democrats’ opinions. Respondents thought 69 percent of Democrats believed global warming has probably been happening, but in reality, the proportion is much higher at 89 percent.
Steady belief in climate change
“Public belief in the existence and threat of global warming has been strikingly consistent over the last 20 years, even in the face of a current administration skeptical about climate change,” says Krosnick, who has been tracking public opinion about global warming since 1995.
“…Americans continue to send a strong signal to government about their preferences on this issue.”
To coincide with the release of the 2018 survey data, Krosnick has launched a comprehensive website with findings from surveys he has conducted over 20 years. Included are detailed graphs that show how attitudes toward climate issues and policy have trended over time.
Among the most striking findings of the new poll is that the proportion of Americans who say the issue is extremely important to them personally is at an all-time high: 20 percent (up 7 points from 2015), with 56 percent saying it’s either very important or somewhat important.
“Twenty percent of Americans might seem like a small group, but these are people who wake up every morning saying, ‘Another day, another opportunity to do something about climate change,'” Krosnick says. These people are overwhelmingly on the green side of the issue: Some 68 percent say that government should do more. “These are the folks who put pressure on government to take action, and that group has been growing.”
What policies Americans support
The researchers also asked survey participants about what climate policies they support.
Despite US withdrawal from the Paris Climate Agreement, some 81 percent of respondents believe that the country should try to cut the greenhouse gases that trap heat in the Earth’s atmosphere to meet the target in that agreement. A majority of greenhouse gases in the Earth’s atmosphere today comes from carbon dioxide—which is released from burning fossil fuels (coal, natural gas, and oil).
One option to reduce greenhouse gas accumulations is to regulate those emissions through taxation.
More than two-thirds of survey respondents (67 percent) say the federal government should require companies to pay taxes for every ton of greenhouse gases they emit. In addition, some 78 percent say that a tax should be levied on oil, coal, or natural gas imported by a company from another country.
“Large majorities support some policy approaches and oppose others,” Krosnick says. “For example, the public objects to increasing taxes on gasoline and electricity designed to reduce consumption, perhaps because those taxes guarantee an increase in what consumers pay without a guarantee that emissions will actually be reduced.”
In the survey, people overwhelmingly favored renewable energy over the traditional oil industry. For example, 81 percent support tax breaks to companies that produce electricity from water, wind and solar power. Americans also see an opportunity for future employment within this sector: 69 percent say the better way for the government to encourage job creation is by developing renewable energy rather than encouraging fossil fuel use.
The researchers also found broad distrust in the traditional energy sector. For example, 78 percent believe that oil companies have not been honest about their products’ role in global warming and think the companies have tried to cover it up. Their doubt is also reflected when it comes to creating American jobs: Only 21 percent believed that protecting the traditional energy industry was the better way for job growth.
Futurity by Carsten Munk Hansen-u. Copenhagen - 1d ago
Researchers have discovered the charred remains of a flatbread that hunter-gatherers baked 14,400 years ago. It is the oldest direct evidence of bread found to date, predating the advent of agriculture by at least 4,000 years.
The findings suggest that bread production based on wild cereals may have encouraged hunter-gatherers to cultivate cereals, and thus contributed to the agricultural revolution in the Neolithic period.
Baking bread before farming
The researchers found the charred food remains at a 14,400-year-old Natufian hunter-gatherer site known as Shubayqa 1 located in the Black Desert in northeastern Jordan.
“The presence of hundreds of charred food remains in the fireplaces from Shubayqa 1 is an exceptional find, and it has given us the chance to characterize 14,000-year-old food practices,” says first author Amaia Arranz Otaegui, an archaeobotanist at the University of Copenhagen.
Amaia Arranz-Otaegui and Ali Shakaiteer sampling cereals in the Shubayqa area. (Credit: Joe Roe via U. Copenhagen)
“The 24 remains analyzed in this study show that wild ancestors of domesticated cereals such as barley, einkorn, and oat had been ground, sieved, and kneaded prior to cooking. The remains are very similar to unleavened flatbreads identified at several Neolithic and Roman sites in Europe and Turkey. So we now know that bread-like products were produced long before the development of farming,” Otaegui says.
“The next step is to evaluate if the production and consumption of bread influenced the emergence of plant cultivation and domestication at all.”
“…extremely time-consuming production of bread based on wild cereals may have been one of the key driving forces behind the later agricultural revolution…”
“Natufian hunter-gatherers are of particular interest to us because they lived through a transitional period when people became more sedentary and their diet began to change, explains Tobias Richter, an archaeologist at University of Copenhagen who led the excavations at Shubayqa 1 in Jordan.
“Flint sickle blades as well as ground stone tools found at Natufian sites in the Levant have long led archaeologists to suspect that people had begun to exploit plants in a different and perhaps more effective way. But the flat bread found at Shubayqa 1 is the earliest evidence of bread making recovered so far, and it shows that baking was invented before we had plant cultivation. So this evidence confirms some of our ideas,” Richter says.
“Indeed, it may be that the early and extremely time-consuming production of bread based on wild cereals may have been one of the key driving forces behind the later agricultural revolution where wild cereals were cultivated to provide more convenient sources of food,” he says.
Lara Gonzalez Carratero, a PhD candidate at the Institute of Archaeology at University College London who is an expert on prehistoric bread, analyzed the charred food remains with electronic microscopy.
“That [bread] was produced before farming methods suggests it was seen as special…”
“The identification of ‘bread’ or other cereal-based products in archaeology is not straightforward. There has been a tendency to simplify classification without really testing it against an identification criteria. We have established a new set of criteria to identify flat bread, dough and porridge like products in the archaeological record. Using Scanning Electron Microscopy we identified the microstructures and particles of each charred food remain,” says Gonzalez Carratero.
“Bread involves labor intensive processing which includes dehusking, grinding of cereals, and kneading and baking,” explains Dorian Fuller, a professor of archaeobotany. “That it was produced before farming methods suggests it was seen as special, and the desire to make more of this special food probably contributed to the decision to begin to cultivate cereals.
“All of this relies on new methodological developments that allow us to identify the remains of bread from very small charred fragments using high magnification,” Fuller says.
The Independent Research Fund Denmark funded the Shubayqa project. The Department of Antiquities of Jordan granted permission to excavate. Additional researchers from the University of Copenhagen, University College London, and the University of Cambridge contributed to the project.
Sugar improves memory in older adults—and makes them more motivated to perform difficult tasks at full capacity—according to new research.
The study finds that increasing blood sugar levels not only improves memory and performance, but also makes older adults feel happier during a task.
“Over the years, studies have shown that actively engaging with difficult cognitive tasks is a prerequisite for the maintenance of cognitive health in older age. Therefore, the implications of uncovering the mechanisms that determine older adults’ levels of engagement cannot be understated,” says study leader Konstantinos Mantantzis, a PhD student from the psychology department at the University of Warwick.
The researchers gave younger (18-27) and older (65-82) participants a drink containing a small amount of glucose, and got them to perform various memory tasks. Other participants were given a placebo drink containing artificial sweetener.
The researchers measured participants’ levels of engagement with the task, their memory score, mood, and their own perception of effort.
They found that increasing energy through a glucose drink can help both young and older adults to try harder compared to those who had the artificial sweetener. For young adults, that’s where the effects ended, though: glucose did not improve either their mood or their memory performance.
However, older adults who had a glucose drink showed significantly better memory and more positive mood compared to older adults who consumed the artificial sweetener.
Moreover, although objective measures of task engagement showed that older adults in the glucose group put more effort into the task than those who consumed the artificial sweetener, their own self-reports showed that they did not feel as if they had tried any harder.
The authors conclude that short-term energy availability in the form of raised blood sugar levels could be an important factor in older adults’ motivation to perform a task at their highest capacity.
Heightened motivation, in turn, could explain the fact that increased blood sugar levels also increase older adults’ sense of self-confidence, decrease self-perceptions of effort, and improve mood. However, more research is needed to disentangle these factors in order to fully understand how energy availability affects cognitive engagement, and to develop clear dietary guidelines for older adults.
“Our results bring us a step closer to understanding what motivates older adults to exert effort and finding ways of increasing their willingness to try hard even if a task seems impossible to perform,” says coauthor Friederike Schlaghecken, also from the psychology department.
Structural changes in the brain may suggest that psychedelics, such as LSD and MDMA, are capable of repairing the circuits that malfunction in mood and anxiety disorders.
A wide range of psychedelic drugs increase the number of neuronal branches (dendrites), the density of small protrusions on these branches (dendritic spines), and the number of connections between neurons (synapses), report researchers in the journal Cell Reports.
“Ketamine is no longer our only option.”
“People have long assumed that psychedelics are capable of altering neuronal structure, but this is the first study that clearly and unambiguously supports that hypothesis. What is really exciting is that psychedelics seem to mirror the effects produced by ketamine,” says David Olson, assistant professor in the departments of chemistry and of biochemistry and molecular medicine at the University of California, Davis, who leads the research team.
Ketamine, an anesthetic, has been receiving a lot of attention lately because it produces rapid antidepressant effects in treatment-resistant populations, leading the US Food and Drug Administration to fast-track clinical trials of two antidepressant drugs based on ketamine. The antidepressant properties of ketamine may stem from its tendency to promote neural plasticity—the ability of neurons to rewire their connections.
“The rapid effects of ketamine on mood and plasticity are truly astounding. The big question we were trying to answer was whether or not other compounds are capable of doing what ketamine does,” Olson says.
Olson’s group has demonstrated that psychedelics mimic the effects of ketamine on neurons grown in a dish, and that these results extend to structural and electrical properties of neurons in animals. Rats treated with a single dose of DMT—a psychedelic compound found in the Amazonian herbal tea known as ayahuasca—showed an increase in the number of dendritic spines, similar to that seen with ketamine treatment.
DMT itself is very short-lived in the rat: Most of the drug is eliminated within an hour. But the “rewiring” effects on the brain could be seen 24 hours later, demonstrating that these effects last for some time.
Behavioral studies also hint at the similarities between psychedelics and ketamine. In another recent paper published in ACS Chemical Neuroscience, Olson’s group showed that DMT treatment enabled rats to overcome a “fear response” to the memory of a mild electric shock. This test is considered to be a model of post-traumatic stress disorder, or PTSD, and interestingly, ketamine produces the same effect.
Recent clinical trials have shown that like ketamine, DMT-containing ayahuasca might have fast-acting effects in people with recurrent depression, Olson says.
These discoveries potentially open doors for the development of novel drugs to treat mood and anxiety disorders, Olson says. His team has proposed the term “psychoplastogen” to describe this new class of “plasticity-promoting” compounds.
“Ketamine is no longer our only option. Our work demonstrates that there are a number of distinct chemical scaffolds capable of promoting plasticity like ketamine, providing additional opportunities for medicinal chemists to develop safer and more effective alternatives,” Olson says.