Liver disease deaths are growing more common in the U.S. and disproportionately affecting younger Americans, according to a recent study.
The paper, published in The BMJ just a day after a Centers for Disease Control and Prevention (CDC) report on rising liver cancer death rates, paints a troubling picture of how Americans’ drinking habits may be affecting their health. While the new study couldn’t prove causation, the researchers say drinking is likely to blame for the growing number of adults aged 24 to 35 who are dying from cirrhosis, or scarring of the liver.
The researchers used deaths logged in the CDC’s WONDER database between 1999 and 2016 to determine mortality trends during those 17 years. During that time period, more than 34,000 people died of cirrhosis, accounting for a 65% increase over the study period. Rates of hepatocellular carcinoma, the most common form of liver cancer, also doubled, with more than 11,000 people dying of the disease.
Although nearly every demographic saw increases in cirrhosis beginning in 2009, after a period of decline from 1999 to 2008, the trends were particularly stark among certain demographics.
Younger Americans, for example, saw the largest increase in their cirrhosis death rate (10.5%), even though older age groups still experience more deaths overall. Cirrhosis now accounts for about 1.4% of deaths in the 24-35 age group, largely driven by this population’s drinking habits, according to the paper.
A separate study published this week found that younger adults are at particularly high risk of starting and sustaining problem drinking habits. And The BMJ study’s authors note that the cirrhosis mortality trend’s start in 2009 — just after the 2008 financial crisis — is in line with research that has found young men to be particularly susceptible to alcohol misuse after unemployment or financial strain.
Native Americans, white Americans and Hispanic Americans have also seen significant increases in liver disease death rates since 1999, the paper says. Geographically, cirrhosis is growing particularly common in Kentucky, New Mexico, Arkansas, Indiana and Alabama.
Liver cancer, meanwhile, is on the decline among younger Americans, and on the rise among those in older age groups, according to the study — a finding consistent with the CDC’s recent report. Liver cancer deaths were most common among Asians and Pacific Islanders, but that group was also the only one to see a slight dip in its death rate during the study period.
The two recent reports add to a growing body of evidence that Americans’ drinking habits have grown increasingly problematic in recent years. A March CDC report, for example, found that 17% of the U.S. population binge drinks, and a February editorial also published in The BMJ blamed alcohol misuse, along with drugs and suicide, for a recent drop in U.S. life expectancy.
“The increasing mortality due to cirrhosis and hepatocellular carcinoma speak to the expanding socioeconomic impact of liver disease,” the BMJ authors write. “Adverse trends in liver related mortality are particularly unfortunate given that in most cases the liver disease is preventable.”
Bayer announced Friday that it will no longer sell a controversial, permanent form of birth control that thousands of women say led to serious complications.
Essure, a non-surgical sterilization device that is inserted into the fallopian tubes and prevents pregnancy by producing scar tissue that blocks sperm from fertilizing eggs, will no longer be available in the U.S. after December 31, 2018, Bayer announced in a statement. “This decision is based on a decline in U.S. sales of Essure in recent years and the conclusion that the Essure business is no longer sustainable,” the statement reads.
While Bayer maintains that “the safety and efficacy of Essure have not changed,” many women have different stories. More than 16,000 lawsuits have been filed against Bayer in relation to Essure, citing complications ranging from migraines and hair loss to organ perforation and unintended or dangerous pregnancies, according to ConsumerSafety.org. Many of these women have chronicled their ordeals in Essure Problems, a nearly 37,000-member Facebook group for women who are “suffering or have suffered from side effects which may be attributed to Essure.”
The Food and Drug Administration (FDA), which approved Essure in 2002, has also taken steps to warn women of the risks associated with the device. In 2016, it ordered Bayer to conduct a post-market safety study and to add warning labels and a patient risk checklist to the device’s packaging. This April, the FDA significantly restricted sales and distribution of Essure.
“Since the FDA ordered Bayer to conduct the post-market study and then to add a boxed warning and a Patient Decision Checklist to the labeling, there has been an approximate 70 percent decline in sales of Essure in the U.S,” FDA Commissioner Scott Gottlieb said in an April statement. About 750,000 women have been implanted with the device since its approval.
Bayer, however, maintains that other factors are behind the product’s drop in popularity. “Several factors have contributed to declining interest in Essure among women in the U.S., including decreased use of permanent contraception overall, increased reliance on other birth control options, such as long-acting reversible contraceptives (LARCs), and inaccurate and misleading publicity about the device,” the company said in its statement.
Bayer halted sales of Essure in every country besides the U.S. last fall.
Researchers have long touted the mood-boosting effects of green space and spending time outdoors — and a new study emphasizes just how much of an impact your environment can have on your mental health.
The paper, published Friday in JAMA Network Open, found an association between urban restoration efforts in Philadelphia and the mental health of city residents. “Cleaning and greening” urban lots in Philadelphia was linked to a drop in neighborhood residents feeling depressed or worthless, and a slight uptick in overall resident mental health, the study says.
“Vacant lot greening is a very simple structural intervention that’s relatively low-cost and that can have a potentially wide or broad population impact,” says study co-author Dr. Eugenia South, an assistant professor of emergency medicine at the University of Pennsylvania’s Perelman School of Medicine. “Performing simple interventions to the neighborhood environment has an impact on health.”
For the study, a team of researchers identified 541 vacant lots in Philadelphia and divided them into clusters: groups of lots within a quarter-mile radius that all showed signs of urban blight, like illegal dumping, abandoned cars and overgrown vegetation. Next, they interviewed 442 adults living within one of these clusters. People were told they had been chosen for a study focused on “improving our understanding of urban health,” and answered questions about mental health. They did not know the researchers were involved in forthcoming urban greening efforts.
After the initial surveys were completed, the researchers randomly selected 37 lot clusters for a greening intervention that involved removing trash and debris, planting grass and trees, installing a fence and performing routine maintenance. Another 36 clusters had trash removed and minor maintenance, but little in the way of increasing green space. The final 37 were left untouched.
Within 18 months of completing the restoration efforts, the researchers re-interviewed 342 of the original study participants, about a third of whom lived near one of the clusters assigned to the greening intervention. Compared to people who lived near lots with no improvements, these people experienced a 41% drop in depressive feelings and an almost 51% drop in feelings of worthlessness. Overall improvements to mental health didn’t quite reach statistical significance, but South says the researchers are “pretty confident that people are experiencing better mental health.”
The study’s results suggest that there’s something special about green space, as people living near a lot cluster that only went through trash removal did not see significant mental health benefits.
“The green space in and of itself is important,” South says. “There are several mechanisms through which that’s proposed to happen, including increased social connections and recovery from mental fatigue and coping with general life stress. The fact that it’s green space, and not, say, a parking lot, is important. The wooden fence also matters: That fence kind of delineates the space as a space that is now being cared for — it’s a space that people are paying attention to.”
Greening was particularly impactful for people living in neighborhoods falling below the poverty line. “The poorer neighborhoods are the most hard-hit, as far as the neighborhood environment being dilapidated and run down,” South says. “Those people are potentially the people who have the biggest health impact from the neighborhood environment, so making changes to this environment could have the biggest impact on them.”
Taken together, the results suggest that urban greening could offer a real opportunity for cities looking to improve population mental health, especially since it only cost about $1,600 to transform an abandoned lot, and $180 per year to maintain it, South says. She and her colleagues are already working with the Pennsylvania Horticultural Society to implement the program more widely in Philadelphia and says other cities have also expressed interest.
“This is not an end-all and be-all treatment for depression in any way — this goes along with other individual patient treatments — but when you think about the amount of money [spent on mental health care], it’s a pretty low-cost intervention,” South says.
A salmonella outbreak connected to raw turkey has affected 90 people in 26 states, the Centers for Disease Control and Prevention announced Thursday.
At least 40 people have been hospitalized due to the Salmonella outbreak, according to the CDC. There are currently no deaths associated with this outbreak, which began last November.
It is unclear where the Salmonella outbreak is coming from, and the CDC said it is still looking into which supplier or type of raw turkey product is linked to the outbreak. The agency is monitoring the outbreak in conjunction with the U.S. Department of Agriculture’s Food Safety Inspection Service.
Lab tests have determined that contaminated raw turkey products from different sources are the cause of this salmonella outbreak. The people infected reported eating different types and brands of turkey products, with two victims reporting they lived in a household where raw turkey pet food was fed to their animals, the CDC said.
The salmonella outbreak affected people in Alaska, California, Colorado, Florida, Georgia, Hawaii, Iowa, Illinois, Indiana, Kansas, Kentucky, Massachusetts, Michigan, Minnesota, New Jersey, New York, North Carolina, Ohio, Oregon, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas, Virginia and Wisconsin, according to the CDC. Minnesota had the most people affected by the salmonella outbreak with 13 people ranging in age from 1 to 91 years old getting sick.
People should handle raw turkey with care and always cook in thoroughly to prevent food poisoning. The CDC also advised consumers to wash their hands before and after preparing or eating any food made with raw turkey. At this time, the CDC hasn’t advised consumers to avoid eating fully cooked turkey products or told retailers to take those products off the shelves.
The most common side effects of people infected with salmonella include diarrhea, fever, and stomach cramps 12 to 72 hours after being exposed to the bacteria, the CDC said. Salmonella typically lasts four to seven days, and most people recover without treatment. In severe cases, some patients may need to be hospitalized.
Scientists are learning that certain foods — either because of their natural ingredients or because of added chemicals — can have significant effects on health. One way in which food can exert this influence health is through inflammation, which is triggered by the immune system and may have an impact on the risk of developing a number of chronic conditions.
Plenty of foods contribute to inflammation, from sugar to saturated fat. But in a new study, cured meats are under scrutiny. They usually contain nitrates, a group of chemicals used as a preservative to cure meats like jerky, meat sticks and hot dogs, and nitrates have also been linked to inflammation and unhealthy outcomes, including cancer and brain disorders.
Dr. Robert Yolken, professor of pediatrics at Johns Hopkins School of Medicine, and his colleagues have been investigating how exposure to certain environmental factors, including nitrates, might affect psychiatric disorders. In a study published in Molecular Psychiatry, he and his team studied 1,101 people, some of whom had psychiatric disorders and some who did not, who had also filled out questionnaires about whether they had eaten different types of food. (The survey did not collect information on how recently the people had consumed the foods, or how much they had eaten, but Yolken says most people answered based on their recent diets.) They were asked about cured meats, as well as about raw and uncooked meats.
Among people hospitalized for psychiatric disorders, people who ate cured meat, which included salami and various forms of dried meat sticks and jerky, had a nearly 3.5 higher likelihood of having been admitted for mania, compared to people in the control group, who did not have psychiatric conditions. Mania, which is a state of abnormally alternating mood swings, is most often associated with bipolar disorder. Certain kinds of cured meats, including beef or turkey jerky and meat sticks, were linked to the highest risk for mania, while those consuming prosciutto and dehydrated cured meats did not show a significant increased risk of mania.
Yolken says the association with nitrates was striking with mania only; the chemicals did not seem to be linked to a significant increase in other psychiatric disorders like bipolar disorder, schizophrenia or major depressive disorder. But, he notes, that may simply be due to the small number of people in the study affected by these conditions. He says that more research in larger populations is needed to determine if nitrates are linked to other psychiatric illnesses, and to confirm the association with mania.
He also says that the data do not suggest that eating foods high in nitrates will necessarily cause manic episodes. In the study, some people without a history of psychiatric disorders also consumed meats with nitrates. What the findings do suggest, however, is that nitrates might be one of the many factors that could contribute to mania, Yolken says. If that’s the case, controlling exposure may be one way to lower the risk of episodes, he says.
How nitrates contribute to manic episodes isn’t clear from the human data alone, but Yolken also conducted studies on rats to better understand how the chemicals might be affecting the brain. When he fed rats either normal chow or chow supplemented with a piece of commercial beef jerky, the animals eating the jerky began to sleep irregularly and become hyperactive within two weeks. In addition, he worked with a jerky manufacturer to develop a nitrate-free jerky, and when he fed animals this jerky, they did not develop hyperactivity or sleep disturbances — although the rats fed commercial jerky with nitrates (in the amount that would be found in a beef jerky stick or hot dog), did.
He also analyzed the intestinal bacteria in the animals, since the makeup of gut microbes can affect body processes like inflammation. Indeed, the animals eating the nitrate-enhanced diet showed different populations and patterns of bacteria than those not exposed to the chemicals. (It’s not known, however, what this bacterial shift might mean.)
If nitrates in the diet are affecting inflammation through gut bacteria, Yolken says, and if that inflammation is playing a role in psychiatric conditions like mania, then it’s possible that adjusting gut microbes with things like probiotics could be one way to affect people’s risk of psychiatric illnesses. In a previous study, Yolken and his team showed that people with bipolar disorder who were given certain probiotics were less likely to be re-hospitalized after a manic episode compared to people not treated with the probiotic.
Much more research is needed to determine the relationship — if there is one — between cured meats and psychiatric conditions. But Yolken is encouraged by what all of the data taken together could mean. “The question is how do we understand and control inflammation, and diet can certainly be one way,” he says. “It’s part of the overall question of how do we try to help people with mania and other psychiatric conditions — and one way may be to cut down on their environmental exposures.”
People diagnosed with cancer have a multitude of treatment options, many of which are standard therapies that have been well-studied to improve their chances of surviving their disease or avoiding recurrence.
But people are increasingly also folding in complementary medicine approaches — which include nutrients, herbal remedies and other so-called natural supplements — with their cancer treatment regimes. While these are not nearly as well-studied as conventional therapies like surgery, radiation and chemotherapy, many people rely on them because they believe they can improve their chances of surviving their cancer or keeping recurrences at bay.
In a new study published in JAMA Oncology, researchers say that may not be the case. Dr. Skyler Johnson, chief resident in therapeutic radiology at Yale School of Medicine, and his colleagues analyzed data from nearly 1,300 people in the National Cancer Database with four common cancers: breast, prostate, lung or colorectal. 1,032 of those people used only conventional medicine, and 258 used at least one conventional treatment and one or more complementary medicine strategies (which were recorded as “other unproven cancer treatments administered by nonmedical personnel”). These included IV, oral and topical therapies made up of vitamins, minerals or herbal supplements, says Johnson.
Johnson found that people who added a complementary medicine approach had twice the risk of dying during the nine-year study compared to people who only chose conventional treatment. The people who opted to add complementary medicine were also more likely to refuse surgery, chemotherapy, radiation and hormone therapy compared to people who did not. Johnson says it was this avoidance of recommended treatment that was largely responsible for the higher early death rate; among people who did all of their conventional treatments and complementary medicine therapies, there was no difference in survival.
“There is data showing that the majority of patients using complementary medicine therapies for cancer treatment are doing so because they believe they are going to improve their survival rates and cure rates,” says Johnson. “But they should know that there is no data suggesting that if you use these therapies you can improve your survival or cure rates.”
He says that when patients bring up complementary therapies, some doctors believe they “couldn’t hurt,” since they might make people feel more comfortable about their treatment. But “based on these data, there are clearly some issues with that approach,” he says.
He says there is some evidence from other studies that while many complementary approaches are “natural” and therefore believed to be safe, they can contain biologically active ingredients that can interact with chemotherapy and other cancer treatments, making them either less effective or even more toxic and detrimental to people’s health.
“We hope this information gives providers and patients pause to at least consider the chance that these complementary therapies might actually result in a detrimental effect,” he says.
The number of women who have a heart attack during pregnancy, labor or in the weeks following birth appears to be rising.
In a new study, published Wednesday in the journal Mayo Clinical Proceedings, researchers looked at more than 49 million births.Among the women who gave birth, 1,061 had a heart attack during their labor and delivery; 922 had heart attacks during their pregnancy, and 2,390 women had heart attacks after they gave birth.
Overall, the risk of having a heart attack was relatively low. But the risk increased 25% from 2002 to 2014, the researchers found, which they call a concerning rise. Among women who had a heart attack during or immediately following pregnancy, the in-hospital mortality rate was 4.5%, which the researchers say is surprisingly high, given that women of child-bearing age are otherwise considered a low risk for heart attacks.
“Although heart attacks in young women are rare, the time during and immediately after pregnancy is a particularly vulnerable period during which heart disease may be unmasked,” says study author Dr. Nathaniel Smilowitz, an interventional cardiologist and assistant professor of medicine at NYU Langone Health, in an email to TIME.
It’s still unclear what might be behind the increase in heart problems during and after birth, but the researchers speculate that there are a few contributing reasons. More women are having children later in life, and older women are more at risk for heart attacks than younger women. Compared to pregnant women in their 20s, pregnant women from ages 35 to 39 have a five-times greater risk of heart attack—and the risk is even higher for women over 40, the researchers note. Diseases like obesity and diabetes, which increase a person’s risk of heart disease, are also much higher now than in the past. Part of the increase may also be due to better heart attack detection and data collection.
“All women should know their cardiovascular risk factors, such as high blood pressure, diabetes and obesity, and work with their doctors to control these factors before or early on during pregnancy,” says Smilowitz. “Women who are pregnant, or recently delivered, who develop chest pain or burning should recognize these warning signs and be evaluated by their doctor.”
With all the talk about the disease-fighting, life-extending superpowers of the Mediterranean diet, a lot of people are trying to cram more seafood into their meals. But while there are endless articles extolling the healthful glories of fatty, omega-3-rich fish like salmon and mackerel, there’s not much talk about shellfish—or whether these sea creatures deserve space on your shopping list.
As it turns out, they do. “Shellfish are high-quality protein sources—just like land animals—meaning they have all the essential amino acids,” says Faye Dong, professor emerita of food science and human nutrition at the University of Illinois. Those “essential” amino acids are ones your body can’t make on its own but are needed to support proper cellular function and muscle health—making them a crucial component of a healthy diet.
Like meat from land animals, shellfish have a range of cholesterol levels; shrimp, lobster and crab have a bit more than mussels, oysters and other mollusks. But there’s ongoing debate about whether dietary cholesterol really contributes to unhealthy levels of blood cholesterol. And, in any case, the amount in shellfish is much lower than in land animal sources of protein, like chicken or beef.
Shellfish meat is also low in fat. “And the fat it has falls under the category of healthy fats, meaning it’s low in saturated fat and high in the omega-3’s DHA and EPA,” says Ann Yaktine, director of the Food and Nutrition Board at the National Academy of Sciences. Shellfish aren’t nearly as impressive on the omega-3 front as salmon. But oysters, shrimp, crab, lobster and mussels have about 25%-50% the omega-3s per serving as the healthiest fatty fish.
Depending on the type of shellfish you’re eating, most have varying amounts of some hard-to-get micronutrients. Selenium—a trace mineral important for cognitive and immune function—is most abundant in seafood. Shellfish are also rich sources of B vitamins, which help support nerve structure and cell function.
Finally, shellfish are good sources of some healthy minerals. Zinc, copper and iron can be difficult to get, Dong says, but they’re found in shellfish. Copper is especially abundant in lobsters and oysters, and it helps the body to make collagen, hemoglobin and other proteins necessary to human health and functioning. Zinc is important for immune function and wound healing, and oysters contain more zinc per serving than any other food. (Eat just two oysters, and you’ll meet the government’s recommended daily intake for zinc.) Clams, meanwhile, are great sources of iron.
That’s the good news. The bad news is that depending on the type of shellfish you’re eating and where it comes from, there are some potential contamination concerns.
Shrimp and lobster can accumulate heavy metals, namely lead and the metal cadmium, which is sometimes used in industrial manufacturing. And research suggests that fried preparations (as opposed to boiling or steaming) can heighten an eater’s exposures. More research has turned up elevated levels of cadmium in some Pacific oysters.
There’s also some concern about the use of antibiotics in shrimp, says Larry Olmsted, author of Real Food, Fake Food, a book about food provenance and safety. Shrimp is the most popular seafood in America, but much of it is farmed in Southeast Asia at facilities that are known to use banned antibiotics, says Olmsted. “If I buy shrimp, I get wild caught shrimp from the Gulf of Mexico,” he says. If you’re concerned about over-fishing and sustainability, Olmsted recommends looking for the blue Marine Stewardship Council (MSC) seal. MSC is a non-profit that audits seafood producers for sustainable fishing practices.
Finally, one of your best safeguards against any potential risk is to vary the types of shellfish you consume. “Moderation goes a long way,” Dong says, adding that you’re probably safe eating any type of shellfish once or twice a week. “Even if you’re eating something that’s contaminated, your body can clear that out,” she says.
Yaktine agrees. “Variety is a good rule of thumb,” she says. In terms of food safety, nutrition and the avoidance of over-fishing a species, “spreading the wealth” when it comes to dining on shellfish is a prudent course.
Though it could not prove causation, a new study, published Tuesday in JAMA, found an association between lots of screen use in teenagers and symptoms of attention-deficit/hyperactivity disorder (ADHD). The disorder is characterized by difficulty paying attention, paired with hyperactivity and impulsive behavior, according to the Mayo Clinic.
In 2014, a group of researchers surveyed more than 3,000 California 10th graders about their digital media use and their self-reported frequency of symptoms that could indicate ADHD, such as difficulty completing tasks and trouble staying still. They then followed up with the roughly 2,600 teenagers who met the study criteria and did not initially show significant signs of ADHD every six months for the rest of high school, tracking their media habits and looking for signs of ADHD.
In the first survey, students reported engaging in an average of 3.62 digital media activities “many times per day”; checking social media, texting and browsing images or videos were the most common high-frequency behaviors. Each additional high-frequency habit reported at the start of the study was associated with a roughly 10% higher chance of that student later developing ADHD symptoms, the researchers found. Throughout the study period, 9.5% of the 114 students who said they had seven or more high-frequency digital habits at baseline developed ADHD symptoms, compared to 4.6% of the 495 students who said they did not frequently engage in any digital behaviors.
The researchers note in the paper that the study does not prove that screen use actually causes ADHD symptoms; it merely shows that the two are likely related. It’s possible, for example, that students with signs of ADHD are more likely than others to check their phones frequently. It’s also possible that some students who later tested positive for symptoms of ADHD already had those characteristics, though potentially at an undiagnosable severity, during baseline testing. Finally, separate environmental factors — such as parental influence — could play a role in both media use and attention style.
Still, the connection between digital media and ADHD shouldn’t be ignored, according to the researchers. The constant pinging of text messages and other notifications “could disrupt normative development of sustained attention and organization skills,” while the immediate reward of always being able to find new content online “could disrupt development of impulse control and patience,” they write.
More research will be required to determine the full extent of digital media’s influence on attention and focus, and more still to decide on appropriate public health interventions, if necessary. For now, parents can consult the American Academy of Pediatrics’ screen use guidelines, which recommend no more than an hour of daily screen time for kids ages 2 to 5, and “personalized media use plans” that allow plenty of time for exercise, sleep, schoolwork and screen-free time for school-age kids and adolescents.
(NEW YORK) — The head of the nation’s top public health agency says the opioid epidemic will be one of his priorities, and he revealed a personal reason for it: His son almost died from taking cocaine contaminated with the powerful painkiller fentanyl.
“For me, it’s personal. I almost lost one of my children from it,” Dr. Robert Redfield Jr. told the annual conference of the National Association of County and City Health Officials.
The AP viewed a video of his speech, which he delivered Thursday in New Orleans. Redfield declined to speak about it Monday, except to say in a statement: “It’s important for society to embrace and support families who are fighting to win the battle of addiction — because stigma is the enemy of public health.”
Redfield mentioned his younger son while talking about his priorities for the U.S. Centers for Disease Control and Prevention, where he started as director in March. He listed the opioid crisis first, calling it “the public health crisis of our time.”
Public records show that the son, a 37-year-old musician, was charged with drug possession in 2016 in Maryland. The outcome of the case is not available in public records.
Dr. Umair Shah, the head of Houston’s county health department, applauded the CDC director’s moment of candor.
“It was definitely an intimate moment that grabbed the audience of public health professionals,” said Shah, who just finished a term as president of the association.
About 70,000 Americans died of drug overdoses last year, according to preliminary CDC numbers released last week. That’s a 10 percent increase from the year before.
Most of the deaths involved opioids, which are driving the deadliest drug overdose epidemic in U.S. history. Growing numbers of recent deaths have been attributed to fentanyl and fentanyl-like drugs, which are relatively cheap and are sometimes cut by suppliers into heroin, cocaine or other drugs without buyers’ knowledge.
AP News Researcher Jennifer Farrar in New York contributed to this report.