Wolbachia in the gut makes fruit flies wimpy

D. melanogaster

Rohrscheib et al have a recent paper in Applied and Environmental Microbiology showing a new way that microbes can manipulate host behavior.

Male Drosophila inoculated with Wolbachia were found to engage in fewer aggressive behaviors than uninfected controls. In effect, Wolbachia transformed male fruit flies into pacifists. How does this happen? The mechanism of passive behavior caused by Wolbachia involves downregulation of octopamine, a neuropeptide that has been previously shown to modulate aggressive behavior in fruit flies.

Read the entire paper here:

Wolbachia Influences the Production of Octopamine and Affects Drosophila Male Aggression

Read the abstract here: Continue reading

Less is more in blood transfusion

A recently published study in the Lancet continues to reinforce the view that less is more for blood transfusions.

Jairath and colleagues tested whether a restrictive approach (in which patients were transfused when hemoglobin concentration fell below 80 g/L) versus a liberal approach (in which transfusion was initiated when the hemoglobin concentration fell below 100 g/L). The liberal approach was not superior to the restrictive approach, suggesting we should tolerate lower hemoglobin levels in patients before starting blood transfusions.

As far as I know, most studies along these lines have shown either improved outcomes or no difference when restrictive transfusion thresholds are compared to liberal ones. These data support a less interventionist perspective on blood transfusion.

Read the Lancet paper here.

Ignoring the brain’s host defense function – perilous for patients!

220px-Man_With_Two_Brains

The brain matters but we act like it doesn’t

I have vivid memories of my training in the trauma intensive care unit where some of my patients clung to life while receiving life support, especially an unfortunate young man who had suffered a shotgun injury to the abdomen and died a prolonged death. We got increasingly concerned as the number of failing organ systems increased: lungs (function replaced with intubation and artifical ventilation), heart (augmented with vasopressors and inotropes), kidneys (replaced with hemofiltration or dialysis). I remember thinking that if I were in that situation, I would want to be unconscious to the horror of the ICU.

Is it really better for our patients to be made unconscious with potent sedative medications?

A new study indicates that, like many things we do in medicine, less is more when it comes to sedation in critical illness.

Balzer and colleagues performed a retrospective study involving 1884 patients who were sedated in the intensive care unit. The bottom line: Deep sedation early in the ICU stay increased in-hospital mortality and was associated with more deaths at 2 year follow up.

Hmm. Why?

By deeply sedating our patients, we are artificially inducing brain failure. Remember that the higher the number of failed organ systems, the higher the mortality. Now, we wouldn’t disable the central processing unit (CPU) in our new car and expect the vehicle to function normally, right? Why do we assume that we can turn off the biggest CPU of them all, the human brain, without incurring a cost?

Maybe we shouldn’t take the brain off-line, when it has an important job to do. What job is that, you ask?

One crucial job of the brain is to coordinate immune defenses.

A recent study by Kim et al. in Nature Immunology detailed one such way that the brain coordinates immunity. During Listeria infection in mice, tumor necrosis factor (TNF) builds up in cerebrospinal fluid in the brain. TNF in the brain then caused mobilization of lymphocytes in the spleen and epididymal fat. The increase in peripheral lymphocytes was dependent on TNF signaling in the brain. This is one, but certainly not the only, way that the brain coordinates an immune defense.

Do we cause harm when we inhibit these defenses by deeply sedating our patients in the intensive care unit? This hypothesis has yet to be tested and should be. Much evidence points toward an immune suppressive effect of sedatives. I suspect that immunity is handicapped in a number of ways in deeply sedated ICU patients. Immune compromise may also contribute to the excess mortality seen in patients given sedative hypnotics in the general population.

Read  Kim et al.’s Nature (2015) article – Rapid linkage of innate immunological signals to adaptive immunity by the brain-fat axis – here.

And

Immune regulation:Brain fat axis in adaptive immunity

And

Weich et al. 2014.BMJ Effect of anxiolytic and hypnotic drug prescriptions on mortality hazards: retrospective cohort study

Intensive feeding in the ICU kills patients

-1

Since publishing our clinical brief on illness anorexia (Alcock and Legrand 2014), described in the last entry on this blog, a recent trial examined whether giving intensive nutrition to critically ill patients with lung injury helps survival. Based on what we have proposed in EMPH and also the idea of immune brinksmanship, we would predict that intensive feeding would be harmful.

Braunschweig and colleagues (2015) studied whether intensive nutrition (attempting to replace most calorie and nutrient needs) improves survival in ICU patients with acute lung injury compared to standard nutrition, in which patients receive less. These patients are intubated and cannot eat, so physicians and nutritionists make feeding decisions for them. Nutritionists can calculate how many calories the body needs, and many advocate for full replacement. However, because gut motility is often impaired in critical illness, most patients receive less than calculated needs in current practice.

The  bottom line: This study was terminated early for increased deaths in the intensive feeding group. Trying to normalize calorie replacement kills critically patients with ALI.

Continue reading

Illness Anorexia

Screen Shot 2015-04-16 at 5.46.01 AM

As we discussed in the last post, illness is accompanied by a dramatic decrease in eating, but also an increase in carbohydrate secretion in the gut. These events point towards a coordinated adaptive response to infection and illness that might improve survival when sick. So, should we give our patients less nutrition in the hospital when they are critically ill?

The idea of permissive underfeeding in critical illness was proposed by Zaloga and Roberts (1994). They hypothesized that maximizing nutrition “may adversely affect the host response to injury, especially when given in excess of energy and protein needs.” Supporting the view that calorie restriction during illness is protective, Huang et al 2012 showed that patients with higher illness severity had longer stays in the ICU when feeding was initiated early versus late. In an observational trial, Arabi and colleagues reported that giving higher calories to patients in the ICU was associated with worse outcomes in the ICU (2010). Similar results were found by Krishnan et al. 2003, In a randomized controlled trials, Arabi and colleagues (2011) underfeeding (60%) vs. normal (100%) replacement of calorie needs resulted in improved survival in ICU patients, in keeping with an adaptive function of illness anorexia.

Ed Legrand and I recently summarized why illness anorexia may be an evolved adpative response, and its clinical implications in Evolution Medicine and Public Health, available here.

Starve a fever

Screen Shot 2015-04-14 at 4.01.28 AMA study by Pickard and colleagues in Nature showed that exposure to lipopolysaccharide, LPS, causes anorexia. This component of sickness behavior is well known, and LPS administration is a commonly used model for illness anorexia. The figure below shows a dramatic decrease in energy intake after receiving LPS:

Both solid bars (Fut2+) and open bar (Fut 2-) had markedly decreased food intake after LPS.

In a new and important finding these authors also showed that LPS makes the host secrete carbohydrates on the intestinal epithelium. This effect depended on production of fucosylated oligosaccharides, the gene for which is FUT2. FUT2 positive mice produce these carbohydrates rapidly after stress from LPS with the consequence of directly feeding gut commensal microbiota.

Thus, LPS is an agonist for N-fucosylation and has the effect of diverting a nutrient source to commensal bacteria. N-fucosylation benefits the host in this instance by feeding beneficial barrier bacteria.  Selection may have favored this mutualism because commensal bacteria help prevent invasion by gut pathogens.

In other words sickness makes you eat less, but you may continue to feed your gut microbes. Exposure to LPS induces “sugaring” of intestinal epithelial cells with fucosylated oligosaccharides, as seen in the fluorescence image below:

Screen Shot 2015-04-14 at 9.28.32 AM

So indeed our bodies do starve a fever (LPS is a well known pyrogen), but we feed our commensals. (Now by we I mean mammals.) This was a mouse study, and humans differ in at least one important respect. Some humans lack functional Fut2, preventing the ability to secrete fucosylated oligosaccharides. Fut2 negative humans are at higher risk of inflammatory bowel disease and neonatal sepsis. Not surprisingly Fut2 in humans shows evidence of ongoing natural selection.

The consequence of FUT2 in humans was covered in detail in a previous post on this blog. Read on.

Related: Faecal microbiota composition in adults is associated with the FUT2 gene determining the secretor status.

And: Innate Susceptibility to Norovirus Infections Influenced by FUT2 Genotype in a United States Pediatric Population.

Nutrients with intrinsic anti-pathogen activity are healthy

The gut microbiome is the most important force driving the evolution of dietary inflammation.

Humans have coevolved with commensal organisms and pathogens since our distant ancestors became multicellular. Today, our bodies are a habitat for a multitude of microbes and viruses, the majority of which inhabit the gut, making up a community known as the microbiome. It turns out that these microbes number as many as 100 trillion, and the sum of their genes outnumber human genes by a ratio of more than ten to one. Thus, for as long as humans and our predecessors have been eating, we have shared the food we eat with the bacteria in our guts.

Nutrients have Jekyll and Hyde characteristics on the microbiome, and are sometimes helpful and sometimes harmful. Those nutrients that enhance the barrier function of the gut and prevent pathogen colonization and growth have evolved a signaling function that is thrifty, reducing the costs of an immune defense. Nutrients that impair the barrier function of the gut and increase the risk of pathogen colonization and invasion have the opposite effects. Melissa Franklin and Chris Kuzawa and I compiled evidence from studies showing that inflammatory signaling by nutrients compensates for the effects they have on the gut microbiome.

Since 2012 even more evidence suggests that the immune effects of nutrients matches their influence on the microbiome. Lucky for us, some of the foods that kill pathogens and pathobionts and promote beneficial microbes are some of the tastiest: coffee, chocolate, many fruits and the healthy fats found in the Mediterranean diet.

Results of a recent study on the effects of chocolate matched our predictions exactly: decreased immune investment with nutrients that inhibit pathogens and promote protective species. Less IgA production accompanied beneficial changes in microbiome production with cacao feeding.

Cacao reduced fecal IgA

Cacao reduced fecal IgA

Closed circles represent the control diet, triangles and open circle are cacao/polyphenol diets.

From the paper: “In general, cocoa diets inhibited the growth of Staphylococcus, Streptococcus, and Clostridium histolyticum/C. perfringens (belonging to the Firmicutes phylum) produced by age.”

Our 2012 paper, Nutrient signaling: evolutionary origins of the immune-modulating effects of dietary fat. is available in this previous post.

Reference: Massot-Cladera, Malen, et al. Impact of cocoa polyphenol extracts on the immune system and microbiota in two strains of young rats. British Journal of Nutrition 112.12 (2014): 1944-1954.