There’s an awful calculus that takes place in malaria stricken regions of the world.
Due to malnutrition, children in these areas often suffer from iron deficiency anemia, which can lead to serious cognitive and motor impairments. While iron supplementation may sound like an obvious solution, there’s been a big problem with it.
Studies in mice and humans suggest that iron promotes malarial infection, likely by increasing the number of red blood cells—the target for the Plasmodium parasites that cause the disease.
More blood cells mean more infection, which means more inflammation. When the disease spreads to the brain in cerebral malaria, this inflammation causes neurological and cognitive damage in survivors.
This conundrum has left health experts at odds with each other about whether children in sub-Saharan Africa and other regions of the world where malaria is prevalent should get iron supplementation. More than 70 percent of malaria deaths occur in children under age 5. This year alone, according to the World Health Organization, the disease has killed more than 300,000 African children in this age group.
Many health experts who are focused on malaria say the best course is to allow iron deficiency anemia to protect kids from the infection. Others say the compromise to brain development caused by too little iron in the diet isn’t a fair tradeoff. The World Health Organization recommends that children in malaria endemic regions get iron supplements, along with aggressive monitoring and treatment for malaria.
But what if there was a third option—a way to supplement with iron without increasing the malaria risk?
A collaboration between three groups at Penn State is poised to bring this possibility to light, and it’s all thanks to an accidental discovery.
Learn more in this Penn State Medicine article.