The Biomarkers Reflecting Inflammation and Nutritional Determinants of Anemia (BRINDA) research group was formed over a decade ago to improve the interpretation of micronutrient biomarkers in ...settings with inflammation. The BRINDA inflammation adjustment method uses regression correction to adjust for the confounding effects of inflammation on select micronutrient biomarkers and has provided important insights to micronutrient research, policy, and programming. However, users may face challenges when applying the BRINDA inflammation adjustment methods to their own data due to varying guidance on the adjustment approach for different biomarkers and the need to develop statistical programming to conduct these analyses. This may result in lost opportunities to have results of micronutrient data readily available during critical decision-making periods. Our research objectives are to 1) provide an all-in-one summary of the BRINDA method in adjusting multiple micronutrient biomarkers for inflammation, 2) evaluate whether malaria as a binary variable should be included in the BRINDA inflammation adjustment method, and 3) present standardized and user-friendly BRINDA adjustment R package and SAS macro. This paper serves as a practical guidebook for the BRINDA inflammation adjustment approach and aids users to use the BRINDA R package and SAS to streamline their analyses.
Improving maternal and child nutrition is central to global development goals and reducing the noncommunicable disease burden. Although the process of becoming malnourished starts in utero, the ...consequences of poor nutrition extend across the life cycle and into future generations. The global nutrition targets for 2025 include reducing infant and young child growth faltering, halting the increase of overweight children, improving breastfeeding practices, and reducing maternal anemia. In this review, we address nutritional assessment, discuss nonnutritive factors that affect growth, and endorse the evidence-based interventions that should be scaled up to improve maternal and child nutrition.
Abstract
Background
In the management of pediatric osteomyelitis or septic arthritis, delay in treatment may affect outcome, while receipt of antibiotics prior to culture may affect culture results. ...We aimed to determine if pathogen identification decreased in cultures that were pretreated with antibiotics.
Methods
We conducted a retrospective cohort study of 584 hospitalized children between 30 days and 18 years of age admitted to two tertiary children’s hospitals. Logistic regression assessed the effect of antibiotic duration on blood, bone, joint aspirate, and “other” culture positivity.
Results
Overall, 42% of blood cultures, 70% of bone cultures, 39% of joint cultures, and 70% of “other” cultures were positive. Compared with children who did not receive antibiotics prior to culture, there were no significant differences in odds of a positive culture in children whose cultures were pretreated with antibiotics for any of the culture types OR (95% CI) 0.90 (0.56–1.44) for blood cultures, 0.77 (0.25–2.34) for bone cultures, 0.71 (0.39–1.28) for joint cultures, 1.18 (0.58–2.41) “for other” cultures; all
p
> 0.05. Furthermore, the duration (hours) of antibiotics in the pretreated cultures was also not a significant predictor of culture positivity (OR ranged from 0.99–1.00 for all cultures,
p
> 0.05).
Conclusions
Culture positivity was not associated with antibiotic pretreatment in any of the samples, even for longer duration of antibiotics prior to culture, though the small sample size of subgroups is an important limitation. In pediatric patients hospitalized with osteomyelitis and/or septic arthritis, early initiation of antibiotics may not affect culture positivity.
Anemia in women of reproductive age (WRA) (age range: 15–49 y) remains a public health problem globally, and reducing anemia in women by 50% by 2025 is a goal of the World Health Assembly.
We ...assessed the associations between anemia and multiple proximal risk factors (e.g., iron and vitamin A deficiencies, inflammation, malaria, and body mass index) and distal risk factors (e.g., education status, household sanitation and hygiene, and urban or rural residence) in nonpregnant WRA.
Cross-sectional, nationally representative data from 10 surveys (n = 27,018) from the Biomarkers Reflecting Inflammation and Nutritional Determinants of Anemia (BRINDA) project were analyzed individually and pooled by the infection burden and risk in the country. We examined the severity of anemia and measured the bivariate associations between anemia and factors at the country level and by infection burden, which we classified with the use of the national prevalences of malaria, HIV, schistosomiasis, sanitation, and water-quality indicators. Pooled multivariate logistic regression models were constructed for each infection-burden category to identify independent determinants of anemia (hemoglobin concertation <120 g/L).
Anemia prevalence was ∼40% in countries with a high infection burden and 12% and 7% in countries with moderate and low infection burdens, respectively. Iron deficiency was consistently associated with anemia in multivariate models, but the proportion of anemic women who were iron deficient was considerably lower in the high-infection group (35%) than in the moderate- and low-infection groups (65% and 71%, respectively). In the multivariate analysis, inflammation, vitamin A insufficiency, socioeconomic status, and age were also significantly associated with anemia, but malaria and vitamin B-12 and folate deficiencies were not.
The contribution of iron deficiency to anemia varies according to a country’s infection burden. Anemia-reduction programs for WRA can be improved by considering the underlying infection burden of the population and by assessing the overlap of micronutrient deficiencies and anemia.
•Selenium (Se) concentration is a potential biomarker for assessing Se status at a population level in a low Se-status population.•VZ Variation in urine and plasma Se concentration of women and ...school aged children in Malawi corresponded between clusters between bet.•Urine Se concentration explained more of the between-cluster variation in plasma Se concentration when data were corrected for hydration status.•UU Urine Se concentration is a less useful biomarker at an individual level.
Plasma selenium (Se) concentration is an established population level biomarker of Se status, especially in Se-deficient populations. Previously observed correlations between dietary Se intake and urinary Se excretion suggest that urine Se concentration is also a potentially viable biomarker of Se status. However, there are only limited data on urine Se concentration among Se-deficient populations. Here, we test if urine is a viable biomarker for assessing Se status among a large sample of women and children in Malawi, most of whom are likely to be Se-deficient based on plasma Se status. Casual (spot) urine samples (n = 1406) were collected from a nationally representative sample of women of reproductive age (WRA, n =741) and school aged children (SAC, n=665) across Malawi as part of the 2015/16 Demographic and Health Survey. Selenium concentration in urine was determined using inductively coupled plasma mass spectrometry (ICP-MS). Urinary dilution corrections for specific gravity, osmolality, and creatinine were applied to adjust for hydration status. Plasma Se status had been measured for the same survey participants. There was between-cluster variation in urine Se concentration that corresponded with variation in plasma Se concentration, but not between households within a cluster, or between individuals within a household. Corrected urine Se concentrations explained more of the between-cluster variation in plasma Se concentration than uncorrected data. These results provide new evidence that urine may be used in the surveillance of Se status at the population level in some groups. This could be a cost-effective option if urine samples are already being collected for other assessments, such as for iodine status analysis as in the Malawi and other national Demographic and Health Surveys.
Iron deficiency is thought to be one of the most prevalent micronutrient deficiencies globally, but an accurate assessment in populations who are frequently exposed to infections is impeded by the ...inflammatory response, which causes iron-biomarker alterations.
We assessed the relation between soluble transferrin receptor (sTfR) concentrations and inflammation and malaria in preschool children (PSC) (age range: 6–59 mo) and women of reproductive age (WRA) (age range: 15–49 y) and investigated adjustment algorithms to account for these effects.
Cross-sectional data from the Biomarkers Reflecting the Inflammation and Nutritional Determinants of Anemia (BRINDA) project from 11,913 PSC in 11 surveys and from 11,173 WRA in 7 surveys were analyzed individually and combined with the use of a meta-analysis. The following 3 adjustment approaches were compared with estimated iron-deficient erythropoiesis (sTfR concentration >8.3 mg/L): 1) the exclusion of individuals with C-reactive protein (CRP) concentrations >5 mg/L or α-1-acid glycoprotein (AGP) concentrations >1 g/L, 2) the application of arithmetic correction factors, and 3) the use of regression approaches.
The prevalence of elevated sTfR concentrations incrementally decreased as CRP and AGP deciles decreased for PSC and WRA, but the effect was more pronounced for AGP than for CRP. Depending on the approach used to adjust for inflammation, the estimated prevalence of iron-deficient erythropoiesis decreased by 4.4–14.6 and 0.3–9.5 percentage points in PSC and WRA, respectively, compared with unadjusted values. The correction-factor approach yielded a more modest reduction in the estimated prevalence of iron-deficient erythropoiesis than did the regression approach. Mostly, adjustment for malaria in addition to AGP did not significantly change the estimated prevalence of iron-deficient erythropoiesis.
sTfR may be useful to assess iron-deficient erythropoiesis, but inflammation influences its interpretation, and adjustment of sTfR for inflammation and malaria should be considered. More research is warranted to evaluate the proposed approaches in different settings, but this study contributes to the evidence on how and when to adjust sTfR for inflammation and malaria.
The accurate estimation of zinc deficiency at the population level is important, as it guides the design, targeting, and evaluation of nutrition interventions. Plasma or serum zinc concentration ...(PZC) is recommended to estimate zinc nutritional status; however, concentrations may decrease in the presence of inflammation.
We aimed to assess the relation between PZC and inflammation in preschool children (PSC; 6–59 mo) and nonpregnant women of reproductive age (WRA; 15–49 y), and to compare different inflammation adjustment approaches, if adjustment is warranted.
Cross-sectional data from 13 nationally representative surveys (18,859 PSC, 22,695 WRA) from the Biomarkers Reflecting Inflammation and Nutritional Determinants of Anemia (BRINDA) project were analyzed. Correlation and decile analyses were conducted, and the following 3 adjustment methods were compared if a consistent negative association between PZC and C-reactive protein (CRP) or α-1-acid glycoprotein (AGP) was observed: 1) exclude individuals with CRP > 5 mg/L or AGP > 1 g/L; 2) apply arithmetic correction factors; and 3) use the BRINDA regression correction (RC) approach.
In 6 of 12 PSC surveys, the estimated prevalence of zinc deficiency increased with increasing CRP deciles, and to a lesser extent, with increasing AGP deciles. In WRA, the association of PZC with CRP and AGP was weak and inconsistent. In the 6 PSC surveys in which adjustment methods were compared, application of RC reduced the estimated prevalence of zinc deficiency by a median of 11 (range: 4–18) percentage points, compared with the unadjusted prevalence.
Relations between PZC and inflammatory markers were inconsistent, suggesting that correlation and decile analyses should be conducted before applying any inflammation adjustments. In populations of PSC that exhibit a significant negative association between PZC and CRP or AGP, application of the RC approach is supported. At this time, there is insufficient evidence to warrant inflammation adjustment in WRA.
Vitamin and mineral deficiencies, particularly those of iron, vitamin A and zinc, affect more than two billion people worldwide. Young children are highly vulnerable because of rapid growth and ...inadequate dietary practices. Micronutrient powders (MNP) are single-dose packets containing multiple vitamins and minerals in powder form that can be sprinkled onto any semi-solid food.The use of MNP for home or point-of-use fortification of complementary foods has been proposed as an intervention for improving micronutrient intake in children under two years of age.
To assess the effects and safety of home (point-of-use) fortification of foods with multiple micronutrient powders on nutritional, health and developmental outcomes in children under two years of age.
We searched the following databases in February 2011: Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library), MEDLINE (1948 to week 2 February 2011), EMBASE (1980 to Week 6 2011), CINAHL (1937 to current), CPCI-S (1990 to 19 February 2011), Science Citation Index (1970 to 19 February 2011), African Index Medicus (searched 23 February 2011), POPLINE (searched 21 February 2011), ClinicalTrials.gov (searched 23 February 2011), mRCT (searched 23 February 2011), and World Health Organization International Clinical Trials Registry Platform (ICTRP) (searched 23 February 2011). We also contacted relevant organisations (25 January 2011) for the identification of ongoing and unpublished studies.
We included randomised and quasi-randomised trials with either individual or cluster randomisation. Participants were children under the age of two years at the time of intervention, with no specific health problems. The intervention was consumption of food fortified at the point of use with multiple micronutrient powders formulated with at least iron, zinc and vitamin A compared with placebo, no intervention or the use of iron containing supplements, which is the standard practice.
Two review authors independently assessed the eligibility of studies against the inclusion criteria, extracted data from included studies and assessed the risk of bias of the included studies.
We included eight trials (3748 participants) conducted in low income countries in Asia, Africa and the Caribbean, where anaemia is a public health problem. The interventions lasted between two and 12 months and the powder formulations contained between five and 15 nutrients. Six trials compared the use of MNP versus no intervention or a placebo and the other two compared the use of MNP versus daily iron drops. Most of the included trials were assessed as at low risk of bias.Home fortification with MNP reduced anaemia by 31% (six trials, RR 0.69; 95% CI 0.60 to 0.78) and iron deficiency by 51% (four trials, RR 0.49; 95% CI 0.35 to 0.67) in infants and young children when compared with no intervention or placebo, but we did not find an effect on growth.In comparison with daily iron supplementation, the use of MNP produced similar results on anaemia (one trial, RR 0.89; 95% CI 0.58 to 1.39) and haemoglobin concentrations (two trials, MD -2.36 g/L; 95% CI -10.30 to 5.58); however, given the limited amount of data these results should be interpreted cautiously.No deaths were reported in the trials and information on side effects and morbidity, including malaria, was scarce.It seems that the use of MNP is efficacious among infants and young children six to 23 months of age living in settings with different prevalences of anaemia and malaria endemicity, regardless of whether the intervention lasts two, six or 12 months or whether recipients are male or female.
Home fortification of foods with multiple micronutrient powders is an effective intervention to reduce anaemia and iron deficiency in children six months to 23 months of age. The provision of MNP is better than no intervention or placebo and possibly comparable to commonly used daily iron supplementation. The benefits of this intervention as a child survival strategy or on developmental outcomes are unclear. Data on effects on malaria outcomes are lacking and further investigation of morbidity outcomes is needed. The micronutrient powders containing multiple nutrients are well accepted but adherence is variable and in some cases comparable to that achieved in infants and young children receiving standard iron supplements as drops or syrups.
The association between suboptimal infant feeding practices and growth faltering is well-established. However, most of this evidence comes from cross-sectional studies. To prospectively assess the ...association between suboptimal infant feeding practices and growth faltering, we interviewed pregnant women at 28-32 weeks' gestation and followed-up their offspring at postnatal months 3, 9, 16 and 24 months in rural Bangladesh. Using maternal recall over the past 24 hours, exclusive breastfeeding (EBF) status at 3 months, age at complementary feeding (CF) initiation, and receipt of minimum acceptable diet (MAD; as defined by WHO) at 9 months were assessed. Infant length and weight measurements were used to produce length-for-age (LAZ) and weight-for-length (WLZ) z-scores at each follow-up. Generalized estimating equations were used to estimate associations of LAZ and WLZ with infant feeding practices. All models were adjusted for baseline SES, infant sex, maternal height, age, literacy and parity. Follow-up was completed by 2189, 2074, 1969 and 1885 mother-child dyads at 3, 9, 16 and 24 months, respectively. Stunting prevalence increased from 28% to 57% between infant age 3 and 24 months. EBF at 3 months and age at CF initiation were not associated with linear infant growth, but receipt of MAD at 9 months was. By age 24 months, infants receiving MAD had attained a higher LAZ compared to infants who did not receive MAD (adjusted β = 0.25, 95% CI: 0.13-0.37). Although prevalence of stunting was already high at age 3 months, ensuring infants receive a diverse, high quality diet from 6 months onwards may reduce rates of stunting in the second year of life.