In the last two decades the role of aerosol iron supply to the ocean has received growing attention. Research has mainly focused on three themes — how much iron is supplied to the ocean from dust; ...where this aerosol iron is deposited (depositional models); and modelling of the biogeochemical impact of iron supply to the ocean in the past, present and future. Here, we investigate the relationship between modes of iron supply (mechanisms, dissolution rate and timescales) to the upper ocean and the subsequent biological responses in the present day. The reported solubility of iron from dust ranges from 0.001–90%, and this variability appears to be linked to both aerosol properties and leaching schemes employed. Consequently, biogeochemical modelling studies have used a wide range of iron dissolution rates (1–12%) and have reported a broad suite of biogeochemical responses. Re-examination of evidence, from ocean observations, of enhanced biological and/or biogeochemical response to aerosol iron supply in the modern ocean suggests that much of it is flawed, and that there are only a few cases in which there is a causative link between dust supply and biological response. The resulting small size of this dataset is due to a wide range of confounding factors including seasonality of environmental factors controlling phytoplankton production (light, silicic acid, phosphate, iron), and the elemental stoichiometry of the aerosols (iron and other nutrients) during dissolution. Thus, the main impact of aerosol iron supply appears to be an initial rapid release of iron, followed by a slow and sustained release of iron during its mixed layer residence time, which may result in small increases in the dissolved iron mixed-layer inventory. The implications of such a mode of iron release from aerosol dust are explored using a simple dust/biota assessment test for both contemporary and paleoceanographic case-studies. We conclude that dust deposition can easily be mistakenly attributed as a primary cause of enhanced biological activity and that, due to the slow dissolution of iron, dust-mediated phytoplankton blooms are probably rare in the modern ocean.
Changing Mortality in Congenital Heart Disease Khairy, Paul, MD, PhD; Ionescu-Ittu, Raluca, MSc; Mackie, Andrew S., MD, SM ...
Journal of the American College of Cardiology,
09/2010, Letnik:
56, Številka:
14
Journal Article
Recenzirano
Odprti dostop
Objectives This study sought to characterize temporal trends in all-cause mortality in patients with congenital heart disease (CHD). Background Historically, most deaths in patients with CHD occurred ...in early childhood. Notable advances have since been achieved that may impact on mortality trends. Methods We conducted a population-based cohort study of patients with CHD in Quebec, Canada, from July 1987 to June 2005. A total of 8,561 deaths occurred in 71,686 patients with CHD followed for 982,363 patient-years. Results The proportion of infant and childhood deaths markedly declined from 1987 to 2005, with a reduction in mortality that exceeded that of the general population. Distribution of age at death transitioned from a bimodal to unimodal, albeit skewed, pattern, more closely approximating the general population. Overall, mortality decreased by 31% (mortality rate ratio: 0.69, 95% confidence interval CI: 0.61 to 0.79) in the last (2002 to 2005) relative to the first (1987 to 1990) period of observation. Mortality rates decreased in all age groups below 65 years, with the largest reduction in infants (mortality rate ratio: 0.23, 95% CI: 0.12 to 0.47). In adults 18 to 64 years, the mortality reduction (mortality rate ratio: 0.84, 95% CI: 0.73 to 0.97) paralleled the general population. Gains in survival were mostly driven by reduced mortality in severe forms of CHD, particularly in children (mortality rate ratio: 0.33, 95% CI: 0.19 to 0.60), and were consistent across most subtypes. Conclusions Deaths in CHD have shifted away from infants and towards adults, with a steady increase in age at death and decreasing mortality.
Aims
To add a spore germination step in order to reduce decontamination temperature and time requirements compared to the current hot, humid air decontamination parameters, which are 75–80°C, ≥72 h, ...70–90% RH, down to ≤60°C and ≤24 h total decontamination time.
Methods and Results
Bacillus anthracis spore germination with l‐alanine+inosine+calcium dipicolinate (CaDPA) was quantified at 0–40°C, several time points and spore concentrations of 5–9 log10 per ml. Germination efficiency at 0–40°C was >99% at <8 log10 spores per ml. The temperature optimum was 20°C. Germination efficiency was significantly higher but slower at 0°C compared to ≥30°C at ≥8 log10 spores per ml. A single germinant application followed by 60°C, 1‐h treatment consistently inactivated >2 log10 (>99%) of spores. However, a repeat application of germinant was needed to achieve the objective of ≥6 log10 spore inactivation out of a 7 log10 challenge (≥99·9999%) for ≤24 h total decontamination time for nylon and aircraft performance coating.
Conclusions
l‐alanine+inosine+CaDPA stimulated germination across wide temperature and spore concentration ranges.
Significance and Impact of the Study
Germination expands the scope of spore decontamination to include materials from any industry sector that can be sprayed with an aqueous germinant solution.
Objective Infants and children who undergo cardiopulmonary bypass and cardiac surgery are at risk of postoperative fluid overload. Peritoneal dialysis catheter (PDC) and peritoneal dialysis are ...reported to be effective means of postoperative fluid management. We sought to test the hypothesis that PDC insertion in the operating room at the time of Norwood palliation would decrease the time to achieve a negative fluid balance in a group of neonates with hypoplastic left heart syndrome. Methods A single center randomized controlled trial was performed. We randomized neonates with hypoplastic left heart syndrome to prophylactic PDC, with or without dialysis, or standard care (ie, no PDC). Results Twenty-two neonates were included; 10 were randomized to PDC and 12 were randomized to standard care. The mean time to first postoperative negative fluid balance was 2.70 ± 1.06 days for the prophylactic PDC group and 2.67 ± 0.65 days for the standard care group ( P = .93). There was no difference between the 2 groups in time to lactate ≤ 2 mmol/L, maximum vasoactive-inotrope score on postoperative days 2 to 5, time to sternal closure, time to first extubation, modified clinical outcome score, or hospital length of stay. Twenty-one patients (95%) survived to hospital discharge. Four patients randomized to prophylactic PDC had 1 or more serious adverse events compared with no patients in the standard care group ( P = .03). Conclusions Prophylactic PDC, with or without dialysis, did not decrease the time to achieve a negative fluid balance after the Norwood procedure, did not alter physiological variables postoperatively, and was associated with more severe adverse events.
We report on the simulated cloud processing of an aerosol iron sample derived from an Australian dust storm. Primary factors influencing the extent and rate of Fe mobilization were pH, duration of ...extraction and dust concentration. Fe was significantly mobilized below a threshold of pH ∼3.6. After initial rapid mobilization at low pH, the rate of Fe release was constant with constant pH. Between this threshold and pH ∼7.1, dissolved Fe fell to a minimum but above this pH further dissolution of iron occurs, probably due to the formation of soluble ferrates. At our lowest dust concentrations (1–20 mg L−1) the rate of Fe extraction at low pH was constant, while at higher dust concentrations the rate was inversely proportional to dust concentration. Dissolution of iron from dust is thus a complex process and these factors must be considered when modeling the input of iron to the oceans.
The paucity of enzymes that efficiently deconstruct plant polysaccharides represents a major bottleneck for industrial-scale conversion of cellulosic biomass into biofuels. Cow rumen microbes ...specialize in degradation of cellulosic plant material, but most members of this complex community resist cultivation. To characterize biomass-degrading genes and genomes, we sequenced and analyzed 268 gigabases of metagenomic DNA from microbes adherent to plant fiber incubated in cow rumen. From these data, we identified 27,755 putative carbohydrate-active genes and expressed 90 candidate proteins, of which 57% were enzymatically active against cellulosic substrates. We also assembled 15 uncultured microbial genomes, which were validated by complementary methods including single-cell genome sequencing. These data sets provide a substantially expanded catalog of genes and genomes participating in the deconstruction of cellulosic biomass.
Echocardiographic measurement of left ventricular (LV) mass is routinely performed in pediatric patients with elevated cardiovascular risk. The complex relationship between heart growth and body ...growth in children requires normalization of LV mass to determine its appropriateness relative to body size. LV mass is strongly determined by lean body mass (LBM). Using new LBM predictive equations, the investigators generated sex-specific LV mass-for-LBM centile curves for children 5 to 18 years of age.
This retrospective study used M-mode echocardiographic data collected from 1995 through 2003 from 939 boys and 771 girls between 5 and 18 years of age (body mass index < 85th percentile for sex and age) to create smoothed sex-specific LV mass-for-LBM reference centile curves using the Lamda Mu Sigma method. The newly developed reference centiles were applied to children with essential hypertension and with chronic kidney disease, groups known to be at high risk for LV hypertrophy (LVH). The identification of LVH using two different normalization approaches was compared: LV mass-for-LBM and LV mass index-for-age percentiles.
Among 231 children at risk for LVH, on average, relative LV mass was higher using the LV mass index-for-age percentile method than the LV mass-for-LBM percentile method. LVH was more likely to be diagnosed among overweight children and less likely among thin children.
This study provides new LV mass reference centiles expressing LV mass relative to LBM, the strongest determinant of LV mass. These reference centiles may allow more accurate stratification of cardiovascular risk in children.
Patients with early breast cancer (eBC) are increasingly provided with different options, which may involve a sequence of different treatments and treatment modalities, and eligibility for certain ...adjuvant treatments depending upon pre-surgical and surgical outcomes. This study examined patient preferences around aspects of treatment decision-making in eBC.PurposePatients with early breast cancer (eBC) are increasingly provided with different options, which may involve a sequence of different treatments and treatment modalities, and eligibility for certain adjuvant treatments depending upon pre-surgical and surgical outcomes. This study examined patient preferences around aspects of treatment decision-making in eBC.A total of 452 patients with self-reported eBC in Germany (n=151), Italy (n=151), and Japan (n=150) completed an online survey about physician interactions and treatment side effects. The survey included best-worst scaling (BWS) to assess prioritization of 13 statements reflecting aspects of treatment decision-making. In a series of choice tasks, participants chose their most and least preferred options among subsets of 4 statements. Hierarchical Bayesian modeling was used to estimate BWS preference scores for each statement. BWS scores were based on the number of times a statement was chosen as most versus least preferred; scores total 100 for each patient.Patients and MethodsA total of 452 patients with self-reported eBC in Germany (n=151), Italy (n=151), and Japan (n=150) completed an online survey about physician interactions and treatment side effects. The survey included best-worst scaling (BWS) to assess prioritization of 13 statements reflecting aspects of treatment decision-making. In a series of choice tasks, participants chose their most and least preferred options among subsets of 4 statements. Hierarchical Bayesian modeling was used to estimate BWS preference scores for each statement. BWS scores were based on the number of times a statement was chosen as most versus least preferred; scores total 100 for each patient.The most preferred aspects of treatment decision-making were "treatment aggressiveness matches personal risk" (mean BWS score = 13.49), "being told about what is coming" (13.18), deciding based on "own surgical outcome" (11.90), "avoiding unnecessary treatment" (10.35), and "involving in treatment decisions" (9.44). The least preferred aspects were "not being asked about treatment decisions along the way" (3.27) and "receiving the same treatment as other patients" (3.41). Patients in Japan preferred "being told about what is coming", "deciding based on own surgical outcome", "avoiding unnecessary treatment", and being "involved in decisions" more than patients in Italy and Germany. Patients in Germany were more satisfied with their physician interactions and care, although their outcomes were not always better than those in Italy and Japan.ResultsThe most preferred aspects of treatment decision-making were "treatment aggressiveness matches personal risk" (mean BWS score = 13.49), "being told about what is coming" (13.18), deciding based on "own surgical outcome" (11.90), "avoiding unnecessary treatment" (10.35), and "involving in treatment decisions" (9.44). The least preferred aspects were "not being asked about treatment decisions along the way" (3.27) and "receiving the same treatment as other patients" (3.41). Patients in Japan preferred "being told about what is coming", "deciding based on own surgical outcome", "avoiding unnecessary treatment", and being "involved in decisions" more than patients in Italy and Germany. Patients in Germany were more satisfied with their physician interactions and care, although their outcomes were not always better than those in Italy and Japan.Patients value individualized treatment tailored to their risk of recurrence and tolerance of side effects, highlighting the need for focused patient education about options, to encourage their engagement.ConclusionPatients value individualized treatment tailored to their risk of recurrence and tolerance of side effects, highlighting the need for focused patient education about options, to encourage their engagement.
Abstract Background In 2007, the American Heart Association (AHA) published revised guidelines for infective endocarditis (IE) prophylaxis. Population-based data with respect to the potential impact ...of these revised guidelines are lacking. Methods The Canadian Institute for Health Information Discharge Abstract Database was used to identify all hospitalizations between April 2002 and March 2013 having IE as a primary diagnosis. Hospitalization rates were determined using age-specific population data from Statistics Canada. Interrupted time series analysis was used to evaluate changes in the slope of hospitalization rates after the AHA guidelines were published. Results There were 9431 hospitalizations during the study period among 8055 patients (63% male patients). Time trend analysis showed an increase of 0.05 IE hospitalizations per 10 million population per month (95% confidence interval, 0.005-0.09; P = 0.029) from April 2002-March 2007 and an increase of 0.07 IE hospitalizations per 10 million population per month from April 2007-March 2013 (interaction P = 0.5213). Change-point analysis showed that a change in the slope occurred in April 2011, 4 years after publication of the revised AHA guidelines. Staphylococcus aureus was the most commonly reported organism (29.4%). Streptococcal infections decreased over time, beginning before the 2007 guidelines ( P < 0.0001). The presence of a pacemaker or defibrillator was an increasingly prevalent risk factor over time (4% increase per year; P = 0.0178). Conclusions The rate of IE hospitalizations increased in Canada before and after the publication of the 2007 AHA guidelines, with no significant change in slope after 2007. These guidelines had no impact on the incidence of IE hospitalizations.
Background
Stress can play a role in disease incidence in all species via immunosuppression and has been implicated as a contributing factor in significant infectious diseases of koalas. Faecal ...cortisol measurement may represent a non‐invasive methodology for quantifying stress in koalas.
Methods
We used an ACTH (adrenocorticotropic hormone) stimulation test (10 IU) to induce sustained secretion of cortisol, which was measured in serum samples from four koalas and subsequently it was attempted to locate a corresponding elevation in either cortisol or corticosterone measurements within the faeces.
Results
Although ACTH administration resulted in an elevation of serum cortisol for at least 4 h post injection, it was not possible to identify a corresponding peak in corticosterone or cortisol concentrations in extracts from the faeces, consistent with the known gut transit time of the koala.
Conclusion
Faecal cortisol and corticosterone metabolites may not be reliable indices of acute changes in cortisol secretion in the koala and studies that attempt to use faecal cortisol as an index of stress will need to be interpreted with caution.