Abstract BACKGROUND We retrospectively investigated incidence, morbidity, and mortality of neonatal necrotizing enterocolitis in China, with special emphasis on determining the predictors of ...necrotizing enterocolitisassociated mortality. METHODS We identified neonates as having necrotizing enterocolitis if they met the accepted diagnostic criterion. Data pertaining to antenatal period, labor and birth, and the postnatal course of illness were collected. Multivariate analysis and logistic regression were used to analyze the risk factors. RESULTS There were 1167 cases of necrotizing enterocolitis identified from the 95 participating NICUs in mainland China in 2011, with the incidence of 2.50% and 4.53% in LBW (birth weight <2500g) and VLBW (birth weight <1500g) infants, respectively. Stage 1, 2 and 3 diseases were noted in 51.1%, 30.3% and 18.6% of cases respectively. The mortality from stage 2 and 3 necrotizing enterocolitis in this cohort was 41.7%. In VLBW infants, the important risk factors for mortality were small for gestation age (OR: 5.02, 95%CI 1.73-14.6; P=0.003) and stage 3 NEC (OR: 8.09, 95%CI 2.80-23.3, P<0.001). In moderate LBW infants (birth weight 1500-2499g), the risk factors identified for mortality were sepsis during hospitalization (OR: 2.59, 95%CI 1.57-4.28, P<0.001) and stage 3 NEC (OR: 5.37, 95%CI 3.24-8.90; P<0.001). CONCLUSIONS Necrotizing enterocolitis remains an important cause of morbidity and mortality in prematurely born neonates in Chinese neonatal units. Awareness of the associated risk factors and appropriate
Bicuspid aortic valve (BAV) is the most common form of congenital cardiovascular defect in humans and is associated with substantial morbidity and mortality. Emerging evidence demonstrates that ...genetic risk factors play an important role in the pathogenesis of BAV. However, BAV is a genetically heterogenous disorder, and the genetic defects underpinning BAV in most patients remain to be identified. In the present study, the coding exons and flanking introns of the NKX2.5 gene, which encodes a homeodomain-containing transcription factor essential for the normal development of the aortic valve, were sequenced in 142 unrelated patients with BAV. The available relatives of the mutation carrier and 200 unrelated healthy subjects used as controls were also genotyped for NKX2.5. The functional characteristics of the mutation were delineated by using a dual-luciferase reporter assay system. As a result, a novel heterozygous NKX2.5 mutation, p.K192X, was identified in a family with BAV transmitted in an autosomal dominant pattern. The nonsense mutation was absent in 400 control chromosomes. Functional analyses revealed that the mutant NKX2.5 had no transcriptional activity compared with its wild-type counterpart. Furthermore, the mutation abolished the synergistic transcriptional activation between NKX2.5 and GATA5, another transcription factor crucial for the aortic valvular morphogenesis. In conclusion, this study is the first to link an NKX2.5 loss-of-function mutation to enhanced susceptibility to human BAV, providing novel insight into the molecular mechanism of BAV and suggesting potential implications for genetic counseling and clinical care of families presenting with BAV.
It is unknown whether radiofrequency ablation (RFA) or antiarrhythmic therapy is superior when treating patients with symptomatic premature ventricular contractions (PVCs).
To determine the relative ...efficacy of RFA and antiarrhythmic drugs (AADs) on PVC burden reduction and increasing left ventricular systolic function.
Patients with frequent PVCs (>1000/24 h) were treated either by RFA or with AADs from January 2005 through December 2010. Data from 24-hour Holter monitoring and echocardiography before and 6-12 months after treatment were compared between the 2 groups.
Of 510 patients identified, 215 (40%) underwent RFA and 295 (60%) received AADs. The reduction in PVC frequency was greater by RFA than with AADs (-21,799/24 h vs -8,376/24 h; P < .001). The left ventricular ejection fraction (LVEF) was increased significantly after RFA (53%-56%; P < .001) but not after AAD (52%- 52%; P = .6) therapy. Of 121 (24%) patients with reduced LVEF, 39 (32%) had LVEF normalization to 50% or greater. LVEF was restored in 25 of 53 (47%) patients in the RFA group compared with 14 of 68 (21%) patients in the AAD group (P = .003). PVC coupling interval less than 450 ms, less impaired left ventricular function, and RFA were independent predictors of LVEF normalization performed by using multivariate analysis.
RFA appears to be more effective than AADs in PVC reduction and LVEF normalization.
Summary Background Two phase II trials in patients with previously-treated advanced non-small-cell lung cancer suggested that gefitinib was efficacious and less toxic than was chemotherapy. We ...compared gefitinib with docetaxel in patients with locally advanced or metastatic non-small-cell lung cancer who had been pretreated with platinum-based chemotherapy. Methods We undertook an open-label phase III study with recruitment between March 1, 2004, and Feb 17, 2006, at 149 centres in 24 countries. 1466 patients with pretreated (≥one platinum-based regimen) advanced non-small-cell lung cancer were randomly assigned with dynamic balancing to receive gefitinib (250 mg per day orally; n=733) or docetaxel (75 mg/m2 intravenously in 1-h infusion every 3 weeks; n=733). The primary objective was to compare overall survival between the groups with co-primary analyses to assess non-inferiority in the overall per-protocol population and superiority in patients with high epidermal growth factor receptor (EGFR)-gene-copy number in the intention-to-treat population. This study is registered with ClinicalTrials.gov , number NCT00076388. Findings 1433 patients were analysed per protocol (723 in gefitinib group and 710 in docetaxel group). Non-inferiority of gefitinib compared with docetaxel was confirmed for overall survival (593 vs 576 events; hazard ratio HR 1·020, 96% CI 0·905–1·150, meeting the predefined non-inferiority criterion; median survival 7·6 vs 8·0 months). Superiority of gefitinib in patients with high EGFR-gene-copy number (85 vs 89 patients) was not proven (72 vs 71 events; HR 1·09, 95% CI 0·78–1·51; p=0·62; median survival 8·4 vs 7·5 months). In the gefitinib group, the most common adverse events were rash or acne (360 49% vs 73 10%) and diarrhoea (255 35% vs 177 25%); whereas in the docetaxel group, neutropenia (35 5% vs 514 74%), asthenic disorders (182 25% vs 334 47%), and alopecia (23 3% vs 254 36%) were most common. Interpretation INTEREST established non-inferior survival of gefitinib compared with docetaxel, suggesting that gefitinib is a valid treatment for pretreated patients with advanced non-small-cell lung cancer. Funding AstraZeneca.
Background Chronic spontaneous urticaria (CSU) is defined by itchy hives, angioedema, or both for at least 6 weeks. Omalizumab, an anti-IgE antibody that affects mast cell and basophil function, is a ...promising new treatment option. As of now, however, the efficacy and safety of different doses of omalizumab used in clinical trials for CSU have not been systematically analyzed and summarized. Objective We sought to assess the efficacy and safety of different doses of omalizumab for the treatment of CSU in a meta-analysis of clinical trial results. Methods Suitable trials were identified by searching PubMed, Medline, Embase, and Web of Science databases and with the help of omalizumab's manufacturers. Only double-blind, randomized, placebo-controlled studies with omalizumab-treated versus placebo-treated patients with CSU were included in this analysis. Results We identified 7 randomized, placebo-controlled studies with 1312 patients with CSU. Patients treated with omalizumab (75-600 mg every 4 weeks) had significantly reduced weekly itch and weekly wheal scores compared with the placebo group. Omalizumab's effects were dose dependent, with the strongest reduction in weekly itch and weekly wheal scores observed with 300 mg. Rates of complete response were significantly higher in the omalizumab group (relative risk, 4.55; P < .00001) and dose dependent, with the highest rates in the 300-mg group. Rates of patients with adverse events were similar in the omalizumab and placebo groups. Conclusion This meta-analysis provides high-quality evidence for the efficacy and safety of omalizumab in patients with CSU and for treating these patients with 300 mg of omalizumab every 4 weeks.
Background Given the high morbidity and mortality rates for surgery and the diminishment of quality of life caused by operative resection of the gastric cardia, a minor invasive treatment without ...loss of curability is desirable for submucosal tumors (SMTs) of the esophagogastric junction (EGJ). Endoscopic submucosal dissection (ESD) has been used successfully for the removal of esophageal or gastric SMTs; however, the EGJ has been regarded as a difficult location for ESD because of its narrow lumen and sharp angle. Objective To evaluate the clinical impact of ESD for SMTs of the EGJ arising from the muscularis propria layer. Design Single-center, prospective study. Setting Academic medical center. Patients 143 patients with 143 SMTs of the EGJ originating from the muscularis propria layer. Interventions ESD. Main Outcome Measurements Complications, en bloc resection rate, local recurrence, and distant metastases. Results The average maximum diameter of the lesions was 17.6 mm (range 5 - 50 mm). The en bloc resection rate was 94.4% (135/143). All en bloc resection lesions showed both lateral and deep tumor-free margins, including 20 GI stromal tumors. Perforations occurred in 6 patients (4.2%, 6/143), and metal clips were used to occlude the defect. Four pneumoperitoneum and 2 pneumothorax caused by perforations were resolved with nonsurgical treatment. Local recurrence and distant metastasis have not occurred during a 2-year follow-up. Limitations Single-center, short follow-up. Conclusions ESD appears to be a safe, feasible, and effective procedure for providing accurate histopathologic evaluations, as well as curative treatments for SMTs of the EGJ originating from the muscularis propria layer.
Summary Background Umbilical-cord blood (UCB) is increasingly considered as an alternative to peripheral blood progenitor cells (PBPCs) or bone marrow, especially when an HLA-matched adult unrelated ...donor is not available. We aimed to determine the optimal role of UCB grafts in transplantation for adults with acute leukaemia, and to establish whether current graft-selection practices are appropriate. Methods We used Cox regression to retrospectively compare leukaemia-free survival and other outcomes for UCB, PBPC, and bone marrow transplantation in patients aged 16 years or over who underwent a transplant for acute leukaemia. Data were available on 1525 patients transplanted between 2002 and 2006. 165 received UCB, 888 received PBPCs, and 472 received bone marrow. UCB units were matched at HLA-A and HLA-B at antigen level, and HLA-DRB1 at allele level (n=10), or mismatched at one (n=40) or two (n=115) antigens. PBPCs and bone-marrow grafts from unrelated adult donors were matched for allele-level HLA-A, HLA-B, HLA-C, and HLA-DRB1 (n=632 and n=332, respectively), or mismatched at one locus (n=256 and n=140, respectively). Findings Leukaemia-free survival in patients after UCB transplantation was comparable with that after 8/8 and 7/8 allele-matched PBPC or bone-marrow transplantation. However, transplant-related mortality was higher after UCB transplantation than after 8/8 allele-matched PBPC recipients (HR 1·62, 95% CI 1·18–2·23; p=0·003) or bone-marrow transplantation (HR 1·69, 95% CI 1·19–2·39; p=0·003). Grades 2–4 acute and chronic graft-versus-host disease (GvHD) were lower in UCB recipients compared with allele-matched PBPC (HR 0·57, 95% 0·42–0·77; p=0·002 and HR 0·38, 0·27–0·53; p=0·003, respectively), while the incidence of chronic, but not acute GvHD, was lower after UCB than after 8/8 allele-matched bone-marrow transplantation (HR 0·63, 0·44–0·90; p=0·01). Interpretation These data support the use of UCB for adults with acute leukaemia when there is no HLA-matched unrelated adult donor available, and when a transplant is needed urgently. Funding National Cancer Institute, National Heart Lung and Blood Institute, National Institute of Allergy and Infectious Disease ( U24-CA76518 ); Health Resources and Services Administration ( HHSH234200637015C ); Office of Naval Research, Department of Navy ( N00014-08-1-1207 ); Children's Leukemia Research Association; and a Scholar in Clinical Research Award from the Leukemia and Lymphoma Society.
Background It might be that early intestinal colonization by bacteria in westernized infants fails to give rise to sufficient immune stimulation to support maturation of regulatory immune mechanisms. ...Objective The purpose of the present study was to characterize the very early infantile microbiota by using a culture-independent approach and to relate the colonization pattern to development of atopic eczema in the first 18 months of life. Methods Fecal samples were collected from 35 infants at 1 week of age. Twenty infants were healthy, and 15 infants were given diagnoses of atopic eczema at the age of 18 months. The fecal microbiota of the infants was compared by means of terminal restriction fragment length polymorphism (T-RFLP) and temporal temperature gradient gel electrophoresis (TTGE) analysis of amplified 16S rRNA genes. Results By means of T-RFLP analysis, the median number of peaks, Shannon-Wiener index, and Simpson index of diversity were significantly less for infants with atopic eczema than for infants remaining healthy in the whole group and for the Swedish infants when Alu I was used for digestion. The same was found when TTGE patterns were compared. In addition, TTGE analysis showed significantly less bands and lower diversity indices for the British atopic infants compared with those of the control subjects. Conclusion There is a reduced diversity in the early fecal microbiota of infants with atopic eczema during the first 18 months of life.
Background Interleukin 18 (IL-18) has been proposed as a biomarker for the early detection of acute kidney injury (AKI), but a broad range of its predictive accuracy has been reported. Study Design ...Meta-analysis of diagnostic test studies. Setting & Population Various clinical settings of AKI, including after cardiac surgery, after contrast infusion, in the emergency department, or in the intensive care unit. Selection Criteria for Studies Prospective studies that investigated the diagnostic accuracy of IL-18 level to predict AKI. Index Tests Increasing or increased urinary IL-18 excretion. Reference Tests The primary outcome was AKI development, mainly based on serum creatinine level (definition varied across studies). The other outcome was in-hospital mortality. Results We analyzed data from 23 studies and 7 countries involving 4,512 patients. Of these studies, 18 could be included in the meta-analysis. Across all settings, the diagnostic odds ratio (DOR) for urinary IL-18 level to predict AKI was 4.22 (95% CI, 2.90-6.14), with sensitivity and specificity of 0.58 and 0.75, respectively. The area under the receiver operating characteristic curve (AUROC) of urinary IL-18 level to predict AKI was 0.70 (95% CI, 0.66-0.74). Subgroup analysis showed the DOR/AUROC of urinary IL-18 was 5.32 (95% CI, 2.92-9.70)/0.72 (95% CI, 0.68-0.76) in cardiac surgery patients and 3.65 (95% CI, 1.88-7.10)/0.66 (95% CI, 0.62-0.70) in intensive care unit or coronary care unit patients. After stratification for age, IL-18 level had better diagnostic accuracy in children and adolescents versus adults: 8.12 (95% CI, 3.79-17.41)/0.78 (95% CI, 0.75-0.82) versus 3.31 (95% CI, 2.28-4.80)/0.66 (95% CI, 0.62-0.70). There was no significant difference in predictive performance of urinary IL-18 level among various times. Limitations Various clinical settings; different definition of AKI and serum creatinine level as the reference standard test for the diagnosis of AKI. Conclusions Urinary IL-18 is a useful biomarker of AKI with moderate predictive value across all clinical settings.
By studying charge trapping in germanium detectors operating at temperatures below 10 K, we demonstrate for the first time that the formation of cluster dipole states from residual impurities is ...responsible for charge trapping. Two planar detectors with different impurity levels and types are used in this study. When drifting the localized charge carriers created by α particles from the top surface across a detector at a lower bias voltage, significant charge trapping is observed when compared to operating at a higher bias voltage. The amount of charge trapping shows a strong dependence on the type of charge carriers. Electrons are trapped more than holes in a p-type detector, while holes are trapped more than electrons in an n-type detector. When both electrons and holes are drifted simultaneously using the widespread charge carriers created by γ rays inside the detector, the amount of charge trapping shows no dependence on the polarity of bias voltage.