The purpose of the study was to evaluate reinfection and superinfection during treatment for recent hepatitis C virus (HCV). The Australian Trial in Acute Hepatitis C (ATAHC) was a prospective study ...of the natural history and treatment of recent HCV. Reinfection and superinfection were defined by detection of infection with an HCV strain distinct from the primary strain (using reverse‐transcription polymerase chain reaction RT‐PCR and subtype‐specific nested RT‐PCR assays) in the setting of spontaneous or treatment‐induced viral suppression (one HCV RNA <10 IU/mL) or persistence (HCV RNA >10 IU/mL from enrollment to week 12). Among 163 patients, 111 were treated, 79% (88 of 111) had treatment‐induced viral suppression, and 60% (67 of 111) achieved sustained virological response. Following treatment‐induced viral suppression, recurrence was observed in 19% (17 of 88), including 12 with relapse and five with reinfection (4.7 cases per 100 person‐years PY, 95% confidence interval CI: 1.9, 11.2). Among 52 untreated patients, 58% (30 of 52) had spontaneous viral suppression and recurrence was observed in 10% (3 of 30), including two with reinfection. Following reinfection, alanine aminotransferase (ALT) levels >1.5× the upper limit of normal were observed in 71% (5 of 7). Among 37 with persistence, superinfection was observed in 16% (3 of 19) of those treated and 17% (3 of 18) of those untreated. In adjusted analysis, reinfection/superinfection occurred more often in participants with poorer social functioning at enrollment and more often in those with ongoing injecting drug use (IDU). Conclusion: Reinfection and superinfection can occur during treatment of recent HCV and are associated with poor social functioning and ongoing IDU. ALT levels may be a useful clinical marker of reexposure. (HEPATOLOGY 2012)
Abstract
Understanding the factors that affect water quality and the ecological services provided by freshwater ecosystems is an urgent global environmental issue. Predicting how water quality will ...respond to global changes not only requires water quality data, but also information about the ecological context of individual water bodies across broad spatial extents. Because lake water quality is usually sampled in limited geographic regions, often for limited time periods, assessing the environmental controls of water quality requires compilation of many data sets across broad regions and across time into an integrated database. LAGOS-NE accomplishes this goal for lakes in the northeastern-most 17 US states.
LAGOS-NE contains data for 51 101 lakes and reservoirs larger than 4 ha in 17 lake-rich US states. The database includes 3 data modules for: lake location and physical characteristics for all lakes; ecological context (i.e., the land use, geologic, climatic, and hydrologic setting of lakes) for all lakes; and in situ measurements of lake water quality for a subset of the lakes from the past 3 decades for approximately 2600–12 000 lakes depending on the variable. The database contains approximately 150 000 measures of total phosphorus, 200 000 measures of chlorophyll, and 900 000 measures of Secchi depth. The water quality data were compiled from 87 lake water quality data sets from federal, state, tribal, and non-profit agencies, university researchers, and citizen scientists. This database is one of the largest and most comprehensive databases of its type because it includes both in situ measurements and ecological context data. Because ecological context can be used to study a variety of other questions about lakes, streams, and wetlands, this database can also be used as the foundation for other studies of freshwaters at broad spatial and ecological scales.
Background:
A large number of systematic reviews and meta-analyses regarding the meniscus have been published.
Purpose:
To provide a qualitative summary of the published systematic reviews and ...meta-analyses regarding the meniscus.
Study Design:
Systematic review; Level of evidence, 4.
Methods:
A systematic search of all meta-analyses and systematic reviews regarding the meniscus and published between July 2009 and July 2019 was performed with PubMed, CINAHL, EMBASE, and the Cochrane database. Published abstracts, narrative reviews, articles not written in English, commentaries, study protocols, and topics that were not focused on the meniscus were excluded. The most pertinent results were extracted and summarized from each study.
Results:
A total of 332 articles were found, of which 142 were included. Included articles were summarized and divided into 16 topics: epidemiology, diagnosis, histology, biomechanics, comorbid pathology, animal models, arthroscopic partial meniscectomy (APM), meniscal repair, meniscal root repairs, meniscal allograft transplantation (MAT), meniscal implants and scaffolds, mesenchymal stem cells and growth factors, postoperative rehabilitation, postoperative imaging assessment, patient-reported outcome measures, and cost-effectiveness. The majority of articles focused on APM (20%), MAT (18%), and meniscal repair (17%).
Conclusion:
This summary of systematic reviews and meta-analyses delivers surgeons a single source of the current evidence regarding the meniscus.
Ongoing and near-future imaging-based dark energy experiments are critically dependent upon photometric redshifts (a.k.a. photo-z’s): i.e., estimates of the redshifts of objects based only on flux ...information obtained through broad filters. Higher-quality, lower-scatter photo-z’s will result in smaller random errors on cosmological parameters; while systematic errors in photometric redshift estimates, if not constrained, may dominate all other uncertainties from these experiments. The desired optimization and calibration is dependent upon spectroscopic measurements for secure redshift information; this is the key application of galaxy spectroscopy for imaging-based dark energy experiments.
Hence, to achieve their full potential, imaging-based experiments will require large sets of objects with spectroscopically-determined redshifts, for two purposes:•Training: Objects with known redshift are needed to map out the relationship between object color and z (or, equivalently, to determine empirically-calibrated templates describing the rest-frame spectra of the full range of galaxies, which may be used to predict the color-z relation). The ultimate goal of training is to minimize each moment of the distribution of differences between photometric redshift estimates and the true redshifts of objects, making the relationship between them as tight as possible. The larger and more complete our “training set” of spectroscopic redshifts is, the smaller the RMS photo-z errors should be, increasing the constraining power of imaging experiments.Requirements: Spectroscopic redshift measurements for ∼30,000 objects over >∼15 widely-separated regions, each at least ∼20arcmin in diameter, and reaching the faintest objects used in a given experiment, will likely be necessary if photometric redshifts are to be trained and calibrated with conventional techniques. Larger, more complete samples (i.e., with longer exposure times) can improve photo-z algorithms and reduce scatter further, enhancing the science return from planned experiments greatly (increasing the Dark Energy Task Force figure of merit by up to ∼50%).Options: This spectroscopy will most efficiently be done by covering as much of the optical and near-infrared spectrum as possible at modestly high spectral resolution (λ/Δλ>∼3000), while maximizing the telescope collecting area, field of view on the sky, and multiplexing of simultaneous spectra. The most efficient instrument for this would likely be either the proposed GMACS/MANIFEST spectrograph for the Giant Magellan Telescope or the OPTIMOS spectrograph for the European Extremely Large Telescope, depending on actual properties when built. The PFS spectrograph at Subaru would be next best and available considerably earlier, c. 2018; the proposed ngCFHT and SSST telescopes would have similar capabilities but start later. Other key options, in order of increasing total time required, are the WFOS spectrograph at TMT, MOONS at the VLT, and DESI at the Mayall 4m telescope (or the similar 4MOST and WEAVE projects); of these, only DESI, MOONS, and PFS are expected to be available before 2020. Table 2-3 of this white paper summarizes the observation time required at each facility for strawman training samples. To attain secure redshift measurements for a high fraction of targeted objects and cover the full redshift span of future experiments, additional near-infrared spectroscopy will also be required; this is best done from space, particularly with WFIRST-2.4 and JWST.Calibration: The first several moments of redshift distributions (the mean, RMS redshift dispersion, etc.), must be known to high accuracy for cosmological constraints not to be systematics-dominated (equivalently, the moments of the distribution of differences between photometric and true redshifts could be determined instead). The ultimate goal of calibration is to characterize these moments for every subsample used in analyses - i.e., to minimize the uncertainty in their mean redshift, RMS dispersion, etc. – rather than to make the moments themselves small. Calibration may be done with the same spectroscopic dataset used for training if that dataset is extremely high in redshift completeness (i.e., no populations of galaxies to be used in analyses are systematically missed). Accurate photo-z calibration is necessary for all imaging experiments.Requirements: If extremely low levels of systematic incompleteness (<∼0.1%) are attained in training samples, the same datasets described above should be sufficient for calibration. However, existing deep spectroscopic surveys have failed to yield secure redshifts for 30–60% of targets, so that would require very large improvements over past experience. This incompleteness would be a limiting factor for training, but catastrophic for calibration. If <∼0.1% incompleteness is not attainable, the best known option for calibration of photometric redshifts is to utilize cross-correlation statistics in some form. The most direct method for this uses cross-correlations between positions on the sky of bright objects of known spectroscopic redshift with the sample of objects that we wish to calibrate the redshift distribution for, measured as a function of spectroscopic z. For such a calibration, redshifts of ∼100,000 objects over at least several hundred square degrees, spanning the full redshift range of the samples used for dark energy, would be necessary.Options: The proposed BAO experiment eBOSS would provide sufficient spectroscopy for basic calibrations, particularly for ongoing and near-future imaging experiments. The planned DESI experiment would provide excellent calibration with redundant cross-checks, but will start after the conclusion of some imaging projects. An extension of DESI to the Southern hemisphere would provide the best possible calibration from cross-correlation methods for DES and LSST.
We thus anticipate that our two primary needs for spectroscopy – training and calibration of photometric redshifts – will require two separate solutions. For ongoing and future projects to reach their full potential, new spectroscopic samples of faint objects will be needed for training; those new samples may be suitable for calibration, but the latter possibility is uncertain. In contrast, wide-area samples of bright objects are poorly suited for training, but can provide high-precision calibrations via cross-correlation techniques. Additional training/calibration redshifts and/or host galaxy spectroscopy would enhance the use of supernovae and galaxy clusters for cosmology. We also summarize additional work on photometric redshift techniques that will be needed to prepare for data from ongoing and future dark energy experiments.
Background and Significance: For older adults (≥60 years) newly diagnosed with acute myeloid leukemia (AML), tumor-specific and patient-specific factors have been used to predict treatment-related ...mortality (TRM) and determine suitability for intensive antileukemic therapy. Comprehensive Geriatric Assessment (CGA) is a tool designed to comprehensively evaluate health among older adults. CGA assesses multiple health domains including physical function, cognition, nutrition, mental health, polypharmacy, comorbidities, and social support but does not currently incorporate body imaging. Sarcopenia is defined as loss of muscle mass and strength. It can be measured objectively by CT, DEXA or Bioimpedance Analysis (BIA) in combination with tests of muscle strength. It has been proposed as a more accurate nutritional marker compared to BMI or serum markers. Nearly all older adults with AML receive a CT scan during their cancer evaluation. Therefore, there is an opportunity to leverage CT sarcopenia measures to improve risk prediction. Our study aims to (1) assess the burden of malnutrition and sarcopenia in older adults with newly diagnosed AML undergoing induction and (2) determine the prognostic impact of traditional markers of nutrition and novel sarcopenia measures on TRM. Study Design and Methods: This is a prospective, observational study of 82 newly diagnosed, older adult patients with AML undergoing induction treatment at the University of Chicago Comprehensive Cancer Center (NCT05458258). Key inclusion criteria include age ≥60 years and receipt of induction therapy for newly diagnosed AML. Key exclusion criteria include presence of a pacemaker or defibrillator. To assess Aim 1, newly diagnosed older adult patients with AML will be undergo a subjective global nutrition assessment and serum nutritional markers prealbumin, albumin, CRP, ferritin. Body composition (fat-mass, fat-free mass, lean mass) will be assessed using BIA and CT. Sarcopenia will be defined by the CT Skeletal Muscle Index (SMI) at L3 plus impairment on either maximal hand grip strength, 6-minute walk, Timed Up and Go test (TUG), the Short Physical Performance Battery (SPPB). Patients will also undergo an assessment of disability (instrumental activities of daily living (IADL) and activities of daily living (ADL) surveys), Short Physical Performance Battery SPPB, and Montreal cognitive assessment MOCA) prior to starting induction. All measures will be repeated at the start of post-remission therapy. Results from both timepoints will be compared against healthy controls matched by age, sex and Charlson Comorbidity Index. Healthy control data will come from the Frailty, Activity, Body Composition, and Energy Expenditure (FACE) Aging dataset housed by the Department of Geriatrics at the University of Chicago, a 1-year longitudinal, observation study of frailty in older adults residing around the university. We will match 1 AML case to 1 control. In order to ensure that all subjects are matched, 2:1 propensity score matching will be used to generate matching controls for each case. To assess Aim 2, multivariable Cox proportional hazards regression models will be performed for each of those nutrition status and sarcopenia markers significant for TRM in univariate analysis. In multivariable analyses, we will control for age, European Leukemia Net 2022 risk stratification, ECOG PS, and CGA measures as covariates. A sample size of 82 patients will give us 80% power to detect a hazard ratio of 3.0 for TRM, the primary end point, for CT diagnosed sarcopenia using a Cox proportional hazards model with a 0.05 significance level and a 60-day mortality rate of 18%. We anticipate completion of enrollment within two years of study initiation. To date, 17 patients have been approached for consent with 11 patients enrolled. When complete, this trial will provide initial evidence necessary to recommend nutritional assessment as a part of the Comprehensive Geriatric Assessment in all older adult AML patients. Furthermore, it may provide evidence to support a future interventional study assessing the impact of improved nutritional status on TRM. Figure 1. NCT05458258 study schema.
ABSTRACT
We present the bright (Vmag = 9.12), multiplanet system TOI-431, characterized with photometry and radial velocities (RVs). We estimate the stellar rotation period to be 30.5 ± 0.7 d using ...archival photometry and RVs. Transiting Exoplanet Survey Satellite (TESS) objects of Interest (TOI)-431 b is a super-Earth with a period of 0.49 d, a radius of 1.28 ± 0.04 R⊕, a mass of 3.07 ± 0.35 M⊕, and a density of 8.0 ± 1.0 g cm−3; TOI-431 d is a sub-Neptune with a period of 12.46 d, a radius of 3.29 ± 0.09 R⊕, a mass of $9.90^{+1.53}_{-1.49}$ M⊕, and a density of 1.36 ± 0.25 g cm−3. We find a third planet, TOI-431 c, in the High Accuracy Radial velocity Planet Searcher RV data, but it is not seen to transit in the TESS light curves. It has an Msin i of $2.83^{+0.41}_{-0.34}$ M⊕, and a period of 4.85 d. TOI-431 d likely has an extended atmosphere and is one of the most well-suited TESS discoveries for atmospheric characterization, while the super-Earth TOI-431 b may be a stripped core. These planets straddle the radius gap, presenting an interesting case-study for atmospheric evolution, and TOI-431 b is a prime TESS discovery for the study of rocky planet phase curves.
Voltage-dependent sodium channels are uniformly distributed along unmyelinated axons, but are highly concentrated at nodes of Ranvier in myelinated axons. Here, we show that this pattern is ...associated with differential localization of distinct sodium channel α subunits to the unmyelinated and myelinated zones of the same retinal ganglion cell axons. In adult axons, Na
v1.2 is localized to the unmyelinated zone, whereas Na
v1.6 is specifically targeted to nodes. During development, Na
v1.2 is expressed first and becomes clustered at immature nodes of Ranvier, but as myelination proceeds, Na
v1.6 replaces Na
v1.2 at nodes. In
Shiverer mice, which lack compact myelin, Na
v1.2 is found throughout adult axons, whereas little Na
v1.6 is detected. Together, these data show that sodium channel isoforms are differentially targeted to distinct domains of the same axon in a process associated with formation of compact myelin.
We report the discovery of an intermediate-mass transiting brown dwarf (BD), TOI-503b, from the TESS mission. TOI-503b is the first BD discovered by TESS, and it has circular orbit around a ...metallic-line A-type star with a period of P=3.6772±0.0001 days. The light curve from TESS indicates that TOI-503b transits its host star in a grazing manner, which limits the precision with which we measure the BD’s radius R(b) = 1.34(+0.26, -0.15)R(J). We obtained high resolution spectroscopic observations with the FIES, Ondrejov, PARAS, Tautenburg, and TRES spectrographs, and measured the mass of TOI-503b to be M(b)=53.7±1.2 M(J). The host star has a mass of M(*)=1.80±0.06M(ʘ), a radius of R(*)=1.70±0.05R(ʘ), an effective temperature of T(eff)=7650±160 K, and a relatively high metallicity of 0.61±0.07 dex. We used stellar isochrones to derive the age of the system to be ∼180 Myr, which places its age between that of RIK 72b (a ∼10 Myr old BD in the Upper Scorpius stellar association) and AD 3116b (a ∼600 Myr old BD in the Praesepe cluster). Given the difficulty in measuring the tidal interactions between BDs and their host stars, we cannot precisely say whether this BD formed in situ or has had its orbit circularized by its host star over the relatively short age of the system. Instead, we offer an examination of plausible values for the tidal quality factor for the star and BD. TOI-503b joins a growing number of known short-period, intermediate-mass BDs orbiting main sequence stars, and is the second such BD known to transit an A star, after HATS-70b. With the growth in the population in this regime, the driest region in the BD desert (35–55M(J) sin i) is reforesting.
Binospec Software System Kansky, Jan; Chilingarian, Igor; Fabricant, Daniel ...
Publications of the Astronomical Society of the Pacific,
07/2019, Letnik:
131, Številka:
1001
Journal Article
Recenzirano
Odprti dostop
Binospec is a high-throughput, 370 to 1000 nm, imaging spectrograph that addresses two adjacent 8′ by 15′ fields of view. Binospec was commissioned in late 2017 at the f/5 focus of the 6.5 m MMT and ...is now available to all MMT observers. Here we describe the Binospec software used for observation planning, instrument control, and data reduction. The software and control systems incorporate a high level of automation to minimize observer workload. Instrument configuration and observation sequencing is implemented using a database-driven approach to maximize observatory efficiency. A web-based interface allows users to define observations, monitor status, and retrieve data products.