Heatwaves (HWs) are high-impact phenomena stressing both societies and ecosystems. Their intensity and frequency are expected to increase in a warmer climate over many regions of the world. While ...these impacts can be wide-ranging, they are potentially influenced by local to regional features such as topography, land cover, and urbanization. Here, we leverage recent advances in the very high-resolution modelling required to elucidate the impacts of heatwaves at these fine scales. Further, we aim to understand how the new generation of km-scale regional climate models (RCMs) modulates the representation of heatwaves over a well-known climate change hot spot. We analyze an ensemble of 15 convection-permitting regional climate model (CPRCM, ~ 2–4 km grid spacing) simulations and their driving, convection-parameterized regional climate model (RCM, ~ 12–15 km grid spacing) simulations from the CORDEX Flagship Pilot Study on Convection. The focus is on the evaluation experiments (2000–2009) and three subdomains with a range of climatic characteristics. During HWs, and generally in the summer season, CPRCMs exhibit warmer and drier conditions than their driving RCMs. Higher maximum temperatures arise due to an altered heat flux partitioning, with daily peaks up to ~ 150 W/m
2
larger latent heat in RCMs compared to the CPRCMs. This is driven by a 5–25% lower soil moisture content in the CPRCMs, which is in turn related to longer dry spell length (up to double). It is challenging to ascertain whether these differences represent an improvement. However, a point-scale distribution-based maximum temperature evaluation, suggests that this CPRCMs warmer/drier tendency is likely more realistic compared to the RCMs, with ~ 70% of reference sites indicating an added value compared to the driving RCMs, increasing to 95% when only the distribution right tail is considered. Conversely, a CPRCMs slight detrimental effect is found according to the upscaled grid-to-grid approach over flat areas. Certainly, CPRCMs enhance dry conditions, with knock-on implications for summer season temperature overestimation. Whether this improved physical representation of HWs also has implications for future changes is under investigation.
The advancement of computational resources has allowed researchers to run convection-permitting regional climate model (CPRCM) simulations. A pioneering effort promoting a multimodel ensemble of such ...simulations is the CORDEX Flagship Pilot Studies (FPS) on “Convective Phenomena over Europe and the Mediterranean” over an extended Alps region. In this study, the Distribution Added Value metric is used to determine the improvement of the representation of all available FPS hindcast simulations for the daily mean near-surface wind speed. The analysis is performed on normalized empirical probability distributions and considers station observation data as the reference. The use of a normalized metric allows for spatial comparison among the different regions (coast and inland), altitudes and seasons. This approach permits a direct assessment of the added value between the CPRCM simulations against their global driving reanalysis (ERA-Interim) and respective coarser resolution regional model counterparts. In general, the results show that CPRCMs add value to their global driving reanalysis or forcing regional model, due to better-resolved topography or through better representation of ocean-land contrasts. However, the nature and magnitude of the improvement in the wind speed representation vary depending on the model, the season, the altitude, or the region. Among seasons, the improvement is usually larger in summer than winter. CPRCMs generally display gains at low and medium-range altitudes. In addition, despite some shortcomings in comparison to ERA-Interim, which can be attributed to the assimilation of wind observations on the coast, the CPRCMs outperform the coarser regional climate models, both along the coast and inland.
Theories explaining why students drop out of college have evolved to emphasize interactions between students and their college environments. While the interactionist model underscores the influence ...of social integration on student retention, few have examined the role of students' social networks in the decision-making process. Drawing on a survey of 952 students at a public research-intensive university, this study examines student characteristics and behaviors, faculty-student interactions, and institutional characteristics in relation to whom students would turn if considering dropping out of college. Findings indicate that students would be more likely to turn to friends or family than campus faculty or staff. Frequency and perceived quality of interactions with campus personnel increase the likelihood that students would turn to them about this decision. Provided are suggestions for institutional practices that may improve college retention interventions.
LHCb data quality monitoring Adinolfi, M; Archilli, F; Baldini, W ...
Journal of physics. Conference series,
10/2017, Letnik:
898, Številka:
9
Journal Article
Recenzirano
Odprti dostop
Data quality monitoring, DQM, is crucial in a high-energy physics experiment to ensure the correct functioning of the experimental apparatus during the data taking. DQM at LHCb is carried out in two ...phases. The first one is performed on-site, in real time, using unprocessed data directly from the LHCb detector, while the second, also performed on-site, requires the reconstruction of the data selected by the LHCb trigger system and occurs later. For the LHC Run II data taking the LHCb collaboration has re-engineered the DQM protocols and the DQM graphical interface, moving the latter to a web-based monitoring system, called Monet, thus allowing researchers to perform the second phase off-site. In order to support the operator's task, Monet is also equipped with an automated, fully configurable alarm system, thus allowing its use not only for DQM purposes, but also to track and assess the quality of LHCb software and simulation over time.
SUMMARY
Microseismic monitoring is a primary tool for understanding and tracking the progress of mechanical processes occurring in active rock fracture systems. In geothermal or hydrocarbon fields or ...along seismogenic fault systems, the detection and location of microseismicity facilitates resolution of the fracture system geometry and the investigation of the interaction between fluids and rocks, in response of stress field perturbations. Seismic monitoring aims to detect locate and characterize seismic sources. The detection of weak signals is often achieved at the cost of increasing the number of false detections, related to transient signals generated by a range of noise sources, or related to instrumental problems, ambient conditions or human activity that often affect seismic records. A variety of fast and automated methods has been recently proposed to detect and locate microseismicity based on the coherent detection of signal anomalies, such as increase in amplitude or coherent polarization, at dense seismic networks. While these methods proved to be very powerful to detect weak events and to reduce the magnitude of completeness, a major problem remains to discriminate among weak seismic signals produced by microseismicity and false detections. In this work, the microseimic data recorded along the Irpinia fault zone (Southern Apennines, Italy) are analysed to detect weak, natural earthquakes using one of such automated, migration-based, method. We propose a new method for the automatic discrimination of real vs false detections, which is based on empirical data and information about the detectability (i.e. detection capability) of the seismic network. Our approach allows obtaining high performances in detecting earthquakes without requiring a visual inspection of the seismic signals and minimizing analyst intervention. The proposed methodology is automated, self-updating and can be tuned at different success rates.
Background: Karyotype analysis has been the standard method for prenatal cytogenetic diagnosis since the 1970s. Although highly reliable, the major limitation remains the requirement for cell ...culture, resulting in a delay of as much as 14 days to obtaining test results. Fluorescent in situ hybridisation (FISH) and quantitative fluorescent PCR (QF-PCR) rapidly detect common chromosomal abnormalities but do not provide a genome wide screen for unexpected imbalances. Array comparative genomic hybridisation (CGH) has the potential to combine the speed of DNA analysis with a large capacity to scan for genomic abnormalities. We have developed a genomic microarray of approximately 600 large insert clones designed to detect aneuploidy, known microdeletion syndromes, and large unbalanced chromosomal rearrangements. Methods: This array was tested alongside an array with an approximate resolution of 1 Mb in a blind study of 30 cultured prenatal and postnatal samples with microscopically confirmed unbalanced rearrangements. Results: At 1 Mb resolution, 22/30 rearrangements were identified, whereas 29/30 aberrations were detected using the custom designed array, owing to the inclusion of specifically chosen clones to give increased resolution at genomic loci clinically implicated in known microdeletion syndromes. Both arrays failed to identify a triploid karyotype. Thirty normal control samples produced no false positive results. Conclusions: Analysis of 30 uncultured prenatal samples showed that array CGH is capable of detecting aneuploidy in DNA isolated from as little as 1 ml of uncultured amniotic fluid; 29/30 samples were correctly diagnosed, the exception being another case of triploidy. These studies demonstrate the potential for array CGH to replace conventional cytogenetics in the great majority of prenatal diagnosis cases.
A modern digital seismic network, with many stations optimally distributed on the earthquake causative seismic zone, enables detection of very low magnitude earthquakes and determination of their ...source parameters. It is essential to associate to such kind of networks procedures to analyze the huge amount of continuously recorded data for monitoring the space-time-magnitude evolution of natural and/or induced seismicity. Hence, the demand for near-real-time, automated data collection and analysis procedures for assisting seismic network operators in carrying out microearthquake monitoring is growing. In response to this need, we designed a computational software platform, TREMOR, for fast and reliable detection and characterization of seismicity recorded by a dense local seismic network. TREMOR integrates different open-source seismological algorithms for earthquake signal detection, location, and source characterizations in a fully automatic workflow. We applied the platform in play-back mode to the continuous waveform data recorded during 1 month at the Japanese Hi-net seismic network in the Nagano region (Japan) and compared the resulting catalog with the Japan Meteorological Agency bulletin in terms of number of detections, location pattern and magnitudes. The results show that the completeness magnitude of the new seismic catalog decreased by 0.35 units of the local magnitude scale and consequently the number of events increased by about 60% with respect to the available catalog. Moreover, the fault plane solutions resulted coherent with the stress regime of the region, and the Vp/Vs ratio well delineated the main structural features of the area. According to our results, TREMOR has shown to be a valid tool for investigating and studying earthquakes, especially to identify and monitor natural or induced micro-seismicity.
We present the first seismic imaging of the crustal volume affected by the March-April 2021 Thessaly sequence by applying a 3D seismic tomography to the aftershocks recorded by an unprecedented ...number of stations. The results, in terms of V
P
, V
S
, and V
P
/V
S
ratio and earthquakes’ location parameters, depict blind fluid-filled inherited structures within the Northern Thessaly seismic gap. The tomographic images highlight the basal detachment accommodating the Pelagonian nappe onto the carbonate of the Gavrovo unit. The high V
P
/V
S
(>1.85) where most of the seismicity occurs increases from SE to NW, showing possible fluid accumulation in the NW edge of the seismogenic volume that could have contributed to the sequence evolution. The aftershock relocations correlate well with the fault planes of the three mainshocks proposed by several geodetic models, but also show additional possible faults sub-parallel and antithetical to the main structures, not to be overlooked for future seismic risk mitigation.
Although surfactants have received considerable attention as a potential means for enhancing the recovery of organic compounds from the subsurface, only limited information is available regarding the ...micellar solubilization of common groundwater contaminants by nonionic surfactants. The purpose of this study was to examine the influence of surfactant properties and environmental fac tors on the solubility of dodecane, tetrachloroethylene (PCE), and 1,2-dichlorobenzene (DCB) in micellar solutions of Witconol 2722, Tergitol NP-15, and Witconol SN-120. A matrix of batch experiments was performed at 10 and 25 °C and in the presence of CaCl2 for surfactant concentrations ranging from 0.5 to 15% by weight. Although the hydrophile−lipophile balance (HLB) values of the surfactants are similar, Witconol 2722 solubilized approximately three times more organic than the other surfactants, which was attributed to the greater alkyl chain length and ethoxylation of Witconol 2722. Results of HLB scans, conducted using Tergitol NP surfactants, showed that solubilization capacity was related to the micelle core volume and that cloud point effects can reduce the aqueous solubility of PCE. These findings demonstrate the importance of considering the specific surfactant−organic interactions, cloud point tem perature, and macroemulsion formation when selecting nonionic surfactants for use in subsurface remediation ap plications.
The quantitative fluorescent PCR (QF-PCR) assay, introduced during the last few years, allows prenatal diagnoses of common chromosome aneuploidies in a few hours after sampling. We report the first ...assessment of QF-PCR performed on a large cohort of 18 000 consecutive clinical specimens analysed in two different Centres. All samples were analysed by QF-PCR using several selected STR markers together with amelogenin and, occasionally, SRY for fetal sexing. Results were compared with those obtained by conventional cytogenetic analysis. In 17 129 tests, normal fetuses were detected by QF-PCR. No false positives were observed. All 732 cases of trisomy 21, 18, 13, triploidies, double trisomies as well as all but one fetuses with X and Y aneuploidies were correctly diagnosed. Chromosome mosaicism could also be suspected in several samples. In some cases of in vitro culture failures, QF-PCR was the only evidence of fetal X, Y, 21, 18 and 13 chromosome complement. QF-PCR proved to be efficient and reliable in detecting major numerical chromosome disorders. The main advantages of the molecular assay are its very low cost, speed and automation enabling a single operator to perform up to 40 assays per day. QF-PCR relieves anxiety of most parents within 24 h from sampling and accelerates therapeutic interventions in the case of an abnormal result. In countries where large scale conventional cytogenetics is hampered by its high cost and lack of technical expertise, QF-PCR may be used as the only prenatal diagnostic test.