The problem of minimizing losses in distribution networks has traditionally been investigated using a single, deterministic demand level. This has proved to be effective since most approaches are ...generally able to also result in minimum overall energy losses. However, the increasing penetration of (firm and variable) distributed generation (DG) raises concerns on the actual benefits of loss minimization studies that are limited to a single demand/generation scenario. Here, a multiperiod AC optimal power flow (OPF) is used to determine the optimal accommodation of (renewable) DG in a way that minimizes the system energy losses. In addition, control schemes expected to be part of the future Smart Grid, such as coordinated voltage control and dispatchable DG power factor, are embedded in the OPF formulation to explore the extra loss reduction benefits that can be harnessed with such technologies. The trade-off between energy losses and more generation capacity is also investigated. The methodology is applied to a generic U.K. distribution network and results demonstrate the significant impact that considering time-varying characteristics has on the energy loss minimization problem and highlight the gains that the flexibility provided by innovative control strategies can have on both loss minimization and generation capacity.
The LHCb simulation application, Gauss, consists of two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup. For the LHCb ...experimental program it is particularly important to model B meson decays: the EvtGen code developed in CLEO and BABAR has been chosen and customized for non-coherent B production as occuring in pp collisions at the LHC. The initial proton-proton collision is provided by a different generator engine, currently PYTHIA 6 for massive production of signal and generic pp collisions events. Beam gas events, background events originating from proton halo, cosmics and calibration events for different detectors can be generated in addition to pp collisions. Different generator packages as available in the physics community or specifically developed in LHCb are used for the different purposes. Running conditions affecting the generated events such as the size of the luminous region, the number of collisions occuring in a bunch crossing and the number of spill-over events from neighbouring bunches are modeled via dedicated algorithms appropriately configured. The design of the generator phase of Gauss will be described: a modular structure with well defined interfaces specific to the various tasks, e.g. pp collisions, particle decays, selections, etc. has been chosen. Different implementations are available for the various tasks allowing selecting and combining them as most appropriate at run time as in the case of PYTHIA 6 for pp collisions or HIJING for beam gas. The advantages of such structure, allowing for example to adopt transparently new generators packages, will be discussed.
Microplastics (<5 mm) have been documented in environmental samples on a global scale. While these pollutants may enter aquatic environments via wastewater treatment facilities, the abundance of ...microplastics in these matrices has not been investigated. Although efficient methods for the analysis of microplastics in sediment samples and marine organisms have been published, no methods have been developed for detecting these pollutants within organic-rich wastewater samples. In addition, there is no standardized method for analyzing microplastics isolated from environmental samples. In many cases, part of the identification protocol relies on visual selection before analysis, which is open to bias. In order to address this, a new method for the analysis of microplastics in wastewater was developed. A pretreatment step using 30% hydrogen peroxide (H2O2) was employed to remove biogenic material, and focal plane array (FPA)-based reflectance micro-Fourier-transform (FT-IR) imaging was shown to successfully image and identify different microplastic types (polyethylene, polypropylene, nylon-6, polyvinyl chloride, polystyrene). Microplastic-spiked wastewater samples were used to validate the methodology, resulting in a robust protocol which was nonselective and reproducible (the overall success identification rate was 98.33%). The use of FPA-based micro-FT-IR spectroscopy also provides a considerable reduction in analysis time compared with previous methods, since samples that could take several days to be mapped using a single-element detector can now be imaged in less than 9 h (circular filter with a diameter of 47 mm). This method for identifying and quantifying microplastics in wastewater is likely to provide an essential tool for further research into the pathways by which microplastics enter the environment.
This review critically summarizes the neuropathology and genetics of schizophrenia, the relationship between them, and speculates on their functional convergence. The morphological correlates of ...schizophrenia are subtle, and range from a slight reduction in brain size to localized alterations in the morphology and molecular composition of specific neuronal, synaptic, and glial populations in the hippocampus, dorsolateral prefrontal cortex, and dorsal thalamus. These findings have fostered the view of schizophrenia as a disorder of connectivity and of the synapse. Although attractive, such concepts are vague, and differentiating primary events from epiphenomena has been difficult. A way forward is provided by the recent identification of several putative susceptibility genes (including neuregulin, dysbindin, COMT, DISC1, RGS4, GRM3, and G72). We discuss the evidence for these and other genes, along with what is known of their expression profiles and biological roles in brain and how these may be altered in schizophrenia. The evidence for several of the genes is now strong. However, for none, with the likely exception of COMT, has a causative allele or the mechanism by which it predisposes to schizophrenia been identified. Nevertheless, we speculate that the genes may all converge functionally upon schizophrenia risk via an influence upon synaptic plasticity and the development and stabilization of cortical microcircuitry. NMDA receptor-mediated glutamate transmission may be especially implicated, though there are also direct and indirect links to dopamine and GABA signalling. Hence, there is a correspondence between the putative roles of the genes at the molecular and synaptic levels and the existing understanding of the disorder at the neural systems level. Characterization of a core molecular pathway and a 'genetic cytoarchitecture' would be a profound advance in understanding schizophrenia, and may have equally significant therapeutic implications.
Subfossil pollen and plant macrofossil data derived from
14
C-dated sediment profiles can provide quantitative information on glacial and interglacial climates. The data allow climate variables ...related to growing-season warmth, winter cold, and plant-available moisture to be reconstructed. Continental-scale reconstructions have been made for the mid-Holocene (MH, around 6 ka) and Last Glacial Maximum (LGM, around 21 ka), allowing comparison with palaeoclimate simulations currently being carried out as part of the fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change. The synthesis of the available MH and LGM climate reconstructions and their uncertainties, obtained using modern-analogue, regression and model-inversion techniques, is presented for four temperature variables and two moisture variables. Reconstructions of the same variables based on surface-pollen assemblages are shown to be accurate and unbiased. Reconstructed LGM and MH climate anomaly patterns are coherent, consistent between variables, and robust with respect to the choice of technique. They support a conceptual model of the controls of Late Quaternary climate change whereby the first-order effects of orbital variations and greenhouse forcing on the seasonal cycle of temperature are predictably modified by responses of the atmospheric circulation and surface energy balance.
Summary
Autologous and single‐donor allogenic platelet preparations are increasingly being used in many areas of regenerative medicine. However, there are few properly controlled randomized clinical ...trials, and the preparation, content and characteristics of platelet preparations are generally poorly defined and controlled. The Platelet Physiology Subcommittee of the Scientific and Standardization Committee (SSC) of the ISTH formed a working party of experts with the aim of producing consensus recommendations for guidance on the use of platelets in regenerative medicine. Owing to a lack of investigations that provide definitive evidence for the efficacy, definition and use of different platelet preparations in regenerative medicine, there were insufficient data to develop evidence‐based guidelines. Therefore, the RAND method was used, which obtains a formal consensus among experts particularly when scientific evidence is absent, scarce and/or heterogeneous. Using this approach, each expert scored as ‘appropriate’, ‘uncertain’ or ‘inappropriate’ a series of 45 statements about the practice of regenerative medicine with platelets, which included different sections on general aspects, platelet preparations, clinical trial design, and potential utility in different clinical scenarios. After presentation and public discussion at SSC meetings, the assessments were further refined to produce final consensus recommendations, which constitute the subject of the present report.
Light transmission aggregometry (LTA) is the most common method used to assess platelet function. However, there is no universal standard for its performance. The Platelet Physiology Subcommittee of ...the Scientific and Standardization Committee (SSC) of the International Society on Thrombosis and Haemostasis formed a working party of experts with the aim of producing a series of consensus recommendations for standardizing LTA. Due to a lack of investigations that directly compared different methodologies to perform LTA studies, there were insufficient data to develop evidence-based guidelines. Therefore, the RAND method was used, which obtains a formal consensus among experts about the appropriateness of health care interventions, particularly when scientific evidence is absent, scarce and/or heterogeneous. Using this approach, each expert scored as "appropriate", "uncertain" or "inappropriate" a series of statements about the practice of LTA, which included pre-analytical variables, blood collection, blood processing, methodological details, choice of agonists and the evaluation and reporting of results. After presentation and public discussion at SSC meetings, the assessments were further refined to produce final consensus recommendations. Before delivering the recommendations, a formal literature review was performed using a series of defined search terms about LTA. Of the 1830 potentially relevant studies identified, only 14 publications were considered to be actually relevant for review. Based upon the additional information, 6 consensus statements were slightly modified. The final statements were presented and discussed at the SSC Meeting in Cairo (2010) and formed the basis of a consensus document, which is the subject of the present report. This article is protected by copyright. All rights reserved.
•Automatic organ segmentation in 3D medical scans is an important yet challenging problem for medical image analysis, especially the pancreas.•As a solution, we present an automated system based on a ...two-stage cascaded approach: pancreas localization and pancreas segmentation.•We design a complete deep-learning approach based on efficient holistically-nested convolutional networks applied to three orthogonal views.•Quantitative evaluation on a public CT dataset of 82 patients shows state-of-the art performance with 81.27 ± 6.27% Dice score in validation.
Display omitted
Accurate and automatic organ segmentation from 3D radiological scans is an important yet challenging problem for medical image analysis. Specifically, as a small, soft, and flexible abdominal organ, the pancreas demonstrates very high inter-patient anatomical variability in both its shape and volume. This inhibits traditional automated segmentation methods from achieving high accuracies, especially compared to the performance obtained for other organs, such as the liver, heart or kidneys. To fill this gap, we present an automated system from 3D computed tomography (CT) volumes that is based on a two-stage cascaded approach—pancreas localization and pancreas segmentation. For the first step, we localize the pancreas from the entire 3D CT scan, providing a reliable bounding box for the more refined segmentation step. We introduce a fully deep-learning approach, based on an efficient application of holistically-nested convolutional networks (HNNs) on the three orthogonal axial, sagittal, and coronal views. The resulting HNN per-pixel probability maps are then fused using pooling to reliably produce a 3D bounding box of the pancreas that maximizes the recall. We show that our introduced localizer compares favorably to both a conventional non-deep-learning method and a recent hybrid approach based on spatial aggregation of superpixels using random forest classification. The second, segmentation, phase operates within the computed bounding box and integrates semantic mid-level cues of deeply-learned organ interior and boundary maps, obtained by two additional and separate realizations of HNNs. By integrating these two mid-level cues, our method is capable of generating boundary-preserving pixel-wise class label maps that result in the final pancreas segmentation. Quantitative evaluation is performed on a publicly available dataset of 82 patient CT scans using 4-fold cross-validation (CV). We achieve a (mean ± std. dev.) Dice similarity coefficient (DSC) of 81.27 ± 6.27% in validation, which significantly outperforms both a previous state-of-the art method and a preliminary version of this work that report DSCs of 71.80 ± 10.70% and 78.01 ± 8.20%, respectively, using the same dataset.