Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions ...of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims. We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods. Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results. The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations (~3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.
Aims. Shadows in transitional disks are generally interpreted as signs of a misaligned inner disk. This disk is usually beyond the reach of current day high contrast imaging facilities. However, the ...location and morphology of the shadow features allow us to reconstruct the inner disk geometry. Methods. We derive analytic equations of the locations of the shadow features as a function of the orientation of the inner and outer disk and the height of the outer disk wall. In contrast to previous claims in the literature, we show that the position angle of the line connecting the shadows cannot be directly related to the position angle of the inner disk. Results. We show how the analytic framework derived here can be applied to transitional disks with shadow features. We use estimates of the outer disk height to put constraints on the inner disk orientation. In contrast with the results from Long et al. (2017, ApJ, 838, 62), we derive that for the disk surrounding HD 100453 the analytic estimates and interferometric observations result in a consistent picture of the orientation of the inner disk. Conclusions. The elegant consistency in our analytic framework between observation and theory strongly support both the interpretation of the shadow features as coming from a misaligned inner disk as well as the diagnostic value of near infrared interferometry for inner disk geometry.
Context. In protoplanetary disks micron-size dust grains coagulate to form larger structures with complex shapes and compositions. The coagulation process changes the absorption and scattering ...properties of particles in the disk in significant ways. To properly interpret observations of protoplanetary disks and to place these observations in the context of the first steps of planet formation, it is crucial to understand the optical properties of these complex structures. Aims. We derive the optical properties of dust aggregates using detailed computations of aggregate structures and compare these computationally demanding results with approximate methods that are cheaper to compute in practice. In this way we wish to understand the merits and problems of approximate methods and define the context in which they can or cannot be used to analyze observations of objects where significant grain growth is taking place. Methods. For the detailed computations we used the discrete dipole approximation (DDA), a method able to compute the interaction of light with a complexly shaped, inhomogeneous particle. We compared the results to those obtained using spherical and irregular, homogeneous and inhomogeneous particles. Results. While no approximate method properly reproduces all characteristics of large dust aggregates, the thermal properties of dust can be analyzed using irregularly shaped, porous, inhomogeneous grains. The asymmetry of the scattering phase function is a good indicator of aggregate size, while the degree of polarization is probably determined by the size of the constituent particles. Optical properties derived from aggregates significantly differ from the most frequently used standard (“astronomical silicate” in spherical grains). We outline a computationally fast and relatively accurate method that can be used for a multiwavelength analysis of aggregate dust in protoplanetary disks.
Brain size variation over primate evolution and human development is associated with shifts in the proportions of different brain regions. Individual brain size can vary almost twofold among ...typically developing humans, but the consequences of this for brain organization remain poorly understood. Using in vivo neuroimaging data from more than 3000 individuals, we find that larger human brains show greater areal expansion in distributed frontoparietal cortical networks and related subcortical regions than in limbic, sensory, and motor systems. This areal redistribution recapitulates cortical remodeling across evolution, manifests by early childhood in humans, and is linked to multiple markers of heightened metabolic cost and neuronal connectivity. Thus, human brain shape is systematically coupled to naturally occurring variations in brain size through a scaling map that integrates spatiotemporally diverse aspects of neurobiology.
Abstract
The European FP7 project DIANA has performed a coherent analysis of a large set of observational data of protoplanetary disks by means of thermo-chemical disk models. The collected data ...include extinction-corrected stellar UV and X-ray input spectra (as seen by the disk), photometric fluxes, low and high resolution spectra, interferometric data, emission line fluxes, line velocity profiles and line maps, which probe the dust, polycyclic aromatic hydrocarbons (PAHs) and the gas in these objects. We define and apply a standardized modeling procedure to fit these data by state-of-the-art modeling codes (
ProDiMo
,
MCFOST
,
MCMax
), solving continuum and line radiative transfer (RT), disk chemistry, and the heating and cooling balance for both the gas and the dust. 3D diagnostic RT tools (e.g., FLiTs) are eventually used to predict all available observations from the same disk model, the DIANA-standard model. Our aim is to determine the physical parameters of the disks, such as total gas and dust masses, the dust properties, the disk shape, and the chemical structure in these disks. We allow for up to two radial disk zones to obtain our best-fitting models that have about 20 free parameters. This approach is novel and unique in its completeness and level of consistency. It allows us to break some of the degeneracies arising from pure Spectral Energy Distribution (SED) modeling. In this paper, we present the results from pure SED fitting for 27 objects and from the all inclusive DIANA-standard models for 14 objects. Our analysis shows a number of Herbig Ae and T Tauri stars with very cold and massive outer disks which are situated at least partly in the shadow of a tall and gas-rich inner disk. The disk masses derived are often in excess to previously published values, since these disks are partially optically thick even at millimeter wavelength and so cold that they emit less than in the Rayleigh–Jeans limit. We fit most infrared to millimeter emission line fluxes within a factor better than 3, simultaneously with SED, PAH features and radial brightness profiles extracted from images at various wavelengths. However, some line fluxes may deviate by a larger factor, and sometimes we find puzzling data which the models cannot reproduce. Some of these issues are probably caused by foreground cloud absorption or object variability. Our data collection, the fitted physical disk parameters as well as the full model output are available to the community through an online database (
http://www.univie.ac.at/diana
).
•Innovative pathway assessment was carried based on the pilot-scale testing data.•The TEA result showed the break even selling price of biofuel was $2.23/gallon.•The IRRs of the process and system ...were between −6.1% and 18.7%.
Combining algae cultivation and wastewater treatment for biofuel production is considered the feasible way for resource utilization. An updated comprehensive techno-economic analysis method that integrates resources availability into techno-economic analysis was employed to evaluate the wastewater-based algal biofuel production with the consideration of wastewater treatment improvement, greenhouse gases emissions, biofuel production costs, and coproduct utilization. An innovative approach consisting of microalgae cultivation on centrate wastewater, microalgae harvest through flocculation, solar drying of biomass, pyrolysis of biomass to bio-oil, and utilization of co-products, was analyzed and shown to yield profound positive results in comparison with others. The estimated break even selling price of biofuel ($2.23/gallon) is very close to the acceptable level. The approach would have better overall benefits and the internal rate of return would increase up to 18.7% if three critical components, namely cultivation, harvest, and downstream conversion could achieve breakthroughs.
Context. High-contrast scattered light observations have revealed the surface morphology of several dozen protoplanetary disks at optical and near-infrared wavelengths. Inclined disks offer the ...opportunity to measure part of the phase function of the dust grains that reside in the disk surface which is essential for our understanding of protoplanetary dust properties and the early stages of planet formation. Aims. We aim to construct a method which takes into account how the flaring shape of the scattering surface of an optically thick protoplanetary disk projects onto the image plane of the observer. This allows us to map physical quantities (e.g., scattering radius and scattering angle) onto scattered light images and retrieve stellar irradiation corrected images (r2-scaled) and dust phase functions. Methods. The scattered light mapping method projects a power law shaped disk surface onto the detector plane after which the observed scattered light image is interpolated backward onto the disk surface. We apply the method on archival polarized intensity images of the protoplanetary disk around HD 100546 that were obtained with VLT/SPHERE in the R′ band and VLT/NACO in the H and Ks bands. Results. The brightest side of the r2-scaled R′ band polarized intensity image of HD 100546 changes from the far to the near side of the disk when a flaring instead of a geometrically flat disk surface is used for the r2-scaling. The decrease in polarized surface brightness in the scattering angle range of ~40°–70° is likely a result of the dust phase function and degree of polarization which peak in different scattering angle regimes. The derived phase functions show part of a forward scattering peak, which indicates that large, aggregate dust grains dominate the scattering opacity in the disk surface. Conclusions. Projection effects of a protoplanetary disk surface need to be taken into account to correctly interpret scattered light images. Applying the correct scaling for the correction of stellar irradiation is crucial for the interpretation of the images and the derivation of the dust properties in the disk surface layer.
Background
The left pulmonary artery sling (LPAS) is a rare vascular anomaly where the left pulmonary artery arises from the right pulmonary artery, passes over the right bronchus, and goes ...posteriorly between the trachea and esophagus. The LPAS is frequently associated with cardiac and non-cardiac defects including tracheobronchial abnormalities.
Objective
To evaluate the utility of multislice CT (MSCT) and helical CT (HCT) in diagnosing and defining the tracheobronchial anomaly and anatomic relationships between the trachea and aberrant left pulmonary artery.
Materials and methods
MSCT or HCT was performed in 27 children to determine the tracheobronchial anatomy and identify tracheobronchial stenosis. Eighteen children underwent surgery.
Results
According to the Wells
6
classification of LPAS, which includes two main types and two subtypes, there were eight cases of type 1A, five cases of type 1B, six cases of type 2A and eight cases of type 2B in this group. Twenty-four of the 27 children had substantial tracheobronchial stenosis. Four died before surgery; the 18 had reanastomosis of the left pulmonary artery. Five children also had tracheoplasty; three died after surgery.
Conclusion
CT, especially MSCT, is an ideal modality for simultaneously identifying aberrant left pulmonary artery and any associated tracheobronchial anomaly. The Wells classification is useful for operative planning.
Advances in image segmentation of magnetic resonance images (MRI) have demonstrated that multi-atlas approaches improve segmentation over regular atlas-based approaches. These approaches often rely ...on a large number of manually segmented atlases (e.g. 30–80) that take significant time and expertise to produce. We present an algorithm, MAGeT-Brain (Multiple Automatically Generated Templates), for the automatic segmentation of the hippocampus that minimises the number of atlases needed whilst still achieving similar agreement to multi-atlas approaches. Thus, our method acts as a reliable multi-atlas approach when using special or hard-to-define atlases that are laborious to construct.
MAGeT-Brain works by propagating atlas segmentations to a template library, formed from a subset of target images, via transformations estimated by nonlinear image registration. The resulting segmentations are then propagated to each target image and fused using a label fusion method.
We conduct two separate Monte Carlo cross-validation experiments comparing MAGeT-Brain and basic multi-atlas whole hippocampal segmentation using differing atlas and template library sizes, and registration and label fusion methods. The first experiment is a 10-fold validation (per parameter setting) over 60 subjects taken from the Alzheimer's Disease Neuroimaging Database (ADNI), and the second is a five-fold validation over 81 subjects having had a first episode of psychosis. In both cases, automated segmentations are compared with manual segmentations following the Pruessner-protocol. Using the best settings found from these experiments, we segment 246 images of the ADNI1:Complete 1Yr 1.5T dataset and compare these with segmentations from existing automated and semi-automated methods: FSL FIRST, FreeSurfer, MAPER, and SNT. Finally, we conduct a leave-one-out cross-validation of hippocampal subfield segmentation in standard 3T T1-weighted images, using five high-resolution manually segmented atlases (Winterburn et al., 2013).
In the ADNI cross-validation, using 9 atlases MAGeT-Brain achieves a mean Dice's Similarity Coefficient (DSC) score of 0.869 with respect to manual whole hippocampus segmentations, and also exhibits significantly lower variability in DSC scores than multi-atlas segmentation. In the younger, psychosis dataset, MAGeT-Brain achieves a mean DSC score of 0.892 and produces volumes which agree with manual segmentation volumes better than those produced by the FreeSurfer and FSL FIRST methods (mean difference in volume: 80mm3, 1600mm3, and 800mm3, respectively). Similarly, in the ADNI1:Complete 1Yr 1.5T dataset, MAGeT-Brain produces hippocampal segmentations well correlated (r>0.85) with SNT semi-automated reference volumes within disease categories, and shows a conservative bias and a mean difference in volume of 250mm3 across the entire dataset, compared with FreeSurfer and FSL FIRST which both overestimate volume differences by 2600mm3 and 2800mm3 on average, respectively. Finally, MAGeT-Brain segments the CA1, CA4/DG and subiculum subfields on standard 3T T1-weighted resolution images with DSC overlap scores of 0.56, 0.65, and 0.58, respectively, relative to manual segmentations.
We demonstrate that MAGeT-Brain produces consistent whole hippocampal segmentations using only 9 atlases, or fewer, with various hippocampal definitions, disease populations, and image acquisition types. Additionally, we show that MAGeT-Brain identifies hippocampal subfields in standard 3T T1-weighted images with overlap scores comparable to competing methods.
•We propose an automated MR image hippocampus (and subfield) segmentation method.•Our method is optimised for use with a small number (<10) of training images.•Consistent, accurate identification of the whole hippocampus and subfields•Validated on healthy, Alzheimer's disease, and first episode psychosis subjects•Source code and high-resolution training subfield atlases available online
Given a transportation network having source nodes with evacuees and destination nodes, we want to find a contraflow network configuration, i.e., ideal direction for each edge, to minimize evacuation ...time. Contraflow is considered a potential remedy to reduce congestion during evacuations in the context of homeland security and natural disasters (e.g., hurricanes). This problem is computationally challenging because of the very large search space and the expensive calculation of evacuation time on a given network. To our knowledge, this paper presents the first macroscopic approaches for the solution of contraflow network reconfiguration incorporating road capacity constraints, multiple sources, congestion factor, and scalability. We formally define the contraflow problem based on graph theory and provide a framework of computational workload to classify our approaches. A greedy heuristic is designed to produce high quality solutions with significant performance. A bottleneck relief heuristic is developed to deal with large numbers of evacuees. We evaluate the proposed approaches both analytically and experimentally using real world datasets. Experimental results show that our contraflow approaches can reduce evacuation time by 40% or more.