Abstract The Habitable Worlds Observatory Preliminary Input Catalog (HPIC) is a list of ∼13,000 nearby bright stars that will be potential targets for the Habitable Worlds Observatory (HWO) in its ...search for Earth-sized planets around Sun-like stars. We construct this target list using the TESS and Gaia DR3 catalogs and develop an automated pipeline to compile stellar measurements and derived astrophysical properties for all stars. We benchmark the stellar properties in the HPIC relative to those of the manually curated ExEP HWO Precursor Science Stars list and find that, for the 164 best targets for exo-Earth direct imaging, our stellar properties are consistent. We demonstrate the utility of the HPIC by using it as an input for yield calculations to predict the science output of various mission designs, including those with larger telescope diameters and those focused on other planet types besides Earth analogs, such as Jupiter-mass planets. The breadth and completeness of the HPIC is essential for accurate HWO mission trade studies, and it will be useful for other exoplanet studies and general astrophysics studying the population of bright nearby stars.
ABSTRACT The yield of Earth-like planets will likely be a primary science metric for future space-based missions that will drive telescope aperture size. Maximizing the exoEarth candidate yield is ...therefore critical to minimizing the required aperture. Here we describe a method for exoEarth candidate yield maximization that simultaneously optimizes, for the first time, the targets chosen for observation, the number of visits to each target, the delay time between visits, and the exposure time of every observation. This code calculates both the detection time and multi-wavelength spectral characterization time required for planets. We also refine the astrophysical assumptions used as inputs to these calculations, relying on published estimates of planetary occurrence rates as well as theoretical and observational constraints on terrestrial planet sizes and classical habitable zones. Given these astrophysical assumptions, optimistic telescope and instrument assumptions, and our new completeness code that produces the highest yields to date, we suggest lower limits on the aperture size required to detect and characterize a statistically motivated sample of exoEarths.
Radio and X-ray emission from brown dwarfs (BDs) suggest that an ionized gas and a magnetic field with a sufficient flux density must be present. We perform a reference study for late M-dwarfs (MD), ...BDs and giant gas planet to identify which ultracool objects are most susceptible to plasma and magnetic processes. Only thermal ionization is considered. We utilize the drift-phoenix model grid where the local atmospheric structure is determined by the global parameters T
eff, log (g) and M/H. Our results show that it is not unreasonable to expect Hα or radio emission to origin from BD atmospheres as in particular the rarefied upper parts of the atmospheres can be magnetically coupled despite having low degrees of thermal gas ionization. Such ultracool atmospheres could therefore drive auroral emission without the need for a companion's wind or an outgassing moon. The minimum threshold for the magnetic flux density required for electrons and ions to be magnetized is well above typical values of the global magnetic field of a BD and a giant gas planet. Na+, K+ and Ca+ are the dominating electron donors in low-density atmospheres (low log(g), solar metallicity) independent of T
eff. Mg+ and Fe+ dominate the thermal ionization in the inner parts of MD atmospheres. Molecules remain unimportant for thermal ionization. Chemical processes (e.g. cloud formation) affecting the most abundant electron donors, Mg and Fe, will have a direct impact on the state of ionization in ultracool atmospheres.
Since its first use in 1990 to enhance production of α-amylase in E. coli, engineering of heterologous hosts to express the hemoglobin from the bacterium Vitreoscilla (VHb) has become a widely used ...strategy to enhance production of a variety of bioproducts, stimulate bioremediation, and increase growth and survival of engineered organisms. The hosts have included a variety of bacteria, yeast, fungi, higher plants, and even animals. The beneficial effects of VHb expression are presumably the result of one or more of its activities. The available evidence indicates that these include oxygen binding and delivery to the respiratory chain and oxygenases, protection against reactive oxygen species, and control of gene expression. In the past 4 to 5 years, the use of this “VHb technology” has continued in a variety of biotechnological applications in a wide range of organisms. These include enhancement of production of an ever wider array of bioproducts, new applications in bioremediation, a possible role in enhancing aerobic waste water treatment, and the potential to enhance growth and survival of both plants and animals of economic importance.
Accurate cross-participant alignment within the medial temporal lobe (MTL) region is critical for fMRI studies of memory. However, traditional alignment approaches have been exceptionally poor at ...registering structures in this area due to significant inter-individual anatomic variability. In this study, we evaluated the performance of twelve registration approaches. Specifically, we extended several traditional approaches such as SPM's normalization and AFNI's 3dWarpDrive to improve the quality of alignment in the MTL region by using weighting masks or applying the transformations directly to ROI segmentations. In addition, we evaluated the performance of three fully deformable methods, DARTEL, Diffeomorphic Demons, and LDDMM that are effectively unconstrained by number of degrees of freedom. For each, we first assessed the method's ability to achieve optimal overlap between segmentations of subregions of the MTL across participants. Then we evaluated the smoothness of group average structural images aligned using each method to assess the blur that results when voxels of different tissue types are averaged together. In general, we found that when anatomical segmentation is possible, substantial improvement in registration accuracy can be gained in the MTL even with a small number of deformations. When segmentation is not possible, the fully deformable models provide some improvement over more traditional approaches and in a few cases even approach the performance of the ROI-based approaches. The best performance is achieved when both methods are combined. We note that these conclusions are not limited to the MTL and are easily extendable to other areas of the brain.
In evergreen tropical forests, the extent, magnitude, and controls on photosynthetic seasonality are poorly resolved and inadequately represented in Earth system models. Combining camera observations ...with ecosystem carbon dioxide fluxes at forests across rainfall gradients in Amazônia, we show that aggregate canopy phenology, not seasonality of climate drivers, is the primary cause of photosynthetic seasonality in these forests. Specifically, synchronization of new leaf growth with dry season litterfall shifts canopy composition toward younger, more light-use efficient leaves, explaining large seasonal increases (~27%) in ecosystem photosynthesis. Coordinated leaf development and demography thus reconcile seemingly disparate observations at different scales and indicate that accounting for leaf-level phenology is critical for accurately simulating ecosystem-scale responses to climate change.
Landslide‐driven erosion is controlled by the scale and frequency of slope failures and by the consequent fluxes of debris off the hillslopes. In this paper, we tackle the magnitude‐frequency part of ...the process and develop a theory of initial slope failure and debris mobilization that reproduces the heavy‐tailed distributions (probability density function or PDFs) observed for landslide source areas and volumes. Landslide rupture propagation is treated as a quasi‐static, noninertial process of simplified elastoplastic deformation with strain weakening; debris runout is not considered. The model tracks the stochastically evolving imbalance of frictional, cohesive, and body forces across a failing slope and uses safety factor concepts to convert the evolving imbalance into a series of incremental rupture growth or arrest probabilities. A single rupture is simulated with a sequence of weighted “coin tosses” with weights set by the growth probabilities. Slope failure treated in this stochastic way is a survival process that generates asymptotically power‐law‐tail PDFs of area and volume for rock and debris slides; predicted scaling exponents are consistent with analyses of landslide inventories. The primary control on the shape of the model PDFs is the relative importance of cohesion over friction in setting slope stability; the scaling of smaller, shallower failures, and the size of the most common landslide volumes, are the result of the low cohesion of soil and regolith, whereas the negative power‐law‐tail scaling for larger failures is tied to the greater cohesion of bedrock.
The evolution of many mountain landscapes is controlled by the incision of bedrock river channels. While the rate of incision is set by channel shape through its mediation of flow, the channel shape ...is itself set by the history of bedrock erosion. This feedback between channel geometry and incision determines the speed of landscape response to tectonic or climatic forcing. Here, a model for the dynamics of bedrock channel shape is derived from geometric arguments, a normal flow approximation for channel flow, and a threshold bed shear stress assumption for bedrock abrasion. The model dynamics describe the competing effects of channel widening, tilting, bending, and variable flow depth. Transient solutions suggest that channels may take ∼1–10 ky to adapt to changes in discharge, implying that channel disequilibrium is commonplace. If so, landscape evolution models will need to include bedrock channel dynamics if they are to probe the effects of climate change.
Coarse-grained (CG) simulation methods are now widely used to model the structure and dynamics of large biomolecular systems. One important issue for using such methodsespecially with regard to ...using them to model, for example, intracellular environmentsis to demonstrate that they can reproduce experimental data on the thermodynamics of protein–protein interactions in aqueous solutions. To examine this issue, we describe here simulations performed using the popular coarse-grained MARTINI force field, aimed at computing the thermodynamics of lysozyme and chymotrypsinogen self-interactions in aqueous solution. Using molecular dynamics simulations to compute potentials of mean force between a pair of protein molecules, we show that the original parametrization of the MARTINI force field is likely to significantly overestimate the strength of protein–protein interactions to the extent that the computed osmotic second virial coefficients are orders of magnitude more negative than experimental estimates. We then show that a simple down-scaling of the van der Waals parameters that describe the interactions between protein pseudoatoms can bring the simulated thermodynamics into much closer agreement with experiment. Overall, the work shows that it is feasible to test explicit-solvent CG force fields directly against thermodynamic data for proteins in aqueous solutions and highlights the potential usefulness of osmotic second virial coefficient measurements for fully parametrizing such force fields.