In many animal species, individuals engage in fights with conspecifics over access to limited resources (e.g. mates, food, or shelter). Most theory about these intraspecific fights assumes that ...damage has an important role in determining the contest winner. Thus, defensive structures that reduce the amount of damage an individual accrues during intraspecific competition should provide a fighting advantage.
Examples of such damage‐reducing structures include the dermal shields of goats, the dorsal osteoderms of crocodiles, and the armoured telsons of mantis shrimps. Although numerous studies have identified these defensive structures, no study has investigated whether they influence the outcomes of intraspecific fights.
Here we investigated whether inhibiting damage by enhancing an individual's armour influenced fighting behaviour and success in the giant mesquite bug, Thasus neocalifornicus (Insecta: Hemiptera: Coreidae).
We found that experimentally manipulated individuals (i.e. those provided with additional armour) were 1.6 times more likely to win a fight when compared to the control. These results demonstrate that damage, and damage‐reducing structures, can influence fighting success.
The implications of these results are twofold. First, our results experimentally support a fundamental assumption of most theoretical fighting models: that damage is a fighting cost that can influence contest outcomes. Second, these results highlight the importance of an individual's defensive capacity, and why defence should not be ignored.
A free Plain Language Summary can be found within the Supporting Information of this article.
A free Plain Language Summary can be found within the Supporting Information of this article.
•Hydrogen accounts for 5–10% of total energy in 2050 under 2 °C scenarios.•Hydrogen share increases for a 1.5 °C goal and reaches 15% if CCS is limited.•Transport is the largest consumer of hydrogen, ...followed by industry and power.•Synthetic hydrocarbons based on hydrogen and direct air capture are an option.•Holistic energy policy design is needed rather than heavy dependence on hydrogen.
Hydrogen-based energy carriers, including hydrogen, ammonia and synthetic hydrocarbons, are expected to help reduce residual carbon dioxide emissions in the context of the Paris Agreement goals, although their potential has not yet been fully clarified in light of their competitiveness and complementarity with other mitigation options such as electricity, biofuels and carbon capture and storage (CCS). This study aimed to explore the role of hydrogen in the global energy system under various mitigation scenarios and technology portfolios using a detailed energy system model that considers various energy technologies including the conversion and use of hydrogen-based energy carriers. The results indicate that the share of hydrogen-based energy carriers generally remains less than 5% of global final energy demand by 2050 in the 2 °C scenarios. Nevertheless, such carriers contribute to removal of residual emissions from the industry and transport sectors under specific conditions. Their share increases to 10–15% under stringent mitigation scenarios corresponding to 1.5 °C warming and scenarios without CCS. The transport sector is the largest consumer, accounting for half or more of hydrogen production, followed by the industry and power sectors. In addition to direct usage of hydrogen and ammonia, synthetic hydrocarbons converted from hydrogen and carbon captured from biomass or direct air capture are attractive transport fuels, growing to half of all hydrogen-based energy carriers. Upscaling of electrification and biofuels is another common cost-effective strategy, revealing the importance of holistic policy design rather than heavy reliance on hydrogen.
Background: This study aimed to establish a clinically useful nomogram to evaluate the probability of hypertension onset in the Chinese population.Methods and Results: A prospective cohort study was ...conducted in 2012–2013 and followed up in 2015 to identify new-onset hypertension in 4,123 participants. The dataset was divided into development (n=2,748) and verification (n=1,375) cohorts. After screening risk factors by lasso regression, a multivariate Cox regression risk model and nomogram were established. Among the 4,123 participants, 818 (19.8%) developed hypertension. The model identified 10 risk factors: age, waist-to-hip ratio, systolic blood pressure, diastolic blood pressure, high pulse rate, history of diabetes, family history of hypertension and stroke, intake frequency of bean products, and intensity of physical labor. The C-indices of the model in the development and validation cohorts were 0.744 and 0.768, respectively. After the inclusion of serum calcium and magnesium concentrations, the C-indices in the development and validation cohorts were 0.764 and 0.791, respectively, with areas under the curve for the updated model of 0.907 and 0.917, respectively. The calibration curve showed that the nomogram accurately predicted the probability of hypertension. The updated nomogram was clinically beneficial across thresholds of 10–60%.Conclusions: The newly developed nomogram has good predictive ability and may effectively assess hypertension risk in high-risk rural areas in China.
Future scenarios and assessment studies used to prepare for long-term energy transitions and develop robust strategies to address climate change are highly dependent on the assessment of technology ...characteristics and availability.
The electric power sector in the United States recently experienced a significant cost escalation: e.g., construction costs for large plants such as nuclear and coal-fired power plants doubled between 2003 and 2007. We assess the main drivers of this escalation. While many factors have affected costs, some of the most significant include cost of materials, particularly the price of metals and to a lesser extent cement, possible increases in labor quantity requirements, the aggressive worldwide competition for power plant design and construction resources, driven by high demand in Asia, market and regulatory changes, and general uncertainty about future regulations and climate policies.
We recalibrate power sector technology costs in the Global Change Assessment Model (GCAM) based on an extensive literature review of recent (post-2010) studies and in the process develop a coherent and updated set of current cost and performance assumptions for all major electricity-generating technologies.
While current cost and performance assumptions of electricity-generating technologies are key drivers of short-term technology deployment and technology mix in the electricity sector, medium- and long-term deployment pathways are significantly affected by assumed efficiency improvement rates and cost reductions. We develop and demonstrate a method to project efficiency and construction cost of power plants and report a sensitivity analysis to explore the importance of such assumptions in future scenarios generated by GCAM.
Aggressiveness trait-based selection is crucial for alleviating interspecies cannibalism in economic crab species and enhancing survival rates in aquaculture. However, there is a lack of efficient ...and simple methods for assessing aggressiveness. In this study, we measured aggressiveness of the swimming crab Portunus trituberculatus through repeated mirror tests and fighting experiments. Factor analysis and the K-means algorithm were used to assess aggressiveness quantitatively and qualitatively. A combination of multiple linear regression and support vector machine (SVM) analyses was employed to construct an aggressiveness assessment model for swimming crabs and explore the relationship between aggressiveness and fighting ability. The results showed significant correlations among repeated aggressive behaviors (attacking, chela extending, defending, crossing, reverse walking, and freezing). Aggression score was significantly correlated with fighting behaviors, and there were significant differences in fighting abilities among different levels of aggressiveness. This suggested that aggressive behaviors are consistent within individuals and that aggressiveness, as a personal trait, affects the fighting ability of swimming crabs. Aggression score (Y) and clustering results of K-means can serve as assessment indicators of aggressiveness. The predictive variables for the quantitative assessment model were relative movement distance (X1) and freezing duration (X2). The adjusted R-square of the optimized quantitative model was 0.72, it also had the smallest Sigma, AIC, MSE, and RMSE values and the best fitting regression equation, which was Y = 0.023X1 – 0.001X2 – 0.002. The predictor variables for the qualitative assessment model were relative movement distance, freezing frequency, and duration. SVM was used to construct the qualitative model, and the prediction accuracy was 92%, sensitivity was 84%, and specificity was 100%, indicating the model has a good classification and prediction effect. The machine learning-based aggressiveness assessment model constructed in this study provides a behavioral method for the selection and high-throughput measurement of economic crab species with excellent aggressiveness traits, giving it important industrial application value.
•Aggressiveness assessment model for crabs was first constructed by machine learning.•Aggression score was verified could assess aggressiveness of crabs quantitatively.•SVM was used for aggressiveness classification which prediction accuracy was 92%.•Aggressiveness positively correlated with fighting ability in swimming crabs.
Prospective Life Cycle Assessment (pLCA) is useful to evaluate the environmental performance of current and emerging technologies in the future. Yet, as energy systems and industries are rapidly ...shifting towards cleaner means of production, pLCA requires an inventory database that encapsulates the expected changes in technologies and the environment at a given point in time, following specific socio-techno-economic pathways. To this end, this study introduces premise, a tool to streamline the generation of prospective inventory databases for pLCA by integrating scenarios generated by Integrated Assessment Models (IAM). More precisely, premise applies a number of transformations on energy-intensive activities found in the inventory database ecoinvent according to projections provided by the IAM. Unsurprisingly, the study shows that, within a given socio-economic narrative, the climate change mitigation target chosen affects the performance of nearly all activities in the database. This is illustrated by focusing on the effects observed on a few activities, such as systems for direct air capture of CO2, lithium-ion batteries, electricity and clinker production as well as freight transport by road, in relation to the applied sector-based transformation and the chosen climate change mitigation target. This work also discusses the limitations and challenges faced when coupling IAM and LCA databases and what improvements are to be brought in to further facilitate the development of pLCA.
Display omitted
•Prospective LCA can benefit from projections of models such as IAM.•Premise streamlines the production of LCA databases based on prospective scenarios.•Emissions and energy efficiencies of major industries are aligned with IAM scenarios.•Stricter greenhouse gas targets result in larger transformations in the LCA database.•However, such targets result in increased LCA impacts other than global warming.
The role of digital transformation (DT) in economic development is a vital and recurring point of research. It is particularly relevant if we consider the high percentage of digital transformation ...initiatives that fail to deliver the expected results, particularly in Small and Medium Enterprises (SMEs). This paper analyzes what is needed to make this transformation successful from an implementation perspective and, simultaneously, from the standpoint of obtaining the company’s expected results. This phenomenon is even more critical to decipher and understand when we look at the small and medium enterprises that face more significant challenges due to the scarcity of resources and needed skills. This work reviews a large variety of models through an extensive systematic literature review (SLR) that assess the readiness and maturity of the digital transformation of enterprises, with a focus on SMEs, with its primary objectives being (1) to review the existing studies and models that assess an organization’s maturity and readiness in the context of digital transformation, focusing on SMEs; (2) to identify if there are gaps considering the importance of the SMEs; and (3) to propose a standardized set of dimensions that should always be considered in a digital transformation assessment. The outcome of this research provides an essential contribution by identifying apparent gaps in the assessment of digital transformation in SMEs and proposing a scalable and standardized set of categories and subcategories that can be used across any future assessment model. These contributions are even more relevant when referencing minimal deep research in the context of SMEs and Digital Transformation. Doi: 10.28991/ESJ-2023-07-06-025 Full Text: PDF
State-space modeling is an emerging approach to age structured fisheries stock assessment that can accommodate multiple sources of variability in processes like recruitment, abundance, and ...selectivity. By maximizing the marginal likelihood by treating yearly deviations as random effects and then integrating them from the likelihood, these models can estimate multiple process variances. Several fisheries software packages have been developed that use a state-space framework with marginal likelihood, which has increased their popularity and usage across the U.S. Atlantic coast, Canada, and Europe. However, robust testing is still needed to gauge the applicability of these models and understand how they perform under a range of realistic variability in the process or observation error. Using an assessment model fit to Gulf of Maine Haddock as a baseline, we used a simulation-estimation procedure to test if state-space stock assessment models could produce approximately unbiased and precise estimates over a range of process variances that extended from zero to well above the levels estimated in the Gulf of Maine Haddock assessment, or when observations were noisier (i.e., more variable around their true value) than had been assumed in the assessment. We fit alternative estimation models that differed in which processes errors were included (of recruitment, expected survival, and fishery selectivity). State-space models which specify random effects in all three processes produced approximately unbiased and precise estimates of biomass and exploitation for most operating models and therefore are recommended except when variability in expected survival is absent (or very low), in which case the model is unlikely to converge. A conventional statistical-catch-at-age model with recruitment estimated as a fixed effect for each year, deterministic expected survival, and constant selectivity produced estimates that were comparable to the best state-space model, but do not provide internal estimates of process variances and did not perform well when recruitment was highly variable. This work will facilitate the use of state-space stock assessment models and choosing the parameterization that will produce the most accurate output to inform future predictions and management.
•CAV data enabled adaptive signal control strategies to enhance safety environment.•Innovated mechanism and data models to evaluate CAV data potentials for safety.•SSAM integrated into the developed ...data fusion models to support safety evaluation.•The proof-of-concept study suggests positive benefits from operation and safety.
Given the connected and autonomous vehicle (CAV) generated trajectories as a “floating sensor” data source to obtain high resolution CAV-generated mobility data at intersections, to ensure maximum safety effect while maintaining efficient operations at the same time is actually a complex task in traffic management. Literature indicates that methods for evaluating the CAV-generated data potentials focusing on safety benefits are still immature. The primary reason lies in lack of underlying mechanism and data models to make the data intelligent to enhance safety environment through adaptive traffic signal control. On top of the developed intelligent CAV-generated mobility data fusion model framework in support of adaptive traffic signal control, parameters and models included in Surrogate Safety Assessment Model (SSAM) are integrated to indicate the risk of near crashes and then evaluate the safety environment. A proof-of-concept study is conducted in Uptown Cincinnati, Ohio to test the developed data fusion models in terms of safety enhancement, along with operational benefits. In the tests, the CAV-generated data supported developed adaptive signal plan is compared with the basic signal plans (i.e., pretimed signal plan, actuated signal plan) that supported by traditional detection systems. The results indicate that the adaptive signal plan has a great potential to decrease at most 91% of total collision risk (measured in probability), 71% of crossing collision risk, 90% of rear end collisions risk and 100% of lane-changing collisions risk, compared with basic signal plans. Meanwhile, it increases up to 6.8% of throughput, and decreases up to 91.49% of average delay, 96.23% of queue length and 75.00% of number of stops. The benefits of operation efficiency include reduced average delay and reduced number of stops; but no improvement in reducing collisions severity that is reflected by high maximum speed and relative speed of two vehicles involved in a potential collision.