Output power and beam quality are the two main bottlenecks for semiconductor lasers—the favourite light sources in countless applications because of their compactness, high efficiency and cheapness. ...Both limitations are due to the fact that it becomes increasingly harder to stabilize a single-mode laser over a broader chip area without multi-mode operations. Here we address this fundamental difficulty with the Dirac-vortex topological cavity1, which offers the optimal single-mode selection in two dimensions. Our topological-cavity surface-emitting laser (TCSEL) exhibits 10 W peak power, sub-1° divergence angle and 60 dB side-mode suppression, among the best-reported performance ever at 1,550 nm—the most important telecommunication and eye-safe wavelength where high-performance surface emitters have always been difficult to make2. We also demonstrate the multi-wavelength capability of two-dimensional TCSEL arrays that are not generally available for commercial lasers2,3. TCSEL, as a new-generation high-brightness surface emitter, can be directly extended to any other wavelength range and is promising for an extremely wide variety of uses.Researchers demonstrate a topological-cavity surface-emitting laser with a 10 W peak power and sub-degree beam divergence at 1,550 nm wavelength. The system is also capable of multiple-wavelength arrays.
•This paper introduces the concept of multi-valued measure.•A directional output distance method is developed to deal with multi-valued measures.•Two optimistic and pessimistic approaches are ...proposed.•Each approach contains two individual and summative models.•A case study of 183 NUTS 2 regions in EU-28 countries validates the new models.
Data envelopment analysis (DEA) evaluates the relative efficiency of a set of comparable decision making units (DMUs) with multiple performance measures (inputs and outputs). Classical DEA models rely on the assumption that each DMU can improve its performance by increasing its current output level and decreasing its current input levels. However, undesirable outputs (like wastes and pollutants) may often be produced together with desirable outputs in final products which have to be minimized. On the other hands, in some real-world situations, we may encounter some specific performance measures with more than one value which are measured by various standards. In this study, we referee such measures as multi-valued measures which only one of their values should be selected. For instance, unemployment rate is a multi-valued measure in economic applications since there are several definitions or standards to measure it. As a result, selecting a suitable value for a multi-valued measure is a challenging issue and is crucial for successful application of DEA. The aim of this study is to accommodate multi-valued measures in the presence of undesirable outputs. In doing so, we formulate two individual and summative selecting directional distance models and develop a pair of multiplier- and envelopment-based selecting approaches. Finally, we elaborate applicability of the proposed method using a real data on 183 NUTS 2 regions in 23 selected EU-28 countries.
Display omitted
•Published data were compiled worldwide to quantify the effects of removing residues.•We found overall reductions in total and available soil nutrients.•This was due to increases in ...nutrient outputs and maybe changes in microbial activity.•Soil fertility loss has a negative effect on the growth of the subsequent forests.•We identified the causes of variability of the effects and hence mitigation measures.
Increasing attention is being paid to using modern fuelwood as a substitute for fossil energies to reduce CO2 emissions. In this context, forest biomass, particularly harvesting residues (branches), and stumps and associated coarse roots, can be used to supply fuelwood chains. However, collecting harvesting residues can affect soil properties and trees, and these effects are still not fully understood. The main objective of the present study was to compile published data worldwide and to quantify the overall effects of removing harvesting residues on nutrient outputs, chemical and biological soil fertility and tree growth, through a meta-analysis. Our study showed that, compared with conventional stem-only harvest, removing the stem plus the harvesting residues generally increases nutrient outputs thereby leading to reduced amounts of total and available nutrients in soils and soil acidification, particularly when foliage is harvested along with the branches. Losses of available nutrients in soils could also be explained by reduced microbial activity and mineralization fluxes, which in turn, may be affected by changes in organic matter quality and environmental conditions (soil compaction, temperature and moisture). Soil fertility losses were shown to have consequences for the subsequent forest ecosystem: tree growth was reduced by 3–7% in the short or medium term (up to 33years after harvest) in the most intensive harvests (e.g. when branches are exported with foliage). Combining all the results showed that, overall, whole-tree harvesting has negative impacts on soil properties and trees that may have an impact on the functioning of forest ecosystems. Practical measures that could be taken to mitigate the environmental consequences of removing harvesting residues are discussed.
Mid‐infrared (MIR) fiber lasers are highly demand after for a variety of technological applications, including uses such as laser surgery, frequency metrology, and spectroscopy. This paper ...demonstrates a stable continuous‐wave laser operation in the 2.8 µm using an Er3+‐doped ZBLAN fiber operating in a free space experimental setting. A laser output power of 12.4 mW was attained in a linear cavity when this fiber was pumped at 2.9 W of 980 nm laser diode.
This note studies output-feedback consensus of linear multiagent systems (MASs) on directed graphs. In view of limited communication resources in MAS tasks, the aim of this note is to reduce ...interagent communication. A fully distributed dynamic output-feedback protocol is proposed, where an event-triggered strategy is designed to determine when to exchange protocol states while a time-triggered one is designed to determine when to sample relative outputs. Design conditions are first established for strongly connected graphs, and then refined for directed graphs with a spanning tree. In contrast to the existing works, the merits of the proposed protocol are threefold: 1) it is applicable for directed graphs; 2) it relies on relative outputs, rather than absolute outputs or absolute/relative states; and 3) it only requires intermittent communication but no continuous monitoring of neighboring agents.
We model the performance of DMUs (decision-making units) using a two-stage network model. In the first stage of production DMUs use inputs to produce an intermediate output that becomes an input to a ...second stage where final outputs are produced. Previous black box DEA models allowed for non-radial scaling of outputs and inputs and accounted for slacks in the constraints that define the technology. We extend these models and build a performance measure that accounts for a network structure of production. We use our method to estimate the performance of Japanese banks, which use labor, physical capital, and financial equity capital in a first stage to produce an intermediate output of deposits. In the second stage, those deposits become an input in the production of loans and securities investments. The network estimates reveal greater bank inefficiency than do the estimates that treat the bank production process as a black box with all production taking place in a single stage.
The threat of global climate change has caused the international community to pay close attention to atmospheric levels of greenhouse gases such as carbon dioxide. Transportation sector carbon ...dioxide emissions efficiency (TSCDEE) is a key indicator used to prioritize sustainable development in the transportation sector. In this paper, the epsilon-based measure data envelopment analysis model with undesirable outputs is applied to estimate TSCDEE for 30 provinces in China from 2010 to 2016. We also analyze influencing factors using the spatial Durbin model. Research shows that the overall TSCDEE of the Chinese provinces studied was 0.618, indicating that most regions are still in need of improvements. The provinces with the highest TSCDEE are located in developed coastal regions of China. This study shows that factors such as transportation structure, traffic infrastructure level, and technological progress have prominent positive effects on TSCDEE, while both urbanization level and urban population density exert significantly negative effects on TSCDEE. The findings should have a far-reaching impact on the sustainable development of global transportation.
Display omitted
•Providing more accurate method in estimating CO2 emissions of the transport sector.•Focusing on the transportation sector carbon dioxide emissions efficiency (TSCDEE).•Technological progress is the key to improve TSCDEE among different factors in China.•Higher mode share of water and rail transport contributes to higher TSCDEE in China.•Urbanization presents significantly negative effect on TSCDEE in China.
Unexpectedness in medical research Aslan, Yasemin; Yaqub, Ohid; Sampat, Bhaven N. ...
Research policy,
October 2024, 2024-10-00, Letnik:
53, Številka:
8
Journal Article
Recenzirano
Odprti dostop
Whether research funding is targetable is one of the central unresolved questions of science policy. A particular question is how often research aimed at understanding one disease or problem spills ...over to others. This has been a perennial topic of debate at the world's largest single funding body of biomedical research, the U.S. National Institutes of Health (NIH). Critics of the agency's priority-setting process have repeatedly called for better alignment between funding and disease burden, and patient advocates for specific diseases for more funding for their causes. In response, opponents of planning have argued that research in one area frequently leads to advances in others. In this study, we provide new evidence to inform these debates by examining the extent to which research funding (grants) in one scientific or disease area leads to research findings (publications) in another. We used the NIH's Research, Condition, and Disease Categorization (RCDC) to identify categories for NIH grants awarded between 2008 and 2016. We applied machine-learning to map text to these categories and use this model to categorize publications resulting from these grants. We categorized over 1.2 million publications, resulting from over 90,000 grants. We found that 70 % of the publications have at least one RCDC category not in its grant, which we termed ‘unexpected’ categories. On average, 40 % of categories assigned to a publication were unexpected. After adjusting for similarity across some of the RCDC categories by empirically clustering the categories, we found 58 % of the publications had at least one unexpected category and, on average, 33 % of publication categories were unexpected. Our results suggest that disease-orientation and clinical research were less likely to be associated with spillovers. Grants resulting from targeted requests for applications were more likely to result in publications with unexpected categories, though the magnitude of the differences was relatively small.
•Examines how often research aimed at one disease or problem spills over to others.•We used the NIH's RCDC to identify categories in over 90,000 grants.•We applied machine learning to categorize over 1.2 million publications.•We found 70 % of publications have at least one unexpected category.•On average, 40 % of categories assigned to a publication were unexpected.
•Dimensionality reduction extends the class of simulators a gaussian process can model.•The emulator model accurately reproduces spatiotemporally varying inundation.•Predictive uncertainty quantified ...utilising the gaussian process framework.•Consistencies observed between emulator risk estimates and existing flood risks maps.•Gaussian process model observed to outperform alternative approaches to emulation.
The computational limitations of complex numerical models have led to adoption of statistical emulators across a variety of problems in science and engineering disciplines to circumvent the high computational costs associated with numerical simulations. In flood modelling, many hydraulic and hydrodynamic numerical models, especially when operating at high spatiotemporal resolutions, have prohibitively high computational costs for tasks requiring the instantaneous generation of very large numbers of simulation results. This study examines the appropriateness and robustness of Gaussian Process (GP) models to emulate the results from a hydraulic inundation model. The developed GPs produce real-time predictions based on the simulation output from LISFLOOD-FP numerical model. An efficient dimensionality reduction scheme is developed to tackle the high dimensionality of the output space and is combined with the GPs to investigate the predictive performance of the proposed emulator for estimation of the inundation depth. The developed GP-based framework is capable of robust and straightforward quantification of the uncertainty associated with the predictions, without requiring additional model evaluations and simulations. Further, this study explores the computational advantages of using a GP-based emulator over alternative methodologies such as neural networks, by undertaking a comparative analysis. For the case study data presented in this paper, the GP model was found to accurately reproduce water depths and inundation extent by classification and produce computational speedups of approximately 10,000 times compared with the original simulator, and 80 times for a neural network-based emulator.
Display omitted