Financial institutions are interconnected directly by holding debt claims against each other (the network channel), and they are also bound by the market when selling assets to raise cash in ...distressful circumstances (the liquidity channel). The goal of our study is to investigate how these two channels of risk interact to propagate individual defaults to a systemwide catastrophe. We formulate a constrained optimization problem that incorporates both channels of risk, and exploit the problem structure to generate the solution (to the clearing payment vector) via a partition algorithm. Through sensitivity analysis, we are able to identify two key contributors to financial systemic risk, the network multiplier and the liquidity amplifier, and to discern the qualitative difference between the two, confirming that the market liquidity effect has a great potential to cause systemwide contagion. We illustrate the network and market liquidity effects—in particular, the significance of the latter—in the formation of systemic risk with data from the European banking system. Our results contribute to a better understanding of the effectiveness of certain policy interventions. In addition, our algorithm can be used to pin down the changes of the net worth (marked to market) of each bank in the system as the spillover effect spreads, so as to estimate the extent of contagion, and to provide a metric of financial resilience as well. Our framework can also be easily extended to incorporate the effect of bankruptcy costs.
We develop a dynamic model to study the systemic risk of the banking network, so as to study the dynamics of bank defaults. In contrast to the existing literature, we show that while the possibility ...of contagion is determined by interconnectedness of the financial network, whether a financial crisis can occur depends on the profile of the liquid assets of the banks in the system. Based on the dynamic model, we introduce a time to crisis index that allows us to predict the occurrence of a financial crisis. We then provide an intuitive measure of systemic risk. To illustrate the potential usefulness of our model, we provide an analysis of the system of twenty‐two German banks. We show how many of the banks are fundamentally weak, where the contagion effect may arise from, how strong the contagion effect is, and how significant the systemic risk is.
Abstract
We develop a continuous‐time control approach to optimal trading in a Proof‐of‐Stake (PoS) blockchain, formulated as a consumption‐investment problem that aims to strike the optimal balance ...between a participant's (or agent's) utility from holding/trading stakes and utility from consumption. We present solutions via dynamic programming and the Hamilton–Jacobi–Bellman (HJB) equations. When the utility functions are linear or convex, we derive close‐form solutions and show that the bang‐bang strategy is optimal (i.e., always buy or sell at full capacity). Furthermore, we bring out the explicit connection between the rate of return in trading/holding stakes and the participant's risk‐adjusted valuation of the stakes. In particular, we show when a participant is risk‐neutral or risk‐seeking, corresponding to the risk‐adjusted valuation being a martingale or a sub‐martingale, the optimal strategy must be to either buy all the time, sell all the time, or first buy then sell, and with both buying and selling executed at full capacity. We also propose a risk‐control version of the consumption‐investment problem; and for a special case, the “stake‐parity” problem, we show a mean‐reverting strategy is optimal.
A new platform described as the liquid metal/metal oxide (LM/MO) framework is introduced. The constituent spherical structures of these frameworks are made of micro‐ to nanosized liquid metal spheres ...and nanosized metal oxides, combining the advantages of both materials. It is shown that the diameters of the spheres and the stoichiometry of the structures can be actively controlled. Additionally, the liquid suspension of these spheres demonstrates tuneable plasmon resonances. These spherical structures are assembled to form LM/MO frameworks which are capable of demonstrating high sensitivity towards low concentrations of heavy metal ions, and enhanced solar light driven photocalalytic activities. These demonstrations imply that the LM/MO frameworks are a suitable candidate for the development of future high performance electronic and optical devices.
A new platform described as the liquid metal/metal oxide (LM/MO) framework is introduced. The constituent spherical structures of these frameworks are made of micro‐ to nanosized liquid metal spheres and nanosized metal oxides. These LM/MO frameworks demonstrate high sensitivity towards low concentrations of heavy metal ions and enhanced solar light driven photocalalytic activities.
Tunable plasmon resonances in suspended 2D molybdenum oxide flakes are demonstrated. The 2D configuration generates a large depolarization factor and the presence of ultra-doping produces ...visible-light plasmon resonances. The ultra-doping process is conducted by reducing the semiconducting 2D MoO sub(3) flakes using simulated solar irradiation. The generated plasmon resonances can be controlled by the doping levels and the flakes' lateral dimensions, as well as by exposure to a model protein.
Currently, no surveillance guidelines for hepatocellular carcinoma (HCC) recurrence after liver transplantation (LT) exist. In this retrospective, multicenter study, we have investigated the role of ...surveillance imaging on postrecurrence outcomes.
Patients with recurrent HCC after LT from 2002 to 2016 were reviewed from 3 transplant centers (University of California San Francisco, Mayo Clinic Florida, and University of Toronto). For this study, we proposed the term cumulative exposure to surveillance (CETS) as a way to define the cumulative sum of all the protected intervals that each surveillance test provides. In our analysis, CETS has been treated as a continuous variable in months.
Two hundred twenty-three patients from 3 centers had recurrent HCC post-LT. The median follow-up was 31.3 months, and median time to recurrence was 13.3 months. Increasing CETS was associated with improved postrecurrence survival (hazard ratio, 0.94; P < 0.01) as was treatment of recurrence with resection or ablation (hazard ratio, 0.31; P < 0.001). An receiver operating characteristic curve (area under the curve, 0.64) for CETS covariate showed that 252 days of coverage (or 3 surveillance scans) within the first 24 months provided the highest probability for aggressive postrecurrence treatment.
In this review of 223 patients with post-LT HCC recurrence, we found that increasing CETS does lead to improved postrecurrence survival as well as a higher probability for aggressive recurrence treatment. We found that 252 days of monitoring (ie, 3 surveillance scans) in the first 24 months was associated with the ability to offer potentially curative treatment.
The objective of this study is to develop a majorization-based tool to compare financial networks with a focus on the implications of liability concentration. Specifically, we quantify liability ...concentration by applying the majorization order to the liability matrix that captures the interconnectedness of banks in a financial network. We develop notions of balancing and unbalancing networks to bring out the qualitatively different implications of liability concentration on the system’s loss profile. We illustrate how to identify networks that are balancing or unbalancing, and we make connections to interbank structures identified by empirical research, such as perfect and imperfect tiering schemes. An empirical analysis of the network formed by the banking sectors of eight representative European countries suggests that the system is either unbalancing or close to it, persistently over time. This empirical finding, along with the majorization results, supports regulatory policies aiming at limiting the size of gross exposures to individual counterparties.
Risk Hedging for Production Planning Wang, Liao; Yao, David D.
Production and operations management,
06/2021, Letnik:
30, Številka:
6
Journal Article
Recenzirano
Odprti dostop
Traditional production planning is primarily a quantity or capacity decision, which must be made at the beginning of a planning horizon before production starts. Adding to this decision a real‐time ...control, a risk‐hedging strategy carried out throughout the horizon can better mitigate the risk involved in demand volatility. We demonstrate how this can be done in terms of jointly optimizing the capacity and the hedging decisions, addressing both the mean‐variance and the shortfall objectives. Solution techniques, results, and insights are highlighted. In particular, we illustrate that our approach readily accommodates data analytics and explicitly quantifies the improvement to the efficient frontier contributed by hedging.
Accurate prediction of a patient's length-of-stay (LOS) in the hospital enables an efficient and effective management of hospital beds. This paper studies LOS prediction for pediatric patients with ...respiratory diseases using three decision tree methods: Bagging, Adaboost, and Random forest. A data set of 11,206 records retrieved from the hospital information system is used for analysis after preprocessing and transformation through a computation and an expansion method. Two tests, namely bisection test and periodic test, are designed to assess the performance of the prediction methods. Bagging shows the best result on the bisection test (0.296 RMSE, 0.831 <inline-formula><tex-math notation="LaTeX">R^2</tex-math></inline-formula>, and 0.723 Acc<inline-formula><tex-math notation="LaTeX">\;\pm\ 1</tex-math></inline-formula>) for the testing set of the whole data test. The performances of the three methods are similar on the periodic test, whereas Adaboost performs slightly better than the other two methods. Results indicate that the three methods are all effective for the LOS prediction. This study also investigates the importance of different data fields to the LOS prediction, and finds that hospital treatment-related data fields contribute more to the LOS prediction than other categories of fields.
While the search for an efficacious HIV-1 vaccine remains elusive, emergence of a new generation of virus-neutralizing monoclonal antibodies (mAbs) has re-ignited the field of passive immunization ...for HIV-1 prevention. However, the plasticity of HIV-1 demands additional improvements to these mAbs to better ensure their clinical utility. Here, we report engineered bispecific antibodies that are the most potent and broad HIV-neutralizing antibodies to date. One bispecific antibody, 10E8V2.0/iMab, neutralized 118 HIV-1 pseudotyped viruses tested with a mean 50% inhibitory concentration (IC50) of 0.002 μg/mL. 10E8V2.0/iMab also potently neutralized 99% of viruses in a second panel of 200 HIV-1 isolates belonging to clade C, the dominant subtype accounting for ∼50% of new infections worldwide. Importantly, 10E8V2.0/iMab reduced virus load substantially in HIV-1-infected humanized mice and also provided complete protection when administered prior to virus challenge. These bispecific antibodies hold promise as novel prophylactic and/or therapeutic agents in the fight against HIV-1.
Display omitted
•We engineered bispecific antibodies with exquisite potency and breadth against HIV-1•Two bispecifics are the most potent and broad HIV-neutralizing antibodies to date•One bispecific inhibited HIV-1 in HIV treatment and HIV prevention animal models•These bispecifics are novel clinical candidates for HIV prevention and treatment
Monotherapy with bispecific engineered antibodies, which have the broadest and most potent neutralizing activity against HIV-1 described to date, can neutralize HIV-1 in both prevention and treatment animal models.