Preclinical data have shown that proton pump inhibitors (PPI) can modulate the microbiome, and single-arm studies suggested that antibiotics (ATB) may decrease the efficacy of immune checkpoint ...inhibitors (ICI), but randomized controlled trial data are lacking. This pooled analysis evaluated the effect of ATB and PPI on outcome in patients randomized between ICI and chemotherapy.
This retrospective analysis used pooled data from the phase II POPLAR (NCT01903993) and phase III OAK (NCT02008227) trials, which included 1512 patients with previously treated non-small-cell lung cancer (NSCLC) randomly assigned to receive atezolizumab (n = 757) or docetaxel (n = 755). The main objective of this analysis was to assess the impact of ATB and PPI use on overall survival (OS) and progression-free survival (PFS).
A total of 169 (22.3%) patients in the atezolizumab group and 202 (26.8%) in the docetaxel group received ATB, and 234 (30.9%) and 260 (34.4%), respectively, received PPI. Multivariate analysis in all patients revealed that ATB were associated with shorter OS hazard ratio (HR) 1.20, 95% confidence interval (CI) 1.04–1.39, as was PPI (HR 1.26, 95% CI 1.10–1.44). Within the atezolizumab population, OS was significantly shorter in patients who received ATB (8.5 versus 14.1 months, HR 1.32, 95% CI 1.06–1.63, P = 0.01) or PPI (9.6 versus 14.5 months, HR 1.45, 95% CI 1.20–1.75, P = 0.0001). PPI use was associated with shorter PFS in the atezolizumab population (1.9 versus 2.8 months, HR 1.30, 95% CI 1.10−1.53, P = 0.001). There was no association between ATB and PPI use and PFS or OS within the docetaxel population.
In this unplanned analysis from two randomized trials, data suggest that ATB or PPI use in patients with metastatic NSCLC is associated with poor outcome and may influence the efficacy of ICI.
•Use of antibiotics or proton pump inhibitors in patients with non-small cell lung cancer is associated with poor outcome.•The effect of antibiotics and proton pump inhibitors on outcome after immunotherapy warrants further investigation.•Treating physicians should carefully evaluate the need for co-medications such as antibiotics or proton pump inhibitors.
Preclinical data have shown that proton pump inhibitors (PPI) can modulate the microbiome, and single-arm studies suggested that antibiotics (ATB) may decrease the efficacy of immune checkpoint ...inhibitors (ICI), but randomized controlled trial data are lacking. This pooled analysis evaluated the effect of ATB and PPI on outcome in patients randomized between ICI and chemotherapy.
This retrospective analysis used pooled data from the phase II POPLAR (NCT01903993) and phase III OAK (NCT02008227) trials, which included 1512 patients with previously treated non-small-cell lung cancer (NSCLC) randomly assigned to receive atezolizumab (n = 757) or docetaxel (n = 755). The main objective of this analysis was to assess the impact of ATB and PPI use on overall survival (OS) and progression-free survival (PFS).
A total of 169 (22.3%) patients in the atezolizumab group and 202 (26.8%) in the docetaxel group received ATB, and 234 (30.9%) and 260 (34.4%), respectively, received PPI. Multivariate analysis in all patients revealed that ATB were associated with shorter OS hazard ratio (HR) 1.20, 95% confidence interval (CI) 1.04-1.39, as was PPI (HR 1.26, 95% CI 1.10-1.44). Within the atezolizumab population, OS was significantly shorter in patients who received ATB (8.5 versus 14.1 months, HR 1.32, 95% CI 1.06-1.63, P = 0.01) or PPI (9.6 versus 14.5 months, HR 1.45, 95% CI 1.20-1.75, P = 0.0001). PPI use was associated with shorter PFS in the atezolizumab population (1.9 versus 2.8 months, HR 1.30, 95% CI 1.10-1.53, P = 0.001). There was no association between ATB and PPI use and PFS or OS within the docetaxel population.
In this unplanned analysis from two randomized trials, data suggest that ATB or PPI use in patients with metastatic NSCLC is associated with poor outcome and may influence the efficacy of ICI.
Tsunami warning centres face the challenging task of rapidly forecasting tsunami threat immediately after an earthquake, when there is high uncertainty due to data deficiency. Here we introduce ...Probabilistic Tsunami Forecasting (PTF) for tsunami early warning. PTF explicitly treats data- and forecast-uncertainties, enabling alert level definitions according to any predefined level of conservatism, which is connected to the average balance of missed-vs-false-alarms. Impact forecasts and resulting recommendations become progressively less uncertain as new data become available. Here we report an implementation for near-source early warning and test it systematically by hindcasting the great 2010 M8.8 Maule (Chile) and the well-studied 2003 M6.8 Zemmouri-Boumerdes (Algeria) tsunamis, as well as all the Mediterranean earthquakes that triggered alert messages at the Italian Tsunami Warning Centre since its inception in 2015, demonstrating forecasting accuracy over a wide range of magnitudes and earthquake types.
Most real-world networks, from the World-Wide-Web to biological systems, are known to have common structural properties. A remarkable point is fractality, which suggests the self-similarity across ...scales of the network structure of these complex systems. Managing the computational complexity for detecting the self-similarity of big-sized systems represents a crucial problem. In this paper, a novel algorithm for revealing the fractality, that exploits the community structure principle, is proposed and then applied to several water distribution systems (WDSs) of different size, unveiling a self-similar feature of their layouts. A scaling-law relationship, linking the number of clusters necessary for covering the network and their average size is defined, the exponent of which represents the fractal dimension. The self-similarity is then investigated as a proxy of recurrent and specific response to multiple random pipe failures – like during natural disasters – pointing out a specific global vulnerability for each WDS. A novel vulnerability index, called
Cut-Vulnerability
is introduced as the ratio between the fractal dimension and the average node degree, and its relationships with the number of randomly removed pipes necessary to disconnect the network and with some topological metrics are investigated. The analysis shows the effectiveness of the novel index in describing the global vulnerability of WDSs.
In confined spaces such as living environments and workplaces, the concentration levels of radon (Rn222) can be very high as compared to the external environment. Since Rn has been classified as the ...second leading cause of lung cancer after cigarette smoking, to apply efficient locally based risk reduction actions, dense maps of indoor radon concentration are needed. These maps would provide information about the areas prone to high radon concentrations and therefore more dangerous to human health. The soil is the primary source of the Rn, hence the risk assessment and reduction for the radon exposure cannot disregard the identification of the local geology. In this regard, we propose an innovative method, based on the Gini index computation, for the realization of interpolated maps (kriging) to describe the distribution of concentration of Rn. To validate the method, a tool that simulates sets of radon concentrations is used, whose variability is, to the first order, controlled by a priori imposed different lithologies. A systematic comparison is made between the results achieved by means of a classically used geostatistical method and the proposed Gini-based tool. We show how, by using this latter tool, the kriging solutions appear to be more robust to resolve the different geogenic radon sources independently from the number of the available measurements.
•The Rn kriging provide information on the areas prone to high Rn concentrations and therefore more dangerous to human health.•We introduce an alternative kriging tool based on the computation of the Lorenz curve and Gini index.•We show how the Gini-based kriging maps better reproduce the indoor radon variability due to different soil-geology.
Brownian dynamics algorithms integrate Langevin equations numerically and allow to probe long time scales in simulations. A common requirement for such algorithms is that interactions in the system ...should vary little during an integration time step; therefore, computational efficiency worsens as the interactions become steeper. In the extreme case of hard-body interactions, standard numerical integrators become ill defined. Several approximate schemes have been invented to handle such cases, but little emphasis has been placed on testing the correctness of the integration scheme. Starting from the two-body Smoluchowski equation, the authors discuss a general method for the overdamped Brownian dynamics of hard spheres, recently developed by one of the authors. They test the accuracy of the algorithm and demonstrate its convergence for a number of analytically tractable test cases.
AbstractThe detection of contaminant intrusion into a water-distribution network (WDN) is a difficult issue due to uncertainty related to the type of injected contaminant, source location, and ...intrusion time. The placement of water quality sensors has received increasing interest in the last years, and it still represents an open problem and a great challenge for researchers and utilities. Efficient numerical techniques are needed to support any contamination warning system (CWS) design. These require a well-calibrated hydraulic model of the WDN and a great deal of information, both of which are often unavailable to water utilities. In addition, as the size of the WDN increases, the choice of effective sensor placement becomes a computationally intractable problem. This paper introduces a methodology to support water utilities in the design of an effective CWS without any use of hydraulic information, but just exploiting the knowledge of the topology of the WDN. To ensure a complete coverage of the network, the method relies on a priori clustering of the WDN and on the installation of quality sensors at the most central nodes of each cluster, selected according to different topological centrality metrics. The procedure is tested on a benchmark network and on a real WDN serving a town close to Naples, Italy. The solutions obtained with topological criteria are effective in terms of detection time, detection likelihood, redundancy, and population exposed through ingestion.
Abstract
Nowadays, companies are experimenting novel organizational solutions to efficiently operate in uncertain and highly dynamic scenarios. As a potential solution, this paper proposes a new ...business model for a multi-echelon Supply Chain inventory management pattern. Specifically, an inventory model with proactive lateral transshipments was developed and subsequently tested carrying out 288 experiments with the aim of assessing transshipments impact on the performance of a two-echelon Supply Chain. The final goal was to investigate the potential reduction of the overall cost of the enterprise and, conversely, whether this approach could promote significant improvements in the level of service, achievable through a more efficient management of resources. The analyses and simulations indicate the use of large batches and/or low-cost products did not demand the necessity of transshipment events. These preliminary findings could be potentially validated and tested in the future considering more complex networks or multiple products.
We develop a general theory for irreducible homogeneous spaces
M
=
G
/
H
, in relation to the nullity distribution
ν
of their curvature tensor. We construct natural invariant (different and ...increasing) distributions associated with the nullity, that give a deep insight of such spaces. In particular, there must exist an order-two transvection, not in the nullity, with null Jacobi operator. This fact was very important for finding out the first homogeneous examples with non-trivial nullity, i.e., where the nullity distribution is not parallel. Moreover, we construct irreducible examples of conullity
k
= 3, the smallest possible, in any dimension. None of our examples admit a quotient of finite volume. We also proved that
H
is trivial and
G
is solvable if
k
= 3. Another of our main results is that the leaves, i.e., the integral manifolds, of the nullity are closed (we used a rather delicate argument). This implies that
M
is a Euclidean affine bundle over the quotient by the leaves of
ν
. Moreover, we prove that
ν
⊥
defines a metric connection on this bundle with transitive holonomy or, equivalently,
ν
⊥
is completely non-integrable (this is not in general true for an arbitrary autoparallel and at invariant distribution). We also found some general obstruction for the existence of non-trivial nullity: e.g., if
G
is reductive (in particular, if
M
is compact), or if
G
is two-step nilpotent.
The complexity of coseismic slip distributions influences the tsunami hazard posed by local and, to a certain extent, distant tsunami sources. Large slip concentrated in shallow patches was observed ...in recent tsunamigenic earthquakes, possibly due to dynamic amplification near the free surface, variable frictional conditions or other factors. We propose a method for incorporating enhanced shallow slip for subduction earthquakes while preventing systematic slip excess at shallow depths over one or more seismic cycles. The method uses the classic
k
−2
stochastic slip distributions, augmented by shallow slip amplification. It is necessary for deep events with lower slip to occur more often than shallow ones with amplified slip to balance the long-term cumulative slip. We evaluate the impact of this approach on tsunami hazard in the central and eastern Mediterranean Sea adopting a realistic 3D geometry for three subduction zones, by using it to model ~ 150,000 earthquakes with
M
w
from 6.0 to 9.0. We combine earthquake rates, depth-dependent slip distributions, tsunami modeling, and epistemic uncertainty through an ensemble modeling technique. We found that the mean hazard curves obtained with our method show enhanced probabilities for larger inundation heights as compared to the curves derived from depth-independent slip distributions. Our approach is completely general and can be applied to any subduction zone in the world.