We consider the extension of the Nitsche method to the case of fluid–structure interaction problems on unfitted meshes. We give a stability analysis for the space semi-discretized problem and show ...how this estimate may be used to derive optimal error estimates for smooth solutions, irrespectively of the mesh/interface intersection. We also discuss different strategies for the time discretization, using either fully implicit or explicit coupling (loosely coupled) schemes. Some numerical examples illustrate the theoretical discussion.
•Unfitted finite element method for a fluid–structure interaction.•Proof of stability and accuracy.•Different coupling scheme’s for time advancement: fully coupled or loosely coupled.
When estimating the average effect of a binary treatment (or exposure) on an outcome, methods that incorporate propensity scores, the G‐formula, or targeted maximum likelihood estimation (TMLE) are ...preferred over naïve regression approaches, which are biased under misspecification of a parametric outcome model. In contrast propensity score methods require the correct specification of an exposure model. Double‐robust methods only require correct specification of either the outcome or the exposure model. Targeted maximum likelihood estimation is a semiparametric double‐robust method that improves the chances of correct model specification by allowing for flexible estimation using (nonparametric) machine‐learning methods. It therefore requires weaker assumptions than its competitors. We provide a step‐by‐step guided implementation of TMLE and illustrate it in a realistic scenario based on cancer epidemiology where assumptions about correct model specification and positivity (ie, when a study participant had 0 probability of receiving the treatment) are nearly violated. This article provides a concise and reproducible educational introduction to TMLE for a binary outcome and exposure. The reader should gain sufficient understanding of TMLE from this introductory tutorial to be able to apply the method in practice. Extensive R‐code is provided in easy‐to‐read boxes throughout the article for replicability. Stata users will find a testing implementation of TMLE and additional material in the Appendix S1 and at the following GitHub repository: https://github.com/migariane/SIM‐TMLE‐tutorial
Abstract
It is almost a century since nisin was discovered in fermented milk cultures, coincidentally in the same year that penicillin was first described. Over the last 100 years this small, highly ...modified pentacyclic peptide has not only found success in the food industry as a preservative but has also served as the paradigm for our understanding of the genetic organization, expression, and regulation of genes involved in lantibiotic biosynthesis—one of the few cases of extensive post-translation modification in prokaryotes. Recent developments in understanding the complex biosynthesis of nisin have shed light on the cellular location of the modification and transport machinery and the co-ordinated series of spatio-temporal events required to produce active nisin and provide resistance and immunity. The continued unearthing of new natural variants from within human and animal gastrointestinal tracts has sparked interest in the potential application of nisin to influence the microbiome, given the growing recognition of the role the gastrointestinal microbiota plays in health and disease. Moreover, interdisciplinary approaches have taken advantage of biotechnological advancements to bioengineer nisin to produce novel variants and expand nisin functionality for applications in the biomedical field. This review will discuss the latest progress in these aspects of nisin research.
The highly post-translationally modified peptide nisin has been studied over the course of the last 100 years, and although it has been employed successfully as a food preservative, its potent activity against multidrug resistant microbes, long safety record, lack of any significant resistance development, and amenability to bioengineering approaches to improve its antimicrobial and physicochemical properties has meant that the focus on nisin-related research is shifting from food preservation towards therapeutic use for the treatment of bacterial infections.
•Partitioned methods invoking the fluid and solid solvers only once per time-step.•The implicit treatment of the sole fluid/solid-inertia coupling guarantees stability.•A priori error estimates ...guarantee optimal (first-order) accuracy.•New insights on the partitioned solution of implicit coupling.•A comprehensive list of numerical tests supports the theory.
We introduce a class of explicit coupling schemes for the numerical solution of fluid–structure interaction problems involving a viscous incompressible fluid and a general thin-walled structure (e.g., including damping and non-linear behavior). The fundamental ingredient in these methods is a (parameter free) explicit Robin interface condition for the fluid, which enables the fluid–solid splitting through appropriate extrapolations of the solid velocity and fluid stress on the interface. The resulting solution procedures are genuinely partitioned. Stability and error estimates are provided for all the variants (depending on the extrapolations), using energy arguments within a representative linear setting. In particular, we show that one of them simultaneously yields added-mass free unconditional stability and optimal (first-order) time accuracy. A comprehensive numerical study, involving different examples from the literature, supports the theory.
Individual and tumour factors only explain part of observed inequalities in colorectal cancer survival in England. This study aims to investigate inequalities in treatment in patients with colorectal ...cancer.
All patients diagnosed with colorectal cancer in England between 2012 and 2016 were followed up from the date of diagnosis (state 1), to treatment (state 2), death (state 3) or censored at 1 year after the diagnosis. A multistate approach with flexible parametric model was used to investigate the effect of income deprivation on the probability of remaining alive and treated in colorectal cancer.
Compared to the least deprived quintile, the most deprived with stage I-IV colorectal cancer had a lower probability of being alive and treated at all the time during follow-up, and a higher probability of being untreated and of dying. The probability differences (most vs. least deprived) of being alive and treated at 6 months ranged between -2.4% (95% CI: -4.3, -1.1) and -7.4% (-9.4, -5.3) for colon; between -2.0% (-3.5, -0.4) and -6.2% (-8.9, -3.5) for rectal cancer.
Persistent inequalities in treatment were observed in patients with colorectal cancer at every stage, due to delayed access to treatment and premature death.
In this paper we introduce a Nitsche-XFEM method for fluid–structure interaction problems involving a thin-walled elastic structure (Lagrangian formalism) immersed in an incompressible viscous fluid ...(Eulerian formalism). The fluid domain is discretized with an unstructured mesh not fitted to the solid mid-surface mesh. Weak and strong discontinuities across the interface are allowed for the velocity and pressure, respectively. The fluid–solid coupling is enforced consistently using a variant of Nitsche’s method with cut-elements. Robustness with respect to arbitrary interface intersections is guaranteed through suitable stabilization. Several coupling schemes with different degrees of fluid–solid time splitting (implicit, semi-implicit and explicit) are investigated. A series of numerical test in 2D, involving static and moving interfaces, illustrates the performance of the different methods proposed.
The main purpose of many medical studies is to estimate the effects of a treatment or exposure on an outcome. However, it is not always possible to randomize the study participants to a particular ...treatment, therefore observational study designs may be used. There are major challenges with observational studies; one of which is confounding. Controlling for confounding is commonly performed by direct adjustment of measured confounders; although, sometimes this approach is suboptimal due to modeling assumptions and misspecification. Recent advances in the field of causal inference have dealt with confounding by building on classical standardization methods. However, these recent advances have progressed quickly with a relative paucity of computational‐oriented applied tutorials contributing to some confusion in the use of these methods among applied researchers. In this tutorial, we show the computational implementation of different causal inference estimators from a historical perspective where new estimators were developed to overcome the limitations of the previous estimators (ie, nonparametric and parametric g‐formula, inverse probability weighting, double‐robust, and data‐adaptive estimators). We illustrate the implementation of different methods using an empirical example from the Connors study based on intensive care medicine, and most importantly, we provide reproducible and commented code in Stata, R, and Python for researchers to adapt in their own observational study. The code can be accessed at
https://github.com/migariane/Tutorial_Computational_Causal_Inference_Estimators.
In this paper we introduce a class of fully decoupled time-marching schemes (velocity–pressure–displacement splitting) for the coupling of an incompressible fluid with a thin-walled elastic or ...viscoelastic structure. The time splitting combines a projection method in the fluid with a specific Robin–Neumann treatment of the interface coupling. A priori energy estimates guaranteeing unconditional stability are established for some of the schemes. The accuracy and performance of the methods proposed are illustrated by a thorough numerical study.
Global warming mitigation strategies are likely to affect human health and biodiversity through diverse cause-effect mechanisms. To analyze these effects, we implement a methodology to link TIMES ...energy models with life cycle assessment using open-source software. The proposed method uses a cutoff to identify the most relevant processes. These processes have their efficiencies, fuel mixes, and emission factors updated to be consistent with the TIMES model. The use of a cutoff criterion reduces exponentially the number of connection points between models, facilitating the analysis of scenarios with a large number of technologies involved. The method is used to assess the potential effects of deploying low-carbon technologies to reduce combustion emissions in the province of Quebec (Canada). In the case of Quebec, the reduction of combustion emissions is largely achieved through electrification of energy services. Global warming mitigation efforts reduce the impact on human health and ecosystem quality, mainly because of lower global warming, water scarcity, and metal contamination impacts. The TIMES model alone underestimated the reduction of CO2eq by 21% with respect to a full account of emissions.