The opioid epidemic is an escalating health crisis. We evaluated the impact of opioid prescription rates and socioeconomic determinants on opioid mortality rates, and identified potential differences ...in prescription patterns by categories of practitioners.
We combined the 2013 and 2014 Medicare Part D data and quantified the opioid prescription rate in a county level cross-sectional study with data from 2710 counties, 468,614 unique prescribers and 46,665,037 beneficiaries. We used the CDC WONDER database to obtain opioid-related mortality data. Socioeconomic characteristics for each county were acquired from the US Census Bureau.
The average national opioid prescription rate was 3.86 claims per beneficiary that received a prescription for opioids (95% CI 3.86-3.86). At a county level, overall opioid prescription rates (p < 0.001, Coeff = 0.27) and especially those provided by emergency medicine (p < 0.001, Coeff = 0.21), family medicine physicians (p = 0.11, Coeff = 0.008), internal medicine (p = 0.018, Coeff = 0.1) and physician assistants (p = 0.021, Coeff = 0.08) were associated with opioid-related mortality. Demographic factors, such as proportion of white (p
< 0.001, Coeff = 0.22), black (p
< 0.001, Coeff = - 0.19) and male population (p
< 0.001, Coeff = 0.13) were associated with opioid prescription rates, while poverty (p < 0.001, Coeff = 0.41) and proportion of white population (p
< 0.001, Coeff = 0.27) were risk factors for opioid-related mortality (p
< 0.001, R
= 0.35). Notably, the impact of prescribers in the upper quartile was associated with opioid mortality (p < 0.001, Coeff = 0.14) and was twice that of the remaining 75% of prescribers together (p < 0.001, Coeff = 0.07) (p
= 0.03, R
= 0.03).
The prescription opioid rate, and especially that by certain categories of prescribers, correlated with opioid-related mortality. Interventions should prioritize providers that have a disproportionate impact and those that care for populations with socioeconomic factors that place them at higher risk.
Apache Mesos is a resource management system for large data centres, initially developed by UC Berkeley, and now maintained under the Apache Foundation umbrella. It is widely used in the industry by ...companies like Apple, Twitter, and Airbnb and it is known to scale to 10 000s of nodes. Together with other tools of its ecosystem, such as Mesosphere Marathon or Metronome, it provides an end-to-end solution for datacenter operations and a unified way to exploit large distributed systems. We present the experience of the ALICE Experiment Offline & Computing in deploying and using in production the Apache Mesos ecosystem for a variety of tasks on a small 500 cores cluster, using hybrid OpenStack and bare metal resources. We will initially introduce the architecture of our setup and its operation, we will then describe the tasks which are performed by it, including release building and QA, release validation, and simple Monte Carlo production. We will show how we developed Mesos enabled components (called "Mesos Frameworks") to carry out ALICE specific needs. In particular, we will illustrate our effort to integrate Work Queue, a lightweight batch processing engine developed by University of Notre Dame, which ALICE uses to orchestrate release validation. Finally, we will give an outlook on how to use Mesos as resource manager for DDS, a software deployment system developed by GSI which will be the foundation of the system deployment for ALICE next generation Online-Offline (O2).
The physico-chemical properties of three grafted pullulans (P) having linked poly(3-acrylamidopropyl)trimethylammonium chloride (pAPTAC) as side chains (P-g-pAPTAC1, P-g-pAPTAC2 and P-g-pAPTAC3 with ...22.53, 29.05, and 34.51 (wt.%) of pAPTAC content in polymer, respectively) and possessing polyelectrolyte character were determined by light scattering analysis. All grafted pullulan aqueous solutions were tested in the presence of 0.5M NaCl, KCl, NaNO3 or KNO3. The biggest associations were recorded in 0.5M NaCl aqueous solutions for P-g-pAPTAC1, P-g-pAPTAC2 and P-g-pAPTAC3 according to the maximum values for Rg extracted from MALLS (multiangle laser light scattering) measurements. Also, the dominant conformation in salted solution of these polyelectrolytes was random coil as Debye plot analysis revealed. Antibacterial activity was tested by Kirby–Bauer diffusion method and all grafted pullulans dissolved in aqueous solutions of 0.5M NaCl have developed inhibition zone against Staphylococcus aureus (ATCC 25923).
The main purpose of this study is to analyze the human evacuation in the case of a fire drill by two different means: experimental research - "real fire drill" and numerical simulation - "virtual ...fire drill". The analyzed building can house a large number of people and its inner atrium can facilitate the spread of smoke and hot gases in the case of a real fire. The evacuation drill was performed with about 50 students, located at the 2nd floor of the building, in two amphitheaters. They could evacuate themselves through two closed staircases to the ground floor of the building and then outside, through two exit doors. The second purpose of this study is to establish, for the given case, if the used computer software can simulate the crowd movement and compute the travel times in accordance with the experimental research. Another important subject is whether the computer software FDS+Evac - based on the CFD (Computational Fluid Dynamics) to simulate the crowd movement - can correctly identify the shortest escape path because one of the staircases, with its correspondent ground floor exit, is closer to the two analyzed amphitheaters. Following the numerical analysis, it was concluded that for the considered scenario, the differences between the "real fire drill" and the "virtual fire drill" are acceptable from an engineering point of view. FDS+Evac can simulate with high degree of credibility human movement in case of a fire drill.
Abstract
This paper aims to determine the influence that water jet cutting parameters have on dimensional accuracy. The input parameters of this study were the workpiece material, that was made from ...19 mm thick S235JR steel alloy plate, using a medium to high cutting pressure (2000, 2500, and 3000 bar), with a variable standoff distance (1, 2, and 3 mm) and a programmed quality of the cut from Q1 to Q5. A statistical interpretation of the results was conducted, and the experimental plan was obtained using the Response Surface Method. The samples were analysed for entrance and exit width of cut and kerf angle. A total number of 45 samples were measured, and the results were interpreted using an ANOVA analysis for statistical significance (p-value) and fit statistics (R2). The results have shown what values are optimum for the input parameters to obtain precision cutting quickly.
Malignancies are generally considered a risk factor for deep vein thrombosis and may hamper the recanalisation of thrombosed veins.
We investigate whether the natural course and response to ...anticoagulant treatment of bland portal vein thrombosis (PVT) in patients with cirrhosis complicated by hepatocellular carcinoma (HCC) differ from those without HCC.
Retrospective study in two hepatology referral centres, in Italy and Romania where patients with a diagnosis of PVT on cirrhosis and follow-up of at least 3 months with repeated imaging were included.
A total of 162 patients with PVT and matching inclusion and exclusion criteria were identified: 30 with HCC were compared to 132 without HCC. Etiologies, Child-Pugh Score (7 vs 7) and MELD scores (11 vs 12, p=0.3679) did not differ. Anticoagulation was administered to 43% HCC vs 42% nonHCC. The extension of PVT in the main portal trunk was similar: partial/total involvement was 73.3/6.7% in HCC vs 67.4/6.1% in nonHCC, p=0.760. The remainder had intrahepatic PVT. The recanalization rate was 61.5% and 60.7% in HCC/nonHCC in anticoagulated patients (p=1). Overall PVT recanalisation, including treated and untreated patients, was observed in 30% of HCC vs 37.9% of nonHCC, p=0.530. Major bleeding incidence was almost identical (3.3% vs 3.8%, p=1). Progression of PVT after stopping anticoagulation did not differ (10% vs 15.9%, respectively, HCC/nHCC, p=0.109).
The course of bland non-malignant PVT in cirrhosis is not affected by the presence of active HCC. Treatment with anticoagulation in patients with active HCC is safe and as effective as in nonHCC patients, this can potentially allow us to use otherwise contraindicated therapies (ie TACE) if a complete recanalization is achieved with anticoagulation.
Polydimethylsiloxane-α,ω-diol was used as matrix for the preparation of polysiloxane-SiO2-TiO2 composites through in situ incorporation of silica and titania using a solvent-free sol-gel procedure. ...For this purpose, oxide precursors tetraethyl-orthosilicate and tetrabutyl-orthotitanate, and a proper condensation catalyst, viz. dibuthyltin dilaurate, were added in pre-established amounts to the polymer. The hydrolysis and condensation reactions take place under mild conditions, with the formation of silicon and titanium oxide networks and polymer crosslinking. The effect of SiO2 and TiO2 mass ratio on the morphology of the composites was investigated by scanning electron microscopy (SEM) and X-rays diffraction (XRD), and interpreted in correlation with differential scanning calorimetry (DSC) and energy-dispersive X-ray spectroscopy (EDX) data. The film samples were tested as active elements in actuation devices.
Segmented poly(ester‐urethanes) containing hard and soft segments, were obtained from aromatic diisocyanates with thiodiglycol or diethylene glycol as chain extenders, and poly(ethylene ...glycol)adipate usig a multistep polyaddition process. Transition temperatures by differential scanning calorimetry and thermo‐optical analysis were employed to characterize polyurethane materials. Changes in the conformation of these polyurethanes were analyzed also, by viscometer measurements in N,N‐dimethyl‐formamide. The obtained data revealed that the thermal curves are influenced by the soft and hard segment structures in the temperature range studied.
The increase in the scale of LHC computing during Run 3 and Run 4 (HL-LHC) will certainly require radical changes to the computing models and the data processing of the LHC experiments. The working ...group established by WLCG and the HEP Software Foundation to investigate all aspects of the cost of computing and how to optimise them has continued producing results and improving our understanding of this process. In particular, experiments have developed more sophisticated ways to calculate their resource needs, we have a much more detailed process to calculate infrastructure costs. This includes stud-ies on the impact of HPC and GPU based resources on meeting the computing demands. We have also developed and perfected tools to quantitatively study the performance of experiments workloads and we are actively collaborating with other activities related to data access, benchmarking and technology cost evolution. In this contribution we expose our recent developments and results and outline the directions of future work.