Display omitted
•Presence of antibiotic resistant bacteria in the wastewater treatment system is reviewed.•Presence of antibiotic resistance genes in the environment is highlighted.•Use of various ...wetland systems to reduce the EOCs are shown.•Combination of wastewater treatment system with wetland system is effective.
Emerging organic contaminants (EOCs) include a diverse group of chemical compounds, such as pharmaceuticals and personal care products (PPCPs), pesticides, hormones, surfactants, flame retardants and plasticizers. Many of these compounds are not significantly removed in conventional wastewater treatment plants and are discharged to the environment, presenting an increasing threat to both humans and natural ecosystems. Recently, antibiotics have received considerable attention due to growing microbial antibiotic-resistance in the environment. Constructed wetlands (CWs) have proven effective in removing many EOCs, including different antibiotics, before discharge of treated wastewater into the environment. Wastewater treatment systems that couple conventional treatment plants with constructed and natural wetlands offer a strategy to remove EOCs and reduce antibiotic resistant bacteria (ARB) and antibiotic resistance genes (ARGs) far more efficiently than conventional treatment alone. This review presents as overview of the current knowledge on the efficiency of different wetland systems in reducing EOCs and antibiotic resistance.
There is a growing body of evidence that points to an important role for modification of lifestyle factors and promotion of health-related quality of life in the secondary prevention of disease ...progression in multiple sclerosis (D'Hooghe et al., 2010; Weiland et al., 2014; Hadgkiss et al., 2015). As a clinical psychologist diagnosed with multiple sclerosis in 2012 I have gained a unique insight into ways in which people living with MS and clinicians can usefully integrate evidence-based lifestyle modifications that enhance self-efficacy and self-management to improve wider psychological and physical health. The framework presented here enables clinicians to engage in salutogenic health promotion by placing value upon the importance of healthy, evidence-based behavior change. Furthermore, the framework provides a structure which can empower and provide guidance for people living with MS on what and how to implement and sustain behavior change and emotional wellbeing in the face of this life-changing diagnosis.
Introduction
Despite recommendations that general practitioners (GPs) delay antibiotic prescribing for respiratory tract infections (RTIs), antibiotic prescriptions in primary care in England ...increased by 4.1% from 2010 to 2013. C-reactive protein (CRP) point-of-care tests (POCT), for example, the Afinion™ Analyzer (Alere Ltd, Stockport, UK) device, are widely used in several countries in the European Union. Studies suggest that CRP POCT use, either alone or in combination with communication training, reduces antibiotic prescribing and improves quality of life for patients presenting with RTI symptoms. The aim of this study is to evaluate the cost-effectiveness of CRP POCT for RTIs in primary care in England over 3 years for three different strategies of care compared to standard practice.
Methods
An economic evaluation was carried out to compare the costs and benefits of three different strategies of CRP testing (GP plus CRP; practice nurse plus CRP; and GP plus CRP and communication training) for patients with RTI symptoms as defined by National Institute for Health and Care Excellence guideline CG69, compared with current standard GP practice without CRP testing. Analysis consisted of a decision tree and Markov model to describe the quality-adjusted life years (QALYs) and cost per 100 patients, together with the number of antibiotic prescriptions and RTIs for each group.
Results
Compared with current standard practice, the GP plus CRP and practice nurse plus CRP test strategies result in increased QALYs and reduced costs, while the GP plus CRP testing and communication training strategy is associated with increased costs and reduced QALYs. Additionally, all three CRP arms led to fewer antibiotic prescriptions and infections over 3 years.
Conclusion
The additional cost per patient of the CRP test is outweighed by the associated cost savings and QALY increment associated with a reduction in infections in the long term.
Recruitment and retention challenges are very common in mental health randomised trials. Investigators utilise different methods to improve recruitment or retention. However, evidence of the ...effectiveness and efficiency of these strategies in mental health has not been synthesised. This systematic review is to investigate and assess the effectiveness and cost-effectiveness of different strategies to improve recruitment and retention in mental health randomised trials.
MEDLINE, EMBASE, the Cochrane Methodology Register and PsycINFO were searched from beginning of record up to July 2016. Randomised trials involving participants with mental health problems which compared different strategies for recruitment or retention were selected. Two authors independently screened identified studies for eligibility.
A total of 5,157 citations were identified. Thirteen articles were included, 11 on recruitment and 2 on retention. Three randomised controlled trials compared different recruitment strategies, none of which found statistically significant differences between the interventional recruitment strategies and the routine recruitment methods. Retrospective comparisons of recruitment methods showed that non-web-based advertisement and recruitment by clinical research staff each have advantages in efficiency. Web-based adverts had the lowest cost per person recruited (£13.41 per person recruited). Specialised care referral cost £183.24 per person, non-web-based adverts cost £372.03 per patient and recruitment via primary care cost £407.65 for each patient. Financial incentives, abridged questionnaires and pre-notification had a positive effect on retention rates.
The recruitment studies included showed differences in strategies, clinical settings, mental health conditions and study design. It is difficult to assess the overall effectiveness of any particular recruitment strategy as some strategies that worked well for a particular population may not work as well for others. Paying attention to the accessibility of information and consent materials may help improve recruitment. More research in this area is needed given its important implications.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
This case study discusses the inception and continued delivery of 10-minute micro research skills sessions within two entrepreneurship modules at Coventry University London. The case study starts ...with an explanation of how and why the project was developed. Its rationale was underpinned by both established, current bite-sized learning research, and established psychological and neural evidence. This paper describes how these practices are used in the workplace to promote continuous professional development and disseminate company information for training purposes. Discussing both the delivery and skills content, this paper explains the methods used by the Information and Skills Development Specialist (ISDS) in each 10-minute session to engage students and embed database searching skills in to their routine study practices. It also explains how this practice has been adopted by students and how the skills have been embedded to enhance their final business pitches at the end of their modules.
Previous prospective cohort studies have shown that angiogenic factors have a high diagnostic accuracy in women with suspected pre-eclampsia, but we remain uncertain of the effectiveness of these ...tests in a real-world setting. We therefore aimed to determine whether knowledge of the circulating concentration of placental growth factor (PlGF), an angiogenic factor, integrated with a clinical management algorithm, decreased the time for clinicians to make a diagnosis in women with suspected pre-eclampsia, and whether this approach reduced subsequent maternal or perinatal adverse outcomes.
We did a multicentre, pragmatic, stepped-wedge cluster-randomised controlled trial in 11 maternity units in the UK, which were each responsible for 3000–9000 deliveries per year. Women aged 18 years and older who presented with suspected pre-eclampsia between 20 weeks and 0 days of gestation and 36 weeks and 6 days of gestation, with a live, singleton fetus were invited to participate by the clinical research team. Suspected pre-eclampsia was defined as new-onset or worsening of existing hypertension, dipstick proteinuria, epigastric or right upper-quadrant pain, headache with visual disturbances, fetal growth restriction, or abnormal maternal blood tests that were suggestive of disease (such as thrombocytopenia or hepatic or renal dysfunction). Women were approached individually, they consented for study inclusion, and they were asked to give blood samples. We randomly allocated the maternity units, representing the clusters, to blocks. Blocks represented an intervention initiation time, which occurred at equally spaced 6-week intervals throughout the trial. At the start of the trial, all units had usual care (in which PlGF measurements were also taken but were concealed from clinicians and women). At the initiation time of each successive block, a site began to use the intervention (in which the circulating PlGF measurement was revealed and a clinical management algorithm was used). Enrolment of women continued for the duration of the blocks either to concealed PlGF testing, or after implementation, to revealed PlGF testing. The primary outcome was the time from presentation with suspected pre-eclampsia to documented pre-eclampsia in women enrolled in the trial who received a diagnosis of pre-eclampsia by their treating clinicians. This trial is registered with ISRCTN, number 16842031.
Between June 13, 2016, and Oct 27, 2017, we enrolled and assessed 1035 women with suspected pre-eclampsia. 12 (1%) women were found to be ineligible. Of the 1023 eligible women, 576 (56%) women were assigned to the intervention (revealed testing) group, and 447 (44%) women were assigned to receive usual care with additional concealed testing (concealed testing group). Three (1%) women in the revealed testing group were lost to follow-up, so 573 (99%) women in this group were included in the analyses. One (<1%) woman in the concealed testing group withdrew consent to follow-up data collection, so 446 (>99%) women in this group were included in the analyses. The median time to pre-eclampsia diagnosis was 4·1 days with concealed testing versus 1·9 days with revealed testing (time ratio 0·36, 95% CI 0·15–0·87; p=0·027). Maternal severe adverse outcomes were reported in 24 (5%) of 447 women in the concealed testing group versus 22 (4%) of 573 women in the revealed testing group (adjusted odds ratio 0·32, 95% CI 0·11–0·96; p=0·043), but there was no evidence of a difference in perinatal adverse outcomes (15% vs 14%, 1·45, 0·73–2·90) or gestation at delivery (36·6 weeks vs 36·8 weeks; mean difference −0·52, 95% CI −0·63 to 0·73).
We found that the availability of PlGF test results substantially reduced the time to clinical confirmation of pre-eclampsia. Where PlGF was implemented, we found a lower incidence of maternal adverse outcomes, consistent with adoption of targeted, enhanced surveillance, as recommended in the clinical management algorithm for clinicians. Adoption of PlGF testing in women with suspected pre-eclampsia is supported by the results of this study.
National Institute for Health Research.
People with dementia living in care homes often experience clinically significant agitation; however, little is known about its economic impact.
To calculate the cost of agitation in people with ...dementia living in care homes.
We used the baseline data from 1,424 residents with dementia living in care homes (part of Managing Agitation and Raising QUality of lifE in dementia (MARQUE) study) that had Cohen-Mansfield Agitation Inventory (CMAI) scores recorded. We investigated the relationship between residents' health and social care costs and severity of agitation based on the CMAI total score. In addition, we assessed resource utilisation and compared costs of residents with and without clinically significant symptoms of agitation using the CMAI over and above the cost of the care home.
Agitation defined by the CMAI was a significant predictor of costs. On average, a one-point increase in the CMAI will lead to a 0.5 percentage points (cost ratio 1.005, 95%CI 1.001 to 1.010) increase in the annual costs. The excess annual cost associated with agitation per resident with dementia was £1,125.35. This suggests that, on average, agitation accounts for 44% of the annual health and social care costs of dementia in people living in care homes.
Agitation in people with dementia living in care homes contributes significantly to the overall costs increasing as the level of agitation increases. Residents with the highest level of agitation cost nearly twice as much as those with the lowest levels of agitation, suggesting that effective strategies to reduce agitation are likely to be cost-effective in this setting.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Trastuzumab improves survival in HER2+ breast cancer patients, with some evidence of adverse cardiac side effects. Current recommendations are to give adjuvant trastuzumab for one year or until ...recurrence, although trastuzumab treatment for only 9 or 10 weeks has shown similar survival rates to 12-month treatment. We present here a multi-arm joint analysis examining the relative cost-effectiveness of different durations of adjuvant trastuzumab.
Network meta-analysis (NMA) was used to examine which trials' data to include in the cost-effectiveness analysis (CEA). A network using FinHer (9 weeks vs. zero) and BCIRG006 (12 months vs. zero) trials offered the only jointly randomisable network so these trials were used in the CEA. The 3-arm CEA compared costs and quality-adjusted life-years (QALYs) associated with zero, 9-week and 12-month adjuvant trastuzumab durations in early breast cancer, using a decision tree followed by a Markov model that extrapolated the results to a lifetime time horizon. Pairwise incremental cost-effectiveness ratios (ICERs) were also calculated for each pair of regimens and used in budget impact analysis, and the Bucher method was used to check face validity of the findings. Addition of the PHARE trial (6 months vs. 12 months) to the network, in order to create a 4-arm CEA including the 6-month regimen, was not possible as late randomisation in this trial resulted in recruitment of a different patient population as evidenced by the NMA findings. The CEA results suggest that 9 weeks' trastuzumab is cost-saving and leads to more QALYs than 12 months', i.e. the former dominates the latter. The cost-effectiveness acceptability frontier (CEAF) favours zero trastuzumab at willingness-to-pay levels below £2,500/QALY and treatment for 9 weeks above this threshold. The combination of the NMA and Bucher investigations suggests that the 9-week duration is as efficacious as the 12-month duration for distant-disease-free survival and overall survival, and safer in terms of fewer adverse cardiac events.
Our CEA results suggest that 9-week trastuzumab dominates 12-month trastuzumab in cost-effectiveness terms at conventional thresholds of willingness to pay for a QALY, and the 9-week regimen is also suggested to be as clinically effective as the 12-month regimen according to the NMA and Bucher analyses. This finding agrees with the results of the E2198 head-to-head study that compared 10 weeks' with 14 months' trastuzumab and found no significant difference. Appropriate trial design and reporting is critical if results are to be synthesisable with existing evidence, as selection bias can lead to recruitment of a different patient population from existing trials. Our analysis was not based on head-to-head trials' data, so the results should be viewed with caution. Short-duration trials would benefit from recruiting larger numbers of participants to reduce uncertainty in the synthesised results.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
•Cost for assimilation wetlands averaged $0.60/gallon of treatment capacity.•2° and 3° treatment costs averaged $4.90 and $6.50/gallon, respectively.•Wetland assimilation is less sensitive to cost ...increases than traditional systems.
In recent decades, water quality standards for wastewater treatment have become more stringent, increasing costs and energy required to reduce pollutants. Wetland assimilation is a low-cost and low-energy alternative to traditional tertiary wastewater treatment where secondarily treated and disinfected municipal effluent is discharged primarily into freshwater forested wetlands in coastal Louisiana. In this paper, costs per gallon of treatment capacity for conventional secondary and tertiary treatment were compared to those for assimilation wetlands. Cost analysis reports were used to determine costs per gallon of treatment capacity for conventional wastewater treatment facilities, including costs for conveyance between the collection system and the assimilation wetland site, and between the treatment and disposal sites if they could not be co-located. Capital and operation and maintenance costs were considered. Because all wastewater treatment plants are required to treat at least to secondary standards, costs for primary and secondary treatment were combined. If necessary, these costs were adjusted for inflation to 2017 dollars using an average inflation rate of 2.19 percent and a cumulative inflation rate of 50.84 percent. To determine costs per gallon of treatment capacity for assimilation wetlands, actual costs provided by the project engineer were used when available. To simulate the future costs of facility construction and compare the replacement costs of conventional secondary and tertiary wastewater treatment facilities and treatment wetlands in the context of energy prices, U.S. Bureau of Labor and Statistics (BLS) data for the price index for inputs to construction were used, as were the Energy Information Administration (EIA) data for the price of crude oil to model future wastewater treatment plant construction and operation costs. The cost for the Mandeville assimilation wetland included $1 million for the price of the land. Future costs of treatment facility construction and operation were modeled relative to average price of construction inputs between 1998 and 2015 using the projected price of crude oil. When treatment costs were compared among secondary, tertiary, and assimilation wetlands, mean cost for assimilation wetlands was $0.60 per gallon (>1 MGD capacity) compared to $4.90 and $6.50 per gallon for secondary and tertiary treatment, respectively. The lower total costs and energy requirements for assimilation wetlands result in lower variability in the price of construction and operation. Wetland assimilation is more economical than conventional wastewater treatment, especially compared to advanced secondary and tertiary treatment. It is likely that energy costs will increase significantly in coming decades. Because conventional secondary and tertiary treatment are energy intensive, increases in energy costs will significantly increase the costs of these treatment systems. Treatment systems that combine lower technology (e.g., oxidation ponds) secondary treatment with wetland assimilation are less likely to be impacted by rising energy costs than traditional wastewater treatment.
The growing and ageing prison population in England makes accurate cancer data of increasing importance for prison health policies. This study aimed to compare cancer incidence, treatment, and ...survival between patients diagnosed in prison and the general population.
In this population-based, matched cohort study, we used cancer registration data from the National Cancer Registration and Analysis Service in England to identify primary invasive cancers and cervical cancers in situ diagnosed in adults (aged ≥18 years) in the prison and general populations between Jan 1, 1998, and Dec 31, 2017. Ministry of Justice and Office for National Statistics population data for England were used to calculate age-standardised incidence rates (ASIR) per year and age-standardised incidence rate ratios (ASIRR) for the 20-year period. Patients diagnosed with primary invasive cancers (ie, excluding cervical cancers in situ) in prison between Jan 1, 2012, and Dec 31, 2017 were matched to individuals from the general population and linked to hospital and treatment datasets. Matching was done in a 1:5 ratio according to 5-year age group, gender, diagnosis year, cancer site, and disease stage. Our primary objectives were to compare the incidence of cancer (1998–2017); the receipt of treatment with curative intent (2012–17 matched cohort), using logistic regression adjusted for matching variables (excluding cancer site) and route to diagnosis; and overall survival following cancer diagnosis (2012–17 matched cohort), using a Cox proportional hazards model adjusted for matching variables (excluding cancer site) and route to diagnosis, with stratification for the receipt of any treatment with curative intent.
We identified 2015 incident cancers among 1964 adults (1556 77·2% men and 459 22·8% women) in English prisons in the 20-year period up to Dec 31, 2017. The ASIR for cancer for men in prison was initially lower than for men in the general population (in 1998, ASIR 119·33 per 100 000 person-years 95% CI 48·59–219·16 vs 746·97 per 100 000 person-years 742·31–751·66), but increased to a similar level towards the end of the study period (in 2017, 856·85 per 100 000 person-years 675·12–1060·44 vs 788·59 per 100 000 person-years 784·62–792·57). For women, the invasive cancer incidence rate was low and so ASIR was not reported for this group. Over the 20-year period, the incidence of invasive cancer for men in prison increased (incidence rate ratio per year, 1·05 95% CI 1·04–1·06, during 1999–2017 compared with 1998). ASIRRs showed that over the 20-year period, overall cancer incidence was lower in men in prison than in men in the general population (ASIRR 0·76 95% CI 0·73–0·80). The difference was not statistically significant for women (ASIRR 0·83 0·68–1·00). Between Jan 1, 2012, and Dec 31, 2017, patients diagnosed in prison were less likely to undergo curative treatment than matched patients in the general population (274 32·3% of 847 patients vs 1728 41·5% of 4165; adjusted odds ratio (OR) 0·72 95% CI 0·60–0·85). Being diagnosed in prison was associated with a significantly increased risk of death on adjustment for matching variables (347 deaths during 2021·9 person-years in the prison cohort vs 1626 deaths during 10 944·2 person-years in the general population; adjusted HR 1·16 95% CI 1·03–1·30); this association was partly explained by stratification by curative treatment and further adjustment for diagnosis route (adjusted HR 1·05 0·93–1·18).
Cancer incidence increased in people in prisons in England between 1998 and 2017, with patients in prison less likely to receive curative treatments and having lower overall survival than the general population. The association with survival was partly explained by accounting for differences in receipt of curative treatment and adjustment for diagnosis route. Improved routine cancer surveillance is needed to inform prison cancer policies and decrease inequalities for this under-researched population.
UK National Institute for Health and Care Research, King's College London, and Strategic Priorities Fund 2019/20 of Research England via the University of Surrey.