In this article, we consider the problem of detecting multiple changepoints in large datasets. Our focus is on applications where the number of changepoints will increase as we collect more data: for ...example, in genetics as we analyze larger regions of the genome, or in finance as we observe time series over longer periods. We consider the common approach of detecting changepoints through minimizing a cost function over possible numbers and locations of changepoints. This includes several established procedures for detecting changing points, such as penalized likelihood and minimum description length. We introduce a new method for finding the minimum of such cost functions and hence the optimal number and location of changepoints that has a computational cost, which, under mild conditions, is linear in the number of observations. This compares favorably with existing methods for the same problem whose computational cost can be quadratic or even cubic. In simulation studies, we show that our new method can be orders of magnitude faster than these alternative exact methods. We also compare with the binary segmentation algorithm for identifying changepoints, showing that the exactness of our approach can lead to substantial improvements in the accuracy of the inferred segmentation of the data. This article has supplementary materials available online.
One of the largest sources of uncertainty in estimates of global temperature change is that associated with the correction of systematic errors in sea surface temperature (SST) measurements. Despite ...recent work to quantify and reduce these errors throughout the historical record, differences between analyses remain larger than can be explained by the estimated uncertainties. We revisited the method used to estimate systematic errors and their uncertainties in version 3 of the Met Office Hadley Centre SST data set, HadSST. Using comparisons with oceanographic temperature profiles, we make estimates of biases associated with engine room measurements and insulated buckets and constrain the ranges of two of the more uncertain parameters in the bias estimation: the timing of the transition from uninsulated to insulated buckets in the middle twentieth century and the estimated fractions of different measurement methods used. Here, we present HadSST.4.0.0.0, based on release 3.0.0 and 3.0.1 of the International Comprehensive Ocean‐Atmosphere Data Set supplemented by drifting buoy measurements from the Copernicus Marine Environmental Monitoring Service. HadSST.4.0.0.0 comprises a 200‐member “ensemble” in which uncertain parameters in the SST bias scheme are varied to generate a range of adjustments. The evolution of global average SST in the new data set is similar to that in other SST data sets, and the difference between data sets is reduced during the middle twentieth century. However, the changes also highlight a discrepancy in the global‐average difference between adjusted SST and marine air temperature in the early 1990s and hence between HadSST.4.0.0.0 and, the National Oceanic and Atmospheric Administration SST data set, ERSSTv5.
Key Points
We describe the construction of HadSST.4.0.0.0, a climate data set of sea surface temperature change from 1850 to 2018
A range of bias adjustments was generated to create an ensemble of SST data sets with the ensemble spread partly constrained by oceanographic profile measurements
New estimates reduce discrepancy between data sets during the middle twentieth century and the recent slowdown in warming but highlight a divergence in the early 1990s
We present a new version of the Met Office Hadley Centre/Climatic Research Unit global surface temperature data set, HadCRUT5. HadCRUT5 presents monthly average near‐surface temperature anomalies, ...relative to the 1961–1990 period, on a regular 5° latitude by 5° longitude grid from 1850 to 2018. HadCRUT5 is a combination of sea‐surface temperature (SST) measurements over the ocean from ships and buoys and near‐surface air temperature measurements from weather stations over the land surface. These data have been sourced from updated compilations and the adjustments applied to mitigate the impact of changes in SST measurement methods have been revised. Two variants of HadCRUT5 have been produced for use in different applications. The first represents temperature anomaly data on a grid for locations where measurement data are available. The second, more spatially complete, variant uses a Gaussian process based statistical method to make better use of the available observations, extending temperature anomaly estimates into regions for which the underlying measurements are informative. Each is provided as a 200‐member ensemble accompanied by additional uncertainty information. The combination of revised input data sets and statistical analysis results in greater warming of the global average over the course of the whole record. In recent years, increased warming results from an improved representation of Arctic warming and a better understanding of evolving biases in SST measurements from ships. These updates result in greater consistency with other independent global surface temperature data sets, despite their different approaches to data set construction, and further increase confidence in our understanding of changes seen.
Plain Language Summary
We have produced a new version of a data set that measures changes of near‐surface temperature across the globe from 1850 to 2018, called HadCRUT5. We have included an improved data set of sea‐surface temperature, which better accounts for the effects of changes through time in how measurement were made from ships and buoys at sea. We have also included an expanded compilation of measurements made at weather stations on land. There are two variations of HadCRUT5, produced for different uses. The first, the “HadCRUT5 noninfilled data set,” maps temperature changes on a grid for locations close to where we have measurements. The second, the “HadCRUT5 analysis,” extends our estimates to locations further from the available measurements using a statistical technique that makes use of the spatial connectedness of temperature patterns. This improves the representation of less well observed regions in estimates of global, hemispheric and regional temperature change. Together, these updates and improvements reveal a slightly greater rise in near‐surface temperature since the nineteenth century, especially in the Northern Hemisphere, which is more consistent with other data sets. This increases our confidence in our understanding of global surface temperature changes since the mid‐19th century.
Key Points
We have created a new version of the Met Office Hadley Centre and Climatic Research Unit global surface temperature data set for 1850–2018
The new data set better represents sparsely observed regions of the globe and incorporates an improved sea‐surface temperature data set
This data set shows increased global average warming since the mid‐19th century and in recent years, consistent with other analyses
Observational estimates of global ocean heat content (OHC) change are used to assess Earth's energy imbalance over the 20th Century. However, intercomparison studies show that the mapping methods ...used to interpolate sparse ocean temperature profile data are a key source of uncertainty. We present a new approach to assessing OHC mapping methods using 'synthetic profiles' generated from a state-of-the-art global climate model simulation. Synthetic profiles have the same sampling characteristics as the historical ocean temperature profile data but are based on model simulation data. Mapping methods ingest these data in the same way as they would real observations, but the resultant mapped fields can be compared to a model simulation 'truth'. We use this approach to assess two mapping methods that are used routinely for climate monitoring and initialisation of decadal forecasts. The introduction of the Argo network of autonomous profiling floats during the 2000s drives clear improvements in the ability of these methods to reconstruct the variability and spatial structure of OHC changes. At depths below 2000 m, both methods underestimate the magnitude of the simulated ocean warming signal. Temporal variability and trends in OHC are better captured in the better-observed northern hemisphere than in the southern hemisphere. At all depths, the sampling characteristics of the historical data introduces some spurious variability in the estimates of global OHC on sub-annual to multi-annual timescales. However, many of the large scale spatial anomalies, especially in the upper ocean, are successfully reconstructed even with sparse observations from the 1960s, demonstrating the potential to construct historical ocean analyses for assessing decadal predictions. The value of using accurate global covariances for data-poor periods is clearly seen. The results of this 'proof-of-concept' study are encouraging for gaining further insights into the capabilities and limitations of different mapping methods and for quantifying uncertainty in global OHC estimates.
Although the mechanism of Aβ action in the pathogenesis of Alzheimer's disease (AD) has remained elusive, it is known to increase the expression of the antagonist of canonical wnt signalling, ...Dickkopf-1 (Dkk1), whereas the silencing of Dkk1 blocks Aβ neurotoxicity. We asked if clusterin, known to be regulated by wnt, is part of an Aβ/Dkk1 neurotoxic pathway. Knockdown of clusterin in primary neurons reduced Aβ toxicity and DKK1 upregulation and, conversely, Aβ increased intracellular clusterin and decreased clusterin protein secretion, resulting in the p53-dependent induction of DKK1. To further elucidate how the clusterin-dependent induction of Dkk1 by Aβ mediates neurotoxicity, we measured the effects of Aβ and Dkk1 protein on whole-genome expression in primary neurons, finding a common pathway suggestive of activation of wnt-planar cell polarity (PCP)-c-Jun N-terminal kinase (JNK) signalling leading to the induction of genes including EGR1 (early growth response-1), NAB2 (Ngfi-A-binding protein-2) and KLF10 (Krüppel-like factor-10) that, when individually silenced, protected against Aβ neurotoxicity and/or tau phosphorylation. Neuronal overexpression of Dkk1 in transgenic mice mimicked this Aβ-induced pathway and resulted in age-dependent increases in tau phosphorylation in hippocampus and cognitive impairment. Furthermore, we show that this Dkk1/wnt-PCP-JNK pathway is active in an Aβ-based mouse model of AD and in AD brain, but not in a tau-based mouse model or in frontotemporal dementia brain. Thus, we have identified a pathway whereby Aβ induces a clusterin/p53/Dkk1/wnt-PCP-JNK pathway, which drives the upregulation of several genes that mediate the development of AD-like neuropathologies, thereby providing new mechanistic insights into the action of Aβ in neurodegenerative diseases.
Provision of supplementary food for garden birds is practiced on a large scale in multiple countries. While this resource has benefits for wild bird populations, concern has been expressed regarding ...the potential for contamination of foodstuffs by mycotoxins, and the implications this might have for wildlife health. We investigated whether aflatoxin (AF) and ochratoxin A (OA) residues are present in foodstuffs sold for wild bird consumption at point of sale in Great Britain using high pressure liquid chromatography analyses. The hypothesis that production of these mycotoxins occurs in British climatic conditions, or under storage conditions after the point of sale, was tested under experimental conditions but was not proved by our study. While the majority of peanut samples were negative for AF residues, 10% (10/98) of samples at point of sale and 11% (13/119) of those across the storage and climate exposure treatment replicates contained AFB1 that exceeded the maximum permitted limit of 20 μg/kg. No significant difference was found in the detection of either mycotoxin between branded and non-branded products. The clinical significance, if any, of exposure of wild birds to mycotoxins requires further investigation. Nevertheless, the precautionary principle should be adopted and best practice steps to reduce the likelihood of wild bird exposure to mycotoxins are recommended.
Display omitted
•Provision of contaminated food risks exposure of wild birds to mycotoxins.•We tested samples of peanuts (whole and granule) and sunflower seed for mycotoxins.•10% of peanut samples exceeded the Maximum Permitted Limit for aflatoxin B1.•Storage and climate exposure treatments could lead to aflatoxin B1 production.•Best feeding practice will reduce health risks to birds from mycotoxin exposure.
Abstract
Background
Mammary gland tumours are the most frequently diagnosed tumours in the female dogs but just a few studies have analysed their epidemiology. Therefore, we set out to describe the ...epidemiology of canine mammary cancer in the Canary Archipelago, Spain. We analysed a pathology tumour registry (PTR) and identified 7362 samples obtained from 5240 female dogs resident on the Canary Archipelago during an 18-year period (2003–2020). Using a case–control study design, we compared mammary tumour affected dogs with the Canarian canine population registry in order to elucidate the breed associations for these tumours.
Results
The frequency of a diagnosis of mammary tumours relative to all tumour diagnoses in female dogs decreased during the study period from 62.7% to 48.9%. Contemporaneously, the proportion of dogs diagnosed with mammary tumours who were also neutered increased from 13.6% to 26.9%. There was a negative correlation (
R
= -0.84) between these changes. Additional findings were that: the proportion of female dogs diagnosed with multiple tumours increased by 23.5% and that the proportion of malignant tumours 89.2% diagnosed has remained stable through the period. Benign mammary tumours were diagnosed at younger ages (9.2 years old) than carcinomas (9.7 years old) and sarcomas (10.4 years old). Epithelial mammary tumours were diagnosed at younger ages in entire female dogs. Samoyed, Schnauzer, Poodle, German Pinscher and Cocker Spaniel were the breeds with the highest odds-ratios (OR) in comparison with the reference (crossbreeds) while Miniature Pinscher, American Staffordshire Terrier, English Pointer as well as some local breeds such as the Canary Warren Hound and the Majorero had the lowest ORs.
Conclusions
This study provides a description of the changing epidemiology of canine mammary cancer in the Canary Archipelago over the last two decades. We found high rates of CMT with a significant predominance of malignant tumours. Exact risk factors are uncertain, but a combination of environmental, regional socioeconomic affecting human and their pets, and animal management factors are likely to play a part. Specifically, neutering was negatively associated with the proportion of epithelial mammary gland tumours and breeds native to the region were at lower risk of mammary tumours. A deeper analysis of all these factors will facilitate a deeper understanding of the epidemiology of mammary gland tumours in both the canine and the human population.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
This work investigated in Alzheimer's disease dementia (AD), whether the probability of making an error on a task (or a correct response) was influenced by the outcome of the previous trials. We used ...the antisaccade task (AST) as a model task given the emerging consensus that it provides a promising sensitive and early biological test of cognitive impairment in AD. It can be employed equally well in healthy young and old adults, and in clinical populations. This study examined eye-movements in a sample of 202 participants (42 with dementia due to AD; 65 with mild cognitive impairment (MCI); 95 control participants). The findings revealed an overall increase in the frequency of AST errors in AD and MCI compared to the control group, as predicted. The errors on the current trial increased in proportion to the number of consecutive errors on the previous trials. Interestingly, the probability of errors was reduced on the trials that followed a previously corrected error, compared to the trials where the error remained uncorrected, revealing a level of adaptive control in participants with MCI or AD dementia. There was an earlier peak in the AST distribution of the saccadic reaction times for the inhibitory errors in comparison to the correct saccades. These findings revealed that the inhibitory errors of the past have a negative effect on the future performance of healthy adults as well as people with a neurodegenerative cognitive impairment.
Understanding the climate response of the Antarctic Peninsula ice sheet is vital for accurate predictions of sea-level rise. However, since climate models are typically too coarse to capture spatial ...variability in local scale meteorological processes, our ability to study specific sectors has been limited by the local fidelity of such models and the (often sparse) availability of observations. We show that a high-resolution (5.5 km × 5.5 km) version of a regional climate model (RACMO2.3) can reproduce observed interannual variability in the Larsen B embayment sufficiently to enable its use in investigating long-term changes in this sector. Using the model, together with automatic weather station data, we confirm previous findings that the year of the Larsen B ice shelf collapse (2001/02) was a strong melt year, but discover that total annual melt production was in fact ~30% lower than 2 years prior. While the year before collapse exhibited the lowest melting and highest snowfall during 1980–2014, the ice shelf was likely pre-conditioned for collapse by a series of strong melt years in the 1990s. Melt energy has since returned to pre-1990s levels, which likely explains the lack of further significant collapse in the region (e.g. of SCAR Inlet).