The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, ...we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.
In this innovatory book Daniel Sage analyses how and why American space exploration reproduced and transformed American cultural and political imaginations by appealing to, and to an extent ...organizing, the transcendence of spatial and temporal frontiers. In so doing, he traces the development of a seductive, and powerful, yet complex and unstable American geographical imagination: the 'transcendental state'. Historical and indeed contemporary space exploration is, despite some recent notable exceptions, worthy of more attention across the social sciences and humanities. While largely engaging with the historical development of space exploration, it shows how contemporary cultural and social, and indeed geographical, research themes, including national identity, critical geopolitics, gender, technocracy, trauma and memory, can be informed by the study of space exploration.
•Comprehensive introduction to 3D deconvolution microscopy.•Description of standard algorithms of deconvolution.•Presentation of the Java open-source software: DeconvolutionLab2.•Benchmark on open ...reference datasets.
Images in fluorescence microscopy are inherently blurred due to the limit of diffraction of light. The purpose of deconvolution microscopy is to compensate numerically for this degradation. Deconvolution is widely used to restore fine details of 3D biological samples. Unfortunately, dealing with deconvolution tools is not straightforward. Among others, end users have to select the appropriate algorithm, calibration and parametrization, while potentially facing demanding computational tasks. To make deconvolution more accessible, we have developed a practical platform for deconvolution microscopy called DeconvolutionLab. Freely distributed, DeconvolutionLab hosts standard algorithms for 3D microscopy deconvolution and drives them through a user-oriented interface. In this paper, we take advantage of the release of DeconvolutionLab2 to provide a complete description of the software package and its built-in deconvolution algorithms. We examine several standard algorithms used in deconvolution microscopy, notably: Regularized inverse filter, Tikhonov regularization, Landweber, Tikhonov–Miller, Richardson–Lucy, and fast iterative shrinkage-thresholding. We evaluate these methods over large 3D microscopy images using simulated datasets and real experimental images. We distinguish the algorithms in terms of image quality, performance, usability and computational requirements. Our presentation is completed with a discussion of recent trends in deconvolution, inspired by the results of the Grand Challenge on deconvolution microscopy that was recently organized.
Circadian clocks operative in pancreatic islets participate in the regulation of insulin secretion in humans and, if compromised, in the development of type 2 diabetes (T2D) in rodents. Here we ...demonstrate that human islet α- and β-cells that bear attenuated clocks exhibit strongly disrupted insulin and glucagon granule docking and exocytosis. To examine whether compromised clocks play a role in the pathogenesis of T2D in humans, we quantified parameters of molecular clocks operative in human T2D islets at population, single islet, and single islet cell levels. Strikingly, our experiments reveal that islets from T2D patients contain clocks with diminished circadian amplitudes and reduced in vitro synchronization capacity compared to their nondiabetic counterparts. Moreover, our data suggest that islet clocks orchestrate temporal profiles of insulin and glucagon secretion in a physiological context. This regulation was disrupted in T2D subjects, implying a role for the islet cell-autonomous clocks in T2D progression. Finally, Nobiletin, an agonist of the core-clock proteins RORα/γ, boosted both circadian amplitude of T2D islet clocks and insulin secretion by these islets. Our study emphasizes a link between the circadian clockwork and T2D and proposes that clock modulators hold promise as putative therapeutic agents for this frequent disorder.
A special case of the geometric Langlands correspondence is given by the relationship between solutions of the Bethe ansatz equations for the Gaudin model and opers-connections on the projective line ...with extra structure. In this paper, we describe a deformation of this correspondence for
SL
(
N
)
. We introduce a difference equation version of opers called
q
-opers and prove a
q
-Langlands correspondence between nondegenerate solutions of the Bethe ansatz equations for the XXZ model and nondegenerate twisted
q
-opers with regular singularities on the projective line. We show that the quantum/classical duality between the XXZ spin chain and the trigonometric Ruijsenaars–Schneider model may be viewed as a special case of the
q
-Langlands correspondence. We also describe an application of
q
-opers to the equivariant quantum
K
-theory of the cotangent bundles to partial flag varieties.
With the widespread uptake of two-dimensional (2D) and three-dimensional (3D) single-molecule localization microscopy (SMLM), a large set of different data analysis packages have been developed to ...generate super-resolution images. In a large community effort, we designed a competition to extensively characterize and rank the performance of 2D and 3D SMLM software packages. We generated realistic simulated datasets for popular imaging modalities-2D, astigmatic 3D, biplane 3D and double-helix 3D-and evaluated 36 participant packages against these data. This provides the first broad assessment of 3D SMLM software and provides a holistic view of how the latest 2D and 3D SMLM packages perform in realistic conditions. This resource allows researchers to identify optimal analytical software for their experiments, allows 3D SMLM software developers to benchmark new software against the current state of the art, and provides insight into the current limits of the field.
A new method based on the Young–Laplace equation for measuring contact angles and surface tensions is presented. In this approach, a first-order perturbation technique helps to analytically solve the ...Young–Laplace equation according to photographic images of axisymmetric sessile drops. When appropriate, the calculated drop contour is extended by mirror symmetry so that reflection of the drop into substrate allows the detection of position of the contact points. To keep a wide range of applicability, a discretisation of the drop’s profile is not realised; instead, an optimisation of an advanced image-energy term fits an approximation of the Young–Laplace equation to drop boundaries. In addition, cubic B-spline interpolation is applied to the image of the drop to reach subpixel resolution. To demonstrate the method’s accuracy, simulated drops as well as images of liquid coal ash slags were analysed. Thanks to the high-quality image interpolation model and the image-energy term, the experiments demonstrated robust measurements over a wide variety of image types and qualities. The method was implemented in Java and is freely available A.F. Stalder, LBADSA, Biomedical Imaging Group, EPFL,
http://bigwww.epfl.ch/demo/dropanalysis.
Peer review and citation metrics are two means of gauging the value of scientific research, but the lack of publicly available peer review data makes the comparison of these methods difficult. ...Mathematics can serve as a useful laboratory for considering these questions because as an exact science, there is a narrow range of reasons for citations. In mathematics, virtually all published articles are post-publication reviewed by mathematicians in Mathematical Reviews (MathSciNet) and so the data set was essentially the Web of Science mathematics publications from 1993 to 2004. For a decade, especially important articles were singled out in Mathematical Reviews for featured reviews. In this study, we analyze the bibliometrics of elite articles selected by peer review and by citation count. We conclude that the two notions of significance described by being a featured review article and being highly cited are distinct. This indicates that peer review and citation counts give largely independent determinations of highly distinguished articles. We also consider whether hiring patterns of subfields and mathematicians’ interest in subfields reflect subfields of featured review or highly cited articles. We re-examine data from two earlier studies in light of our methods for implications on the peer review/citation count relationship to a diversity of disciplines.
Aging affects elastin, a key component of the arterial wall integrity and functionality. Elastin degradation in cerebral vessels is associated with cerebrovascular disease. The goal of this study is ...to assess the biomechanical properties of human cerebral arteries, their composition, and their geometry, with particular focus on the functional alteration of elastin attributable to aging.
Twelve posterior cranial arteries obtained from human cadavers of 2 different age groups were compared morphologically and tested biomechanically before and after enzymatic degradation of elastin. Light, confocal, and scanning electron microscopy were used to analyze and determine structural differences, potentially attributed to aging.
Aging affects structural morphology and the mechanical properties of intracranial arteries. In contrast to main systemic arteries, intima and media thicken while outer diameter remains relatively constant with age, leading to concentric hypertrophy. The structural morphology of elastin changed from a fiber network oriented primarily in the circumferential direction to a more heterogeneously oriented fiber mesh, especially at the intima. Biomechanically, cerebral arteries stiffen with age and lose compliance in the elastin dominated regime. Enzymatic degradation of elastin led to loss in compliance and stiffening in the young group but did not affect the structural and material properties in the older group, suggesting that elastin, though present in equal quantities in the old group, becomes dysfunctional with aging.
Elastin loses its functionality in cerebral arteries with aging, leading to stiffer less compliant arteries. The area fraction of elastin remained, however, fairly constant. The loss of functionality may thus be attributed to fragmentation and structural reorganization of elastin occurring with age.