Background
Studies regarding adequacy of secondary stroke prevention are limited. We report medication adherence, risk factor control and factors influencing vascular risk profile following ischaemic ...stroke.
Methods
A total of 664 home‐dwelling participants in the Norwegian Cognitive Impairment After Stroke study, a multicenter observational study, were evaluated 3 and 18 months poststroke. We assessed medication adherence by self‐reporting (4‐item Morisky Medication Adherence Scale) and medication persistence (defined as continuation of medication(s) prescribed at discharge), achievement of guideline‐defined targets of blood pressure (BP) (<140/90 mmHg), low‐density lipoprotein cholesterol (LDL‐C) (<2.0 mmol L−1) and haemoglobin A1c (HbA1c) (≤53 mmol mol−1) and determinants of risk factor control.
Results
At discharge, 97% were prescribed antithrombotics, 88% lipid‐lowering drugs, 68% antihypertensives and 12% antidiabetic drugs. Persistence of users declined to 99%, 88%, 93% and 95%, respectively, at 18 months. After 3 and 18 months, 80% and 73% reported high adherence. After 3 and 18 months, 40.7% and 47.0% gained BP control, 48.4% and 44.6% achieved LDL‐C control, and 69.2% and 69.5% of diabetic patients achieved HbA1c control. Advanced age was associated with increased LDL‐C control (OR 1.03, 95% CI 1.01 to 1.06) and reduced BP control (OR 0.98, 0.96 to 0.99). Women had poorer LDL‐C control (OR 0.60, 0.37 to 0.98). Polypharmacy was associated with increased LDL‐C control (OR 1.29, 1.18 to 1.41) and reduced HbA1c control (OR 0.76, 0.60 to 0.98).
Conclusion
Risk factor control is suboptimal despite high medication persistence and adherence. Improved understanding of this complex clinical setting is needed for optimization of secondary preventive strategies.
We develop a framework for joint constraints on the CO luminosity function based on power spectra (PS) and voxel intensity distributions (VID) and apply this to simulations of CO Mapping Array ...Pathfinder (COMAP), a CO intensity mapping experiment. This Bayesian framework is based on a Markov chain Monte Carlo (MCMC) sampler coupled to a Gaussian likelihood with a joint PS + VID covariance matrix computed from a large number of fiducial simulations and re-calibrated with a small number of simulations per MCMC step. The simulations are based on dark matter halos from fast peak patch simulations combined with the LCO(Mhalo) model of Li et al. We find that the relative power to constrain the CO luminosity function depends on the luminosity range of interest. In particular, the VID is more sensitive at large luminosities, while the PS and the VID are both competitive at small and intermediate luminosities. The joint analysis is superior to using either observable separately. When averaging over CO luminosities ranging between , and over 10 cosmological realizations of COMAP Phase 2, the uncertainties (in dex) are larger by 58% and 30% for the PS and VID, respectively, when compared to the joint analysis (PS + VID). This method is generally applicable to any other random field, with a complicated likelihood, as long a fast simulation procedure is available.
Background and purpose
The aim of this pooled patient‐level data analysis was to test if multidomain interventions, addressing several modifiable vascular risk factors simultaneously, are more ...effective than usual post‐stroke care for the prevention of cognitive decline after stroke.
Methods
This pooled patient‐level data analysis included two randomized controlled trials using a multidomain approach to target vascular risk factors in stroke patients and cognition as primary outcome. Changes from baseline to 12 months in the trail making test (TMT)‐A, TMT‐B and 10‐words test were analysed using stepwise backward linear mixed models with study as random factor. Two analyses were based on the intention‐to‐treat (ITT) principle using different imputation approaches and one was based on complete cases.
Results
Data from 322 patients (157 assigned to multidomain intervention and 165 to standard care) were analysed. Differences between randomization groups for TMT‐A scores were found in one ITT model (P = 0.014) and approached significance in the second ITT model (P = 0.087) and for complete cases (P = 0.091). No significant intervention effects were found for any of the other cognitive variables.
Conclusion
We found indications that multidomain interventions compared with standard care can improve the scores in TMT‐A at 1 year after stroke but not those for TMT‐B or the 10‐words test. These results have to be interpreted with caution due to the small number of patients.
We present the first application of the C
OSMOGLOBE
analysis framework by analyzing nine-year WMAP time-ordered observations that uses similar machinery to that of B
EYOND
P
LANCK
for the
Planck
Low ...Frequency Instrument (LFI). We analyzed only the
Q
-band (41 GHz) data and report on the low-level analysis process based on uncalibrated time-ordered data to calibrated maps. Most of the existing B
EYOND
P
LANCK
pipeline may be reused for WMAP analysis with minimal changes to the existing codebase. The main modification is the implementation of the same preconditioned biconjugate gradient mapmaker used by the WMAP team. Producing a single WMAP
Q
1-band sample requires 22 CPU-hrs, which is slightly more than the cost of a
Planck
44 GHz sample of 17 CPU-hrs; this demonstrates that a full end-to-end Bayesian processing of the WMAP data is computationally feasible. In general, our recovered maps are very similar to the maps released by the WMAP team, although with two notable differences. In terms of temperature, we find a ∼2 μK quadrupole difference that most likely is caused by different gain modeling, while in polarization we find a distinct 2.5 μK signal that has been previously referred to as poorly measured modes by the WMAP team. In the C
OSMOGLOBE
processing, this pattern arises from temperature-to-polarization leakage from the coupling between the CMB Solar dipole, transmission imbalance, and sidelobes. No traces of this pattern are found in either the frequency map or TOD residual map, suggesting that the current processing has succeeded in modeling these poorly measured modes within the assumed parametric model by using
Planck
information to break the sky-synchronous degeneracies inherent in the WMAP scanning strategy.
We test for foreground residuals in the foreground-cleaned Planck cosmic microwave background (CMB) maps outside and inside the U73 mask commonly used for cosmological analysis. The aim of this paper ...is to introduce a new method of validating masks by looking at the differences in cleaned maps obtained by different component-separation methods. By analyzing the power spectrum, as well as the mean, rms, and skewness of needlet coefficients on separate equatorial bands running from the poles to the equator outside and inside the U73 mask, we first confirm that the pixels already masked by U73 are highly contaminated and cannot be used for cosmological analysis. We further find that the U73 mask needs extension in order to reduce large-scale foreground residuals to a level of less than 20% of the standard deviation of CMB fluctuations within the bands closest to the galactic equator. We also find 276 point-like residuals in the cleaned foreground maps that are currently not masked by the U73 mask. About 80 of these are identified as sz clusters that have not been properly subtracted by the component separation methods, and the rest are strongly correlated with the Planck dust map, indicating point-like dust residuals. Our final publicly available extended mask leaves 65.9% of the sky for cosmological analysis. This extended mask may be important for analyses on local sky patches; for the full sky power spectrum, we have shown that the unmasked residuals have very little impact.
To explore the impact of premorbid physical activity on stroke severity and functioning, measured by activities of daily living, gait and balance during the acute period of first-ever stroke and at ...one-year follow-up.
Acute phase and one-year follow-up registrations of 183 patients with first-ever stroke or transient ischaemic attack were included in the study. Gender, age, education, living arrangements, body mass index, smoking, hypertension, stroke classification and use of walking aids were recorded. Premorbid physical activity was recorded with the Walking Habits questionnaire. The outcomes post-stroke were the National Institutes of Health Stroke Scale, the Modified Ranking Scale, Barthel ADL Index, Maximal Walking Speed and Berg Balance Scale.
Significant associations (p < 0.05) were found between the participants` pre-stroke "duration of regular walks" and functioning on all outcomes in the acute phase of stroke. Participants who walked for more than 30 min each time achieved significantly better results. The measures of gait and balance showed similar associations (p < 0.05) at one-year follow-up.
There are significant associations between premorbid walking habits and functional status after first-ever stroke. Weekly light-intensity activity, such as walking for more than 30 min, may have a sustained impact on functioning after stroke.
Abstract
Background
Assessment of carotid atherosclerosis with ultrasound may have an unfilled potential in predicting stroke and cardiovascular events.
Purpose
We aimed to explore the predictive ...value of the carotid plaque score compared to the Systematic COronary Risk Evaluation (SCORE) 2 risk prediction algorithm, on incident stroke and major adverse cardiovascular events (MACE), and establish a prognostic cut-off for the carotid plaque score.
Methods
In the prospective Akershus Cardiac Eexamination (ACE) 1950 cohort study, carotid plaque score was calculated with ultrasound at inclusion in 2012-2015. The largest plaque diameter in each extra cranial segment of the carotid artery both sides was measured and scored from 0-3 points. The sum of each segment provided the carotid plaque score. Incident stroke and composite endpoint MACE (nonfatal stroke, nonfatal myocardial infarction, and cardiovascular death) was assessed by linkage to national registries throughout 2020.
Results
Carotid plaque score was available in 3650 (98.5%) of the participants, with mean age at inclusion was 63.9±0.64 years. Only 462 (12.7%) were free of plaque and the median carotid plaque score was 2 (IQR 1-4). By the end of 2020, 42 (1.2%) subjects had experienced a stroke and 145 (4.0%) a MACE. Carotid plaque score was a predictor of incident stroke (HR 1.25, 95%CI 1.15-1.36) and MACE (OR 1.21, 95%CI 1.14-1.27) after adjustment for SCORE2, and outperformed SCORE2 in predicting stroke (p=0.001). The best cut-off value for carotid plaque score determined by receiver operator characteristics area under the curve was ≥4 with a positive predictive value 2.5% and negative predictive value 99.3%.
Conclusion
Carotid plaque score is a strong predictor of incident stroke and MACE and outperforms SCORE2 for risk prediction in a middle-aged cohort recruited from the general population. A cut-off score of ≥4 seems to be suitable to identify high-risk subjects.Graphical AbstractCarotid plaque score distribution
The BeyondPlanck and Cosmoglobe collaborations have implemented the first integrated Bayesian end-to-end analysis pipeline for CMB experiments. The primary long-term motivation for this work is to ...develop a common analysis platform that supports efficient global joint analysis of complementary radio, microwave, and sub-millimeter experiments. A strict prerequisite for this to succeed is broad participation from the CMB community, and two foundational aspects of the program are therefore reproducibility and Open Science. In this paper, we discuss our efforts toward this aim. We also discuss measures toward facilitating easy code and data distribution, community-based code documentation, user-friendly compilation procedures, etc. This work represents the first publicly released end-to-end CMB analysis pipeline that includes raw data, source code, parameter files, and documentation. We argue that such a complete pipeline release should be a requirement for all major future and publicly-funded CMB experiments, noting that a full public release significantly increases data longevity by ensuring that the data quality can be improved whenever better processing techniques, complementary datasets, or more computing power become available, and thereby also taxpayers’ value for money; providing only raw data and final products is not sufficient to guarantee full reproducibility in the future.
COSMOGLOBE DR1 results Watts, D. J.; Basyrov, A.; Eskilt, J. R. ...
Astronomy and astrophysics (Berlin),
11/2023, Letnik:
679
Journal Article
Recenzirano
Odprti dostop
We present C
OSMOGLOBE
Data Release 1, which implements the first joint analysis of WMAP and
Planck
LFI time-ordered data, processed within a single Bayesian end-to-end framework. This framework ...directly builds on a similar analysis of the LFI measurements by the B
EYOND
P
LANCK
collaboration, and approaches the cosmic microwave background (CMB) analysis challenge through Gibbs sampling of a global posterior distribution, simultaneously accounting for calibration, mapmaking, and component separation. The computational cost of producing one complete WMAP+LFI Gibbs sample is 812 CPU-h, of which 603 CPU-h are spent on WMAP low-level processing; this demonstrates that end-to-end Bayesian analysis of the WMAP data is computationally feasible. We find that our WMAP posterior mean temperature sky maps and CMB temperature power spectrum are largely consistent with the official WMAP9 results. Perhaps the most notable difference is that our CMB dipole amplitude is 3366.2 ± 1.4 μK, which is 11 μK higher than the WMAP9 estimate and 2.5
σ
higher than B
EYOND
P
LANCK
; however, it is in perfect agreement with the HFI-dominated
Planck
PR4 result. In contrast, our WMAP polarization maps differ more notably from the WMAP9 results, and in general exhibit significantly lower large-scale residuals. We attribute this to a better constrained gain and transmission imbalance model. It is particularly noteworthy that the
W
-band polarization sky map, which was excluded from the official WMAP cosmological analysis, for the first time appears visually consistent with the
V
-band sky map. Similarly, the long standing discrepancy between the WMAP
K
-band and LFI 30 GHz maps is finally resolved, and the difference between the two maps appears consistent with instrumental noise at high Galactic latitudes. Relatedly, these updated maps allowed us for the first time to combine WMAP and LFI polarization data into a single coherent model of large-scale polarized synchrotron emission. Still, we identified a few issues that require additional work, including (1) low-level noise modeling; (2) large-scale temperature residuals at the 1–2 μK level; and (3) a strong degeneracy between the absolute
K
-band calibration and the dipole of the anomalous microwave emission component. We conclude that leveraging the complementary strengths of WMAP and LFI has allowed the mitigation of both experiments’ weaknesses, and resulted in new state-of-the-art WMAP sky maps. All maps and the associated code are made publicly available through the C
OSMOGLOBE
web page.
BEYONDPLANCK Galloway, M.; Andersen, K. J.; Aurlien, R. ...
Astronomy and astrophysics (Berlin),
06/2023, Letnik:
675
Journal Article
Recenzirano
Odprti dostop
We describe the computational infrastructure for end-to-end Bayesian cosmic microwave background (CMB) analysis implemented by the BeyondPlanck Collaboration. The code is called
Commander3
. It ...provides a statistically consistent framework for global analysis of CMB and microwave observations and may be useful for a wide range of legacy, current, and future experiments. The paper has three main goals. Firstly, we provide a high-level overview of the existing code base, aiming to guide readers who wish to extend and adapt the code according to their own needs or re-implement it from scratch in a different programming language. Secondly, we discuss some critical computational challenges that arise within any global CMB analysis framework, for instance in-memory compression of time-ordered data, fast Fourier transform optimization, and parallelization and load-balancing. Thirdly, we quantify the CPU and RAM requirements for the current B
EYOND
P
LANCK
analysis, finding that a total of 1.5 TB of RAM is required for efficient analysis and that the total cost of a full Gibbs sample for LFI is 170 CPU-hrs, including both low-level processing and high-level component separation, which is well within the capabilities of current low-cost computing facilities. The existing code base is made publicly available under a GNU General Public Library (GPL) license.