Summary
Determinants of trabecular bone score (TBS) and vertebral fractures assessed semiquantitatively (SQ1–SQ3) were studied in 496 women with fragility fractures. TBS was associated with age, ...parental hip fracture, alcohol intake and BMD, not SQ1–SQ3 fractures. SQ1–SQ3 fractures were associated with age, prior fractures, and lumbar spine BMD, but not TBS.
Introduction
Trabecular bone score (TBS) and vertebral fractures assessed by semiquantitative method (SQ1–SQ3) seem to reflect different aspects of bone strength. We therefore sought to explore the determinants of and the associations between TBS and SQ1–SQ3 fractures.
Methods
This cross-sectional sub-study of the Norwegian Capture the Fracture Initiative included 496 women aged ≥ 50 years with fragility fractures. All responded to a questionnaire about risk factors for fracture, had bone mineral density (BMD) of femoral neck and/or lumbar spine assessed, TBS calculated, and 423 had SQ1–SQ3 fracture assessed.
Results
Mean (SD) age was 65.6 years (8.6), mean TBS 1.27 (0.10), and 33.3% exhibited SQ1–SQ3 fractures. In multiple variable analysis, higher age (β
per SD
= − 0.26, 95% CI: − 0.36,− 0.15), parental hip fracture (β = − 0.29, 95% CI: − 0.54,− 0.05), and daily alcohol intake (β = − 0.43, 95% CI − 0.79, − 0.08) were associated with lower TBS. Higher BMD of femoral neck (β
per SD
= 0.34, 95% CI 0.25–0.43) and lumbar spine (β
per SD
= 0.40, 95% CI 0.31–0.48) were associated with higher TBS. In multivariable logistic regression analyses, age (OR
per SD
= 1.94, 95% CI 1.51–2.46) and prior fragility fractures (OR = 1.71, 95% CI 1.09–2.71) were positively associated with SQ1–SQ3 fractures, while lumbar spine BMD (OR
per SD
= 0.75 95% CI 0.60–0.95) was negatively associated with SQ1–SQ3 fractures. No association between TBS and SQ1–SQ3 fractures was found.
Conclusion
Since TBS and SQ1–SQ3 fractures were not associated, they may act as independent risk factors, justifying the use of both in post-fracture risk assessment.
COSMOGLOBE DR1 results Watts, D. J.; Basyrov, A.; Eskilt, J. R. ...
Astronomy and astrophysics (Berlin),
11/2023, Letnik:
679
Journal Article
Recenzirano
Odprti dostop
We present C
OSMOGLOBE
Data Release 1, which implements the first joint analysis of WMAP and
Planck
LFI time-ordered data, processed within a single Bayesian end-to-end framework. This framework ...directly builds on a similar analysis of the LFI measurements by the B
EYOND
P
LANCK
collaboration, and approaches the cosmic microwave background (CMB) analysis challenge through Gibbs sampling of a global posterior distribution, simultaneously accounting for calibration, mapmaking, and component separation. The computational cost of producing one complete WMAP+LFI Gibbs sample is 812 CPU-h, of which 603 CPU-h are spent on WMAP low-level processing; this demonstrates that end-to-end Bayesian analysis of the WMAP data is computationally feasible. We find that our WMAP posterior mean temperature sky maps and CMB temperature power spectrum are largely consistent with the official WMAP9 results. Perhaps the most notable difference is that our CMB dipole amplitude is 3366.2 ± 1.4 μK, which is 11 μK higher than the WMAP9 estimate and 2.5
σ
higher than B
EYOND
P
LANCK
; however, it is in perfect agreement with the HFI-dominated
Planck
PR4 result. In contrast, our WMAP polarization maps differ more notably from the WMAP9 results, and in general exhibit significantly lower large-scale residuals. We attribute this to a better constrained gain and transmission imbalance model. It is particularly noteworthy that the
W
-band polarization sky map, which was excluded from the official WMAP cosmological analysis, for the first time appears visually consistent with the
V
-band sky map. Similarly, the long standing discrepancy between the WMAP
K
-band and LFI 30 GHz maps is finally resolved, and the difference between the two maps appears consistent with instrumental noise at high Galactic latitudes. Relatedly, these updated maps allowed us for the first time to combine WMAP and LFI polarization data into a single coherent model of large-scale polarized synchrotron emission. Still, we identified a few issues that require additional work, including (1) low-level noise modeling; (2) large-scale temperature residuals at the 1–2 μK level; and (3) a strong degeneracy between the absolute
K
-band calibration and the dipole of the anomalous microwave emission component. We conclude that leveraging the complementary strengths of WMAP and LFI has allowed the mitigation of both experiments’ weaknesses, and resulted in new state-of-the-art WMAP sky maps. All maps and the associated code are made publicly available through the C
OSMOGLOBE
web page.
COSMOGLOBE DR1 results Eskilt, J. R.; Watts, D. J.; Aurlien, R. ...
Astronomy and astrophysics (Berlin),
11/2023, Letnik:
679
Journal Article
Recenzirano
Odprti dostop
Cosmic birefringence is a parity-violating effect that might have rotated the plane of the linearly polarized light of the cosmic microwave background (CMB) by an angle
β
since its emission. This ...angle has recently been measured to be nonzero at a statistical significance of 3.6
σ
in the official
Planck
PR4 and 9-year WMAP data. In this work, we constrain
β
using the reprocessed B
EYOND
P
LANCK
LFI and C
OSMOGLOBE
DR1 WMAP polarization maps. These novel maps have both lower systematic residuals and a more complete error description than the corresponding official products. Foreground
EB
correlations could bias measurements of
β
, and while thermal dust
EB
emission has been argued to be statistically nonzero, no evidence for synchrotron
EB
power has been reported. Unlike the dust-dominated
Planck
HFI maps, the majority of the LFI and WMAP polarization maps are instead dominated by synchrotron emission. Simultaneously constraining
β
and the polarization miscalibration angle,
α
, of each channel, we find a best-fit value of
β
= 0.35° ±0.70° with LFI and WMAP data only. When including the
Planck
HFI PR4 maps, but fitting
β
separately for dust-dominated,
β
> 70 GHz
, and synchrotron-dominated channels,
β
≤70 GHz
, we find
β
≤70 GHz
= 0.53° ±0.28°. This differs from zero with a statistical significance of 1.9
σ
, and the main contribution to this value comes from the LFI 70 GHz channel. While the statistical significances of these results are low on their own, the measurement derived from the LFI and WMAP synchrotron-dominated maps agrees with the previously reported HFI-dominated constraints, despite the very different astrophysical and instrumental systematics involved in all these experiments.
We implement support for a cosmological parameter estimation algorithm in
Commander
and quantify its computational efficiency and cost. For a semi-realistic simulation similar to
Planck
LFI 70 GHz, ...we find that the computational cost of producing one single sample is about 20 CPU-hours and that the typical Markov chain correlation length is ∼100 samples. The net effective cost per independent sample is ∼2000 CPU-hours, in comparison with all low-level processing costs of 812 CPU-hours for
Planck
LFI and WMAP in C
OSMOGLOBE
Data Release 1. Thus, although technically possible to run already in its current state, future work should aim to reduce the effective cost per independent sample by one order of magnitude to avoid excessive runtimes, for instance through multi-grid preconditioners and/or derivative-based Markov chain sampling schemes. This work demonstrates the computational feasibility of true Bayesian cosmological parameter estimation with end-to-end error propagation for high-precision CMB experiments without likelihood approximations, but it also highlights the need for additional optimizations before it is ready for full production-level analysis.
Objectives
Aim of this RCT was to evaluate whether the added use of a decision board (DB) during shared decision‐making improves patients' knowledge as for different treatment options and overall ...satisfaction with the consultation.
Methods
Forty‐nine undergraduate students were trained in shared decision‐making (SDM) and evaluated by an Objective Structured Clinical Examination (OSCE). According to their test results, all participants were randomly allocated to either the test‐ (DB) or the control‐group (Non‐DB). Both groups performed SDM with patients showing a defect in a posterior tooth (Class‐II defect). Prior to the interview, patients of the DB group were given the decision aid for review. In the Non‐DB group, patients were consulted without additional aids. After treatment decision, a questionnaire was completed by all patients to measure knowledge (costs, survival rate, characteristics and treatment time) and overall satisfaction with the consultation. Fifty DB patients and 31 Non‐DB patients completed the questionnaire.
Results
DB patients (n = 50) demonstrated a statistically significant increase in knowledge compared to the Non‐DB group (n = 31) (Mann–Whitney U‐test; DB group = 10.04; Non‐DB group = 4.16; P = 0.004). There was no significant difference between groups regarding satisfaction with the consultation (t‐test; P > 0.05).
Conclusions
During the shared decision‐making process, the use of a decision board yielding information about Class‐II treatment options leads to a significantly higher patient knowledge compared to knowledge gained through consultation alone. It is therefore desirable to provide DBs for dental diagnoses with several treatment options to increase transparency for the patient.
BEYONDPLANCK Andersen, K. J.; Herman, D.; Aurlien, R. ...
Astronomy and astrophysics (Berlin),
06/2023, Letnik:
675
Journal Article
Recenzirano
Odprti dostop
We present the intensity foreground algorithms and model employed within the B
EYOND
P
LANCK
analysis framework. The B
EYOND
P
LANCK
analysis is aimed at integrating component separation and ...instrumental parameter sampling within a global framework, leading to complete end-to-end error propagation in the
Planck
Low Frequency Instrument (LFI) data analysis. Given the scope of the B
EYOND
P
LANCK
analysis, a limited set of data is included in the component separation process, leading to foreground parameter degeneracies. In order to properly constrain the Galactic foreground parameters, we improve upon the previous
Commander
component separation implementation by adding a suite of algorithmic techniques. These algorithms are designed to improve the stability and computational efficiency for weakly constrained posterior distributions. These are: (1) joint foreground spectral parameter and amplitude sampling, building on ideas from M
IRAMARE
; (2) component-based monopole determination; (3) joint spectral parameter and monopole sampling; and (4) application of informative spatial priors for component amplitude maps. We find that the only spectral parameter with a significant signal-to-noise ratio using the current B
EYOND
P
LANCK
data set is the peak frequency of the anomalous microwave emission component, for which we find
ν
p
= 25.3 ± 0.5 GHz; all others must be constrained through external priors. Future works will be aimed at integrating many more data sets into this analysis, both map and time-ordered based, thereby gradually eliminating the currently observed degeneracies in a controlled manner with respect to both instrumental systematic effects and astrophysical degeneracies. When this happens, the simple LFI-oriented data model employed in the current work will need to be generalized to account for both a richer astrophysical model and additional instrumental effects. This work will be organized within the Open Science-based C
OSMOGLOBE
community effort.
We use data from two satellites and a terrestrial carbon model to quantify the impact of urbanization on the carbon cycle and food production in the US as a result of reduced net primary productivity ...(NPP). Our results show that urbanization is taking place on the most fertile lands and hence has a disproportionately large overall negative impact on NPP. Urban land transformation in the US has reduced the amount of carbon fixed through photosynthesis by 0.04 pg per year or 1.6% of the pre-urban input. The reduction is enough to offset the 1.8% gain made by the conversion of land to agricultural use, even though urbanization covers an area less than 3% of the land surface in the US and agricultural lands approach 29% of the total land area. At local and regional scales, urbanization increases NPP in resource-limited regions and through localized warming “urban heat” contributes to the extension of the growing season in cold regions. In terms of biologically available energy, the loss of NPP due to urbanization of agricultural lands alone is equivalent to the caloric requirement of 16.5 million people, or about 6% of the US population.
This paper provides an alternative behavioral foundation for an investor's use of power utility in the objective function and its particular risk aversion parameter. The foundation is grounded in an ...investor's desire to minimize the objective probability that the growth rate of invested wealth will not exceed an investor-selected target growth rate. Large deviations theory is used to show that this is equivalent to using power utility, with an argument that depends on the investor's target, and a risk aversion parameter determined by maximization. As a result, an investor's risk aversion parameter is not independent of the investment opportunity set, contrary to the standard model assumption.
Data from two different satellites, a digital land cover map, and digital census data were analyzed and combined in a geographic information system to study the effect of urbanization on ...photosynthetic productivity in the United States. Results show that urbanization can have a measurable but variable impact on the primary productivity of the land surface. Annual productivity can be reduced by as much as 20 days in some areas, but in resource limited regions, photosynthetic production can be enhanced by human activity. Overall, urban development reduces the productivity of the land surface, and those areas with the highest productivity are directly in the path of urban sprawl.