Abstract Less-educated workers have the lowest participation rates in job-related further training across the industrialized world, but the extent of their disadvantage varies. Using data on 28 high- ...and middle-income countries, we assess different explanations for less-educated workers’ training disadvantage relative to intermediate-educated workers, with a focus on the role of labor market allocation (i.e. job tasks, other job features and firm characteristics). Shapley decompositions reveal a broadly similar pattern for all countries: differences in labor market allocation between less- and intermediate-educated workers are more important for explaining the training gap than differences in individual learning disposition (i.e. cognitive skills and motivation to learn). Our analysis further suggests that the training gap is related to educational and labor market institutions and that labor market allocation processes play a key role in mediating any institutional ‘effects’. Strong conclusions regarding the role of institutions are hampered by the small country-level sample, however.
We present MadDM v.3.0, a numerical tool to compute particle dark matter observables in generic new physics models. The new version features a comprehensive and automated framework for dark matter ...searches at the interface of collider physics, astrophysics and cosmology and is deployed as a plugin of the MadGraph5_aMC@NLO platform, inheriting most of its features. With respect to the previous version, MadDM v.3.0 can now provide predictions for indirect dark matter signatures in astrophysical environments, such as the annihilation cross section at present time and the energy spectra of prompt photons, cosmic rays and neutrinos resulting from dark matter annihilation. MadDM indirect detection features support both 2→2 and 2→n dark matter annihilation processes. In addition, the ability to compare theoretical predictions with experimental constraints is extended by including the Fermi-LAT likelihood for gamma-ray constraints from dwarf spheroidal galaxies and by providing an interface with the nested sampling algorithm PyMultiNest to perform high dimensional parameter scans efficiently. We validate the code for a wide set of dark matter models by comparing the results from MadDM v.3.0 to existing tools and results in the literature.
SModelS is an automatized tool enabling the fast interpretation of simplified model results from the LHC within any model of new physics respecting a Z2 symmetry. With the version 1.2 we announce ...several new features. First, previous versions were restricted to missing energy signatures and assumed prompt decays within each decay chain. SModelSv1.2 considers the lifetime of each Z2-odd particle and appropriately takes into account missing energy, heavy stable charged particle and R-hadron signatures. Second, the current version allows for a combination of signal regions in efficiency map results whenever a covariance matrix is available from the experiment. This is an important step towards fully exploiting the constraining power of efficiency map results. Several other improvements increase the user-friendliness, such as the use of wildcards in the selection of experimental results, and a faster database which can be given as a URL. Finally, smodelsTools provides an interactive plots maker to conveniently visualize the results of a model scan.
Program Title: SModelS
Program Files doi:http://dx.doi.org/10.17632/w4nft4459w.2
Licensing provisions: GPLv3
Programming language: Python3
Journal reference of previous version: Comput. Phys. Commun. 227 (2018) 72
Does the new version supersede the previous version?: Yes
Reasons for the new version: Addition of new features.
Summary of revisions: The most important new features in v1.2 are the combination of signal regions in efficiency map results whenever a covariance matrix is available from the experiment, and the implementation of heavy stable charged particle and R-hadron signatures. Moreover, the database of experimental results can now be given as a URL, and the pickling has been improved to make the database faster. Other improvements include that wildcards are allowed when selecting analyses, datasets or topologies, and that the path to the model file, formerly required to be smodels/sparticles.py, can be specified in the parameters card. For the convenience of the user, we also provide a tool to make interactive plots to visualize the results of a model scan. Finally, the whole code now also runs with Python3, which has become the recommended default, and it can now be installed in its source directory.
Nature of problem: The results for searches for new physics beyond the Standard Model (BSM) at the Large Hadron Collider are often communicated by the experimental collaborations in terms of constraints on so-called simplified models spectra (SMS). Understanding how SMS constraints impact a realistic new physics model, where possibly a multitude of production channels and decay modes are relevant, is a non-trivial task.
Solution method: We exploit the notion of simplified models to constrain full models by “decomposing” them into their SMS components. A database of SMS results obtained from the official results of the ATLAS and CMS collaborations, but in part also from ‘recasting’ the experimental analyses, can be matched against the decomposed model, resulting in a statement to what extent the model at hand is in agreement or contradiction with the experimental results. Further useful information on, e.g., the coverage of the model’s signatures is also provided.
Additional comments including restrictions and unusual features: At present, only models with a Z2-like symmetry can be tested. Each SMS is defined purely by the vertex structure and the final-state particles; initial and intermediate BSM particles are described only by their masses, production cross sections, branching ratios and total widths. Possible differences in signal selection efficiencies arising, e.g., from different production mechanisms or from the spin of the BSM particles, are ignored in this approach. Since only part of the full model can be constrained by SMS results, SModelS will always remain more conservative (though orders of magnitude faster) than “full recasting” approaches.
1 F. Ambrogi et al., “SModelS v1.1 user manual: Improving simplified model constraints with efficiency maps,” Comput. Phys. Commun. 227 (2018) 72 arXiv:1701.06586 hep-ph.
Simplified models for exotic BSM searches Heisig, Jan; Lessa, Andre; Quertenmont, Loic
The journal of high energy physics,
12/2015, Volume:
2015, Issue:
12
Journal Article
Peer reviewed
Open access
A
bstract
Simplified models are a successful way of interpreting current LHC searches for models beyond the standard model (BSM). So far simplified models have focused on topologies featuring a ...missing transverse energy (MET) signature. However, in some BSM theories other, more exotic, signatures occur. If a charged particle becomes long-lived on collider time scales — as it is the case in parts of the SUSY parameter space — it leads to a very distinct signature. We present an extension of the computer package SModelS which includes simplified models for heavy stable charged particles (HSCP). As a physical application we investigate the CMSSM stau co-annihilation strip containing long-lived staus, which presents a potential solution to the Lithium problem. Applying both MET and HSCP constraints we show that, for low values of tan
β
, all this region of parameter space either violates Dark Matter constraints or is excluded by LHC searches.
We use PIAAC data to study the relationship between parental education and educational success among adults from 23 advanced economies. We consider educational success in terms of both educational ...attainment (formal qualifications) and educational achievement (competencies) and in both absolute and relative terms (i.e. as the individual’s rank in the distribution of educational success). Parental education effects are stronger for educational attainment than for achievement in all countries. Cross-national variation in the strength of social background effects follows broadly similar patterns for the different ways of measuring success, but a few countries combine relatively strong achievement with relatively weak attainment effects and vice versa. Tracking in secondary education is associated with stronger background effects for educational attainment but not for achievement. Greater prevalence of formal (non-formal) AET is associated with stronger (weaker) background effects for both attainment and achievement, while vocational orientation of upper secondary education does not matter much.
Mixed-effects multilevel models are often used to investigate cross-level interactions, a specific type of context effect that may be understood as an upper-level variable moderating the association ...between a lower-level predictor and the outcome. We argue that multilevel models involving cross-level interactions should always include random slopes on the lower-level components of those interactions. Failure to do so will usually result in severely anti-conservative statistical inference. We illustrate the problem with extensive Monte Carlo simulations and examine its practical relevance by studying 30 prototypical cross-level interactions with European Social Survey data for 28 countries. In these empirical applications, introducing a random slope term reduces the absolute t-ratio of the cross-level interaction term by 31 per cent or more in three quarters of cases, with an average reduction of 42 per cent. Many practitioners seem to be unaware of these issues. Roughly half of the cross-level interaction estimates published in the European Sociological Review between 2011 and 2016 are based on models that omit the crucial random slope term. Detailed analysis of the associated test statistics suggests that many of the estimates would not reach conventional thresholds for statistical significance in correctly specified models that include the random slope. This raises the question how much robust evidence of cross-level interactions sociology has actually produced over the past decades.
In aging societies, more people become vulnerable to experiencing cognitive decline. Simultaneously, the role of grandparenthood is central for older adults and their families. Our study investigates ...inequalities in the level and trajectories of cognitive functioning among older adults, focusing on possible intersectional effects of social determinants and grandparenthood as a life course transition that may contribute to delaying cognitive decline.
Using longitudinal data from the Survey of Health, Ageing and Retirement in Europe, we analyzed a sample of 19,953 individuals aged 50-85 without grandchildren at baseline. We applied Multilevel Analysis of Individual Heterogeneity and Discriminatory Accuracy to investigate variation in cognitive functioning across 48 intersectional strata, defined by sex/gender, migration, education, and occupation. We allowed the impact of becoming a grandparent on cognitive functioning trajectories to vary across strata by including random slopes.
Intersectional strata accounted for 17.43% of the overall variance in cognitive functioning, with most of the stratum-level variation explained by additive effects of the stratum-defining characteristics. Transition to grandparenthood was associated with higher cognitive functioning, showing a stronger effect for women. Stratum-level variation in the grandparenthood effect was modest, especially after accounting for interactions between grandparenthood and the stratum-defining variables.
This study highlights the importance of social determinants for understanding heterogeneities in older adults' level of cognitive functioning and its association with the transition to grandparenthood. Cumulative disadvantages negatively affect cognitive functioning, hence adopting an intersectional lens is useful to decompose inequalities and derive tailored interventions to promote equal healthy aging.