To understand and approach the spread of the SARS-CoV-2 epidemic, machine learning offers fundamental tools. This study presents the use of machine learning techniques for projecting COVID-19 ...infections and deaths in Mexico. The research has three main objectives: first, to identify which function adjusts the best to the infected population growth in Mexico; second, to determine the feature importance of climate and mobility; third, to compare the results of a traditional time series statistical model with a modern approach in machine learning. The motivation for this work is to support health care providers in their preparation and planning. The methods compared are linear, polynomial, and generalized logistic regression models to describe the growth of COVID-19 incidents in Mexico. Additionally, machine learning and time series techniques are used to identify feature importance and perform forecasting for daily cases and fatalities. The study uses the publicly available data sets from the John Hopkins University of Medicine in conjunction with the mobility rates obtained from Google’s Mobility Reports and climate variables acquired from the Weather Online API. The results suggest that the logistic growth model fits best the pandemic’s behavior, that there is enough correlation of climate and mobility variables with the disease numbers, and that the Long short-term memory network can be exploited for predicting daily cases. Given this, we propose a model to predict daily cases and fatalities for SARS-CoV-2 using time series data, mobility, and weather variables.
Patagonia is an understudied area, especially when it comes to population genomic studies with relevance to fishery management. However, the dynamic and heterogeneous landscape in this area can ...harbor an important but cryptic genetic population structure. Once such information is revealed, it can be integrated into the management of infrequently investigated species. Eleginops maclovinus is a protandrous hermaphrodite species with economic importance for local communities that are currently managed as a single genetic unit. In this study, we sampled five locations distributed across a salinity cline from Northern Patagonia to investigate the genetic population structure of E. maclovinus. We used restriction site‐associated DNA (RAD) sequencing and outlier tests to obtain neutral and adaptive loci, using FST and GEA approaches. We identified a spatial pattern of structuration with gene flow and spatial selection by environmental association. Neutral and adaptive loci showed two and three genetic groups, respectively. The effective population sizes estimated ranged from 572 (Chepu) to 14,454 (Chaitén) and were influenced more by locality than by salinity cline. We found loci putatively associated with salinity suggesting that salinity may act as a selective driver in E. maclovinus populations. These results suggest a complex interaction between genetic drift, gene flow, and natural selection in this area. Our findings also suggest several evolutionary significant units in this area, and the information should be integrated into the management of this species. We discussed the significance of these results for fishery management and suggest future directions to improve our understanding of how E. maclovinus has adapted to the dynamic waters of Northern Patagonia.
Patagonia is an understudied area, especially when it comes to population genomic studies with relevance to fishery management.; Using RADseq we found that neutral and adaptive loci reveal fine‐scale population structure in Eleginops maclovinus.; We found loci putatively associated with salinity suggesting that it may act as a selective driver in E. maclovinus populations.
Background
This paper explores machine learning algorithms and approaches for predicting alum income to obtain insights on the strongest predictors and a ‘high’ earners’ class.
Methods
It examines ...the alum sample data obtained from a survey from a multicampus Mexican private university. Survey results include 17,898 and 12,275 observations before and after cleaning and pre-processing, respectively. The dataset comprises income values and a large set of independent demographical attributes of former students. We conduct an in-depth analysis to determine whether the accuracy of traditional algorithms can be improved with a data science approach. Furthermore, we present insights on patterns obtained using explainable artificial intelligence techniques.
Results
Results show that the machine learning models outperformed the parametric models of linear and logistic regression, in predicting alum’s current income with statistically significant results (p < 0.05) in three different tasks. Moreover, the later methods were found to be the most accurate in predicting the alum’s first income after graduation.
Conclusion
We identified that age, gender, working hours per week, first income and variables related to the alum’s job position and firm contributed to explaining their current income. Findings indicated a gender wage gap, suggesting that further work is needed to enable equality.
•The Glulam beams increased their load capacity by up to 20 % with a section composed of 11.36 % UHPFRC.•The Glulam beams are less expansive up to 56.16 % lower than concrete beams.•The Glulam beams ...have less environmental impact than concrete beams with a reduction up to 41.90 kgCO2e.
Timber-concrete structures have been developed as sustainable solutions to face current environmental challenges. This paper aims to present a numerical method for the analysis of the mechanical behaviour of glued laminated wood beams (Glulam) reinforced with ultra-high-performance fibre-reinforced concrete (UHPFRC). For this purpose, three series of a proposed Glulam-UHPFRC (HP-Glulam) beams with different slenderness ratio values, percentage, and position reinforcement under 4-point flexural tests were simulated by the finite element software Ansys®. The orthotropic linear elastic model was assigned to the Glulam, whilst to model the tensile plastic behaviour of UHPFRC the Cast Iron Model was adopted; this last model is based on grey cast iron. The numerical model results were validated with experimental tests from the literature. The Glulam beams increased their load capacity by up to 20 % with a cross-section composed of 11.36 % UHPFRC. The Series C, reinforced with a layer of UHPFRC at the top, demonstrated a greater increase in its flexural properties; therefore, the slender beams (span/depth = 13.16) were taken for a comparative analysis with concrete and steel beams. With the goal to propose a tool for building designers, estimation of costs and embodied energy generated during the production of materials were assessed too. Results show that HP-Glulam beams is a potential substitute for frequently used concrete beams, with a production cost up to 56.16 % lower and up to 41.90 kgCO2e less embedded carbon.
The goal for CMS computing is to maximise the throughput of simulated event generation while also processing event data generated by the detector as quickly and reliably as possible. To maintain this ...achievement as the quantity of events increases CMS computing has migrated at the Tier 1 level from its old production framework, ProdAgent, to a new one, WMAgent. The WMAgent framework offers improved processing efficiency and increased resource usage as well as a reduction in operational manpower. In addition to the challenges encountered during the design of the WMAgent framework, several operational issues have arisen during its commissioning. The largest operational challenges were in the usage and monitoring of resources, mainly a result of a change in the way work is allocated. Instead of work being assigned to operators, all work is centrally injected and managed in the Request Manager system and the task of the operators has changed from running individual workflows to monitoring the global workload. In this report we present how we tackled some of the operational challenges, and how we benefitted from the lessons learned in the commissioning of the WMAgent framework at the Tier 2 level in late 2011. As case studies, we will show how the WMAgent system performed during some of the large data reprocessing and Monte Carlo simulation campaigns.
File-based data flow in the CMS Filter Farm Andre, J-M; Andronidis, A; Bawej, T ...
Journal of physics. Conference series,
12/2015, Letnik:
664, Številka:
8
Journal Article, Conference Proceeding
Recenzirano
Odprti dostop
During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground ...for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small "documents" using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These "files" can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.
The CMS Collaboration relies on 7 globally distributed Tier-1 computing centres located at large universities and national laboratories for a second custodial copy of the CMS RAW data and primary ...copy of the simulated data, data serving capacity to Tier-2 centres for analysis, and the bulk of the reprocessing and event selection capacity in the experiment. The Tier-1 sites have a challenging role in CMS because they are expected to ingest and archive data from both CERN and regional Tier-2 centres, while they export data to a global mesh of Tier-2s at rates comparable to the raw export data rate from CERN. The combined capacity of the Tier-1 centres is more than twice the resources located at CERN and efficiently utilizing this large distributed resources represents a challenge. In this article we will discuss the experience building, operating, and utilizing the CMS Tier-1 computing centres. We will summarize the facility challenges at the Tier-1s including the stable operations of CMS services, the ability to scale to large numbers of processing requests and large volumes of data, and the ability to provide custodial storage and high performance data serving. We will also present the operations experience utilizing the distributed Tier-1 centres from a distance: transferring data, submitting data serving requests, and submitting batch processing requests.
Differential rates in the decay
B
s
0
→
J/
ψϕ with ϕ →
K
+
K
−
and
J/
ψ →μ
+
μ
−
are sensitive to the
CP
-violation phase β
s
= arg((−
V
ts
V
tb
∗
)
/
(
V
cs
V
cb
∗
)), predicted to be very small in ...the standard model. The analysis of
B
s
0
→
J/
ψϕ decays is also suitable for measuring the
B
s
0
lifetime, the decay width difference ΔΓ
s
between the
B
s
0
mass eigenstates, and the
B
s
0
oscillation frequency Δ
m
even if appreciable
CP
violation does not occur. In this paper we present normalized probability densities useful in maximum likelihood fits, extended to allow for
S
-wave
K
+
K
−
contributions on one hand and for direct
CP
violation on the other. Our treatment of the
S
-wave contributions includes the strong variation of the
S
-wave/
P
-wave amplitude ratio with
m
(
K
+
K
−
) across the ϕ resonance, which was not considered in previous work. We include a scheme for re-normalizing the probability densities after detector sculpting of the angular distributions of the final state particles, and conclude with an examination of the symmetries of the rate formulae, with and without an
S
-wave
K
+
K
−
contribution. All results are obtained with the use of a new compact formalism describing the differential decay rate of
B
s
0
mesons into
J/
ψϕ final states.
The efficiency of the Data Acquisition (DAQ) of the Compact Muon Solenoid (CMS) experiment for LHC Run 2 is constantly being improved. A significant factor affecting the data taking efficiency is the ...experience of the DAQ operator. One of the main responsibilities of the DAQ operator is to carry out the proper recovery procedure in case of failure of data-taking. At the start of Run 2, understanding the problem and finding the right remedy could take a considerable amount of time (up to many minutes). Operators heavily relied on the support of on-call experts, also outside working hours. Wrong decisions due to time pressure sometimes lead to an additional overhead in recovery time. To increase the efficiency of CMS data-taking we developed a new expert system, the DAQExpert, which provides shifters with optimal recovery suggestions instantly when a failure occurs. DAQExpert is a web application analyzing frequently updating monitoring data from all DAQ components and identifying problems based on expert knowledge expressed in small, independent logic-modules written in Java. Its results are presented in real-time in the control room via a web-based GUI and a sound-system in a form of short description of the current failure, and steps to recover.