Antibody testing is crucial for monitoring the evolution of the pandemic, providing a more complete picture of the total number of people infected with severe acute respiratory syndrome coronavirus 2 ...(SARS-CoV-2) than molecular diagnostic testing alone.1 All individuals with SARS-CoV-2-specific antibodies have been exposed to the virus, so antibody testing can highlight differences in past exposure between regions, demographic groups, and occupations.2 Seroprevalence estimates can also be used to estimate the infection fatality rate.3 Dashboards that visualise COVID-19 cases confirmed by diagnostic testing have been pivotal in enabling policy makers and researchers to monitor the pandemic.4 Yet, despite the value of antibody testing, there is no unified resource for seroprevalence estimates. SeroTracker integrates evidence from serosurveillance studies through a live systematic review.5 Each day, published articles (MEDLINE, Embase, Web of Science, and Cochrane), preprints (medRxiv and bioRxiv), government reports, and news articles are reviewed for newly reported SARS-CoV-2 seroprevalence estimates by a team of doctoral and medical students. Across both tabs, users can also filter data by geography, study characteristics (source type, study status, overall risk of bias), population demographics (age, sex, general population, health-care workers), and test information (test type, reported isotypes).
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
The advances in technology to capture and process unprecedented amounts of educational data has boosted the interest in Learning Analytics Dashboard (LAD) applications as a way to provide meaningful ...visual information to administrators, parents, teachers and learners. Despite the frequent argument that LADs are useful to support target users and their goals to monitor and act upon the information provided, little is known about LADs’ theoretical underpinnings and the alignment (or lack thereof) between LADs intended outcomes and the measures used to evaluate their implementation. However, this knowledge is necessary to illuminate more efficient approaches in the development and implementation of LAD tools. Guided by the self‐regulated learning perspective and using the Preferred Reporting Items for Systematic Reviews and Meta‐Analyses (PRISMA) framework, this systematic literature review addressed this gap by examining whether and how learner‐facing LAD’s target outcomes align with the domain measures used to evaluate their implementations. Out of the 1297 papers retrieved from 15 databases, 28 were included in the final quantitative and qualitative analysis. Results suggested an intriguing lack of alignment between LADs’ intended outcomes (mostly cognitive domain) and their evaluation (mostly affective measures). Based on these results and on the premise that LADs are designed to support learners, a critical recommendation from this study is that LADs’ target outcomes should guide the selection of measures used to evaluate the efficacy of these tools. This alignment is critical to enable the construction of more robust guidelines to inform future endeavours in the field.
Practitioner notes
What is already known about this topic
There has been an increased interest and investment in learning analytics dashboards to support learners as end‐users.
Learner‐facing learning analytics dashboards are designed with different purposes, functionalities and types of data in an attempt to influence learners’ behaviour, achievement and skills.
What this paper adds
This paper reports trends and opportunities regarding the design of learner‐facing learning analytics dashboards, contexts of implementation, as well as types and features of learner‐facing learning analytics dashboard studies.
The paper discusses how affect and motivation have been largely overlooked as target outcomes in learner‐facing learning analytics dashboards.
Implications for practice and/or policy
Based on the evidence gathered through the review, this paper makes recommendations for theory (eg, inclusion of motivation as an important target outcome).
The paper makes recommendations related to the design, implementation and evaluation of learning analytics dashboards.
The paper also highlights the need for further integration between learner‐facing learning analytics dashboards and open learner models.
Full text
Available for:
BFBNIB, DOBA, FZAB, GIS, IJS, IZUM, KILJ, NLZOH, NUK, OILJ, PILJ, PNG, SAZU, SBCE, SBMB, SIK, UILJ, UKNU, UL, UM, UPUK
Apache Superset is a modern, open source, enterprise-ready Business Intelligence web application. This book will teach you how Superset integrates with popular databases like Postgres, Google ...BigQuery, Snowflake, and MySQL. You will learn to create real time data visualizations and dashboards on modern web browsers for your organization.
This article introduces learning analytics dashboards that visualize learning traces for learners and teachers. We present a conceptual framework that helps to analyze learning analytics applications ...for these kinds of users. We then present our own work in this area and compare with 15 related dashboard applications for learning. Most evaluations evaluate only part of our conceptual framework and do not assess whether dashboards contribute to behavior change or new understanding, probably also because such assessment requires longitudinal studies.
Full text
Available for:
NUK, OILJ, SAZU, UKNU, UL, UM, UPUK
With the development of a technology‐supported environment, it is plausible to provide rich process‐oriented feedback in a timely manner. In this paper, we developed a learning analytics dashboard ...(LAD) based on process‐oriented feedback in iTutor to offer learners their final scores, sub‐scale reports, and corresponding suggestions on further learning content. We adopted a quasi‐experimental design to investigate the effectiveness of the report on students' learning. Ninety‐four freshman from two classes participated in this research. The two classes were divided into the LAD group and the original analytics report (OAR) based on a product‐oriented feedback group. Before the experiment, all the students took the prior knowledge assessment. After a semester's instruction, all the students took the post‐test of the summative assessment. Results indicated that students in the LAD group experienced better learning effectiveness than students in the OAR group. LAD based on process‐oriented feedback was also effective in improving the skill learning effectiveness of the students with low‐level prior knowledge.
Lay Description
What is already known about this topic:
Feedback is crucial and a growing body of research has investigated the effect of types of feedback.
Process‐oriented feedback is continually incorporated in assessment process to support learning.
Learning analytics dashboard (LAD) is a great tool for delivering feedback to students.
What this paper adds:
Develop a LAD based on process‐oriented feedback and implement it in iTutor.
Analyse and visualize process‐oriented feedback content based on the process data automatically collected by iTutor.
Verify the effectiveness of LAD based on process‐oriented feedback through a quasi‐experiment.
Implications for practise and/or policy:
Process‐oriented feedback is the main embodiment of learning analysis technology.
Students with low prior knowledge benefit more from LAD based on process‐oriented feedback.
Full text
Available for:
BFBNIB, FZAB, GIS, IJS, KILJ, NLZOH, NUK, OILJ, SBCE, SBMB, UL, UM, UPUK
Background
The potential of learning analytics dashboards in virtual reality simulation‐based training environments to influence occupational self‐efficacy via self‐reflection phase processes in the ...Chemical industry is still not fully understood. Learning analytics dashboards provide feedback on learner performance and offer points of comparison (i.e., comparison with one's own past performance or comparison with peer performance) to help learners make sense of their feedback.
Objectives
We present a theoretical framework for describing learning analytics reference frames and investigate the impact of feedback delivered through dashboards with different reference frames on occupational self‐efficacy, while controlling for workplace self‐reflection.
Methods
This experimental study engaged 42 chemical operator employees, aged between 18 and 55 years, each with at least one year of experience. We utilised a two‐group design to ask two research question each with three competing hypotheses related to changes in occupational self‐efficacy, employing Bayesian informative hypothesis evaluation.
Results and Conclusions
Results for the primary research question suggest that dashboards with progress reference frames do not elicit greater change to self‐efficacy than those with social reference frames, however, they may elicit equal change. Furthermore, dashboards with social reference frames may elicit greater change to self‐efficacy than those with progress reference frames. Exploratory results found that dashboards with progress reference frames may elicit greater positive directional change than those with social reference frames and that they may elicit equal directional change.
These findings contribute to the understanding of self‐efficacy beliefs within the Chemical industry, with potential impacts on skill development. The research may inform the design of targeted interventions and training programs to influence self‐efficacy. From a practical perspective, this research suggests that careful consideration is needed when choosing reference frames in learning analytics dashboards due to their potential consequences on the formation of learner self‐efficacy.
Lay Description
What is already known about this topic?
Learning analytics dashboards often aim to stimulate self‐regulated learning by providing feedback.
Feedback plays a crucial role in simulation‐based learning and learning analytics dashboards can be a valuable tool to deliver feedback and facilitate the learning process by providing learners with insights into their performance and progress.
Previous research has highlighted the importance of reference frames as critical design features of learning analytics dashboards, as they assist learners in making sense of learning analytics feedback.
Self‐efficacy is a powerful determine factor of workplace performance and is influenced by mastery experiences and social modelling information.
What the paper adds?
A theoretical framework describes the mediating role self‐reflection phase processes of the self‐regulated learning cycle plays in the formation of occupational self‐efficacy.
This paper introduces the concept of three learning analytics reference frame components which include the performance outcome component, point of comparison component and score delta component.
The paper indicates that learning analytics dashboards designed with a progress reference frame does not elicit greater change to occupational self‐efficacy than the social reference frame.
Exploratory results suggest dashboards with progress reference frames might produce greater positive directional change compared to social reference frames or elicit equal change.
Implications for practice?
Learning analytics system designers who use reference frames to help learners make sense of their feedback should carefully consider which reference frames they use as these decisions likely have consequences on the formation of learner self‐efficacy beliefs.
For LAD designers and stakeholders, recognising the absolute change and directional shift in self‐efficacy is crucial, as LAD designs significantly influence these dynamics and guide the tailoring of effective tools.
In industries like the chemical sector, where overconfidence can result in severe consequences, understanding the nuance of self‐efficacy changes is especially pertinent.
Full text
Available for:
BFBNIB, FZAB, GIS, IJS, KILJ, NLZOH, NUK, OILJ, SBCE, SBMB, UL, UM, UPUK
Constant monitoring of road surfaces helps to show the urgency of deterioration or problems in the road construction and to improve the safety level of the road surface. Conditional generative ...adversarial networks (cGAN) are a powerful tool to generate or transform the images used for crack detection. The advantage of this method is the highly accurate results in vector-based images, which are convenient for mathematical analysis of the detected cracks at a later time. However, images taken under established parameters are different from images in real-world contexts. Another potential problem of cGAN is that it is difficult to detect the shape of an object when the resulting accuracy is low, which can seriously affect any further mathematical analysis of the detected crack. To tackle this issue, this paper proposes a method called improved cGAN with attention gate (ICGA) for roadway surface crack detection. To obtain a more accurate shape of the detected target object, ICGA establishes a multi-level model with independent stages. In the first stage, everything except the road is treated as noise and removed from the image. These images are stored in a new dataset. In the second stage, ICGA determines the cracks. Therefore, ICGA focuses on the redistribution of cracks, not the auxiliary elements in the image. ICGA adds two attention gates to a U-net architecture and improves the segmentation capacities of the generator in pix2pix. Extensive experimental results on dashboard camera images of the Unsupervised Llamas dataset show that our method has better performance than other state-of-the-art methods.
Full text
Available for:
IZUM, KILJ, NUK, PILJ, PNG, SAZU, UL, UM, UPUK
Since the beginning of the COVID-19 pandemic, many dashboards have emerged as useful tools to monitor its evolution, inform the public, and assist governments in decision-making. Here, we present a ...globally applicable method, integrated in a daily updated dashboard that provides an estimate of the trend in the evolution of the number of cases and deaths from reported data of more than 200 countries and territories, as well as 7-d forecasts. One of the significant difficulties in managing a quickly propagating epidemic is that the details of the dynamic needed to forecast its evolution are obscured by the delays in the identification of cases and deaths and by irregular reporting. Our forecasting methodology substantially relies on estimating the underlying trend in the observed time series using robust seasonal trend decomposition techniques. This allows us to obtain forecasts with simple yet effective extrapolation methods in linear or log scale. We present the results of an assessment of our forecasting methodology and discuss its application to the production of global and regional risk maps.
This paper examines how business users can leverage machine learning and data analytics through dashboards to optimize their decision making in demand-side supply chain management. We present a case ...study of an Austrian B2B hygiene product retailer that needed to provide its top management, sales representatives, and marketing managers with more relevant information to improve business intelligence and to enhance customer acquisition and retention. To generate this information, we utilized various data analysis and machine learning methods, including RFM analysis, market basket analysis, TURF analysis, and demand forecasting, using real-life transaction data. To provide business users with easy access to this information, we developed dashboards that integrate these methods providing an interactive and visual tool for data exploration and understanding. We conclude that dashboards can enable, business users to make better informed and effective decisions on the demand side of supply chains leading to improved sales performance and increased customer satisfaction.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Decision makers in organizations strive to improve the quality of their decisions. One way to improve that process is to objectify the decisions with facts. Data-driven Decision Support Systems ...(data-driven DSS), and more specifically business intelligence (BI) intend to achieve this. Organizations invest massively in the development of BI data-driven DSS and expect them to be adopted and to effectively support decision makers. This raises many technical and methodological challenges, especially regarding the design of BI dashboards, which can be seen as the visible tip of the BI data-driven DSS iceberg and which play a major role in the adoption of the entire system. In this paper, the dashboard content is investigated as one possible root cause for BI data-driven DSS dashboard adoption or rejection through an early empirical research. More precisely, this work is composed of three parts. In the first part, the concept of cognitive loads is studied in the context of BI dashboards and the informational, the representational and the non-informational loads are introduced. In the second part, the effects of these loads on the adoption of BI dashboards are then studied through an experiment with 167 respondents and a Structural Equation Modeling (SEM) analysis. The result is a Dashboard Adoption Model, enriching the seminal Technology Acceptance Model with new content-oriented variables to support the design of more supportive BI data-driven DSS dashboards. Finally, in the third part, a set of indicators is proposed to help dashboards designers in the monitoring of the loads of their dashboards practically.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP