•This article focuses on prescriptive analytics, which is the less mature area of business analytics in comparison with descriptive and predictive analytics.•Prescriptive analytics are positioned as ...the next step towards increasing data analytics maturity and leading to optimized decision making ahead of time.•The existing literature pertaining to prescriptive analytics is reviewed and prominent methods for its implementation are examined.•The article identifies research challenges and outlines directions for future research in the field of prescriptive analytics.
Business analytics aims to enable organizations to make quicker, better, and more intelligent decisions with the aim to create business value. To date, the major focus in the academic and industrial realms is on descriptive and predictive analytics. Nevertheless, prescriptive analytics, which seeks to find the best course of action for the future, has been increasingly gathering the research interest. Prescriptive analytics is often considered as the next step towards increasing data analytics maturity and leading to optimized decision making ahead of time for business performance improvement. This paper investigates the existing literature pertaining to prescriptive analytics and prominent methods for its implementation, provides clarity on the research field of prescriptive analytics, synthesizes the literature review in order to identify the existing research challenges, and outlines directions for future research.
The use of comics and their creation is an especially promising tool to enable students to construct new knowledge. Comics have already been adopted in many applied sciences disciplines, as the ...combination of text and images has been recognized as a powerful learning tool. Educational activities and tools, however, must not create an overload on students’ working memory that could hinder learning. In the current study, we investigated, through pre-test and post-test performance, the effect of digital comics creation on students’ efforts to construct new knowledge. Furthermore, through the multidimensional NASA-TLX, we assessed the cognitive load imposed on students. The results were in favor of digital comics creation, ranking it as an efficient instructional activity. Specifically, the students’ performance after digital comics creation improved and the imposed load on students was normal. Also, studying the weighing procedure between the NASA-TLX dimensions, frustration and temporal demand were found to be the most aggravating dimensions. Finally, implications for teachers and future research recommendations are discussed.
The rise of Artificial Intelligence (AI) enables enterprises to manage large amounts of data in order to derive predictions about future performance and to gain meaningful insights. In this context, ...descriptive and predictive analytics has gained a significant research attention; however, prescriptive analytics has just started to emerge as the next step towards increasing data analytics maturity and leading to optimized decision making ahead of time. Although machine learning for decision making has been identified as one of the most important applications of AI, up to now, prescriptive analytics is mainly addressed with domain-specific optimization models. On the other hand, existing literature lacks generalized prescriptive analytics models capable of being dynamically adapted according to the human preferences. Reinforcement Learning, as the third machine learning paradigm alongside supervised learning and unsupervised learning, has the potential to deal with the dynamic, uncertain and time-variant environments, the huge states space of sequential decision making processes, as well as the incomplete knowledge. In this paper, we propose a human-augmented prescriptive analytics approach using Interactive Multi-Objective Reinforcement Learning (IMORL) in order to cope with the complexity of real-life environments and the need for optimized human-machine collaboration. The decision making process is modelled in a generalized way in order to assure scalability and applicability in a wide range of problems and applications. We deployed the proposed approach in a stock market case study in order to evaluate the proactive trading decisions that will lead to the maximum return and the minimum risk that the user's experience and the available data can yield in combination.
While a multitude of cloud vendors exist today offering flexible application hosting services, the application adaptation capabilities provided in terms of autoscaling are rather limited. In most ...cases, a static adaptation action is used having a fixed scaling response. In the cases that a dynamic adaptation action is provided, this is based on a single scaling variable. We propose Severity, a novel algorithmic approach aiding the adaptation of cloud applications. Based on the input of the DevOps, our approach detects situations, calculates their Severity and proposes adaptations which can lead to better application performance. Severity can be calculated for any number of application QoS attributes and any type of such attributes, whether bounded or unbounded. Evaluation with four distinct workload types and a variety of monitoring attributes shows that QoS for particular application categories is improved. The feasibility of our approach is demonstrated with a prototype implementation of an application adaptation manager, for which the source code is provided.
Big data analytics is rapidly emerging as a key Internet of Things (IoT) initiative aiming at providing meaningful insights and supporting optimal decision making under time constraints. In this ...direction, prescriptive analytics has just started to emerge. Prescriptive analytics moves beyond descriptive and predictive analytics aiming at providing adaptive, automated, constrained, time-dependent and optimal decisions. The use of time-dependent parameters in prescriptive analytics models provide a more reliable and realistic representation of the complex and dynamic environment and the associated decision making process; however, their estimation poses significant challenges due to the uncertainty derived from inaccurate user input, noisy data, and non-stationarity of real-world data streams. Since feedback and learning mechanisms for tracking the prescriptive analytics are crucial enablers for self-configuration and self-optimization, this paper proposes an approach for sensor-driven learning of time-dependent parameters for prescriptive analytics models deployed in streaming computational environments. The proposed approach was validated in an Industry 4.0 use case, while it was further evaluated through extensive simulation experiments. The proposed approach overcomes challenges related to uncertainty derived from user's input, non-stationary data and sensor noise and provides estimates of time-dependent parameters that lead to more reliable prescriptions.
The rapid growth of new computing models that exploit the cloud continuum has a big impact on the adoption of microservices, especially in dynamic environments where the amount of workload varies ...over time or when Internet of Things (IoT) devices dynamically change their geographic location. In order to exploit the true potential of cloud continuum computing applications, it is essential to use a comprehensive set of various intricate technologies together. This complex blend of technologies currently raises data interoperability problems in such modern computing frameworks. Therefore, a semantic model is required to unambiguously specify notions of various concepts employed in cloud applications. The goal of the present paper is therefore twofold: (i) offering a new model, which allows an easier understanding of microservices within adaptive fog computing frameworks, and (ii) presenting the latest open standards and tools which are now widely used to implement each class defined in our proposed model.
The present study focuses on high school students' acceptance of digital comics creation (DCC) in classroom learning and aims at identifying the factors that affect it. The DCC is a modern ICT ...activity, which combines the popular and familiar to students medium of comics with the computers. The research model used to explain the students' preference for DCC is based on the technology acceptance model. A partial least squares structural equation modeling was used to analyze the data and examine our research model and corresponding hypotheses. The results confirm the acceptance of the model and show that students' preference for DCC is directly influenced by perceived enjoyment, perceived usefulness and perceived ease of use. Among them, the perceived enjoyment is the stronger influencing factor. Digital comics creation self-efficacy was a significant indirect factor of students’ preference for DCC through perceived ease of use. It is important that teachers take these into consideration and incorporate ICT activities that students enjoy, perceive them as useful and easy to use in order to capture their interest. Teachers should also enhance students’ self-efficacy when providing them with ICT systems. Further relationships among the aforementioned factors and future research directions are also discussed.
Decision-making for manufacturing and maintenance operations is benefiting from the advanced sensor infrastructure of Industry 4.0, enabling the use of algorithms that analyze data, predict emerging ...situations, and recommend mitigating actions. The current paper reviews the literature on data-driven decision-making in maintenance and outlines directions for future research towards data-driven decision-making for Industry 4.0 maintenance applications. The main research directions include the coupling of decision-making with augmented reality for seamless interfacing that combines the real and virtual worlds of manufacturing operators; methods and techniques for addressing uncertainty of data, in lieu of emerging Internet of Things (IoT) devices; integration of maintenance decision-making with other operations such as scheduling and planning; utilization of the cloud continuum for optimal deployment of decision-making services; capability of decision-making methods to cope with big data; incorporation of advanced security mechanisms; and coupling decision-making with simulation software, autonomous robots, and other additive manufacturing initiatives.
Process mining is a research discipline that applies data analysis and computational intelligence techniques to extract knowledge from event logs of information systems. It aims to provide new means ...to discover, monitor, and improve processes. Process mining has gained particular attention over recent years and new process mining software tools, both academic and commercial, have been developed. This paper provides a survey of process mining software tools. It identifies and describes criteria that can be useful for comparing the tools. Furthermore, it introduces a multi-criteria methodology that can be used for the comparative analysis of process mining software tools. The methodology is based on three methods, namely ontology, decision tree, and Analytic Hierarchy Process (AHP), that can be used to help users decide which software tool best suits their needs.