Fit-out construction is crucial for the functionality of completed infrastructures. However, the absence of a robust data collection system hinders efficient project management, causing delays and ...extra costs. This paper introduces a digital twin-enabled platform that utilizes Internet of Things (IoT) and neural network technologies for real-time visibility and traceability of prefabricated materials. The effectiveness of the proposed platform has been validated through a real-life prefabricated construction project in Hong Kong. Leveraging the IoT-captured data, visual and event-driven process models are developed to simulate materials flow and workflow executions. Hierarchical finite state machines (HFSM) facilitate the management of prefabricated construction supply chain. Results show the platform's effectiveness in identifying abnormal event states and notifying stakeholders in real-time, thus supporting advanced project management practices. This platform improves information coordination among stakeholders, providing real-time visibility and traceability, and enabling hierarchical management at strategic, tactical, and operational levels throughout the prefabrication process.
•A service-oriented 5D digital twin model is proposed for real-time visibility and traceability.•Event-driven virtual and real mapping are based on AutomationML and OPC UA.•The effectiveness of the proposed model is validated in a practical case study.•Hierarchical management positively influence on the performance of prefabricated fit-out construction projects.
The processes of digital transformation have involved a variety of socio-technical activities, with the objective of increasing productivity, safety and quality of execution, sustainable development, ...collaborative working and solutions for the sustainable smart city. The major digital trends, changing the building sector and revealing new trends of understanding information technologies to integrate in this sector. Current smart building management systems incorporate a variety of sensors, actuators and dedicated networks. Their objectives are to observe the condition of specific areas and apply appropriate rules to preserve or improve comfort while saving energy. In this paper, we propose a review of related works to IoT, Big Data Analytics in smart buildings.
Event processing (EP) is a data processing technology that conducts online processing of event information. In this survey, we summarize the latest cutting-edge work done on EP from both industrial ...and academic research community viewpoints. We divide the entire field of EP into three subareas: EP system architectures, EP use cases, and EP open research topics. Then we deep dive into the details of each subsection. We investigate the system architecture characteristics of novel EP platforms, such as Apache Storm, Apache Spark, and Apache Flink. We found significant advancements made on novel application areas, such as the Internet of Things; streaming machine learning (ML); and processing of complex data types such as text, video data streams, and graphs. Furthermore, there has been significant body of contributions made on event ordering, system scalability, development of EP languages and exploration of use of heterogeneous devices for EP, which we investigate in the latter half of this article. Through our study, we found key areas that require significant attention from the EP community, such as Streaming ML, EP system benchmarking, and graph stream processing.
•Combining CEP and ML paradigms permits detecting IoT security attacks in real time.•A graphical tool facilitates security attack pattern definition and code generation.•The proposed architecture has ...been validated in an E-health IoT network scenario.•ML makes it possible to create accurate pattern dynamically.
The Internet of Things (IoT) is growing globally at a fast pace: people now find themselves surrounded by a variety of IoT devices such as smartphones and wearables in their everyday lives. Additionally, smart environments, such as smart healthcare systems, smart industries and smart cities, benefit from sensors and actuators interconnected through the IoT. However, the increase in IoT devices has brought with it the challenge of promptly detecting and combating the cybersecurity attacks and threats that target them, including malware, privacy breaches and denial of service attacks, among others. To tackle this challenge, this paper proposes an intelligent architecture that integrates Complex Event Processing (CEP) technology and the Machine Learning (ML) paradigm in order to detect different types of IoT security attacks in real time. In particular, such an architecture is capable of easily managing event patterns whose conditions depend on values obtained by ML algorithms. Additionally, a model-driven graphical tool for security attack pattern definition and automatic code generation is provided, hiding all the complexity derived from implementation details from domain experts. The proposed architecture has been applied in the case of a healthcare IoT network to validate its ability to detect attacks made by malicious devices. The results obtained demonstrate that this architecture satisfactorily fulfils its objectives.
A large number of distributed applications requires continuous and timely processing of information as it flows from the periphery to the center of the system. Examples include intrusion detection ...systems which analyze network traffic in real-time to identify possible attacks; environmental monitoring applications which process raw data coming from sensor networks to identify critical situations; or applications performing online analysis of stock prices to identify trends and forecast future values.
•Research issues in complex event processing (CEP) emphasizing on query optimization.•Cover deterministic – probabilistic models, centralized – distributed settings.•Issues for CEP optimization over ...Big Data enabling cloud computing platforms.•Predictive Analytics and CEP in cloud platforms even with dispersed resource pools.
Many Big Data technologies were built to enable the processing of human generated data, setting aside the enormous amount of data generated from Machine-to-Machine (M2M) interactions and Internet-of-Things (IoT) platforms. Such interactions create real-time data streams that are much more structured, often in the form of series of event occurrences. In this paper, we provide an overview on the main research issues confronted by existing Complex Event Processing (CEP) techniques, with an emphasis on query optimization aspects. Our study expands on both deterministic and probabilistic event models and spans from centralized to distributed network settings. In that, we cover a wide range of approaches in the CEP domain and review the current status of techniques that tackle efficient query processing. These techniques serve as a starting point for developing Big Data oriented CEP applications. Therefore, we further study the issues that arise upon trying to apply those techniques over Big Data enabling technologies, as is the case with cloud platforms. Furthermore, we expand on the synergies among Predictive Analytics and CEP with an emphasis on scalability and elasticity considerations in cloud platforms with potentially dispersed resource pools.
The globalization of manufacturing has increased the risk of counterfeiting as the demand grows, the production flow increases, and the availability expands. The intensifying counterfeit issues ...causing a worriment to companies and putting lives at risk. Companies have ploughed a large amount of money into defensive measures, but their efforts have not slowed counterfeiters. In such complex manufacturing processes, decision-making and real-time reactions to uncertain situations throughout the production process are one way to exploit the challenges. Detecting uncertain conditions such as counterfeit and missing items in the manufacturing environment requires a specialized set of technologies to deal with a flow of continuously created data. In this paper, we propose an uncertain detection algorithm (UDA), an approach to detect uncertain events such as counterfeit and missing items in the RFID distributed system for a manufacturing environment. The proposed method is based on the hashing and thread pool technique to solve high memory consumption, long processing time and low event throughput in the current detection approaches. The experimental results show that the execution time of the proposed method is averagely reduced 22% in different tests, and our proposed method has better performance in processing time based on RFID event streams.
Fuzzy complex event processing-based decision-making systems have received considerable research interests recently. In particular, a well-managed operator distribution is required for improving the ...performance of the fuzzy complex event processing-based decision-making systems. However, the intrinsic uncertainty in dynamic input events increases the difficulties of operator distribution problem. To address these issues, a cost-aware, fault-tolerant and reliable strategy, called CaFtR is proposed for operator scheduling on fuzzy complex event processing systems based on the Technique for Order Preferences by Similarity to an Ideal Solution. The proposed CaFtR method adequately makes use of network resources to achieve continuous and highly available complex event processing regardless of dynamic operator migrations under fuzzy environment. Finally, a case study is provided to illustrate the efficiency of the proposed method, and the utility of our work is demonstrated through an application on the StreamBase system.
In social media, human-generated web data from real-world events have become exponentially complex due to the chaotic and spontaneous features of natural language. This may create an information ...overload for the information consumers, and in turn not easily digest a large amount of information in a limited time. To tackle this issue, we propose to use Complex Event Processing (CEP) and semantic web reasoners to disentangle the human-generated data and present users with only relevant and important data. However, one of the key obstacles is that the human-generated data can have no structured meaning sometimes even for the speaker, hindering the output of the CEP. Therefore, in order to adapt to the CEP inputs, we present two different techniques that allow for the discrimination and digestion of value of human-generated data. The first one relies on the Variable Sharing Property that was developed for relevance logics, while the second one is based on semantic equivalence and natural language processing. The results can be given to CEP for further semantic reasoning and generate digested information for users.