Nowadays, there is a huge and growing concern about security in information and communication technology among the scientific community because any attack or anomaly in the network can greatly affect ...many domains such as national security, private data storage, social welfare, economic issues, and so on. Therefore, the anomaly detection domain is a broad research area, and many different techniques and approaches for this purpose have emerged through the years. In this study, the main objective is to review the most important aspects pertaining to anomaly detection, covering an overview of a background analysis as well as a core study on the most relevant techniques, methods, and systems within the area. Therefore, in order to ease the understanding of this survey’s structure, the anomaly detection domain was reviewed under five dimensions: (1) network traffic anomalies, (2) network data types, (3) intrusion detection systems categories, (4) detection methods and systems, and (5) open issues. The paper concludes with an open issues summary discussing presently unsolved problems, and final remarks.
Recently, modern smartphones equipped with a variety of embedded-sensors, such as accelerometers and gyroscopes, have been used as an alternative platform for human activity recognition (HAR), since ...they are cost-effective, unobtrusive and they facilitate real-time applications. However, the majority of the related works have proposed a position-dependent HAR, i.e., the target subject has to fix the smartphone in a pre-defined position. Few studies have tackled the problem of position-independent HAR. They have tackled the problem either using handcrafted features that are less influenced by the position of the smartphone or by building a position-aware HAR. The performance of these studies still needs more improvement to produce a reliable smartphone-based HAR. Thus, in this paper, we propose a deep convolution neural network model that provides a robust position-independent HAR system. We build and evaluate the performance of the proposed model using the RealWorld HAR public dataset. We find that our deep learning proposed model increases the overall performance compared to the state-of-the-art traditional machine learning method from 84% to 88% for position-independent HAR. In addition, the position detection performance of our model improves superiorly from 89% to 98%. Finally, the recognition time of the proposed model is evaluated in order to validate the applicability of the model for real-time applications.
Summary
The exchange of information among health professionals is a common practice among clinics, laboratories, and hospitals. Cloud‐based clinical data exchange platforms enable valuable ...information to be available in real time and in a secure and private manner. The increasing availability of data in health information systems allows specialists to extract knowledge using pattern recognition techniques for the identification and prediction of risk situations that could lead to severe complications for a patient. Hence, this paper proposes the use of a neuro‐fuzzy machine learning technique for predicting the most complex hypertensive disorder in pregnancy called HELLP syndrome. This classifier serves as an inference mechanism for cloud‐based mobile applications, for effective monitoring through the analysis of symptoms presented by pregnant women. Results show that the proposed model achieves excellent results regarding several indicators, such as precision (0.685), recall (0.756), the F‐measure (0.705), and the area under the receiver operating characteristic curve (0.829). This technique can accurately predict situations that could lead to the death of both a mother and fetus, at any location and time.
Theft of electricity poses a significant risk to the public and is the most costly non-technical loss for an electrical supplier. In addition to affecting the quality of the energy supply and the ...strain on the power grid, fraudulent electricity use drives up prices for honest customers and creates a ripple effect on the economy. Using data-analysis tools, smart grids may drastically reduce this waste. Smart-grid technology produces much information, including consumers’ unique electricity-use patterns. By analyzing this information, machine-learning and deep-learning methods may successfully pinpoint those who engage in energy theft. This study presents an ensemble-learning-based system for detecting energy theft using a hybrid approach. The proposed approach uses a machine-learning-based ensemble model based on a majority voting strategy. This work aims to develop a smart-grid information-security decision support system. This study employed a theft-detection dataset to facilitate automatic theft recognition in a smart-grid environment (TDD2022). The dataset consists of six separate electricity thefts. The experiments are performed in four different scenarios. The proposed machine-learning-based ensemble model obtained significant results in all scenarios. The proposed ensemble model obtained the highest accuracy of 88%, 87.24%, 94.75%, and 94.70% with seven classes including the consumer type, seven classes excluding the consumer type, six classes including the consumer type, and six classes excluding the consumer type. The suggested ensemble model outperforms the existing techniques in terms of accuracy when the proposed methodology is compared to state-of-the-art approaches.
The Internet of Things (IoT) is one of the most promising technologies for the near future. Healthcare and well-being will receive great benefits with the evolution of this technology. This paper ...presents a review of techniques based on IoT for healthcare and ambient-assisted living, defined as the Internet of Health Things (IoHT), based on the most recent publications and products available in the market from industry for this segment. Also, this paper identifies the technological advances made so far, analyzing the challenges to be overcome and provides an approach of future trends. Through selected works, it is possible to notice that further studies are important to improve current techniques and that novel concept and technologies of IoHT are needed to overcome the identified challenges. The presented results aim to serve as a source of information for healthcare providers, researchers, technology specialists, and the general population to improve the IoHT.
The emergence of the Internet of Things (IoT) and its applications has taken the attention of several researchers. In an effort to provide interoperability and IPv6 support for the IoT devices, the ...Internet Engineering Task Force (IETF) proposed the 6LoWPAN stack. However, the particularities and hardware limitations of networks associated with IoT devices lead to several challenges, mainly for routing protocols. On its stack proposal, IETF standardizes the RPL (IPv6 Routing Protocol for Low-Power and Lossy Networks) as the routing protocol for Low-power and Lossy Networks (LLNs). RPL is a tree-based proactive routing protocol that creates acyclic graphs among the nodes to allow data exchange. Although widely considered and used by current applications, different recent studies have shown its limitations and drawbacks. Among these, it is possible to highlight the weak support of mobility and P2P traffic, restrictions for multicast transmissions, and lousy adaption for dynamic throughput. Motivated by the presented issues, several new solutions have emerged during recent years. The approaches range from the consideration of different routing metrics to an entirely new solution inspired by other routing protocols. In this context, this work aims to present an extensive survey study about routing solutions for IoT/LLN, not limited to RPL enhancements. In the course of the paper, the routing requirements of LLNs, the initial protocols, and the most recent approaches are presented. The IoT routing enhancements are divided according to its main objectives and then studied individually to point out its most important strengths and weaknesses. Furthermore, as the main contribution, this study presents a comprehensive discussion about the considered approaches, identifying the still remaining open issues and suggesting future directions to be recognized by new proposals.
Summary
Orthogonal frequency division multiplexing (OFDM) is an alternative technology that is constrained to keep up with the increasingly high data transmission rate due to its robustness against ...selective frequency fading and its resistance to intersymbol interference (ISI). This paper aims to study and optimize its performance, namely, its probability of communication failure and signal‐to‐interference‐plus‐noise ratio (SINR) metrics. The mathematical tool used to perform OFDM evaluation and analysis is the Fourier transform and its properties. The obtained results of those equations led to the demonstration of the probability of communication outage and to notice that the cutoff time occurs after the 10th iteration for values of λ less than or equal to 1. It could also be checked that this period is small if λ exceeds 1 by considering the sensitivity factor. Moreover, for the obtained results, in terms of bit error rate (BER) and SINR, the influence of the modulation on the error vector magnitude (EVM) shows the sensitivity factor Ω presents a key role in the transmission chain.
A schematic illustration of the proposed optimal IEEE 802.11a (Wi‐Fi) and IEEE 802.16e (WiMAX) performances over the OFDM physical layer. The obtained results provide insight on the cutoff time according to values of λ. If λ ≤ 1, cutoff is achieved after iteration 10. If this parameter exceeds 1, the sensitivity factor must be analyzed.
The rapid development of different social media and content-sharing platforms has been largely exploited to spread misinformation and fake news that make people believing in harmful stories, which ...allow to influence public opinion, and could cause panic and chaos among population. Thus, fake news detection has become an important research topic, aiming at flagging a specific content as fake or legitimate. The fake news detection solutions can be divided into three main categories: content-based, social context-based, and knowledge-based approaches. In this paper, we propose a novel hybrid fake news detection system that combines linguistic and knowledge-based approaches and inherits their advantages, by employing two different sets of features: (1) linguistic features (i.e., title, number of words, reading ease, lexical diversity,and sentiment), and (2) a novel set of knowledge-based features, called fact-verification features that comprise three types of information namely, (i) reputation of the website where the news is published, (ii) coverage , i.e., number of sources that published the news, and (iii) fact-check , i.e., opinion of well-known fact-checking websites about the news, i.e., true or false. The proposed system only employs eight features, which is less than most of the state-of-the-art approaches. Also, the evaluation results on a fake news dataset show that the proposed system employing both types of features can reach an accuracy of 94.4%, which is better compared to that obtained from separately employing linguistic features (i.e., accuracy=89.4%) and fact-verification features (i.e., accuracy=81.2%).
The Distributed Denial of Service (DDoS) attack is considered one of the most critical threats on the Internet, blocking legitimate users from accessing online services. Botnets have exploited ...insecure IoT devices and used them to launch DDoS attacks. Providing IoT devices with the ability to detect DDoS attacks will prevent them from becoming contributors to these attacks. This paper presents an efficient solution to defend IoT devices against such inevitable attacks. The proposed solution consists of two parts: an IoT node detector and a server detector. The IoT node detector is a lightweight classifier to monitor egress traffic. The server detector is a more accurate classifier that is used by the IoT node if it suspected itself to be a contributor to a DDoS attack. To develop an accurate server detector, this paper proposes ShieldRNN : a novel training and prediction approach for RNN/LSTM models. We compare ShieldRNN with other supervised and unsupervised models on the CIC-IDS2017 dataset and show that it outperforms them. Also, we set baseline results for DDoS detection on the CIC IoT 2022 dataset.
Internet of Things (IoT) management systems require scalability, standardized communication, and context-awareness to achieve the management of connected devices with security and accuracy in real ...environments. Interoperability and heterogeneity between hardware and application layers are also critical issues. To attend to the network requirements and different functionalities, a dynamic and context-sensitive configuration management system is required. Thus, reference architectures (RAs) represent a basic architecture and the definition of key characteristics for the construction of IoT environments. Therefore, choosing the best technologies of the IoT management platforms and protocols through comparison and evaluation is a hard task, since they are difficult to compare due to their lack of standardization. However, in the literature, there are no management platforms focused on addressing all IoT issues. For this purpose, this paper surveys the available policies and solutions for IoT Network Management and devices. Among the available technologies, an evaluation was performed using features such as heterogeneity, scalability, supported technologies, and security. Based on this evaluation, the most promising technologies were chosen for a detailed performance evaluation study (through simulation and deployment in real environments). In terms of contributions, these protocols and platforms were studied in detail, the main features of each approach are highlighted and discussed, open research issues are identified as well as the lessons learned on the topic.