The fast development of the Internet of Things (IoT) technology in recent years has supported connections of numerous smart things along with sensors and established seamless data exchange between ...them, so it leads to a stringy requirement for data analysis and data storage platform such as cloud computing and fog computing. Healthcare is one of the application domains in IoT that draws enormous interest from industry, the research community, and the public sector. The development of IoT and cloud computing is improving patient safety, staff satisfaction, and operational efficiency in the medical industry. This survey is conducted to analyze the latest IoT components, applications, and market trends of IoT in healthcare, as well as study current development in IoT and cloud computing-based healthcare applications since 2015. We also consider how promising technologies such as cloud computing, ambient assisted living, big data, and wearables are being applied in the healthcare industry and discover various IoT, e-health regulations and policies worldwide to determine how they assist the sustainable development of IoT and cloud computing in the healthcare industry. Moreover, an in-depth review of IoT privacy and security issues, including potential threats, attack types, and security setups from a healthcare viewpoint is conducted. Finally, this paper analyzes previous well-known security models to deal with security risks and provides trends, highlighted opportunities, and challenges for the IoT-based healthcare future development.
Driven by the emergence of new compute-intensive applications and the vision of the Internet of Things (IoT), it is foreseen that the emerging 5G network will face an unprecedented increase in ...traffic volume and computation demands. However, end users mostly have limited storage capacities and finite processing capabilities, thus how to run compute-intensive applications on resource-constrained users has recently become a natural concern. Mobile edge computing (MEC), a key technology in the emerging fifth generation (5G) network, can optimize mobile resources by hosting compute-intensive applications, process large data before sending to the cloud, provide the cloud-computing capabilities within the radio access network (RAN) in close proximity to mobile users, and offer context-aware services with the help of RAN information. Therefore, MEC enables a wide variety of applications, where the real-time response is strictly required, e.g., driverless vehicles, augmented reality, robotics, and immerse media. Indeed, the paradigm shift from 4G to 5G could become a reality with the advent of new technological concepts. The successful realization of MEC in the 5G network is still in its infancy and demands for constant efforts from both academic and industry communities. In this survey, we first provide a holistic overview of MEC technology and its potential use cases and applications. Then, we outline up-to-date researches on the integration of MEC with the new technologies that will be deployed in 5G and beyond. We also summarize testbeds and experimental evaluations, and open source activities, for edge computing. We further summarize lessons learned from state-of-the-art research works as well as discuss challenges and potential future directions for MEC research.
Summary
Recently Internet of Things (IoT) is being used in several fields like smart city, agriculture, weather forecasting, smart grids, waste management, etc. Even though IoT has huge potential in ...several applications, there are some areas for improvement. In the current work, we have concentrated on minimizing the energy consumption of sensors in the IoT network that will lead to an increase in the network lifetime. In this work, to optimize the energy consumption, most appropriate Cluster Head (CH) is chosen in the IoT network. The proposed work makes use of a hybrid metaheuristic algorithm, namely, Whale Optimization Algorithm (WOA) with Simulated Annealing (SA). To select the optimal CH in the clusters of IoT network, several performance metrics such as the number of alive nodes, load, temperature, residual energy, cost function have been used. The proposed approach is then compared with several state‐of‐the‐art optimization algorithms like Artificial Bee Colony algorithm, Genetic Algorithm, Adaptive Gravitational Search algorithm, WOA. The results prove the superiority of the proposed hybrid approach over existing approaches.
•A comprehensive review of vision-based and sensor-based human activity recognition.•Summarize and discuss public datasets that are used in vision-based HAR and sensor-based HAR.•Categorize and ...analyze standard data processing, and feature engineering processes used in HAR.•Categorize and analyze machine learning techniques for HAR and focus on current deep learning research in HAR.•Discuss challenges and show future directions for HAR.
Human activity recognition (HAR) technology that analyzes data acquired from various types of sensing devices, including vision sensors and embedded sensors, has motivated the development of various context-aware applications in emerging domains, e.g., the Internet of Things (IoT) and healthcare. Even though a considerable number of HAR surveys and review articles have been conducted previously, the major/overall HAR subject has been ignored, and these studies only focus on particular HAR topics. Therefore, a comprehensive review paper that covers major subjects in HAR is imperative. This survey analyzes the latest state-of-the-art research in HAR in recent years, introduces a classification of HAR methodologies, and shows advantages and weaknesses for methods in each category. Specifically, HAR methods are classified into two main groups, which are sensor-based HAR and vision-based HAR, based on the generated data type. After that, each group is divided into subgroups that perform different procedures, including the data collection, pre-processing methods, feature engineering, and the training process. Moreover, an extensive review regarding the utilization of deep learning in HAR is also conducted. Finally, this paper discusses various challenges in the current HAR topic and offers suggestions for future research.
Over the last few years, interference has been a major hurdle for successfully implementing various end-user applications in the fifth-generation (5G) of wireless networks. During this era, several ...communication protocols and standards have been developed and used by the community. However, interference persists, keeping given quality of service (QoS) provision to end-users for different 5G applications. To mitigate the issues mentioned above, in this paper, we present an in-depth survey of state-of-the-art non-orthogonal multiple access (NOMA) variants having power and code domains as the backbone for interference mitigation, resource allocations, and QoS management in the 5G environment. These are future smart communication and supported by device-to-device (D2D), cooperative communication (CC), multiple-input and multiple-output (MIMO), and heterogeneous networks (HetNets). From the existing literature, it has been observed that NOMA can resolve most of the issues in the existing proposals to provide contention-based grant-free transmissions between different devices. The key differences between the orthogonal multiple access (OMA) and NOMA in 5G are also discussed in detail. Moreover, several open issues and research challenges of NOMA-based applications are analyzed. Finally, a comparative analysis of different existing proposals is also discussed to provide deep insights to the readers.
Cognitive radio (CR) is among the promising solutions for overcoming the spectrum scarcity problem in the forthcoming fifth-generation (5G) cellular networks, whereas mobile stations are expected to ...support multimode operations to maintain connectivity to various radio access points. However, particularly for multimedia services, because of the time-varying channel capacity, the random arrivals of legacy users, and the on-negligible delay caused by spectrum handoff, it is challenging to achieve seamless streaming leading to minimum quality of experience (QoE) degradation. The objective of this paper is to manage spectrum handoff delays by allocating channels based on the user QoE expectations, minimizing the latency, providing seamless multimedia service, and improving QoE. First, to minimize the handoff delays, we use channel usage statistical information to compute the channel quality. Based on this, the cognitive base station maintains a ranking index of the available channels to facilitate the cognitive mobile stations. Second, to enhance channel utilization, we develop a priority-based channel allocation scheme to assign channels to the mobile stations based on their QoE requirements. Third, to minimize handoff delays, we employ the hidden markov model (HMM) to predict the state of the future time slot. However, due to sensing errors, the scheme proactively performs spectrum sensing and reactively acts handoffs. Fourth, we propose a handoff management technique to overcome the interruptions caused by the handoff. In such a way that, when a handoff is predicted, we use scalable video coding to extract the base layer and transmit it during a certain interval time before handoff occurrence to be shown during handoff delays, hence providing seamless service. Our simulation results highlight the performance gain of the proposed framework in terms of channel utilization and received video quality.
Today, with the worldwide offer and rapid increment in multimedia applications on the web, the demands of users to get them accessed are also increasing prominently. The users in vehicular ...environment too expect efficient multimedia streaming while travelling on the road. However, the high mobility of vehicles as well as the limited transmission range of infrastructure components in IP based network provides low performance by offering high delay and additional network overhead. To provide better Quality of Experience (QoE) with high performance, Information Centric Networking (ICN) is blended with vehicular environment. Caching the content inside network nodes is inherent feature of ICN with various associated benefits such as low content retrieval delay, less network traffic, path reduction and so on. However, challenges still exists for caching the content due to resource constrained network environment (such as limited cache capacity, node battery) as well as for secure delivery of cached data. To solve these challenges and to enhance network performance, we propose a cooperative caching scheme in hierarchical network architecture that jointly considers cache location as well as combined content popularity and predicted future rating score while making caching decision. The proposed approach uses two layer hierarchical architecture where nodes in edge layer are divided into clusters. The proposed scheme uses modified Weighted Clustering Algorithms (WCA) for selection of cluster heads which are then used to decide cache location. A probability matrix is used to compute content caching probability which considers both popularity and predicted future rating of content. The proposed approach dynamically predict the user's preferences using non-negative matrix factorization (NMF) - a machine learning technique which eventually provides prediction of future rating. Based on the selection of both cache location and content to cache, the proposed scheme can effectively cache the content in the network. Further, to deal with the secure delivery of cached content, this work supports legitimate user authorization at edge nodes. The performance of the proposed scheme is evaluated in MATLAB parallel computing toolkit. The results prove significant caching improvement in terms of cache hit, hop reduction and average delay using our proposed scheme.
Massive connectivity and limited energy are main challenges for the beyond 5G (B5G)-enabled massive Internet of Things (IoT) to maintain diversified Qualify of Service (QoS) of the huge number of IoT ...device users. Motivated by these challenges, this article studies the performance of cooperative simultaneous wireless information and power transfer (SWIPT) nonorthogonal multiple access (NOMA) for massive IoT systems. Under the practical assumption, residual hardware impairments (RHIs) and channel estimation errors (CEEs) are taken into account. The communication between the base station (BS) and two NOMA IoT device users is realized through a direct link and the assistance of multiple relays with finite energy storage capability that can harvest energy from the BS. Aiming at improving the system performance, an optimal relay is selected among <inline-formula> <tex-math notation="LaTeX">K </tex-math></inline-formula> relays by using the partial relay selection (PRS) protocol to forward the received signal to the two NOMA IoT device users, namely, the far user (FU) and near user (NU). To evaluate the system performance, exact analytical expressions for the outage probability (OP) are derived in closed form. In order to get a better understanding of the overall system performance, we further undertake diversity order analyses by deriving asymptotic expressions for the OP in the high signal-to-noise ratio (SNR) regime. In addition, we also investigate the energy efficiency (EE) of the considered system, which is a crucial performance metric in massive IoT systems so that the impact of key system parameters on the performance can be quantified. Finally, the optimal power allocation scheme to maximize the sum rate of the considered system in the high SNR regime is also designed. Numerical results have shown that: 1) hardware impairment parameter has a deleterious effect on system performance while the channel estimation parameter is always beneficial to the OP; 2) the expected performance improvements obtained by the user of PRS protocol are enhanced by increasing the number of relays; and 3) the proposed power allocation scheme can optimize the sum-rate performance of the considered system.
This study presents a new nonlinear model-based predictive control scheme using fractional-order calculus and interval type-3 (IT3) fuzzy logic systems (FLSs) for Micro-Electro-Mechanical-System ...gyroscopes (MEMS-Gs). The dynamics of MEMS-G are unknown and perturbed by actuator faults and disturbances. Two IT3-FLSs are used for online modeling of uncertainties and predicting of tracking error. The IT3-FLSs are online optimized by Lyapunov adaptation rules such that the stability and robustness to be guaranteed. Also, the designed compensators adaptively tackle the effects of perturbations and estimation errors. In various conditions such as dynamic perturbations, actuator nonlinearities, tracking of a chaotic system, and tracking of a pulse signal with sharp rising and falling, we examine the capability of the suggested controller and compare with new controllers and other type of FLSs. We show that a well tracking accuracy with the desired transient performance and least overshoot is obtained.
Since December 2019, the coronavirus disease (COVID-19) outbreak has caused many death cases and affected all sectors of human life. With gradual progression of time, COVID-19 was declared by the ...world health organization (WHO) as an outbreak, which has imposed a heavy burden on almost all countries, especially ones with weaker health systems and ones with slow responses. In the field of healthcare, deep learning has been implemented in many applications, e.g., diabetic retinopathy detection, lung nodule classification, fetal localization, and thyroid diagnosis. Numerous sources of medical images (e.g., X-ray, CT, and MRI) make deep learning a great technique to combat the COVID-19 outbreak. Motivated by this fact, a large number of research works have been proposed and developed for the initial months of 2020. In this paper, we first focus on summarizing the state-of-the-art research works related to deep learning applications for COVID-19 medical image processing. Then, we provide an overview of deep learning and its applications to healthcare found in the last decade. Next, three use cases in China, Korea, and Canada are also presented to show deep learning applications for COVID-19 medical image processing. Finally, we discuss several challenges and issues related to deep learning implementations for COVID-19 medical image processing, which are expected to drive further studies in controlling the outbreak and controlling the crisis, which results in smart healthy cities.