With the proliferation of information and communication technology in every walks of the society, including healthcare services, digitization, and increased sophistication have been gaining pace, ...digital healthcare alternatives such as electronic healthcare record (EHR) have gained prominence with increased patients' data volume. However, traditional EHR-based systems are plagued by data loss risks, security and immutability consensus over health records, gapped communication among constituted hospitals, and inefficient clinical data retrieval systems, among others. Blockchain has been developed as a decentralized technology that holds the promise to address the aforesaid facilities in EHR-based systems. This article presents a patient-centric design of a decentralized healthcare management system with blockchain-based EHR using javascript-based smart contracts. A working prototype based on hyperledger fabric and composer technology has also been implemented which guarantees the security of the proposed model. Experiments with the hyperledger caliper benchmarking tool provide performance such as latency, throughput, resource utilization, and so on under varied scenarios and control parameters. The results affirm the efficacy of the proposed approach.
Traditional methods of Human Activity Recognition (HAR) do not take into consideration physical attributes of human subjects such as height, weight, gender etc. Thus, a particular recognition model ...does not perform consistently for different subjects having diverse physical attributes. In this paper, we propose a novel Physique-based HAR method that takes care of this problem and provides better accuracy in comparatively less time. Raw sensor data are acquired from inbuilt accelerometer and gyroscope sensor modules of smartphones. After pre-processing of the collected data physique-based datasets were prepared, based on the similarities of subjects' physique. Analysis is done on both of these physique-based datasets and normal traditional dataset with the help of various machine learning algorithms. The work not only identifies the suitable learning algorithms for HAR but also shows that the proposed physique-based method outperforms the traditional HAR approach for both publicly available dataset and our own generated dataset, with an accuracy of 99.88%. Individual activity-wise accuracy results are also compared with several recent benchmarks to show the efficiency of the proposed Physique-based HAR method.
With the proliferation of smart grid and deregulation of the energy market, a wide variety of peer-to-peer (P2P) energy trading systems have emerged. Common challenges for designing such systems ...include prosumers' privacy and security threats. To this end, Blockchain-based solutions have gained a lot of attention, though most existing solutions have either employed permissionless blockchain, which is far from pragmatic for a P2P energy trading system with peers permitted to join or leave the network at their whim; or relatively secure yet inefficient permissioned blockchains. Hence, this article presents a flexible permissioned ascription (FPA) scheme that uses on-chain and off-chain permissioning scheme via Orion and Metamask wallet. It also employs contract permissioning through a JavaScript based chain code deployed over Hyperledger Besu (an Ethereum based permissioned Blockchain network) with istanbul byzantine fault tolerant (IBFT) 2.0 consensus algorithm. Additionally, the proposed framework is emulated for development of a working prototype for a P2P energy trading system. Its performance evaluation has been conducted and monitored with Grafana, Prometheus, Hyperledger Caliper, and Kibana for parameters such as latency, throughput, success rate, CPU time, block time, block behind time, memory usage, garbage collection (GC) time, and performance of the validator nodes. The latency of IBFT 2.0 was found five times lesser than that of Ethereum and two times lesser than HF RAFT and KAFKA under varying conditions. Also, the measured throughput was 1.5 times higher than RAFT and Kafka and three times higher than that of Ethereum. The average block confirmation time measured is 5-6 s. The GC usage measured very less, i.e., 0.5-0.8%, with the proposed framework. It has been observed that the proposed energy-trading framework provides an efficient performance for deploying, transferring, and querying the energy transaction to a P2P energy-trading Blockchain network when compared with other consensus mechanisms.
We attempt to predict the accidental fall of human beings due to sudden abnormal changes in their health parameters such as blood pressure, heart rate, and sugar level. In medical terminology, this ...problem is known as Syncope. The primary motivation is to prevent such falls by predicting abnormal changes in these health parameters that might trigger a sudden fall. We apply various machine learning algorithms such as logistic regression, a decision tree classifier, a random forest classifier, K-Nearest Neighbours (KNN), a support vector machine, and a naive Bayes classifier on a relevant dataset and verify our results with the cross-validation method. We observe that the KNN algorithm provides the best accuracy in predicting such a fall. However, the accuracy results of some other algorithms are also very close. Thus, we move one step further and propose an ensemble model, Majority Voting, which aggregates the prediction results of multiple machine learning algorithms and finally indicates the probability of a fall that corresponds to a particular human being. The proposed ensemble algorithm yields 87.42% accuracy, which is greater than the accuracy provided by the KNN algorithm.
In wireless rechargeable sensor networks ( WRSNs ), target coverage is an important issue. However, most studies assumed that the sensors were powered by battery with fixed sensing radiuses. ...Furthermore, most existing studies applied the Boolean Sensing Model ( BSM ) which cannot reflect the physical features of sensing. As a result, the detection accuracy was decreased in a real application. This paper proposes a target coverage mechanism, called TCSAR , which aims to maximize the surveillance quality of the point of interests ( POIs ) under the constraint of perpetual network lifetime. The proposed TCSAR applies the Probabilistic Sensing Model ( PSM ) and assumes that the sensing radius of each sensor is adjustable. The proposed mechanism consists of two phases. In the scheduling phase, the TCSAR initially evaluates the surveillance contribution of each sensor and then schedules the sensor with the maximal contribution to the bottleneck POIs . In the space time transformation phase, the surveillance quality for the POIs can be further improved by adjusting the sensing radius of some sensors from space dimension to time dimension. The experimental results show that the proposed mechanism outperforms the existing mechanisms in terms of the worst surveillance quality and the quality of monitoring.
As the Internet of Things (IoT) paradigm is maturing, innovative, and novel services are being envisioned. An upcoming trend is the depiction of services enacted through seamless integration of ...multiple vertical IoT services, termed as cross-vertical or unified IoT services in this paper. Traditional Cloud-based centralized network architectures cannot cater to real-time responses demanded by such unified IoT applications. Moreover, introducing Fog nodes within the network architecture, though a promising alternative, cannot sustain the burden of a huge number of applications that culminates in massive data handling. In this paper, we envision employing lessons learned from context-aware computing, specifically context sharing among interdependent vertical IoT applications to address this delay requirement of such unified IoT applications by enacting context sharing among Fog nodes for minimizing system delay. The detailed network model and context sharing mechanism have been presented and the service time minimization has been framed as an optimization problem. Algorithms for context sharing and delay tolerant load balancing have been presented and simulation results carried out demonstrate the efficacy of the proposed methodology.
Using a mobile sink for data collection from the sensors has been considered as a good way to prolong the network lifetime of wireless sensor networks (WSNs). To avoid the problems of long delay and ...data non-fresh, some previous studies selected a set of data collection points (CPs) and all the other sensors transmit their data to the closest CP in a multi-hop manner. Then the mobile sink only needs to visit and collect data from the selected CPs, reducing the path length of the mobile sink and the time required for collecting data from all sensors. This paper proposes a data collection mechanism, called DDCF, which uses a mobile sink to collect data in a heterogeneous WSN(HWSN), aiming to prolong the network lifetime while improving the surveillance coverage. The proposed DDCF mainly consists of CP selection and tree topology construction phases in each round. The CP selection phase takes into consideration multiple parameters including remaining energy, coverage contribution as well as data fusion degree and then dynamically selects CPs in each round for balancing the lifetime of CPs and improving the surveillance quality. In the tree topology construction phase, each sensor selects its parent by considering the remaining energy, data fusion degree and transmission success ratio, which aims to dynamically construct a tree topology for further reducing and balancing the energy consumptions of sensor nodes. Performance study shows that the proposed DDCF outperforms existing studies in terms of network lifetime, fairness as well as surveillance quality.
Wireless charging is one of the most important issues in wireless sensor networks (WSNs), which aims to cope with the energy limitation problem of the sensors. Many of the existing researches applied ...the mobile charger to recharge the sensors for maintaining the perpetual lifetime of sensor networks. However, when choosing sensors to be recharged, the coverage contribution of sensors was ignored. This paper proposes a recharging scheme, called CAERM, which considers the coverage contribution of each requested sensor and constructs the recharging path to maximize the coverage of the whole networks. Two algorithms, including Simple Recharging Coverage Benefit (S-RCB) and Chain-Effect Recharging Coverage Benefit (CE-RCB) algorithms, are proposed to evaluate the coverage contribution of the requested sensors effectively. Through extensive simulations, experimental study shows that the proposed algorithm improves the recharging efficiency, while maximizing the coverage contribution and monitoring quality of the given sensor networks.
As a result of the proliferation of digital and network technologies in all facets of modern society, including the healthcare systems, the widespread adoption of Electronic Healthcare Records (EHRs) ...has become the norm. At the same time, Blockchain has been widely accepted as a potent solution for addressing security issues in any untrusted, distributed, decentralized application and has thus seen a slew of works on Blockchain-enabled EHRs. However, most such prototypes ignore the performance aspects of proposed designs. In this paper, a prototype for a Blockchain-based EHR has been presented that employs smart contracts with Hyperledger Fabric 2.0, which also provides a unified performance analysis with Hyperledger Caliper 0.4.2. The additional contribution of this paper lies in the use of a multi-hosted testbed for the performance analysis in addition to far more realistic Gossip-based traffic scenario analysis with Tcpdump tools. Moreover, the prototype is tested for performance with superior transaction ordering schemes such as Kafka and RAFT, unlike other literature that mostly uses SOLO for the purpose, which accounts for superior fault tolerance. All of these additional unique features make the performance evaluation presented herein much more realistic and hence adds hugely to the credibility of the results obtained. The proposed framework within the multi-host instances continues to behave more successfully with high throughput, low latency, and low utilization of resources for opening, querying, and transferring transactions into a healthcare Blockchain network. The results obtained in various rounds of evaluation demonstrate the superiority of the proposed framework.
Smart cities aim to use technology and data to efficiently utilize city resources, improving residents' quality of life and advancing sustainability. Efficient management of these resources, with the ...Internet of Things (IoT) playing a crucial role, is essential for enhancing smart cities' functionality and sustainability. As the number of users grows, providing efficient and timely services becomes increasingly challenging and intelligent management of resources is key to delivering effective IoT services. This paper introduces a novel approach, the Ensemble Fog Layered Service Management (EFLSM) model, designed to manage and optimize resource allocation and usage within a smart city's fog computing layer. This model addresses challenges such as unnecessary application migrations, and efficient context sharing and fog nodes migration issues. The proposed EFLSM model operates in three stages: Firstly, it assigns service requests to the appropriate fog nodes based upon the available computing resources and context history through a multi-channel queuing model; secondly, it ensures efficient context sharing and migration to address context instance shortages; and thirdly, it guides service request migration to select fog nodes using ensemble machine learning predictions. Experimental results from a hardware test-bed illustrate the deployment of the EFLSM model and demonstrate its superior performance over existing methods like SMCQ, greedy strategies, and random allocation.