Recently, green computing has received significant attention for Internet of Things (IoT) environments due to the growing computing demands under tiny sensor enabled smart services. The related ...literature on green computing majorly focuses on a cover set approach that works efficiently for target coverage, but it is not applicable in case of area coverage. In this paper, we present a new variant of a cover set approach called a grouping and sponsoring aware IoT framework (GS-IoT) that is suitable for area coverage. We achieve non-overlapping coverage for an entire sensing region employing sectorial sensing. Non-overlapping coverage not only guarantees a sufficiently good coverage in case of large number of sensors deployed randomly, but also maximizes the life span of the whole network with appropriate scheduling of sensors. A deployment model for distribution of sensors is developed to ensure a minimum threshold density of sensors in the sensing region. In particular, a fast converging grouping (FCG) algorithm is developed to group sensors in order to ensure minimal overlapping. A sponsoring aware sectorial coverage (SSC) algorithm is developed to set off redundant sensors and to balance the overall network energy consumption. GS-IoT framework effectively combines both the algorithms for smart services. The simulation experimental results attest to the benefit of the proposed framework as compared to the state-of-the-art techniques in terms of various metrics for smart IoT environments including rate of overlapping, response time, coverage, active sensors, and life span of the overall network.
Postquantum cryptography for elevating security against attacks by quantum computers in the Internet of Everything (IoE) is still in its infancy. Most postquantum based cryptosystems have longer keys ...and signature sizes and require more computations that span several orders of magnitude in energy consumption and computation time, hence the sizes of the keys and signature are considered as another aspect of security by green design. To address these issues, the security solutions should migrate to the advanced and potent methods for protection against quantum attacks and offer energy efficient and faster cryptocomputations. In this context, a novel security framework Lightweight Postquantum ID-based Signature (LPQS) for secure communication in the IoE environment is presented. The proposed LPQS framework incorporates a supersingular isogeny curve to present a digital signature with small key sizes which is quantum-resistant. To reduce the size of the keys, compressed curves are used and the validation of the signature depends on the commutative property of the curves. The unforgeability of LPQS under an adaptively chosen message attack is proved. Security analysis and the experimental validation of LPQS are performed under a realistic software simulation environment to assess its lightweight performance considering embedded nodes. It is evident that the size of keys and the signature of LPQS is smaller than that of existing signature-based postquantum security techniques for IoE. It is robust in the postquantum environment and efficient in terms of energy and computations.
This paper thoroughly presents a comprehensive review of the so-called moth–flame optimization (MFO) and analyzes its main characteristics. MFO is considered one of the promising metaheuristic ...algorithms and successfully applied in various optimization problems in a wide range of fields, such as power and energy systems, economic dispatch, engineering design, image processing and medical applications. This manuscript describes the available literature on MFO, including its variants and hybridization, the growth of MFO publications, MFO application areas, theoretical analysis and comparisons of MFO with other algorithms. Conclusions focus on the current work on MFO, highlight its weaknesses, and suggest possible future research directions. Researchers and practitioners of MFO belonging to different fields, like the domains of optimization, medical, engineering, clustering and data mining, among others will benefit from this study.
In this paper, we propose a non-localization routing protocol for underwater wireless sensor networks (UWSNs), namely, the triangle metric based multi-layered routing protocol (TM2RP). The main idea ...of the proposed TM2RP is to utilize supernodes along with depth information and residual energy to balance the energy consumption between sensors. Moreover, TM2RP is the first multi-layered and multi-metric pressure routing protocol that considers link quality with residual energy to improve the selection of next forwarding nodes with more reliable and energy-efficient links. The aqua-sim package based on the ns-2 simulator was used to evaluate the performance of the proposed TM2RP. The obtained results were compared to other similar methods such as depth based routing (DBR) and multi-layered routing protocol (MRP). Simulation results showed that the proposed protocol (TM2RP) obtained better outcomes in terms of energy consumption, network lifetime, packet delivery ratio, and end-to-end delay.
Real-world engineering design problems are widespread in various research disciplines in both industry and industry. Many optimization algorithms have been employed to address these kinds of ...problems. However, the algorithm’s performance substantially reduces with the increase in the scale and difficulty of problems. Various versions of the optimization methods have been proposed to address the engineering design problems in the literature efficiently. In this paper, a comprehensive review of the meta-heuristic optimization methods that have been used to solve engineering design problems is proposed. We use six main keywords in collecting the data (meta-heuristic, optimization, algorithm, engineering, design, and problems). It is worth mentioning that there is no survey or comparative analysis paper on this topic available in the literature to the best of our knowledge. The state-of-the-art methods are presented in detail over several categories, including basic, modified, and hybrid methods. Moreover, we present the results of the state-of-the-art methods in this domain to figure out which version of optimization methods performs better in solving the problems studied. Finally, we provide remarkable future research directions for the potential methods. This work covers the main important topics in the engineering and artificial intelligence domain. It presents a large number of published works in the literature related to the meta-heuristic optimization methods in solving various engineering design problems. Future researches can depend on this review to explore the literature on meta-heuristic optimization methods and engineering design problems.
Image encryption algorithms based on Chaotic approach are becoming increasingly popular for remotely sensed images using parallel techniques. It has been demonstrated that the most efficient image ...encryption algorithms are based on Chaos. Previous research using chaos-based cryptosystems has resulted in poor performance when using a single computer, compromising privacy, security, and reliability. Furthermore, there were issues when vulnerable satellite images were processed. This paper describes a novel chaos-based encryption technique that employs an external secret key and Henon, Logistic, and Gauss iterated maps. The proposed encryption algorithm is capable of efficiently encrypting a large number of images. When the number of images increases, however, these images become very small, and the technology becomes inefficient or impractical. This paper investigates the parallel method of image encryption on a large number of remotely sensed images in Hadoop. Hadoop's file visit method has been enhanced so that it can treat the entire Tiff file as a single unit. Furthermore, the file format is being extended to be supported by Hadoop in order to support GeoTiff in Hadoop. The results of the experiments show that the proposed parallel method for encryption is effective and scalable to a large number of images when compared to other well-known methods.
In the past few years, big data related to healthcare has become more important, due to the abundance of data, the increasing cost of healthcare, and the privacy of healthcare. Create, analyze, and ...process large and complex data that cannot be processed by traditional methods. The proposed method is based on classifying data into several classes using the data weight derived from the features extracted from the big data. Three important criteria were used to evaluate the study as well as to benchmark the current study with previous studies using a standard dataset.
The Internet of Underwater Things (IoUT) is an emerging area in marine science and engineering. It has witnessed significant research and development attention from both academia and industries due ...to its growing underwater use cases in oceanographic data collection, pollution monitoring, seismic monitoring, tactical surveillance, and assisted navigation for waterway transport. Information dissemination in the underwater network environment is very critical considering network dynamism, unattainable nodes, and limited resources of the tiny IoUT devices. Existing techniques are majorly based on location-centric beacon messages, which results in higher energy consumption, and wastage of computing resources in tiny IoUT devices. Towards this end, this paper presents an efficient void aware (EVA) framework for information dissemination in IoUT environment. Network architecture is modeled considering potential void region identification in the underwater network environment. An efficient void aware (EVA) information dissemination framework is proposed focusing on detecting void network region, and intelligent void aware data forwarding. The comparative performance evaluation attests to the benefits of the proposed framework in terms of energy consumption, network lifetime, packet delivery ratio, and end-to-end delay for information dissemination in IoUT.
We develop effective medical image classification techniques, with an emphasis on histopathology and magnetic resonance imaging (MRI). The trainer utilized the curriculum as a starting point for a ...set of data and a restricted number of samples, and we used it as a starting point for a set of data. As calibrating a machine learning model is difficult, we used alternative methods as unsupervised feature extracts or weight-conditioning factors for identifying pathological histology pictures. As a result, the pretrained models will be trained on 3-channel RGB pictures, while the MRI sample has more slices. To alter the working model using the MRI data, the convolutional neural network (CNN) must be fine-tuned. Pretrained models are placed and then used as feature snippets. However, there is a scarcity of well-done medical photos, making training machine learning models a difficult endeavor to begin with. In any case, data augmentation aids in the generation of sufficient training samples; however, it is unclear if data augmentation aids in the prediction of unknown data samples. As a result, we fine-tuned machine learning models without using any additional data. Furthermore, rather than utilizing a standard machine learning classifier for the MRI data, we created a unique CNN that uses both 3D shear descriptors and deep features as input. This custom network identifies the MRI sample after processing our representation of the characteristics from beginning to end. On the hidden MRI dataset, our bespoke CNN outperforms traditional machine learning. Our CNN model is less prone to overfitting as a result of this. Furthermore, we have given cutting-edge outcomes employing machine learning.