Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete ...problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.
Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications ...scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.
In cloud computing, resources are dynamically provisioned and delivered to users in a transparent manner automatically on-demand. Task execution failure is no longer accidental but a common ...characteristic of cloud computing environment. In recent times, a number of intelligent scheduling techniques have been used to address task scheduling issues in cloud without much attention to fault tolerance. In this research article, we proposed a dynamic clustering league championship algorithm (DCLCA) scheduling technique for fault tolerance awareness to address cloud task execution which would reflect on the current available resources and reduce the untimely failure of autonomous tasks. Experimental results show that our proposed technique produces remarkable fault reduction in task failure as measured in terms of failure rate. It also shows that the DCLCA outperformed the MTCT, MAXMIN, ant colony optimization and genetic algorithm-based NSGA-II by producing lower makespan with improvement of 57.8, 53.6, 24.3 and 13.4 % in the first scenario and 60.0, 38.9, 31.5 and 31.2 % in the second scenario, respectively. Considering the experimental results, DCLCA provides better quality fault tolerance aware scheduling that will help to improve the overall performance of the cloud environment.
The realm of the Internet of Things (IoT), while continually transforming as a novel paradigm in the nexus of technology and education, still contends with numerous obstacles that hinder its ...incorporation into higher education institutions’ (HEIs) e-learning platforms. Despite substantial strides in IoT utilization from industrialized nations—the United States, the United Kingdom, Japan, and China serving as prime exemplars—the scope of its implementation in developing countries, notably Saudi Arabia, Malaysia, Pakistan, and Bangladesh, lags behind. A significant gap exists in research centered on the trajectory of IoT integration within e-learning systems of economically disadvantaged nations. Specifically, this study centers on Saudi Arabia to illuminate the main factors catalyzing or encumbering IoT uptake within its HEIs’ e-learning sector. As a preliminary step, this research has embarked on an exhaustive dissection of prior studies to unearth critical variables implicated in the IoT adoption process. Subsequently, we employed an inferential methodology, amassing data from 384 respondents in Saudi Arabian HEIs. Our examination divulges that usability, accessibility, technical support, and individual proficiencies considerably contribute to the rate of IoT incorporation. Furthermore, our data infer that financial obstacles, self-efficacy, interactive capability, online surveillance, automated attendance tracking, training programs, network and data safeguarding measures, and relevant tools significantly influence IoT adoption. Contrarily, factors such as accessibility, internet quality, infrastructure preparedness, usability, privacy concerns, and faculty support appeared to have a negligible impact on the adoption rates within HEIs. This research culminates in offering concrete recommendations to bolster IoT integration within Saudi Arabian HEIs, presenting valuable insights for government entities, policy architects, and HEIs to address the hurdles associated with IoT implementation in the higher education sector.
The internet of things (IoT) is an emerging paradigm of educational applications and innovative technology in the current era. While capabilities are increasing day by day, there are still many ...limitations and challenges to utilizing these technologies within E-Learning in higher educational institutes (HEIs). The IoT is well-implemented in the United States of America (USA), United Kingdom (UK), Japan, and China but not in developing countries, including Saudi Arabia, Malaysia, Pakistan, Bangladesh, etc. Few studies have investigated the adoption of IoT in E-Learning within developing countries. Therefore, this research aims to examine the factors influencing IoT adoption for E-Learning to be utilized in HEIs. Further, an adoption model is proposed for IoT-based E-Learning in the contexts of developing countries and provides recommendations for enhancing the IoT adoption for E-Learning in HEIs. The IoT-based E-Learning model categorizes these influencing factors into four groups: individual, organizational, environmental, and technological. Influencing factors are compared along with a detailed description in order to determine which factors should be prioritized for efficient IoT-based E-Learning in HEIs. We identify the privacy (27%), infrastructure readiness (24%), financial constraints (24%), ease of use (20%), support of faculty (18%), interaction (15%), attitude (14%), and network and data security (14%), as the significant E-Learning influencing factors on IoT adoption in HEIs. These findings from the researcher's perspective will show that the national culture has a significant role in the individual, organizational, technological, and environmental behavior toward using new technology in developing countries.
Small and Medium Enterprises (SMEs) are steadily moving in the direction of implementing digital and smart technologies, including the Industrial Internet of Things (IIoT) for improving their ...products and services. The adoption of IIoT allows manufactures and producers to make quick decisions for improving productivity and quality in real‐time. For this purpose, the era of digital industrial revolution from IR 1.0 to IR 5.0 is briefly explained. In this research study, the authors have reviewed and analysed the existing reviews, surveys and technical research studies on IIoT technologies for the manufacturing and production SMEs to highlight the concern raised. Forty‐seven (47) influencing factors are identified and classified into four groups based on the TOEI framework. Based on the identified influencing factors, IIoT adoption model is proposed for the manufacturing and production SMEs to adopt the new IIoT technologies in their business environments. Furthermore, a comparative analysis of the influencing factors has been done for the adoption of IIoT to increase efficiency, productivity and competitiveness for the manufacturing and production SMEs in developing countries. The proposed IIoT adoption model will help future policymakers and stakeholders to develop policies and strategies for the successful adoption and implementation of IIoT in manufacturing and production SMEs in developing countries. Also, recommendations are suggested to encourage IIoT adoption in production and manufacturing environments so that manufacturers and producers can respond easily and quickly to highly changing demands, product trends, skills gaps and other unexpected challenges in the future.
Factors Influencing the Adoption of Industrial IoT for the Manufacturing and Production SMEs in Developing Countries. Identify the influencing factors and proposed the IIoT adoption model for manufacturing and production SMEs for the efficient and successful adoption of IIoT, with suggestions for future policymakers and government.
Mobile Cloud Computing: Taxonomy and Challenges Aliyu, Ahmed; Abdullah, Abdul Hanan; Kaiwartya, Omprakash ...
Journal of Computer Networks and Communications,
2020, Volume:
2020
Journal Article
Peer reviewed
Open access
Mobile cloud computing (MCC) holds a new dawn of computing, where the cloud users are attracted to multiple services through the Internet. MCC has a qualitative, flexible, and cost-effective delivery ...platform for providing services to mobile cloud users with the aid of the Internet. Due to the advantage of the delivery platform, several studies have been conducted on how to address different issues in MCC. The issues include energy efficiency in MCC, secured MCC, user-satisfied applications and Quality of Service-aware MCC (QoS). In this context, this paper qualitatively reviews different proposed MCC solutions. Therefore, taxonomy for MCC is presented considering major themes of research including energy-aware, security, applications, and QoS-aware developments. Each of these themes is critically investigated with comparative assessments considering recent advancements. Analysis of metrics and implementation environments used for evaluating the performance of existing techniques are presented. Finally, some open research issues and future challenges are identified based on the critical and qualitative assessment of literature for researchers in this field.
Blockchain-based reliable, resilient, and secure communication for Distributed Energy Resources (DERs) is essential in Smart Grid (SG). The Solana blockchain, due to its high stability, scalability, ...and throughput, along with low latency, is envisioned to enhance the reliability, resilience, and security of DERs in SGs. This paper presents big datasets focusing on SQL Injection, Spoofing, and Man-in-the-Middle (MitM) cyberattacks, which have been collected from Solana blockchain-based Industrial Wireless Sensor Networks (IWSNs) for events monitoring and control in DERs. The datasets provided include both raw (unprocessed) and refined (processed) data, which highlight distinct trends in cyberattacks in DERs. These distinctive patterns demonstrate problems like superfluous mass data generation, transmitting invalid packets, sending deceptive data packets, heavily using network bandwidth, rerouting, causing memory overflow, overheads, and creating high latency. These issues result in ineffective real-time events monitoring and control of DERs in SGs. The thorough nature of these datasets is expected to play a crucial role in identifying and mitigating a wide range of cyberattacks across different smart grid applications.
Scheduling problems in cloud computing environment are mostly influenced by multi-objective optimization but frequently deal with using single-objective algorithms. The algorithms need to resolve ...multi-objective problems which are significantly different from the procedure or techniques used for single-objective optimizations. For this purpose, meta-heuristic algorithms always show their strength to deal with multi-objective optimization problems. In this research article, we present an innovative Multi-objective Cuckoo Search Optimization (MOCSO) algorithm for dealing with the resource scheduling problem in cloud computing. The main objective of resource scheduling problem is to reduce the cloud user cost and enhance the performance by minimizing makespan time, which helps to increase the revenue or profit for cloud providers with maximum resource utilization. Therefore, the proposed MOCSO algorithm is a new method for solving multi-objective resource scheduling problems in IaaS cloud computing environment. Moreover, the effects of the proposed algorithm are analyzed and evaluated by comparison with state-of-the-art multi-objective resource scheduling algorithms using simulation framework. Results obtained from simulation show that the proposed MOSCO algorithm performs better than MOACO, MOGA, MOMM and MOPSO, and balance multiple objectives in terms of expected time to completion and expected cost to completion matrices for resource scheduling in IaaS cloud computing environment.
Wireless sensor networks (WSNs) require accurate localization of sensor nodes for various applications. In this article, we propose the distance vector hop localization method (DVHLM) to address the ...node dislocation issue in real-time networks. The proposed method combines trilateration and Particle Swarm Optimization techniques to estimate the location of unknown or dislocated nodes. Our methodology includes four steps: coordinate calculation, distance calculation, unknown node position estimation, and estimation correction. To evaluate the proposed method, we conducted simulation experiments and compared its performance with state-of-the-art methods in terms of localization accuracy with known nodes, dislocated nodes, and shadowing effects. Our results demonstrate that DVHLM outperforms the existing methods and achieves better localization accuracy with reduced error. This article provides a valuable contribution to the field of WSNs by proposing a new method with a detailed methodology and superior performance.