This paper presents a novel meta-heuristic algorithm so-called White Shark Optimizer (WSO) to solve optimization problems over a continuous search space. The core ideas and underpinnings of WSO are ...inspired by the behaviors of great white sharks, including their exceptional senses of hearing and smell while navigating and foraging. These aspects of behavior are mathematically modeled to accommodate a sufficiently adequate balance between exploration and exploitation of WSO and to assist search agents to explore and exploit each potential area of the search space in order to achieve optimization. The search agents of WSO randomly update their position in connection with best-so-far solutions, to eventually arrive at the optimal outcome. The performance of WSO was comprehensively benchmarked on a set of 29 test functions from the CEC-2017 test suite for several dimensions. WSO was further applied to solve the benchmark problems of the CEC-2011 evolutionary algorithm competition to prove its reliability and applicability to real-world problems. A thorough analysis of computational and convergence results was presented to shed light on the efficacy and stability levels of WSO. The performance score of WSO in terms of several statistical methods was compared with 9 well-established meta-heuristics based on the solutions generated. Friedman’s and Holm’s tests of the results showed that WSO revealed reasonable solutions, in terms of global optimality, avoidance of local minima and solution quality, compared to other existing meta-heuristics.
COVID‐19 is the disease evoked by a new breed of coronavirus called the severe acute respiratory syndrome coronavirus 2 (SARS‐CoV‐2). Recently, COVID‐19 has become a pandemic by infecting more than ...152 million people in over 216 countries and territories. The exponential increase in the number of infections has rendered traditional diagnosis techniques inefficient. Therefore, many researchers have developed several intelligent techniques, such as deep learning (DL) and machine learning (ML), which can assist the healthcare sector in providing quick and precise COVID‐19 diagnosis. Therefore, this paper provides a comprehensive review of the most recent DL and ML techniques for COVID‐19 diagnosis. The studies are published from December 2019 until April 2021. In general, this paper includes more than 200 studies that have been carefully selected from several publishers, such as IEEE, Springer and Elsevier. We classify the research tracks into two categories: DL and ML and present COVID‐19 public datasets established and extracted from different countries. The measures used to evaluate diagnosis methods are comparatively analysed and proper discussion is provided. In conclusion, for COVID‐19 diagnosing and outbreak prediction, SVM is the most widely used machine learning mechanism, and CNN is the most widely used deep learning mechanism. Accuracy, sensitivity, and specificity are the most widely used measurements in previous studies. Finally, this review paper will guide the research community on the upcoming development of machine learning for COVID‐19 and inspire their works for future development. This review paper will guide the research community on the upcoming development of ML and DL for COVID‐19 and inspire their works for future development.
Gene expression data are expected to make a great contribution in the producing of efficient cancer diagnosis and prognosis. Gene expression data are coded by large measured genes, and only of a few ...number of them carry precious information for different classes of samples. Recently, several researchers proposed gene selection methods based on metaheuristic algorithms for analysing and interpreting gene expression data. However, due to large number of selected genes with limited number of patient's samples and complex interaction between genes, many gene selection methods experienced challenges in order to approach the most relevant and reliable genes. Hence, in this paper, a hybrid filter/wrapper, called rMRMR-MBA is proposed for gene selection problem. In this method, robust Minimum Redundancy Maximum Relevancy (rMRMR) as filter to select the most promising genes and an modified bat algorithm (MBA) as search engine in wrapper approach is proposed to identify a small set of informative genes. The performance of the proposed method has been evaluated using ten gene expression datasets. For performance evaluation, MBA is evaluated by studying the convergence behaviour of MBA with and without TRIZ optimisation operators. For comparative evaluation, the results of the proposed rMRMR-MBA were compared against ten state-of-arts methods using the same datasets. The comparative study demonstrates that the proposed method produced better results in terms of classification accuracy and number of selected genes in two out of ten datasets and competitive results on the remaining datasets. In a nutshell, the proposed method is able to produce very promising results with high classification accuracy which can be considered a promising contribution for gene selection domain.
•This paper proposes a Triz Bat Algorithm (BA) for Gene Selection Problem.•A new operators inspired by Triz are added to the improvement loop of BA.•Ten cancer benchmark datasets are used for evaluation process.•The results prove that the proposed algorithm is viable for gene selection domain•The comparative evaluation proves the superiority of the proposed method.
Feature selection (FS) is a crucial area of cognitive computation that demands further studies. It has recently received a lot of attention from researchers working in machine learning and data ...mining. It is broadly employed in many different applications. Many enhanced strategies have been created for FS methods in cognitive computation to boost the performance of the methods. The goal of this paper is to present three adaptive versions of the capuchin search algorithm (CSA) that each features a better search ability than the parent CSA. These versions are used to select optimal feature subset based on a binary version of each adapted one and the k-Nearest Neighbor (k-NN) classifier. These versions were matured by applying several strategies, including automated control of inertia weight, acceleration coefficients, and other computational factors, to ameliorate search potency and convergence speed of CSA. In the velocity model of CSA, some growth computational functions, known as exponential, power, and S-shaped functions, were adopted to evolve three versions of CSA, referred to as exponential CSA (ECSA), power CSA (PCSA), and S-shaped CSA (SCSA), respectively. The results of the proposed FS methods on 24 benchmark datasets with different dimensions from various repositories were compared with other k-NN based FS methods from the literature. The results revealed that the proposed methods significantly outperformed the performance of CSA and other well-established FS methods in several relevant criteria. In particular, among the 24 datasets considered, the proposed binary ECSA, which yielded the best overall results among all other proposed versions, is able to excel the others in 18 datasets in terms of classification accuracy, 13 datasets in terms of specificity, 10 datasets in terms of sensitivity, and 14 datasets in terms of fitness values. Simply put, the results on 15, 9, and 5 datasets out of the 24 datasets studied showed that the performance levels of the binary ECSA, PCSA, and SCSA are over 90% in respect of specificity, sensitivity, and accuracy measures, respectively. The thorough results via different comparisons divulge the efficiency of the proposed methods in widening the classification accuracy compared to other methods, ensuring the ability of the proposed methods in exploring the feature space and selecting the most useful features for classification studies.
Summary
Flash memory–based solid‐state drives (SSDs) offer several attractive features and benefits compared to hard disk drive (HDD), such as shock resistance and better performance especially for ...random data access. Depending on the number of bits in each cell, Flash memory can be designed as single/multi/triple level cell (SLC/MLC/TLC), which have different performance, density, cost and write endurance characteristics. To bring the best of these together, several researchers have proposed designing SSD using hybrid SLC/MLC/TLC Flash memory. However, these SSDs also present several challenges such as buffer management, placement of hot/cold data in suitable portion, and intelligent garbage collection. Several recent techniques aim to address these challenges. In this paper, we present a survey of techniques for managing SSDs designed with SLC/MLC/TLC Flash memory. We classify the works on several axes to bring out their similarities and differences. We aim to synthesize the state‐of‐art progress in hybrid SSD management and also spark further research in this area.
In this study, Mg-Ti-SiC composite powders with varied micron and nano silicon carbide (SiC) particle sizes were fabricated utilizing the ball milling technology at various milling times. The effect ...of reinforcement particles sizes and milling time on the morphology and microstructure of the magnesium composite powders was characterized. Then, we developed a machine-learning model based on Adaptive Neuro-fuzzy Inference System (ANFIS) modified with termite life cycle optimizer to predict the crystallite size of the produced composites. The average particles size in all composites including micron SiC (µSiC) and nano SiC (nSiC) always decreased with increasing milling time and SiC content, and the most optimal reduction in particle size was achieved after 16 h of milling for both configurations, which were 5.12 µm and 1.96 µm, respectively. Changing reinforcement particle size from micron to nano caused the peak intensities of Mg and Ti more decreased and phases Ti5Si3 and TiC were observed after milling for 16 h in ND composite powder. With increasing milling time in Mg-25 wt% Ti-5 wt% µSiC, the crystallite size decreased from 31 nm to 13.62 nm after 1 h and 32 h milled, respectively. The most optimum reduction in crystallite size occurred in the composite powders including nSiC, in which crystallite size decreased to 13.35 nm. The developed Machine learning model was able to predict the evolution of the crystallite size of the produce d composites with very good accuracy.
Background. The most common and successful technique for signal denoising with nonstationary signals, such as electroencephalogram (EEG) and electrocardiogram (ECG) is the wavelet transform (WT). The ...success of WT depends on the optimal configuration of its control parameters which are often experimentally set. Fortunately, the optimality of the combination of these parameters can be measured in advance by using the mean squared error (MSE) function. Method. In this paper, five powerful metaheuristic algorithms are proposed to find the optimal WT parameters for EEG signal denoising which are harmony search (HS), β-hill climbing (β-hc), particle swarm optimization (PSO), genetic algorithm (GA), and flower pollination algorithm (FPA). It is worth mentioning that this is the initial investigation of using optimization methods for WT parameter configuration. This paper then examines which efficient algorithm has obtained the minimum MSE and the best WT parameter configurations. Result. The performance of the proposed algorithms is tested using two standard EEG datasets, namely, Kiern's EEG dataset and EEG Motor Movement/Imagery dataset. The results of the proposed algorithms are evaluated using five common criteria: signal-to-noise-ratio (SNR), SNR improvement, mean square error (MSE), root mean square error (RMSE), and percentage root mean square difference (PRD). Interestingly, for almost all evaluating criteria, FPA achieves the best parameters configuration for WT and empowers this technique to efficiently denoise the EEG signals for almost all used datasets. To further validate the FPA results, a comparative study between the FPA results and the results of two previous studies is conducted, and the findings favor to FPA. Conclusion. In conclusion, the results show that the proposed methods for EEG signal denoising can produce better results than manual configurations based on ad hoc strategy. Therefore, using metaheuristic approaches to optimize the parameters for EEG signals positively affects the denoising process performance of the WT method.
Mechanical properties of fine grain nanocomposites differ from those of conventional composites due to the in situ effect caused by the addition of nanoparticle reinforcement and the complexity of ...strengthening mechanisms, which make their prediction using conventional analytical and numerical model is relatively difficult. Therefore, this work presents a rapid reliable machine learning model based on long-short term memory model modified with beluga whale optimizer to predict the mechanical properties of ultrafine grain Al-TiO2 nanocomposite manufactured using accumulative roll bonding (ARB). The mechanical properties were evaluated using tensile tests and correlated with the composite microstructure and hardness. Experimentally, it was demonstrated that the tensile strength increases with increasing the number of ARB passes until a plateau was achieved due to the uniform distribution of TiO2 nanoparticles inside the composite and the saturation of grain refinement in the Al matrix. The maximum tensile achieved was 270 MPa for composite containing 3% TiO2 nanoparticles after 5 ARB passes compared to 90.5 MPa for the raw Al. The proposed model was able to predict the yield and ultimate strengths, elongation, and hardness for all the produced composites tested with excellent accuracy reaching R2 equal 0.9955, 0.9952, 0.9859, and 0.9975, respectively, which is way better than other models.
Hill climbing method is an optimization technique that is able to build a search trajectory in the search space until reaching the local optima. It only accepts the uphill movement which leads it to ...easily get stuck in local optima. Several extensions to hill climbing have been proposed to overcome such problem such as Simulated Annealing, Tabu Search. In this paper, an extension version of hill climbing method has been proposed and called
β
-hill climbing. A stochastic operator called
β
-operator is utilized in hill climbing to control the balance between the exploration and exploitation during the search. The proposed method has been evaluated using IEEE-CEC2005 global optimization functions. The results show that the proposed method is a very efficient enhancement to the hill climbing providing powerful results when it compares with other advanced methods using the same global optimization functions.
This paper offers a summary of the latest studies on healthcare scheduling problems including patients’ admission scheduling problem, nurse scheduling problem, operation room scheduling problem, ...surgery scheduling problem and other healthcare scheduling problems. The paper provides a comprehensive survey on healthcare scheduling focuses on the recent literature. The development of healthcare scheduling research plays a critical role in optimizing costs and improving the patient flow, providing prompt administration of treatment, and the optimal use of the resources provided and accessible in the hospitals. In the last decades, the healthcare scheduling methods that aim to automate the search for optimal resource management in hospitals by using metaheuristics methods have proliferated. However, the reported results are disintegrated since they solved every specific problem independently, given that there are many versions of problem definition and various data sets available for each of these problems. Therefore, this paper integrates the existing results by performing a comprehensive review and analyzing 190 articles based on four essential components in solving optimization problems: problem definition, formulations, data sets, and methods. This paper summarizes the latest healthcare scheduling problems focusing on patients’ admission scheduling problems, nurse scheduling problems, and operation room scheduling problems considering these are the most common issues found in the literature. Furthermore, this review aims to help researchers to highlight some development from the most recent papers and grasp the new trends for future directions.