Feature selection is an important data-preprocessing technique in classification problems such as bioinformatics and signal processing. Generally, there are some situations where a user is interested ...in not only maximizing the classification performance but also minimizing the cost that may be associated with features. This kind of problem is called cost-based feature selection. However, most existing feature selection approaches treat this task as a single-objective optimization problem. This paper presents the first study of multi-objective particle swarm optimization (PSO) for cost-based feature selection problems. The task of this paper is to generate a Pareto front of nondominated solutions, that is, feature subsets, to meet different requirements of decision-makers in real-world applications. In order to enhance the search capability of the proposed algorithm, a probability-based encoding technology and an effective hybrid operator, together with the ideas of the crowding distance, the external archive, and the Pareto domination relationship, are applied to PSO. The proposed PSO-based multi-objective feature selection algorithm is compared with several multi-objective feature selection algorithms on five benchmark datasets. Experimental results show that the proposed algorithm can automatically evolve a set of nondominated solutions, and it is a highly competitive feature selection method for solving cost-based feature selection problems.
Various real-world applications can be formulated as feature selection problems, which have been known to be NP-hard. In this paper, we propose an effective feature selection method based on firefly ...algorithm (FFA), called return-cost-based binary FFA (Rc-BBFA). The proposed method has the capability of preventing premature convergence and is particularly efficient attributed to the following three aspects. An indicator based on the return-cost is first defined to measure a firefly’s attractiveness from other fireflies. Then, a Pareto dominance-based strategy is presented to seek the attractive one for each firefly. Finally, a binary movement operator based on the return-cost attractiveness and the adaptive jump is developed to update the position of a firefly. The experimental results on a series of public datasets show that the proposed method is competitive in comparison with other feature selection algorithms, including the traditional algorithms, the GA-based algorithm, the PSO-based algorithm, and the FFA-based algorithms.
•A two-archive-guided multiobjective artificial bee colony algorithm was designed.•The algorithm’s convergence and exploitation abilities are enhanced.•Two archives are employed to enhance the search ...capability of the algorithm.•Results have shown that TMABC-FS is an efficient and robust optimization method.
Since different features may require different costs, the cost-sensitive feature selection problem become more and more important in real-world applications. Generally, it includes two main conflicting objectives, i.e., maximizing the classification performance and minimizing the feature cost. However, most existing approaches treat this task as a single-objective optimization problem. To satisfy various requirements of decision-makers, this paper studies a multi-objective feature selection approach, called two-archive multi-objective artificial bee colony algorithm (TMABC-FS). Two new operators, i.e., convergence-guiding search for employed bees and diversity-guiding search for onlooker bees, are proposed for obtaining a group of non-dominated feature subsets with good distribution and convergence. And two archives, i.e., the leader archive and the external archive are employed to enhance the search capability of different kinds of bees. The proposed TMABC-FS is validated on several datasets from UCI, and is compared with two traditional algorithms and three multi-objective methods. Results have shown that TMABC-FS is an efficient and robust optimization method for solving cost-sensitive feature selection problems.
•Proposing a novel PSO-based feature selection algorithm with mutual information.•Presenting an effective swarm initialization strategy based on label correlation.•Designing two local search ...operators, the supplementary and deletion operators.•Giving an adaptive flip mutation to help particles jump out of local extremum.
Feature selection (FS) is an important data processing method in pattern recognition and data mining. Due to not considering characteristics of the FS problem itself, traditional particle update mechanisms and swarm initialization strategies adopted in most particle swarm optimization (PSO) limit their performance on dealing with high-dimensional FS problems. Focused on it, this paper proposes a novel feature selection algorithm based on bare bones PSO (BBPSO) with mutual information. Firstly, an effective swarm initialization strategy based on label correlation is developed, making full use of the correlation between features and class labels to accelerate the convergence of swarm. Then, in order to enhance the exploitation performance of the algorithm, two local search operators, i.e., the supplementary operator and the deletion operator, are developed based on feature relevance-redundancy. Furthermore, an adaptive flip mutation operator is designed to help particles jump out of local optimal solutions. We apply the proposed algorithm to typical datasets based on the K-Nearest Neighbor classifier (K-NN), and compare it with eleven state-of-the-art algorithms, SFS, PTA, SGA, BPSO, PSO(4-2), HPSO-LS, Binary BPSO, NaFA, IBFA, KPLS-mRMR and SMBA-CSFS. The experimental results show that the proposed algorithm can achieve a feature subset with better performance, and is a highly competitive FS algorithm.
Previous methods of designing a bolt supporting network, which depend on engineering experiences, seek optimal bolt supporting schemes in terms of supporting quality. The supporting cost and time, ...however, have not been considered, which restricts their applications in real-world situations. We formulate the problem of designing a bolt supporting network as a three-objective optimization model by simultaneously considering such indicators as quality, economy, and efficiency. Especially, two surrogate models are constructed by support vector regression for roof-to-floor convergence and the two-sided displacement, respectively, so as to rapidly evaluate supporting quality during optimization. To solve the formulated model, a novel interactive preference-based multiobjective evolutionary algorithm is proposed. The highlight of generic methods which interactively articulate preferences is to systematically manage the regions of interest by three steps, that is, "partitioning-updating-tracking" in accordance with the cognition process of human. The preference regions of a decision-maker (DM) are first articulated and employed to narrow down the feasible objective space before the evolution in terms of nadir point, not the commonly used ideal point. Then, the DM's preferences are tracked by dynamically updating these preference regions based on satisfactory candidates during the evolution. Finally, individuals in the population are evaluated based on the preference regions. We apply the proposed model and algorithm to design the bolt supporting network of a practical roadway. The experimental results show that the proposed method can generate an optimal bolt supporting scheme with a good balance between supporting quality and the other demands, besides speeding up its convergence.
In this paper, we propose a new bare-bones multi-objective particle swarm optimization algorithm to solve the environmental/economic dispatch problems. The algorithm has three distinctive features: a ...particle updating strategy which does not require tuning up control parameters; a mutation operator with action range varying over time to expand the search capability; and an approach based on particle diversity to update the global particle leaders. Several trials have been carried out on the IEEE 30-bus test system. By comparing with seven existing multi-objective optimization algorithms and three well-known multi-objective particle swarm optimization techniques, it is found that our algorithm is capable of generating excellent approximation of the true Pareto front and can be used to solve other types of multi-objective optimization problems.
The "curse of dimensionality" and the high computational cost have still limited the application of the evolutionary algorithm in high-dimensional feature selection (FS) problems. This article ...proposes a new three-phase hybrid FS algorithm based on correlation-guided clustering and particle swarm optimization (PSO) (HFS-C-P) to tackle the above two problems at the same time. To this end, three kinds of FS methods are effectively integrated into the proposed algorithm based on their respective advantages. In the first and second phases, a filter FS method and a feature clustering-based method with low computational cost are designed to reduce the search space used by the third phase. After that, the third phase applies oneself to finding an optimal feature subset by using an evolutionary algorithm with the global searchability. Moreover, a symmetric uncertainty-based feature deletion method, a fast correlation-guided feature clustering strategy, and an improved integer PSO are developed to improve the performance of the three phases, respectively. Finally, the proposed algorithm is validated on 18 publicly available real-world datasets in comparison with nine FS algorithms. Experimental results show that the proposed algorithm can obtain a good feature subset with the lowest computational cost.
•Proposing a binary differential evolution algorithm with self-learning strategy, called MOFS-BDE, to solve multi-objective feature selection problems.•Proposing a new binary mutation operator based ...on probability difference to guide the individuals to locate potentially optimal areas fast.•Proposing a new one-bit purifying search operator (OPS) for improving the self-learning capability of elite individuals.•Proposing an efficient non-dominated sorting operator with crowding distance to reduce the time consumption of the selection operator in differential evolution.
Feature selection is an important data preprocessing method. This paper studies a new multi-objective feature selection approach, called the Binary Differential Evolution with self-learning (MOFS-BDE). Three new operators are proposed and embedded into the MOFS-BDE to improve its performance. The novel binary mutation operator based on probability difference can guide individuals to rapidly locate potentially optimal areas, the developed One-bit Purifying Search operator (OPS) can improve the self-learning capability of the elite individuals located in the optimal areas, and the efficient non-dominated sorting operator with crowding distance can reduce the computational complexity of the selection operator in the differential evolution. Experimental results on a series of public datasets show that the effective combination of the binary mutation and OPS makes our MOFS-BDE achieve a trade-off between local exploitation and global exploration. The proposed method is competitive in comparison with some representative genetic algorithm-, particle swarm-, differential evolution-, and artificial bee colony-based feature selection algorithms.
Display omitted
In many real-world applications, workspace of robots often involves various danger sources that robots must evade, such as fire in rescue mission, landmines and enemies in war field. Since it is ...either impossible or too expensive to get their precise positions, decision-makers know only their action ranges in most cases. This paper proposes a multi-objective path planning algorithm based on particle swarm optimization for robot navigation in such an environment. First, a membership function is defined to evaluate the risk degree of path. Considering two performance merits: the risk degree and the distance of path, the path planning problem with uncertain danger sources is described as a constrained bi-objective optimization problem with uncertain coefficients. Then, a constrained multi-objective particle swarm optimization is developed to tackle this problem. Several new operations/improvements such as the particle update method based on random sampling and uniform mutation, the infeasible archive, the constrained domination relationship based on collision times with obstacles, are incorporated into the proposed algorithm to improve its effectiveness. Finally, simulation results demonstrate the capability of our method to generate high-quality Pareto optimal paths.
Feature selection is an important data preprocessing technique in multi-label classification. Although a large number of studies have been proposed to tackle feature selection problem, there are a ...few cases for multi-label data. This paper studies a multi-label feature selection algorithm using an improved multi-objective particle swarm optimization (PSO), with the purpose of searching for a Pareto set of non-dominated solutions (feature subsets). Two new operators are employed to improve the performance of the proposed PSO-based algorithm. One operator is adaptive uniform mutation with action range varying over time, which is used to extend the exploration capability of the swarm; another is a local learning strategy, which is designed to exploit the areas with sparse solutions in the search space. Moreover, the idea of the archive, and the crowding distance are applied to PSO for finding the Pareto set. Finally, experiments verify that the proposed algorithm is a useful approach of feature selection for multi-label classification problem.