Convolutional autoencoders (CAEs) have shown their remarkable performance in stacking to deep convolutional neural networks (CNNs) for classifying image data during the past several years. However, ...they are unable to construct the state-of-the-art CNNs due to their intrinsic architectures. In this regard, we propose a flexible CAE (FCAE) by eliminating the constraints on the numbers of convolutional layers and pooling layers from the traditional CAE. We also design an architecture discovery method by exploiting particle swarm optimization, which is capable of automatically searching for the optimal architectures of the proposed FCAE with much less computational resource and without any manual intervention. We test the proposed approach on four extensively used image classification data sets. Experimental results show that our proposed approach in this paper significantly outperforms the peer competitors including the state-of-the-art algorithms.
The vibration signals of faulty rotating machinery are typically nonstationary, nonlinear, and mixed with abundant compounded background noise. To extract the potential excitations from the observed ...rotating machinery, signal demodulation and time-frequency analysis are indispensable. This work proposes a novel particle swarm optimization-based variational mode decomposition method, which adopts the minimum mean envelope entropy to optimize the parameters (<inline-formula><tex-math notation="LaTeX">\alpha</tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">K</tex-math></inline-formula>) in the existing variational mode decomposition. The proposed fault-detection framework separated the observed vibration signals into a series of intrinsic modes. A certain number of the intrinsic modes are then selected by means of the Hilbert transform-based square envelope spectral kurtosis. Subsequently, in this study, the feature representations were reconstructed via the selected intrinsic modes; then, the envelope spectra of the real faulty conditions were generated in the rotating machinery. To verify the performance of the proposed method, a testbed platform of a gearbox with a combination of different faults was implemented. The experimental results demonstrated that the proposed method represented the patterns of the fault frequency more explicitly than the available empirical mode decomposition, the local mean decomposition, and the wavelet package transform method.
•SVM and other methods are combined to build prediction model of dam deformation.•Three key problems on SVM are investigated.•The proposed approach can reduce the human impact on modeling process.
...Considering the strong nonlinear dynamic characteristics of dam deformation, the prediction model of dam deformation is investigated. Support vector machine (SVM) is combined with other methods, such as phase space reconstruction, wavelet analysis and particle swarm optimization (PSO), to build the prediction model of dam deformation. Firstly, the chaotic characteristics and the predictable time scale of dam deformation are identified by implementing the phase space reconstruction of observation data series on dam deformation. Secondly, a SVM-based prediction model of dam deformation is proposed. The reconstructed phase space of observed deformation and the Morlet wavelet basis function are selected as the input vector and the kernel function of SVM. Thirdly, the PSO algorithm is improved to implement the parameter optimization of SVM-based prediction model of dam deformation. Finally, the displacement of one actual dam is taken as an example. The results demonstrate the modeling efficiency and forecasting accuracy can be improved.
Differential evolution (DE) is arguably one of the most powerful stochastic real-parameter optimization algorithms in current use. DE operates through similar computational steps as employed by a ...standard evolutionary algorithm (EA). However, unlike traditional EAs, the DE-variants perturb the current-generation population members with the scaled differences of randomly selected and distinct population members. Therefore, no separate probability distribution has to be used for generating the offspring. Since its inception in 1995, DE has drawn the attention of many researchers all over the world resulting in a lot of variants of the basic algorithm with improved performance. This paper presents a detailed review of the basic concepts of DE and a survey of its major variants, its application to multiobjective, constrained, large scale, and uncertain optimization problems, and the theoretical studies conducted on DE so far. Also, it provides an overview of the significant engineering applications that have benefited from the powerful nature of DE.
Particle swarm optimization (PSO) has two salient components: 1) a dynamical rule governing particle motion and 2) an interparticle communication topology. Recent practice has focused on the fully ...connected topology (Gbest) despite earlier indications on the superiority of local particle neighborhoods. This paper seeks to address the controversy with empirical trials with canonical PSO on a large benchmark of functions, categorized into 14 properties. This paper confirms the early lore that Gbest is the overall better algorithm for unimodal and separable problems and that a ring neighborhood of connectivity two (Lbest) is the preferred choice for multimodal, nonseparable and composition functions. Topologies of intermediate particle connectivity were also tested and the difference in global/local performance was found to be even more marked. A measure of significant improvement is introduced in order to distinguish major improvements from refinements. Lbest, according to the experiments on the 84 test functions and a bi-modal problem of adjustable severity, is found to have significant improvements later in the run, and to be more diverse at termination. A mobility study shows that Lbest is better able to jump between optimum basins. Indeed Gbest was unable to switch basins in the bi-modal trial. The implication is that Lbest's larger terminal diversity, its better ability to basin hop and its later significant improvement account for the performance enhancement. In several cases where Lbest was not the better algorithm, the trials show that Lbest was not stuck but would have continued to improve with an extended evaluation budget. Canonical PSO is a baseline algorithm and the ancestor of all contemporary PSO variants. These variants build on the basic structure of baseline PSO and the broad conclusions of this paper are expected to follow through. In particular, research that fails to consider local topologies risks underplaying the success of the promoted algorithm.
In this paper, the optimal sizing problem of the micro-grid’s resources in two different modes in the presence of the electric vehicle using the multi-objective particle swarm optimization algorithm ...is investigated. In this regard, the uncertain behavior of the electric vehicle is modeled using the Monte Carlo Simulation. In the first case named as PV/wind/battery, the optimum number of components and the amount of cost at different levels of reliability are determined. Then, the electric vehicle is added to the system regarding which the loss of power supply probability is recalculated in the both deterministic and stochastic states. The results show that the electric vehicle increases the system reliability. In the second system named as PV/wind/battery/EV, the effect of deterministic and stochastic behavior of electric vehicle on the number of components and the loss of power supply probability was investigated for the first time. The results demonstrated that the design of both systems is feasible, but the first system was more efficient than the second, because the latter used more winds in a number of identical LPSPs. Moreover, sensitivity analysis has been performed to show the effect of wind speed and load parameters on decision variables.
•Optimal sizing of a HRES in the presence of electric vehicle.•Optimizing the HRES’s optimal sizing problem using MOPSO algorithm.•Modeling the uncertain behavior of EVs in the proposed model using MCS approach.•Modeling the problem using the LCC and LPSP indices.
The Mahalanobis–Taguchi system is considered as a promising and powerful tool for handling binary classification cases. Though, the Mahalanobis–Taguchi system has several restrictions in screening ...useful features and determining the decision boundary in an optimal manner. In this article, an integrated Mahalanobis classification system is proposed which builds on the concept of Mahalanobis distance and its space. The integrated Mahalanobis classification system integrates the decision boundary searching process, based on particle swarm optimizer, directly into the feature selection phase for constructing the Mahalanobis distance space. This integration (a) avoids the need for user-dependent input parameters and (b) improves the classification performance. For the feature selection phase, both the use of binary particle swarm optimizer and binary gravitational search algorithm is investigated. To deal with possible overfitting problems in case of sparse data sets, k-fold cross-validation is considered. The integrated Mahalanobis classification system procedure is benchmarked with the classical Mahalanobis–Taguchi system as well as the recently proposed two-stage Mahalanobis classification system in terms of classification performance. Results are presented on both an experimental case study of complex-shaped metallic turbine blades with various damage types and a synthetic case study of cylindrical dogbone samples with creep and microstructural damage. The results indicate that the proposed integrated Mahalanobis classification system shows good and robust classification performance.
This paper proposes a deterministic particle swarm optimization to improve the maximum power point tracking (MPPT) capability for photovoltaic system under partial shading condition. The main idea is ...to remove the random number in the accelerations factor of the conventional PSO velocity equation. Additionally, the maximum change in velocity is restricted to a particular value, which is determined based on the critical study of P - V characteristics during partial shading. Advantages of the method include: 1) consistent solution is achieved despite a small number of particles, 2) only one parameter, i.e., the inertia weight, needs to be tuned, and 3) the MPPT structure is much simpler compared to the conventional PSO. To evaluate the idea, the algorithm is implemented on a buck-boost converter and compared to the conventional hill climbing (HC) MPPT method. Simulation results indicate that the proposed method outperforms the HC method in terms of global peak tracking speed and accuracy under various partial shading conditions. Furthermore, it is tested using the measured data of a tropical cloudy day, which includes rapid movement of the passing clouds and partial shading. Despite the wide fluctuations in array power, the average efficiency for the 10-h test profile reaches 99.5%.
Due to the development of modern information technology, the emergence of the fog computing enhances equipment computational power and provides new solutions for traditional industrial applications. ...Generally, it is impossible to establish a quantitative energy-aware model with a smart meter for load balancing and scheduling optimization in smart factory. With the focus on complex energy consumption problems of manufacturing clusters, this paper proposes an energy-aware load balancing and scheduling (ELBS) method based on fog computing. First, an energy consumption model related to the workload is established on the fog node, and an optimization function aiming at the load balancing of manufacturing cluster is formulated. Then, the improved particle swarm optimization algorithm is used to obtain an optimal solution, and the priority for achieving tasks is built toward the manufacturing cluster. Finally, a multiagent system is introduced to achieve the distributed scheduling of manufacturing cluster. The proposed ELBS method is verified by experiments with candy packing line, and experimental results showed that proposed method provides optimal scheduling and load balancing for the mixing work robots.
Model management plays an essential role in surrogate-assisted evolutionary optimization of expensive problems, since the strategy for selecting individuals for fitness evaluation using the real ...objective function has substantial influences on the final performance. Among many others, infill criterion driven Gaussian process (GP)-assisted evolutionary algorithms have been demonstrated competitive for optimization of problems with up to 50 decision variables. In this paper, a multiobjective infill criterion (MIC) that considers the approximated fitness and the approximation uncertainty as two objectives is proposed for a GP-assisted social learning particle swarm optimization algorithm. The MIC uses nondominated sorting for model management, thereby avoiding combining the approximated fitness and the approximation uncertainty into a scalar function, which is shown to be particularly important for high-dimensional problems, where the estimated uncertainty becomes less reliable. Empirical studies on 50-D and 100-D benchmark problems and a synthetic problem constructed from four real-world optimization problems demonstrate that the proposed MIC is more effective than existing scalar infill criteria for GP-assisted optimization given a limited computational budget.