•Involvement of lot streaming, part sharing and sequence-dependent transition/setup time.•Involvement of products with multi-level assembly structure.•Design of a distributed ant colony system ...meta-heuristic.•Comprehensive experiment design using different scales of testbeds•A case study on practical ball valve production scheduling problem.
This paper studies the production scheduling problem in a flexible manufacturing system with two adjacent working areas, whose products are incorporated with flexible non-linear process plans and assembling operations. The basic parts are produced in one area before they are transported to the other area for assembly. The assembling structures of products are either flat or multi-levelled. Sequence-dependent setup times of operations and transition times of jobs between machines are considered separately from processing times. Lot streaming is considered beforehand such that each job represents a basic part instead of a batch of identical parts. Identical subassemblies are shared by all possible assembling operations, instead of being pre-associated with any product. Makespan, total tardiness and total workload are taken as objectives to be optimised. We propose a distributed ant colony system to solve the problem and explore the Pareto front. The approach is first compared with other methods, using several sets of hypothetical test cases with different sizes and complexities; then, it is applied to solve a ball valve production scheduling problem under different scenarios. We show that the proposed approach outperforms most of the other methods for the tested problems, especially for large-scale instances, making it a valuable and competitive approach for solving practical production scheduling problems.
In this study, a three-stage optimisation method for distribution system load transfer scheme with consideration of the participation of distributed generations (DGs) is proposed. The algorithm aims ...to recover loads from the large-scale power outage caused by substation-complete-shutdown accidents (SCSAs), thus improving the power supply reliability. First, the principle of the SCSA load transfer strategy considering DGs and the secondary dispatch is introduced. The capacities of DGs were used to provide the additional power supply in grid-connected or island operations. Meanwhile, to relieve the stress of the backup substation for further load transfer, a secondary dispatch strategy is proposed to dispatch loads among the direct-connected and secondary-connected substations. Afterwards, a three-stage optimisation model is built to obtain the optimal load transfer scheme. The first-stage optimisation determines the island partitions in the unrecovered areas. The second-stage optimisation comprises a tie-line transfer model considering the load importance and the recovery time. The third-stage optimisation intends to determine the optimal secondary dispatch to achieve a better overall recovery rate. Case study results prove that the proposed three-stage optimisation method can improve the overall load recovery rate while ensuring the reliability of important loads and the operation safety of the system.
Recently, various quantum computing and communication tasks have been implemented using IBM’s superconductivity-based quantum computers. Here, we show that the circuits used in most of those works ...were not optimized and obtain the corresponding optimized circuits. Optimized circuits implementable in IBM quantum computers are also obtained for a set of reversible benchmark circuits. With a clear example, it is shown that the reduction in circuit cost enhances the fidelity of the output state (with respect to the theoretically expected state in the absence of noise) as fewer gates and less circuit depth introduce fewer errors during evolution of the state. Further, considering Mermin inequality as an example, it is shown that the violation of the classical limit is enhanced when we use optimized circuits. Thus, the present approach can be used to identify a relatively weaker signature of quantumness and to establish quantum supremacy in a stronger manner.
Today, finding a viable solution for any real world problem focusing on combinatorial of problems is a crucial task. However, using optimisation techniques, a viable best solution for a specific ...problem can be obtained, developed and solved despite the existing limitations of the implemented technique. Furthermore, population based optimisation techniques are now a current interest and has spawned many new and improved techniques to rectify many engineering problems. One of these methods is the Grey Wolf Optimiser (GWO), which resembles the grey wolf’s leadership hierarchy and its hunting behavior in nature. The GWO adopts the hierarchical nature of grey wolfs and lists the best solution as alpha, followed by beta and delta in descending order. Additionally, its hunting technique of tracking, encircling and attacking are also modeled mathematically to find the best optimised solution. This paper presents the results from an extensive study of 83 published papers from previous studies related to GWO in various applications such as parameter tuning, economy dispatch problem, and cost estimating to name a few. A discussion on the properties of GWO algorithm and how it minimises the different problems in the different applications is presented, as well as an analysis on the research trend of GWO optimisation technique in various applications from year 2014 to 2017. Based on the literatures, it was observed that GWO has the ability to solve single and multi-objective problems efficiently due to its good local search criteria that performs exceptionally well for different problems and solutions.
This text focuses on the practitioner and demonstrates how to apply heuristic optimization methods. It offers a systematic theory of how to apply and adapt heuristic optimization methods.
This work proposes a novel framework able to optimise both topology and fibre angle concomitantly to minimise the compliance of a structure. Two different materials are considered, one with isotropic ...properties (nylon) and another one with orthotropic properties (onyx, which is nylon reinforced with chopped carbon fibres). The framework optimises, in the same particular sub-step, first the topology, and second, the fibre angle at every element throughout the domain. For the isotropic material, only topology optimisation takes place, whereas, for the orthotropic solid, both topology and fibre orientation are considered. The objective function is to minimise compliance, and this is done for three volume fractions of material inside the design domain: 30%, 40%, and 50%. Two classical benchmark cases are considered: 3-point and 4-point bending loading cases. The optimum topologies are further treated and manufactured using the fused filament fabrication (FFF) 3D printing method. Key results reveal that the absolute stiffness, density-normalised and volume-normalised stiffness values within each admissible volume are higher for onyx than for nylon, which proves the efficiency of the proposed concurrent optimisation framework. Moreover, although the objective function was to minimise compliance, it was also effective to improve the strength of all parts. The excellent quality and geometric tolerance of the 3D-printed parts are also worth mentioning.
Display omitted
•A topology optimisation framework for isotropic and orthotropic materials.•Consideration of fibre orientation into the optimisation framework.•Additive manufacturing of chopped carbon fibre reinforced nylon composites.•Experimental validation: manufacturing and testing.
Most of the real-world optimisation problems are subject to different types of constraints and are known as constrained optimisation problems. Reactive power dispatch (RPD) in electrical power system ...is also a non-linear, multi-objective or a single objective constrained optimisation problem. In this study, hybrid multi-swarm particle swarm optimisation (HMPSO) algorithm has been proposed to solve the RPD problem. HMPSO is one of the recently proposed population based search algorithm, in which the existing swarm is partitioned into several sub-swarms. Particle swarm optimisation is applied as the search engine for each sub-swarm. In addition, to explore more promising regions of the search space, differential evolution (DE) algorithm is implemented to improve the personal best of each particle. The RPD problem is formulated as non-linear, constrained multi-objective optimisation problem with equality and inequality constraints for minimisation of power losses and improvement of voltage profile simultaneously. To find the Pareto optimal set for RPD problem, weighted sum method has been applied. Afterwards, for finding the preferred solution out of the Pareto-optimal set, fuzzy membership function has been used. Effectiveness of the HMPSO algorithm has been verified on the standard IEEE 30-bus and a practical 75-bus Indian systems.
The set-based particle swarm optimisation algorithm is a swarm-based meta-heuristic that has gained popularity in recent years. In contrast to the original particle swarm optimisation algorithm, the ...set-based particle swarm optimisation algorithm is used to solve discrete and combinatorial optimisation problems. The main objective of this paper is to review the set-based particle swarm optimisation algorithm and to provide an overview of the problems to which the algorithm has been applied. This paper starts with an examination of previous attempts to create a set-based particle swarm optimisation algorithm and discusses the shortcomings of the existing attempts. The set-based particle swarm optimisation algorithm is established as the only suitable particle swarm variant that is both based on true set theory and does not require problem-specific modifications. In-depth explanations are given regarding the general position and velocity update equations, the mechanisms used to control the exploration–exploitation trade-off, and the quantifiers of swarm diversity. After the various existing applications of set-based particle swarm optimisation are presented, this paper concludes with a discussion on potential future research.
In many recent studies, the value of forest inventory information in harvest scheduling has been examined. In a previous paper, we demonstrated that making measurement decisions for stands for which ...the harvest decision is uncertain simultaneously with the harvest decisions may be highly profitable. In that study, the quality of additional measurements was not a decision variable, and the only options were between making no measurements or measuring perfect information. In this study, we introduce data quality into the decision problem, i.e., the decisionmaker can select between making imperfect or perfect measurements. The imperfect information is obtained with a specific scenario tree formulation. Our decision problem includes three types of decisions: harvest decisions, measurement decisions, and decisions about measurement quality. In addition, the timing of the harvests and measurements must be decided. These decisions are evaluated based on two objectives: discounted aggregate income for the planning periods and the end value of the forest at the end of the planning horizon. Solving the bi-objective optimization problem formed using the ε-constraint method showed that imperfect information was mostly sufficient for the harvest timing decisions during the planning horizon but perfect information was required to meet the end-value constraint. The relative importance of the two objectives affects the measurements indirectly by increasing or decreasing the number of certain decisions (i.e., situations in which the optimal decision is identical in all scenarios).
Demand response (DR) is the response of electricity consumers to time-varying tariffs or incentives awarded by the utility. Home energy management systems are systems whose role is to control the ...consumption of appliances under DR programs, in a way that electricity bill is minimised. While, most researchers have done optimal scheduling only for non-interruptible appliances, in this paper, the interruptible appliances such as electric water heaters are considered. In optimal scheduling of non-interruptible appliances, the problem is commonly formulated as an optimisation problem with integer decision variables. However, consideration of interruptible appliances leads to a binary optimisation problem which is more difficult than integer optimisation problems. Since, the basic version of binary particle swarm optimisation (PSO) does not perform well in solving binary engineering optimisation problems, in this paper a new binary particle swarm optimisation with quadratic transfer function, named as quadratic binary PSO (QBPSO) is proposed for scheduling shiftable appliances in smart homes. The proposed methodology is applied for optimal scheduling in a smart home with 10 appliances, where the number of decision variables is as high as 264. Optimal scheduling is done for both RTP and TOU tariffs both with and without consideration of consumers’ comfort. The achieved results indicate the drastic effect of optimal scheduling on the reduction of electricity bill, while consumers’ comfort is not much affected. The results testify that the proposed QBPSO outperforms basic binary PSO variant and 9 other binary PSO variants with different transfer functions.
Display omitted
•Optimal scheduling of appliances is formulated as a binary optimisation problem.•QBPSO has been proposed for scheduling shiftable appliances in smart homes.•The results show the drastic effect of optimal scheduling on the electricity bill.•The results show strong effect of transfer function on performance of binary PSO.•Results show superiority of QBPSO over basic binary PSO and other binary PSO’s.