A microgrid is a group of interconnected loads and distributed energy resources that can fill the gap between the dependence on a bulk power grid and the transition to renewable energies. The ...islanded mode presents itself as the most interesting scenario, when local controllers should maintain the power quality standards based on several parameters. A tool specifically focused on the process of parameter tuning of the secondary consensus-based control for inverter-based islanded microgrids was proposed in this paper. One often-quoted drawback in this process is the great number of parameters that must be tuned, even for a very simple microgrid structure. To manage such a large number of parameters, the design of experiments was used in this study. The main motivation for this work was to present an optimized way to define the correct parameters for the secondary consensus control for inverter-based islanded microgrids. The study shows how experimental design methodology can be an efficient tool to tune microgrid parameters, which are typically multi-objective-based experiments. From the results, it is correct to state that the design of experiments is able to reach the optimal setting with a minimal number of experiments, which would be almost impossible to obtain with the trial-and-error method.
Electric power systems have experienced the rapid insertion of distributed renewable generating sources and, as a result, are facing planning and operational challenges as new grid connections are ...made. The complexity of this management and the degree of uncertainty increase significantly and need to be better estimated. Considering the high volatility of photovoltaic generation and its impacts on agents in the electricity sector, this work proposes a multivariate strategy based on design of experiments (DOE), principal component analysis (PCA), artificial neural networks (ANN) that combines the resulting outputs using Mixture DOE (MDOE) for photovoltaic generation prediction a day ahead. The approach separates the data into seasons of the year and considers multiple climatic variables for each period. Here, the dimensionality reduction of climate variables is performed through PCA. Through DOE, the possibilities of combining prediction parameters, such as those of ANN, were reduced, without compromising the statistical reliability of the results. Thus, 17 generation plants distributed in the Brazilian territory were tested. The one-day-ahead PV generation forecast has been considered for each generation plant in each season of the year, reaching mean percentage errors of 10.45% for summer, 9.29% for autumn, 9.11% for winter and 6.75% for spring. The versatility of the proposed approach allows the choice of parameters in a systematic way and reduces the computational cost, since there is a reduction in dimensionality and in the number of experimental simulations.
For some time, renewable solar energy generations using cellular photovoltaic panels have stood out among the options, especially in the segment of micro and small companies, where the return on ...investment is usually higher. In this context, when micro and small companies do not have the capital for the enterprises, several others, mainly small ones, have emerged to finance. However, significant difficulties occur for financiers in selecting investment portfolios, especially when considering the trade-off between return and risk and the covariations of return on investment, which are very common. In this type of selection, the Capital Asset Pricing Model criteria using the Gini risk can help significantly because this one is a more robust risk coefficient for assessments of non-normal probability distributions. However, searches for methods that meet the selection needs using the adjacent criteria are unsuccessful. Thus, this work seeks to help minimize the gap by presenting a new method for selection using the criteria. Historical and simulations data stochastic evaluations indicate that the portfolios selected by the method are attractive options for implementations. These portfolios have reasonable probabilistic expectations and satisfactory protection to avoid mistakes caused for not considering covariations in return on investment, which indicates a significant advance on the current knowledge frontier and will likely allow the increased use of the concept. The method also presents theoretical contributions in adaptations of the benchmark models, which help to minimize the adjacent literary gap of similar methods.
Radio Frequency Identification (RFID) and related technologies have been touted to allow exponential improvements in supply chain logistics and management. The accurate location of packages, cargo ...containers, and truck trailers saves fuel, pollution and over-production. However, many industrial users have indicated that these technologies have not provided the anticipated benefits. Two complementary strategies required to address RFID reliability are: improving the reliability of RFID technology and/or designing packaging related infrastructure that enables RFID. This paper focuses on designing RFID Ready facilities (RRF), and an RFID-enabling packaging infrastructure that helps avoid unnecessary transportation, thereby reducing pollution. The design guidelines developed were based on a set of experiments conducted in the RFID Supply Chain Laboratory at the University of Tennessee (UT) using Design of Experiments (DOE), to help determine the operational and facility factors that impact RFID reliability. Three different packaging strategies were tested on packages, boxes, and their various combinations. The key factors considered in the experiments were the following: Package Orientation (PO), Tag Placement (TP), Package Placement (PP), Reader Location (RL), Box Orientation (BO), Tag Placement on Box (TPB) and Tag Placement on Package (TPP).
•Optimized method of RFID is implemented by providing the guidelines for RRP.•The guidelines are developed based on experiments using Design of Experiments.•Packaging strategies are tested on packages, boxes, and their various combinations.
Recently, different methods have been proposed for portfolio optimization and decision making on investment issues. This article aims to present a new method for portfolio formation based on Data ...Envelopment Analysis (DEA) and Entropy function. This new portfolio optimization method applies DEA in association with a model resulting from the insertion of the Entropy function directly into the optimization procedure. First, the DEA model was applied to perform a pre-selection of the assets. Then, assets given as efficient were submitted to the proposed model, resulting from the insertion of the Entropy function into the simplified Sharpe’s portfolio optimization model. As a result, an improved asset participation was provided in the portfolio. In the DEA model, several variables were evaluated and a low value of beta was achieved, guaranteeing greater robustness to the portfolio. Entropy function has provided not only greater diversity but also more feasible asset allocation. Additionally, the proposed method has obtained a better portfolio performance, measured by the Sharpe Ratio, in relation to the comparative methods.
One of the main goals in flux-cored arc welding processes is the optimization of bead geometry, in which multiple geometric characteristics of the welding bead are important; therefore, ...multiobjective optimization programming is often applied. However, several optimization problems that use stochastic programming do not consider the impact of the correlation between the output variables on their probabilistic constraints. In this context, this paper aims to present a multiobjective optimization method based on multivariate stochastic programming. To demonstrate the applicability of the proposal, we conducted a design of experiments to optimize a flux-cored arc welding process for stainless-steel claddings. The weighting-sums method was applied to formulate the multiobjective optimization problem. It was possible to formulate a multivariate probability distribution for the penetration and dilution. In addition, a 95% probability to meet the predefined specification limits of the geometric characteristics was achieved.
In the light of Brazilian energy regulatory context, cluster strategies are required to classify groups of substations for voltage sag purposes. Tuning cluster algorithms is not a trivial task, due ...to the fact that these methods are sensitive to small errors. Therefore, this study proposes a new methodology based on principal components analysis (PCA), attribute agreement and analysis of covariance to verify the level of consistency and sensitivity of the linkage methods in the cluster formation for voltage sag studies. In order to prove this methodology, real data from power quality indices of distribution substations are used. Four distinct scenarios with disturbances are evaluated. PCA is applied for dimensionality reduction of the data. Then, grouping is performed for eight different linkage methods and agreement analysis is applied. Ward method was the only one that presented 100% consistency in all scenarios, considered as the most robust method whereas k-means showed consistency of 94.11%, with inversion of the clusters. However, when evaluating their groupings, it was found that k-means was unable to adequately separate the groups for this dataset. Finally, the proposed methodology is adequate for choose cluster methods for extensive data and it can be extended to applications in different areas.
Vibrations in CDFW Soares de Alcantara, Daniel; Balestrassi, Pedro Paulo; Freitas Gomes, José Henrique ...
Entropy,
06/2020, Letnik:
22, Številka:
6
Journal Article
Recenzirano
Odprti dostop
Continuous drive friction welding is a solid-state welding process that has been experimentally proven to be a fast and reliable method. This is a complex process; deformations in the viscosity of a ...material alter the friction between the surfaces of the pieces. All these dynamics cause changes in the vibration signals; the interpretation of these signals can reveal important information. The vibration signals generated during the friction and forging stages are measured on the stationary part of the structure to determine the influence of the manipulated variables on the time domain statistical characteristics (root mean square, peak value, crest factor, and kurtosis). In the frequency domain, empirical mode decomposition is used to characterize frequencies. It was observed that it is possible to identify the effects of the manipulated variables on the calculated statistical characteristics. The results also indicate that the effect of manipulated variables is stronger on low-frequency signals.
Cluster analysis is a multivariate data mining technique that is widely used in several areas. It aims to group automatically the n elements of a database into k clusters, using only the information ...of the variables of each case. However, the accuracy of the final clusters depends on the clustering method used. In this paper, we present an evaluation of the performance of main methods for cluster analysis as Ward, K-means, and Self-Organizing Maps. Differently from many studies published in the area, we generated the datasets using the Design of Experiment (DOE) technique, in order to achieve reliable conclusions about the methods through the generalization of the different possible data structures. We considered the number of variables and clusters, dataset size, sample size, cluster overlapping, and the presence of outliers, as the DOE factors. The datasets were analyzed by each clustering method and the clustering partitions were compared by the Attribute Agreement Analysis, providing invaluable information about the effects of the considered factors individually and about their interactions. The results showed that, the number of clusters, overlapping, and the interaction between sample size and number of variable significantly affect all the studied methods. Moreover, it is possible to state that the methods have similar performances, with a significance level of 5%, and it is not possible to affirm that one outperforms the others.
•Cost and tool life of a hardened steel turning of are properly characterized as typical poisson random variables.•Objective functions are simultaneously optimized coupled with their respective ...variances.•The proposed method reduces the dimension of the multi-objective optimization problem.•Proposed confidence ellipse for Pareto points supports a decision-making based on variability and mean shift.•A multivariate robust setup for steel turning process is obtained according to the fuzzy decision-maker.
This paper presents a multi-objective optimization algorithm that combines Normal Boundary Intersection method with response surface models of equimax rotated factor scores in order to simultaneously optimize multiples sets of means and variances of manufacturing processes characteristics. The algorithm uses equimax factor rotation to separate means and variances in individual and uncorrelated functions and afterwards combines them in a mean squared error function. These functions are then optimized using Normal Boundary Intersection method generating a Pareto frontier. The optimal solutions found are then filtered according to a 95% non-overlapping confidence ellipses for the predicted values of the responses and posteriorly they are assessed by a Fuzzy decision-maker index established between the volume of each confidence ellipsoid and the Mahalanobis distance between each Pareto point and its individual optima for a given weight. In order to illustrate the practical implementation of this approach, two cases involving the multi-objective optimization of the hardened steel turning process were considered: (a) the AISI 52100 hardened steel turning with CC6050 mixed ceramic inserts and (b) the AISI H13 hardened steel turning with CC 670 mixed ceramic tools. For both cases, the best setup for cutting speed (V), feed rate (f) and depth of cut (d) were adjusted to find the minimal process cost (Kp) and the maximal tool life (T), both responses with minimal variance. The suitable results achieved in these case studies indicate that the proposal may be useful for similar manufacturing processes.