•We introduce a possibility distribution to typical hesitant fuzzy elements (PDHFE) to represent cluster preferences.•We present an interactive consensus framework for large-scale group decision ...making (LGDM) problems.•A new distance measure between two PDHFEs is defined based on the possibility information from the two PDHFEs.•The clusters in each interactive round are allowed to change, and therefore the evolution of the consensus process is captured.•A fine local feedback strategy is designed to help decision makers modify their preferences toward a higher consensus level.
In the large-scale group decision making (LGDM) consensus process, it is usually assumed that the obtained clusters do not change. However, as the individual preferences change as part of the decision process, this is generally not the case. The aim of this paper, therefore, is to propose a LGDM consensus model in which the clusters are allowed to change and the decision makers provide preferences using fuzzy preference relations. The most commonly used clustering method, k-means, is introduced to identify the subgroups and a possibility distribution based hesitant fuzzy element (PDHFE) is employed to represent each cluster preference. A novel distance measure over the PDHFEs is given to compute the various consensus measures, after which a local feedback strategy with four identification rules and two direction rules is designed to guide the consensus reaching process. The proposed model is distinguished from previous studies where the changes occur on the obtained clusters that the feedback mechanism is directly based on the decision makers in the identified clusters. Further, as the clusters (virtual or nominal) change in every interactive consensus round, the consensus process evolution can be captured. Finally, an emergency decision to choose a rescue plan is illustrated to validate the proposed method and demonstrate distinctive characteristics compared with the existing approaches.
In this paper we consider approaches for combining separately possibilistic uncertainty, probabilistic uncertainty and situations where both forms of uncertainty appear. An approach to probability ...aggregation using rational consensus with equi-weighting is developed. This aggregation is analyzed with information measures as one way to assess combinations and understand the impact on uncertainty. The analysis is based on combinations of bounding cases of probability distributions. Measures of conflict and the effect on information are developed. Next possibility transformations are used and illustrated by three representative possibility cases. The resultant transformed probabilities are aggregated with general probability distributions and the result evaluated with information measures as before. Finally a general approach to combining possibility distributions directly using quality criteria is described. An example is provided to illustrate the basic possibility distribution aggregation fusion developed.
Hesitant fuzzy linguistic term sets (HFLTSs) with additional possibility distributions can represent a much broader range of linguistic data. This paper develops compromise solutions for multiple ...attribute group decision making (MAGDM) using HFLTS. Geodesic distance and possibility distribution based distance measures are introduced to calculate the consensus degrees and determine the participant importance weights for the aggregation. Two models are then proposed to derive a compromise solution for the MAGDM problems. The first model is based on the VIKOR method and is used to determine the compromise between group utility maximization and individual regret minimization. The second model is based on the TOPSIS method and seeks to identify the compromise between the distances from the ideal and anti-ideal solutions. For both models, several ideal solutions and separation measures are presented and some of the properties in the two models are proven. An example based on an assessment of a health-care waste disposal management system illustrates the feasibility and practicability of the two proposed models. A further comparative study highlights the distinct features and potential use of the presented models.
The possibility-based design optimization (PBDO) model under the fuzzy uncertainty can provide the optimal design parameters by taking a trade-off between the performance and security. In order to ...efficiently solve the PBDO model with implicit failure possibility constraints in engineering, a Sequential Optimization Method based on the Adaptive Kriging model (AK-SOM) is proposed in this paper. The AK-SOM firstly constructs the Kriging model of the performance function corresponding to each failure possibility constraint. The optimized design parameters of the PBDO based on the current Kriging model (It is called the current K-PBDO) can be obtained without any additional model evaluations. Then, the effective possibility constraints in the current K-PBDO can be identified by the effective constraint criterion proposed in this paper, and only the Kriging models of the effective constraints are updated. The sequential optimization process continues until both the relative error stopping criterion and the U-function stopping criterion are satisfied simultaneously. The effectiveness and superiority of the AK-SOM are verified by five examples, and the obtained experimental results indicate that the proposed AK-SOM has no restrictions on the expression of the performance function, and it can effectively improve the efficiency of solving the PBDO model while ensuring the accuracy. Additionally, the proposed method is employed in the PBDO of a GH4169 aero-engine turbine disk, and the optimal scheme of PBDO shows that the failure possibility of the disk is reduced and the maximum stress in the dangerous part is also greatly reduced.
This paper marks the 50th anniversary of the publication of my first paper on fuzzy sets, “Fuzzy sets,” Information and Control, 1965. What is of historical interest is that initially—and for some ...time thereafter—my paper was an object of indifference, skepticism and derision. A prominent school of thought claimed that fuzzy set theory is probability theory in disguise. Positive comments were few and far between. In contrast, my ideas were welcomed with open arms in Japan. In the seventies and eighties of last century, fuzzy set theory and fuzzy logic began to gain acceptance in Europe and, more particularly, in Eastern Europe and the Soviet Union. In part, many negative reactions to my papers reflected the fact that the word “fuzzy” has pejorative connotations. In large measure, science is based on the classical, Aristotelian, bivalent logic. Binarization—drawing a sharply defined boundary between two classes—is a deeply entrenched Cartesian tradition. What is not widely recognized is that this tradition has outlived its usefulness. One of the principal contributions of fuzzy logic is providing a basis for a progression from binarization to graduation, from binarism to pluralism, from black and white to shades of gray. Graduation involves association of a class which has unsharp (fuzzy) boundaries with degrees/grades of membership. Classes with unsharp boundaries are pervasive in human cognition. Most words in natural language are labels of such classes. This paper is a concise exposition of what I consider to be my principal contributions to the development of fuzzy set theory and fuzzy logic. Among the contributions which are discussed are: introduction of the concept of a fuzzy set, FL-generalization, the concept of a linguistic variable, information granulation, precisiation of meaning, generalized theory of uncertainty (GTU), the concept of a restriction, restriction-centered theory of truth and meaning, the information principle, and similarity-based definitions of possibility and probability.
This paper proposes an axiomatic framework from which we develop the theory of type-2 (T2) fuzziness, called
fuzzy possibility theory
. First, we introduce the concept of a fuzzy possibility measure ...in a fuzzy possibility space (FPS). The fuzzy possibility measure takes on regular fuzzy variable (RFV) values, so it generalizes the scalar possibility measure in the literature. One of the interesting consequences of the FPS is that it leads to a new definition of T2 fuzzy set on the Euclidean space
which we call T2 fuzzy vector, as a map to the space instead of on the space. More precisely, we define a T2 fuzzy vector as a measurable map from an FPS to the space
of real vectors. In the current development, we are suggesting that T2 fuzzy vector is a more appropriate definition for a T2 fuzzy set on
In the literature, a T2 fuzzy set is usually defined via its T2 membership function, whereas in this paper, we obtain the T2 possibility distribution function as the transformation of a fuzzy possibility measure from a universe to the space
via T2 fuzzy vector. Second, we develop the product fuzzy possibility theory. In this part, we give a general extension theorem about product fuzzy possibility measure from a class of measurable atom-rectangles to a product ample field, and discuss the relationship between a T2 fuzzy vector and T2 fuzzy variables. We also prove two useful theorems about the existence of an FPS and a T2 fuzzy vector based on the information from a finite number of RFV-valued maps. The two results provide the possible interpretations for the concepts of the FPS and the T2 fuzzy vector, and thus reinforce the credibility of the approach developed in this paper. Finally, we deal with the arithmetic of T2 fuzzy variables in fuzzy possibility theory. We divide our discussion into two cases according to whether T2 fuzzy variables are defined on single FPS or on different FPSs, and obtain two theorems about T2 fuzzy arithmetic.
Model checking possibilistic linear-time properties was investigated by Li (2017). However, nondeterminism of the system is absent in previous studies. Therefore, in order to permit both ...possibilistic and nondeterministic choices, we use the generalized possibilistic decision process (GPDP) as a model of the system. First, the definition of GPDP describing the behavior of nondeterministic system is given in detail, the resolution of nondeterminism is performed by using the notion of schedulers, and the semantics of generalized possibilistic linear-temporal logic (GPoLTL) with schedulers are defined. Second, we study possibilistic model checking of some fuzzy linear-time properties under GPDP. Since there are many (infinite) schedulers satisfying a certain linear timing property in a given state of a GPDP, it is particularly critical to study the optimal strategy and its corresponding possible measure, which is called extremal possibility model checking. For some special fuzzy linear-time properties, such as constrained reachability, step-bounded constrained reachability, reachability, always reachability, repeated reachability, persistence reachability, we present complete solution to the optimal (including maximum and minimum cases) possibilistic model checking of the above reachability using the fixpoint techniques. We also introduce fuzzy !-regular properties in GPDP and show that their model checking can be simplified by repeated reachability. The algorithms for model checking are also provided. Additionally, an example is presented to illustrate the methods described in the paper.
This paper deals with a new transformation, so-called two-sided normalized (TSN), of continuous unimodal asymmetric probability distributions into possibility distributions. Many properties are ...derived and interpretations are discussed. A comparison with the optimal transformation is provided. In particular, the respective positions of right or left branches relative to the resulting optimal and TSN possibility distributions are given. It is also shown that the TSN transformation is the optimal transformation for the particular family of two-piece skewed distributions. The preservation of the asymmetry property is then analyzed and illustrated for several conventional asymmetric distributions and counter-examples of asymmetry preservation are provided. A multilinear approximation of the TSN transformation is finally proposed.
The latter half of the twentieth century witnessed an ‘intensional revolution’, a great collective effort to analyse notions which are absolutely fundamental to our understanding of the world and of ...ourselves—from meaning and information to knowledge, belief, causation, essence, supervenience, conditionality, as well as nomological, metaphysical, and logical necessity—in terms of a single concept. This was the concept of a possible world: a way things could have been. Possible worlds found applications in logic, metaphysics, semantics, game theory, information theory, artificial intelligence, and the philosophy of mind and cognition. However, possible worlds analyses have been facing numerous problems. This book traces them all back to hyperintensionality: the need for distinctions more fine-grained than the possible worlds apparatus can easily represent. It then introduces impossible worlds—ways things could not have been—as a general tool for modelling hyperintensional phenomena. The book discusses the metaphysics of impossible worlds and applies them to a range of central topics and open issues in logic, semantics, and philosophy: from the problem of logical omniscience in epistemic logic, to the semantics of non-classical logics, the modelling of imagination and mental simulation, the analysis of information and informative inference, truth in fiction, and counterpossible reasoning.