As a generation of fuzzy sets, intuitionistic fuzzy sets (IFSs) have a more powerful ability to represent and address the uncertainty of information. Therefore, IFSs have been used in many areas. ...However, the distance measure between the IFSs indicating the difference or discrepancy grade is still an open question that has attracted considerable attention over the past few decades. Although various measurement methods have been developed, some problems still exist regarding the unsatisfactory axioms of distance measure or that lack discernment and cause counterintuitive cases. To address the above issues, in this article, we propose a new distance measure between IFSs based on the Jensen-Shannon divergence. This new IFS distance measure can not only satisfy the axiomatic definition of distance measure but also has nonlinear characteristics. As a result, it can better discriminate the discrepancies between IFSs, and it generates more reasonable results than do other existing measure methods; these advantages are illustrated by several numerical examples. Based on these qualities, an algorithm for pattern classification is designed that provides a promising solution for addressing inference problems.
In this paper we propose a novel image representation method that characterizes an image as a spatiogram--a generalized histogram--of colors quantized by Gaussian Mixture Models (GMMs). First, we ...quantize the color space using a GMM, which is learned by the Expectation-Maximization (EM) algorithm from the training images. The number of Gaussian components (i.e., the number of quantized color bins) is determined automatically according to the Bayesian Information Criterion (BIC). Second, we incorporate the spatiogram representation with the quantized Gaussian mixture color model. Intuitively, a spatiogram is a histogram in which the distribution of colors is spatially weighted by the locations of the pixels contributing to each color bin. We have modified the spatiogram representation to fit our framework, which employs Gaussian color components instead of discrete color bins. Finally, the comparison between two images is achieved by measuring the similarity between two spatiograms, for which purpose we propose a new measurement adopting the Jensen–Shannon Divergence (JSD). We applied the new image representation and comparison method to the image retrieval task. The experiments on several publicly available COREL image datasets demonstrate the effectiveness of our proposed image representation for image retrieval.
The Pythagorean fuzzy set (PFS) which is an extension of intuitionistic fuzzy set, is more capable of expressing and handling the uncertainty under uncertain environments, so that it was broadly ...applied in a variety of fields. Whereas, how to measure PFSs’ distance appropriately is still an open issue. It is well known that the square root of Jensen–Shannon divergence is a true metric in the probability distribution space which is a useful measure of distance. On account of this point, a novel divergence measure between PFSs is proposed by taking advantage of the Jensen–Shannon divergence in this paper, called as PFSJS distance. This is the first work to consider the divergence of PFSs for measuring the discrepancy of data from the perspective of the relative entropy. The new PFSJS distance measure has some desirable merits, in which it meets the distance measurement axiom and can better indicate the discrimination degree of PFSs. Then, numerical examples demonstrate that the PFSJS distance can avoid generating counter-intuitive results which is more feasible, reasonable and superior than existing distance measures. Additionally, a new algorithm based on the PFSJS distance measure is designed to solve the problems of medical diagnosis. By comparing the different methods in the medical diagnosis application, it is found that the new algorithm is as efficient as the other methods. These results prove that the proposed method is practical in dealing with the medical diagnosis problems.
The Jensen-Shannon divergence is a renown bounded symmetrization of the Kullback-Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce ...a vector-skew generalization of the scalar α -Jensen-Bregman divergences and derive thereof the vector-skew α -Jensen-Shannon divergences. We prove that the vector-skew α -Jensen-Shannon divergences are
-divergences and study the properties of these novel divergences. Finally, we report an iterative algorithm to numerically compute the Jensen-Shannon-type centroids for a set of probability densities belonging to a mixture family: This includes the case of the Jensen-Shannon centroid of a set of categorical distributions or normalized histograms.
•A new Belief Jensen–Shannon (BJS) divergence is devised in this paper.•BJS is proposed to measure the discrepancy and conflict degree between evidences.•A new method is proposed for multi-sensor ...data fusion.•The method is based on belief divergence measure of evidences and belief entropy.•The method outperforms the related works with better effectiveness.
Multi-sensor data fusion technology plays an important role in real applications. Because of the flexibility and effectiveness in modeling and processing the uncertain information regardless of prior probabilities, Dempster–Shafer evidence theory is widely applied in a variety of fields of information fusion. However, counter-intuitive results may come out when fusing the highly conflicting evidences. In order to deal with this problem, a novel method for multi-sensor data fusion based on a new belief divergence measure of evidences and the belief entropy was proposed. First, a new Belief Jensen–Shannon divergence is devised to measure the discrepancy and conflict degree between the evidences; then, the credibility degree can be obtained to represent the reliability of the evidences. Next, considering the uncertainties of the evidences, the information volume of the evidences are measured by making use of the belief entropy to indicate the relative importance of the evidences. Afterwards, the credibility degree of each evidence is modified by taking advantage of the quantitative information volume which will be utilized to obtain an appropriate weight in terms of each evidence. Ultimately, the final weights of the evidences are applied to adjust the bodies of the evidences before using the Dempster’s combination rule. A numerical example is illustrated that the proposed method is feasible and effective in handling the conflicting evidences, where the belief value of target increases to 99.05%. Furthermore, an application in fault diagnosis is given to demonstrate the validity of the proposed method. The results show that the proposed method outperforms other related methods where the basic belief assignment (BBA) of the true target is 89.73%.
The Jensen-Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture ...distribution. However, the Jensen-Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.
In this paper, we use the quantum Jensen–Shannon divergence as a means of measuring the information theoretic dissimilarity of graphs and thus develop a novel graph kernel. In quantum mechanics, the ...quantum Jensen–Shannon divergence can be used to measure the dissimilarity of quantum systems specified in terms of their density matrices. We commence by computing the density matrix associated with a continuous-time quantum walk over each graph being compared. In particular, we adopt the closed form solution of the density matrix introduced in Rossi et al. (2013) 27,28 to reduce the computational complexity and to avoid the cumbersome task of simulating the quantum walk evolution explicitly. Next, we compare the mixed states represented by the density matrices using the quantum Jensen–Shannon divergence. With the quantum states for a pair of graphs described by their density matrices to hand, the quantum graph kernel between the pair of graphs is defined using the quantum Jensen–Shannon divergence between the graph density matrices. We evaluate the performance of our kernel on several standard graph datasets from both bioinformatics and computer vision. The experimental results demonstrate the effectiveness of the proposed quantum graph kernel.
•We compute a density matrix for a graph using the continuous-time quantum walk.•We compute the quantum Jensen–Shannon divergence between graph density matrixes.•We define a quantum Jensen–Shannon graph kernel using the quantum divergence.•We evaluate the performance of our quantum kernel on standard graph datasets.•We demonstrate the effectiveness of the proposed quantum kernel.
In this short note, we prove that the square root of the quantum Jensen-Shannon divergence is a true metric on the cone of positive matrices, and hence in particular on the quantum state space.
Stochastic neighbour embedding (SNE) and its variants are methods of nonlinear dimensionality reduction that involve soft Gaussian neighbourhoods to measure similarities for all pairs of data. In ...order to build a suitable embedding, these methods try to reproduce in a low-dimensional space the neighbourhoods that are observed in the high-dimensional data space. Previous works have investigated the immunity of such similarities to norm concentration, as well as enhanced cost functions, like sums of Jensen–Shannon divergences. This paper proposes an additional refinement, namely multi-scale similarities, which are averages of soft Gaussian neighbourhoods with exponentially growing bandwidths. Such multi-scale similarities can replace the regular, single-scale neighbourhoods in SNE-like methods. Their objective is then to maximise the embedding quality on all scales, with the best preservation of both local and global neighbourhoods, and also to exempt the user from having to fix a scale arbitrarily. Experiments with several data sets show that the proposed multi-scale approach captures better the structure of data and improves significantly the quality of dimensionality reduction.
In decision-making systems, how to address uncertainty plays an important role for the improvement of system performance in uncertainty reasoning. Dempster-Shafer evidence (DSE) theory is an ...effective method to address uncertainty in decision-making problems by means of basic belief assignments (BBAs) and Dempster's combination rule. In the DSE theory, divergence measure between BBAs, which is beneficial for conflict information management in decision making, remains an open issue. In this paper, several generalized evidential divergences (EDs) are proposed and studied to measure the difference and discrepancy between BBAs in DSE theory, which have more universal applicability in decision theory. On this basis, a uniform <inline-formula><tex-math notation="LaTeX">\mathcal {BJS}</tex-math> <mml:math><mml:mi mathvariant="script">BJS</mml:mi></mml:math><inline-graphic xlink:href="xiao-ieq1-3177896.gif"/> </inline-formula> divergence-based decision-making algorithm is devised to improve the decision level. Furthermore, the extensions of weighted <inline-formula><tex-math notation="LaTeX">\mathcal {BJS}</tex-math> <mml:math><mml:mi mathvariant="script">BJS</mml:mi></mml:math><inline-graphic xlink:href="xiao-ieq2-3177896.gif"/> </inline-formula> to decision-making algorithms are discussed by considering not only subjective weights but also objective weights. Notably, this is the first work to propose the weighted <inline-formula><tex-math notation="LaTeX">\mathcal {BJS}</tex-math> <mml:math><mml:mi mathvariant="script">BJS</mml:mi></mml:math><inline-graphic xlink:href="xiao-ieq3-3177896.gif"/> </inline-formula> divergence in DSE theory providing a promising way to analyze decision-making problems from different perspectives. Besides, experiments demonstrate the effectiveness and superiority of the proposed methods. Finally, the proposed <inline-formula><tex-math notation="LaTeX">\mathcal {BJS}</tex-math> <mml:math><mml:mi mathvariant="script">BJS</mml:mi></mml:math><inline-graphic xlink:href="xiao-ieq4-3177896.gif"/> </inline-formula>-based decision-making algorithm is applied to pattern classification. The results validate that the proposed decision-making algorithm is beneficial for diverse real-world datasets and outperforms several well-known related works and demonstrates higher classification accuracy as well as robustness.