UP - logo
E-resources
Full text
Peer reviewed Open access
  • Homogenizing Entropy Across...
    Peck, Joel R.; Waxman, David

    IEEE transactions on information theory, 03/2023, Volume: 69, Issue: 3
    Journal Article

    In classical information theory, a causal relationship between two variables is typically modelled by assuming that, for every possible state of one of the variables, there exists a particular distribution of states of the second variable. Let us call these two variables the causal and caused variables, respectively. We shall assume that both variables are continuous and one-dimensional. In this work we consider a procedure to transform each variable, using transformations that are differentiable and strictly increasing. We call these increasing transformations. Any causal relationship (as defined here) is associated with a channel capacity, which is the maximum rate that information could be sent if the causal relationship was used as a signalling system. Channel capacity is unaffected when the two variables are changed by use of increasing transformations. For any causal relationship we show that there is always a way to transform the caused variable such that the entropy associated with the caused variable is independent of the value of the causal variable. Furthermore, the resulting universal entropy has an absolute value that is equal to the channel capacity associated with the causal relationship. This observation may be useful in statistical applications. Also, for any causal relationship, it implies that there is a 'natural' way to transform a continuous caused variable. We also show that, with additional constraints on the causal relationship, a natural increasing transformation of both variables leads to a transformed causal relationship that has properties that might be expected from a well-engineered measuring device.