A challenge for physiologists and neuroscientists is to map information transfer between components of the systems that they study at different scales, in order to derive important knowledge on ...structure and function from the analysis of the recorded dynamics. The components of physiological networks often interact in a nonlinear way and through mechanisms which are in general not completely known. It is then safer that the method of choice for analyzing these interactions does not rely on any model or assumption on the nature of the data and their interactions. Transfer entropy has emerged as a powerful tool to quantify directed dynamical interactions. In this paper we compare different approaches to evaluate transfer entropy, some of them already proposed, some novel, and present their implementation in a freeware MATLAB toolbox. Applications to simulated and real data are presented.
We present an approach, framed in information theory, to assess nonlinear causality between the subsystems of a whole stochastic or deterministic dynamical system. The approach follows a sequential ...procedure for nonuniform embedding of multivariate time series, whereby embedding vectors are built progressively on the basis of a minimization criterion applied to the entropy of the present state of the system conditioned to its past states. A corrected conditional entropy estimator compensating for the biasing effect of single points in the quantized hyperspace is used to guarantee the existence of a minimum entropy rate at which to terminate the procedure. The causal coupling is detected according to the Granger notion of predictability improvement, and is quantified in terms of information transfer. We apply the approach to simulations of deterministic and stochastic systems, showing its superiority over standard uniform embedding. Effects of quantization, data length, and noise contamination are investigated. As practical applications, we consider the assessment of cardiovascular regulatory mechanisms from the analysis of heart period, arterial pressure, and respiration time series, and the investigation of the information flow across brain areas from multichannel scalp electroencephalographic recordings.
Exploiting the theory of state space models, we derive the exact expressions of the information transfer, as well as redundant and synergistic transfer, for coupled Gaussian processes observed at ...multiple temporal scales. All of the terms, constituting the frameworks known as interaction information decomposition and partial information decomposition, can thus be analytically obtained for different time scales from the parameters of the VAR model that fits the processes. We report the application of the proposed methodology firstly to benchmark Gaussian systems, showing that this class of systems may generate patterns of information decomposition characterized by prevalently redundant or synergistic information transfer persisting across multiple time scales or even by the alternating prevalence of redundant and synergistic source interaction depending on the time scale. Then, we apply our method to an important topic in neuroscience, i.e., the detection of causal interactions in human epilepsy networks, for which we show the relevance of partial information decomposition to the detection of multiscale information transfer spreading from the seizure onset zone.
In the framework of information dynamics, the temporal evolution of coupled systems can be studied by decomposing the predictive information about an assigned target system into amounts quantifying ...the information stored inside the system and the information transferred to it. While information storage and transfer are computed through the known self-entropy (SE) and transfer entropy (TE), an alternative decomposition evidences the so-called cross entropy (CE) and conditional SE (cSE), quantifying the cross information and internal information of the target system, respectively. This study presents a thorough evaluation of SE, TE, CE and cSE as quantities related to the causal statistical structure of coupled dynamic processes. First, we investigate the theoretical properties of these measures, providing the conditions for their existence and assessing the meaning of the information theoretic quantity that each of them reflects. Then, we present an approach for the exact computation of information dynamics based on the linear Gaussian approximation, and exploit this approach to characterize the behavior of SE, TE, CE and cSE in benchmark systems with known dynamics. Finally, we exploit these measures to study cardiorespiratory dynamics measured from healthy subjects during head-up tilt and paced breathing protocols. Our main result is that the combined evaluation of the measures of information dynamics allows to infer the causal effects associated with the observed dynamics and to interpret the alteration of these effects with changing experimental conditions.
The continuously growing framework of information dynamics encompasses a set of tools, rooted in information theory and statistical physics, which allow to quantify different aspects of the ...statistical structure of multivariate processes reflecting the temporal dynamics of complex networks. Building on the most recent developments in this field, this work designs a complete approach to dissect the information carried by the target of a network of multiple interacting systems into the new information produced by the system, the information stored in the system, and the information transferred to it from the other systems; information storage and transfer are then further decomposed into amounts eliciting the specific contribution of assigned source systems to the target dynamics, and amounts reflecting information modification through the balance between redundant and synergetic interaction between systems. These decompositions are formulated quantifying information either as the variance or as the entropy of the investigated processes, and their exact computation for the case of linear Gaussian processes is presented. The theoretical properties of the resulting measures are first investigated in simulations of vector autoregressive processes. Then, the measures are applied to assess information dynamics in cardiovascular networks from the variability series of heart period, systolic arterial pressure and respiratory activity measured in healthy subjects during supine rest, orthostatic stress, and mental stress. Our results document the importance of combining the assessment of information storage, transfer and modification to investigate common and complementary aspects of network dynamics; suggest the higher specificity to alterations in the network properties of the measures derived from the decompositions; and indicate that measures of information transfer and information modification are better assessed, respectively, through entropy-based and variance-based implementations of the framework.
Understanding how different areas of the human brain communicate with each other is a crucial issue in neuroscience. The concepts of structural, functional and effective connectivity have been widely ...exploited to describe the human connectome, consisting of brain networks, their structural connections and functional interactions. Despite high-spatial-resolution imaging techniques such as functional magnetic resonance imaging (fMRI) being widely used to map this complex network of multiple interactions, electroencephalographic (EEG) recordings claim high temporal resolution and are thus perfectly suitable to describe either spatially distributed and temporally dynamic patterns of neural activation and connectivity. In this work, we provide a technical account and a categorization of the most-used data-driven approaches to assess brain-functional connectivity, intended as the study of the statistical dependencies between the recorded EEG signals. Different pairwise and multivariate, as well as directed and non-directed connectivity metrics are discussed with a pros-cons approach, in the time, frequency, and information-theoretic domains. The establishment of conceptual and mathematical relationships between metrics from these three frameworks, and the discussion of novel methodological approaches, will allow the reader to go deep into the problem of inferring functional connectivity in complex networks. Furthermore, emerging trends for the description of extended forms of connectivity (e.g., high-order interactions) are also discussed, along with graph-theory tools exploring the topological properties of the network of connections provided by the proposed metrics. Applications to EEG data are reviewed. In addition, the importance of source localization, and the impacts of signal acquisition and pre-processing techniques (e.g., filtering, source localization, and artifact rejection) on the connectivity estimates are recognized and discussed. By going through this review, the reader could delve deeply into the entire process of EEG pre-processing and analysis for the study of brain functional connectivity and learning, thereby exploiting novel methodologies and approaches to the problem of inferring connectivity within complex networks.
The framework of information dynamics allows the dissection of the information processed in a network of multiple interacting dynamical systems into meaningful elements of computation that quantify ...the information generated in a target system, stored in it, transferred to it from one or more source systems, and modified in a synergistic or redundant way. The concepts of information transfer and modification have been recently formulated in the context of linear parametric modeling of vector stochastic processes, linking them to the notion of Granger causality and providing efficient tools for their computation based on the state–space (SS) representation of vector autoregressive (VAR) models. Despite their high computational reliability these tools still suffer from estimation problems which emerge, in the case of low ratio between data points available and the number of time series, when VAR identification is performed via the standard ordinary least squares (OLS). In this work we propose to replace the OLS with penalized regression performed through the Least Absolute Shrinkage and Selection Operator (LASSO), prior to computation of the measures of information transfer and information modification. First, simulating networks of several coupled Gaussian systems with complex interactions, we show that the LASSO regression allows, also in conditions of data paucity, to accurately reconstruct both the underlying network topology and the expected patterns of information transfer. Then we apply the proposed VAR-SS-LASSO approach to a challenging application context, i.e., the study of the physiological network of brain and peripheral interactions probed in humans under different conditions of rest and mental stress. Our results, which document the possibility to extract physiologically plausible patterns of interaction between the cardiovascular, respiratory and brain wave amplitudes, open the way to the use of our new analysis tools to explore the emerging field of Network Physiology in several practical applications.
Despite the widespread diffusion of nonlinear methods for heart rate variability (HRV) analysis, the presence and the extent to which nonlinear dynamics contribute to short-term HRV are still ...controversial. This work aims at testing the hypothesis that different types of nonlinearity can be observed in HRV depending on the method adopted and on the physiopathological state. Two entropy-based measures of time series complexity (normalized complexity index, NCI) and regularity (information storage, IS), and a measure quantifying deviations from linear correlations in a time series (Gaussian linear contrast, GLC), are applied to short HRV recordings obtained in young (Y) and old (O) healthy subjects and in myocardial infarction (MI) patients monitored in the resting supine position and in the upright position reached through head-up tilt. The method of surrogate data is employed to detect the presence and quantify the contribution of nonlinear dynamics to HRV. We find that the three measures differ both in their variations across groups and conditions and in the percentage and strength of nonlinear HRV dynamics. NCI and IS displayed opposite variations, suggesting more complex dynamics in O and MI compared to Y and less complex dynamics during tilt. The strength of nonlinear dynamics is reduced by tilt using all measures in Y, while only GLC detects a significant strengthening of such dynamics in MI. A large percentage of detected nonlinear dynamics is revealed only by the IS measure in the Y group at rest, with a decrease in O and MI and during T, while NCI and GLC detect lower percentages in all groups and conditions. While these results suggest that distinct dynamic structures may lie beneath short-term HRV in different physiological states and pathological conditions, the strong dependence on the measure adopted and on their implementation suggests that physiological interpretations should be provided with caution.
This Correspondence article is a comment which directly relates to the paper "A study of problems encountered in Granger causality analysis from a neuroscience perspective" (
Stokes and Purdon, ...2017). We agree that interpretation issues of Granger causality (GC) in neuroscience exist, partially due to the historically unfortunate use of the name "causality", as described in previous literature. On the other hand, we think that Stokes and Purdon use a formulation of GC which is outdated (albeit still used) and do not fully account for the potential of the different frequency-domain versions of GC; in doing so, their paper dismisses GC measures based on a suboptimal use of them. Furthermore, since data from simulated systems are used, the pitfalls that are found with the used formulation are intended to be general, and not limited to neuroscience. It would be a pity if this paper, even if written in good faith, became a wildcard against all possible applications of GC, regardless of the large body of work recently published which aims to address faults in methodology and interpretation. In order to provide a balanced view, we replicate the simulations of Stokes and Purdon, using an updated GC implementation and exploiting the combination of spectral and causal information, showing that in this way the pitfalls are mitigated or directly solved.
This work presents a comparison between different approaches for the model-free estimation of information-theoretic measures of the dynamic coupling between short realizations of random processes. ...The measures considered are the mutual information rate (MIR) between two random processes X and Y and the terms of its decomposition evidencing either the individual entropy rates of X and Y and their joint entropy rate, or the transfer entropies from X to Y and from Y to X and the instantaneous information shared by X and Y. All measures are estimated through discretization of the random variables forming the processes, performed either via uniform quantization (binning approach) or rank ordering (permutation approach). The binning and permutation approaches are compared on simulations of two coupled non-identical Hènon systems and on three datasets, including short realizations of cardiorespiratory (CR, heart period and respiration flow), cardiovascular (CV, heart period and systolic arterial pressure), and cerebrovascular (CB, mean arterial pressure and cerebral blood flow velocity) measured in different physiological conditions, i.e., spontaneous vs paced breathing or supine vs upright positions. Our results show that, with careful selection of the estimation parameters (i.e., the embedding dimension and the number of quantization levels for the binning approach), meaningful patterns of the MIR and of its components can be achieved in the analyzed systems. On physiological time series, we found that paced breathing at slow breathing rates induces less complex and more coupled CR dynamics, while postural stress leads to unbalancing of CV interactions with prevalent baroreflex coupling and to less complex pressure dynamics with preserved CB interactions. These results are better highlighted by the permutation approach, thanks to its more parsimonious representation of the discretized dynamic patterns, which allows one to explore interactions with longer memory while limiting the curse of dimensionality.