Projekt Centrum Zastosowań Matematyki został zakończony w 2015 roku
Projekt Centrum Zastosowań Matematyki został zakończony w 2015 roku. W latach 2012-2015 zorganizowaliśmy 5 konferencji, 6 warsztatów tematycznych oraz 3 konkursy...
Luca Faes: Measuring Information Dynamics in Complex Physiological Networks
Ulrich Parlitz: Nonlinear Dynamics of The Heart
In the emerging field of network physiology [1], the human organism is viewed as an integrated network where the cardiac, circulatory, respiratory, and cerebral systems, each with its own internal dynamics, continuously interact with each other to preserve the overall physiological function. Being able to describe the joint system behavior and the contribution of the different observed parts to it may yield fundamental insight on the functioning of the networks underlying the regulation of physiological rhythms.
The proposed lecture will introduce an unifying approach for the quantitative description of physiological networks framed in the novel research field of information dynamics [2]. The approach is based on interpreting the physiological systems under analysis as dynamical systems, mapping the system behavior into a set of variables, and describing the time evolution of these variables –collected in the form of time series data– using information-theoretic analysis tools. These tools are developed essentially by incorporating dynamical and directional information into classic information theoretic measures like (conditional) entropy and mutual information. For instance, in a network of dynamical systems formed by a target system Z and two (possibly multivariate) source systems X and Y, the predictive entropy (PE) about Z is measured as the amount of information carried by the present state of Z which can be predicted from the past states of X, Y and Z; the PE can be decomposed as the sum of the self entropy (SE) of Z and the transfer entropy (TE) from {X,Y} to Z, reflecting respectively information storage and transfer in the network; the TE can be further expanded as the sum of the TE from X to Z conditioned to Y, the TE from Y to Z conditioned to X, plus the redundant entropy (RE) relevant to the interaction between X and Y while transferring information to Z. The main appeal of these measures is in the fact that, taken together, they allow to dissect the general concept of ‘information processing’ into essential sub-components of meaningful interpretation: the information produced by a dynamical system, the information stored in the system, the information transferred to it from the other connected systems, and the informational character (synergistic or redundant) of the information transferred from multiple source systems to a destination system.
In the lecture, I will first provide theoretical definitions of information dynamical measures for the study of networks of interacting dynamical systems, drawing a connection between dynamical system theory and information theory via the time delay embedding procedure, and showing how the system states can be described in terms of random process and then characterized by entropy-based functionals. Then, I will present two main approaches for the practical computation of these measures, dealing with theoretical and practical issues that hamper their estimation from real-world time series: the linear model based approach provides an efficient and compact representation of the system interactions and is closely related to the frequency domain representation of such interactions; the
model-free approach, implemented using different entropy estimators, allows the detection of any type of linear or nonlinear dynamical interaction. The advantages and limitations of the two approaches will be pinpointed showing their application in simulations of networks of stochastic and deterministic coupled systems. Finally, practical applications of the framework for the analysis of physiological networks will be discussed, including the study of the mechanisms underlying physiologic cardiovascular and cardiorespiratory interactions, of cerebrovascular regulation in neurally-mediated syncope, and of brain-brain and brain-heart interactions during sleep.
[1] Bashan A, Bartsch RP, Kantelhardt JW, Havlin S and Ivanov PC, Network physiology reveals relations
between network topology and physiological function. Nature Communications 3, 2012
[2] Faes L and Porta A. Conditional entropy-based evaluation of information dynamics in physiological systems.
In: Vicente R, Wibral M, Lizier JT, editors. Directed Information Measures in Neuroscience. Berlin:
Springer-Verlag, 2014: 61-86.
In our lectures we shall present and discuss the complex spatial-temporal dynamics underlying physiological and pathological states of the heart. We shall show how nonlinear dynamics and statistical physics provide novel analytical concepts to enhance understanding of cardiac dynamics and arrhythmias, including experimental and theoretical approaches towards modeling, analysis and control of electrical forms of heart disease.
The theory of dynamical systems plays a central role in integrating biological experiments with mathematical developments. This approach will be applied to cardiac arrhythmias, a highly significant cause of mortality and morbidity worldwide. The term dynamical disease was coined for cardiac arrhythmias, suggesting that they can be best understood from the dynamical system’s perspective, integrating multidisciplinary research on all relevant spatial and temporal scales.
In the lecture we shall explain how cardiac arrhythmias are a result of underlying complex spatial-temporal electrical excitation patterns following fast developing electro-mechanical instabilities. These dynamical states can be detected and classified using time series analysis. Furthermore, mathematical models of (collective) cell activities will be introduced and evaluated. Finally, a novel approach (LEAP) for terminating cardiac arrhythmias using low-energy pulses will presented.