Information Processing in Complex Systems  (IPCS) Session 1

Schedule Top Page

Time and Date: 10:00 - 12:30 on 20th Sep 2016

Room: B - Berlage zaal

Chair: Rick Quax

45000 Entropy for complex systems [abstract]
Abstract: Complex systems are often inherently non-ergodic and non-Markovian and Shannon entropy loses its applicability. It will be shown that the entropy of non-ergodic systems can still be derived from three of the Shannon–Khinchin axioms and by violating the fourth, the so-called composition axiom. The corresponding entropy is of the form S_{c,d}∼\sum_i \Gamma(1+d,1−c \ln{p_i}) and depends on two system-specific scaling exponents, c and d. This entropy contains many recently proposed entropy functionals as special cases, including Shannon and Tsallis entropy. It will also be shown that this entropy is relevant for a special class of non-Markovian path-dependent random walks. We show for the first time that a generalised max ent formalism can be devised that allows to predict distribution functions of evolving path-dependent processes.
Stefan Thurner
45001 Integrated Information Theory Applied to Complex Systems Analysis [abstract]
Abstract: Integrated information theory (IIT) was originally proposed by Balduzzi and Tononi in 2008 and has been expanded by multiple authors since, predominantly in a theoretical neuroscience context. The main object of IIT, Phi, is an informational measure that attempts to quantify to what extent the parts of a dynamical system are simultaneously segregated (acting independently of each other) and integrated (acting as a whole). In our work, we have taken the broad concepts behind IIT and Phi and used them to gain further insight into the behaviour of complex systems. Among other systems, we have applied IIT to spiking neural networks and coupled Kuramoto oscillators. In both cases, we find that Phi-like measures can reliably identify phase transitions in the system and are related to other dynamical properties like criticality, metastability or synchronisation onset. Furthermore, IIT can offer an illustrative picture of the interdependence between parts of the system and their evolution at different timescales. To do this, we have proposed novel kinds of estimators for Phi-like measures on time series data of any kind, and studied their behaviour in terms of stability, robustness, finite sample effects and stationarity. Overall, we push IIT forward not as a fundamental neuroscientific theory, but as a useful tool to interpret and analyse information processing in complex systems.
Pedro A.M. Mediano and Murray Shanahan
45002 Information-Theoretic Distance between Complex Networks [abstract]
Abstract: Entropy and information-theoretic derived measures have successfully been applied in a range disciplines, revealing time scale dependence in neural coding, quantifying the complexity of genetic sequences and playing an even central role in the quantification of quantum information, to cite just a few representative achievements. However, when it comes to complex networks, an appropriate definition of entropy has remained elusive. Applicability being often limited to the probability distribution of some network descriptor (such as the normalized distribution of node degrees). Here, inspired by how entropy is calculated in quantum systems, we define an interconnectivity based density matrix to calculate the von Neumann entropy directly from a network. We prove that our definition satisfies the expected additivity properties of (quantum) thermodynamic entropy, in contrast to past approaches. We exploit this entropy to define network based information-theoretic measures such as Renyi q-entropy, generalized Kullback-Leibler and Jensen-Shannon divergences---as complexity indicators---and importantly to define a distance measure between complex networks. Using our mathematical framework, we are thus able to numerically probe contemporary problems faced in complex network science, recovering results related to model selection and clustering layers of multilayer networks. We found that both synthetic and empirical networks exhibit information-theoretic properties, indicating that the approach offers insights to quantify complexity and distinguish networks by means of a distance measures, providing a backbone to an information-theoretic approach to complex network science.
Manlio De Domenico and Jacob Biamonte
45003 Information Processing in Biomolecular Regulatory Networks [abstract]
Abstract: While living systems seem distinctive in their ability to process information and useful tools for quantifying information structure exist in complex systems research, they haven't been widely applied to biological networks. Hence, information processing in living systems is yet to be rigorously quantified. In our work we investigate informational architecture of Boolean model for various biomolecular regulatory networks. We also compare their informational properties to two classes, random and scale-free, of null model that share commonalities in their causal structure [1]. We report patterns in information processing that distinguish biological networks from random networks in terms of scaling relation, total amount of information processed and causal interaction. Based on the results, we suggest that previously unidentified information-based organizational principles that go beyond topological considerations, such as a scale-free structure, which may be critical to biological functions [2]. 1] H. Kim, P.C.W. Davies and S.I. Walker (2015) New Scaling Relation for Information Transfer in Biological Networks. J. Roy. Soc. Interface 12 20150944; DOI: 10.1098/rsif.2015.0944 [2] S.I. Walker, H. Kim and P.C.W. Davies (2016) The Informational Architecture of the Cell. Phil Trans A. 2016 374 20150057; DOI: 10.1098/rsta.2015.0057
Hyunju Kim, Paul Davies and Sara Imari Walker
45004 Thermodynamic Cost of Information Processing in Bio-Chemical Networks [abstract]
Abstract: Life is a non-equilibrium process involving information processing with biochemistry. Understanding the thermodynamic cost of these processes is important. For processes described by linear chemical networks, Stochastic Thermodynamics provides a powerful tools to do so. We illustrate this statement by characterizing the trade-offs between dissipation, speed and accuracy in kinetic proofreading [1]. However, for more complicated networks stochastic descriptions rapidly becomes prohibitive. We show that similar concepts can be extended to study chemical reaction networks (CNs) described by deterministic rate equations. In particular we derive a Landauer principle characterizing the amount of chemical work necessary to modify the population of a CN operating far from equilibrium [2]. [1] Rao, R. & Peliti, L. “Thermodynamics of accuracy in kinetic proofreading: dissipation and efficiency trade-offs”. J. Stat. Mech. Theor. Exp., 2015, P06001.[2] Rao, R. & Esposito, M. “Nonequilibrium Thermodynamics of Chemical Reaction Networks: Wisdom from Stochastic Thermodynamics”. arXiv 1602.07257.
Riccardo Rao, Massimiliano Esposito and Luca Peliti
45006 Applying Fisher Information to real data: Electric Vehicle charging behavior Omri Har Shemesh, Rick Quax, Alfons Hoekstra and Peter Sloot