Information Processing in Complex Systems  (IPCS) Session 1

Schedule Top Page

Time and Date: 10:00 - 12:30 on 20th Sep 2016

Room: B - Berlage zaal

Chair: Rick Quax

45000 Entropy for complex systems [abstract]
Abstract: Complex systems are often inherently non-ergodic and non-Markovian and Shannon entropy loses its applicability. It will be shown that the entropy of non-ergodic systems can still be derived from three of the Shannon–Khinchin axioms and by violating the fourth, the so-called composition axiom. The corresponding entropy is of the form S_{c,d}∼\sum_i \Gamma(1+d,1−c \ln{p_i}) and depends on two system-specific scaling exponents, c and d. This entropy contains many recently proposed entropy functionals as special cases, including Shannon and Tsallis entropy. It will also be shown that this entropy is relevant for a special class of non-Markovian path-dependent random walks. We show for the first time that a generalised max ent formalism can be devised that allows to predict distribution functions of evolving path-dependent processes.
Stefan Thurner
45001 Integrated Information Theory Applied to Complex Systems Analysis [abstract]
Abstract: Integrated information theory (IIT) was originally proposed by Balduzzi and Tononi in 2008 and has been expanded by multiple authors since, predominantly in a theoretical neuroscience context. The main object of IIT, Phi, is an informational measure that attempts to quantify to what extent the parts of a dynamical system are simultaneously segregated (acting independently of each other) and integrated (acting as a whole). In our work, we have taken the broad concepts behind IIT and Phi and used them to gain further insight into the behaviour of complex systems. Among other systems, we have applied IIT to spiking neural networks and coupled Kuramoto oscillators. In both cases, we find that Phi-like measures can reliably identify phase transitions in the system and are related to other dynamical properties like criticality, metastability or synchronisation onset. Furthermore, IIT can offer an illustrative picture of the interdependence between parts of the system and their evolution at different timescales. To do this, we have proposed novel kinds of estimators for Phi-like measures on time series data of any kind, and studied their behaviour in terms of stability, robustness, finite sample effects and stationarity. Overall, we push IIT forward not as a fundamental neuroscientific theory, but as a useful tool to interpret and analyse information processing in complex systems.
Pedro A.M. Mediano and Murray Shanahan
45002 Information-Theoretic Distance between Complex Networks [abstract]
Abstract: Entropy and information-theoretic derived measures have successfully been applied in a range disciplines, revealing time scale dependence in neural coding, quantifying the complexity of genetic sequences and playing an even central role in the quantification of quantum information, to cite just a few representative achievements. However, when it comes to complex networks, an appropriate definition of entropy has remained elusive. Applicability being often limited to the probability distribution of some network descriptor (such as the normalized distribution of node degrees). Here, inspired by how entropy is calculated in quantum systems, we define an interconnectivity based density matrix to calculate the von Neumann entropy directly from a network. We prove that our definition satisfies the expected additivity properties of (quantum) thermodynamic entropy, in contrast to past approaches. We exploit this entropy to define network based information-theoretic measures such as Renyi q-entropy, generalized Kullback-Leibler and Jensen-Shannon divergences---as complexity indicators---and importantly to define a distance measure between complex networks. Using our mathematical framework, we are thus able to numerically probe contemporary problems faced in complex network science, recovering results related to model selection and clustering layers of multilayer networks. We found that both synthetic and empirical networks exhibit information-theoretic properties, indicating that the approach offers insights to quantify complexity and distinguish networks by means of a distance measures, providing a backbone to an information-theoretic approach to complex network science.
Manlio De Domenico and Jacob Biamonte
45003 Information Processing in Biomolecular Regulatory Networks [abstract]
Abstract: While living systems seem distinctive in their ability to process information and useful tools for quantifying information structure exist in complex systems research, they haven't been widely applied to biological networks. Hence, information processing in living systems is yet to be rigorously quantified. In our work we investigate informational architecture of Boolean model for various biomolecular regulatory networks. We also compare their informational properties to two classes, random and scale-free, of null model that share commonalities in their causal structure [1]. We report patterns in information processing that distinguish biological networks from random networks in terms of scaling relation, total amount of information processed and causal interaction. Based on the results, we suggest that previously unidentified information-based organizational principles that go beyond topological considerations, such as a scale-free structure, which may be critical to biological functions [2]. 1] H. Kim, P.C.W. Davies and S.I. Walker (2015) New Scaling Relation for Information Transfer in Biological Networks. J. Roy. Soc. Interface 12 20150944; DOI: 10.1098/rsif.2015.0944 [2] S.I. Walker, H. Kim and P.C.W. Davies (2016) The Informational Architecture of the Cell. Phil Trans A. 2016 374 20150057; DOI: 10.1098/rsta.2015.0057
Hyunju Kim, Paul Davies and Sara Imari Walker
45004 Thermodynamic Cost of Information Processing in Bio-Chemical Networks [abstract]
Abstract: Life is a non-equilibrium process involving information processing with biochemistry. Understanding the thermodynamic cost of these processes is important. For processes described by linear chemical networks, Stochastic Thermodynamics provides a powerful tools to do so. We illustrate this statement by characterizing the trade-offs between dissipation, speed and accuracy in kinetic proofreading [1]. However, for more complicated networks stochastic descriptions rapidly becomes prohibitive. We show that similar concepts can be extended to study chemical reaction networks (CNs) described by deterministic rate equations. In particular we derive a Landauer principle characterizing the amount of chemical work necessary to modify the population of a CN operating far from equilibrium [2]. [1] Rao, R. & Peliti, L. “Thermodynamics of accuracy in kinetic proofreading: dissipation and efficiency trade-offs”. J. Stat. Mech. Theor. Exp., 2015, P06001.[2] Rao, R. & Esposito, M. “Nonequilibrium Thermodynamics of Chemical Reaction Networks: Wisdom from Stochastic Thermodynamics”. arXiv 1602.07257.
Riccardo Rao, Massimiliano Esposito and Luca Peliti
45006 Applying Fisher Information to real data: Electric Vehicle charging behavior Omri Har Shemesh, Rick Quax, Alfons Hoekstra and Peter Sloot

Information Processing in Complex Systems  (IPCS) Session 2

Schedule Top Page

Time and Date: 14:15 - 18:00 on 20th Sep 2016

Room: B - Berlage zaal

Chair: Rick Quax

45010 Quantum Complexity [abstract]
Abstract: I intend to review several notions of classical complexity and discuss how they might be quantized. The standard technique from quantum information is to measure a given quantum system whose complexity we want to know and then apply the classical measures to the resulting classical statistics. In order to obtain the quantum complexity we finally optimize (minimize or maximize) over all classically obtained results. However, it might be more appropriate to measure quantum complexity directly, i.e. not going via classical statistics. How might that be done? I also discuss complexity from the perspective of how difficult it is to make entangled states. This will lead me introduce the concept of quantum macroscopicity.
Vlatko Vedral
45009 The classical-quantum divergence of complexity in the Ising spin chain [abstract]
Abstract: Most interesting systems in nature are often complex, lying in the region between order and randomness. Even though the idea of complexity is understood intuitively, it lacks a formal mathematical description. The statistical complexity, defined by Crutchfield as the minimum information required by a classical model to statistically simulate a process, serves to quantify complexity. Here, we show that the same idea of complexity behaves differently in the quantum information regime. We introduce the quantum statistical complexity as the minimum information required by a quantum model to statistically simulate a process, and show that it exhibits drastically different qualitative behavior when applied to the same system - the 1D Ising spin chain. Thus, we illustrate that the notion of what is complex depends on the information theory used to describe it.
Whei Yeap Suen, Jayne Thompson, Andrew Garner, Vlatko Vedral and Mile Gu
45011 Occam's Quantum Strop [abstract]
Abstract: A stochastic process's statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generatorto another. How much information is required, though, when synchronizing over aquantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process's cryptic order---a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost---one trades off prediction for generation complexity.
John Mahoney, James Crutchfield and Cina Aghamohammadi
45012 Classification of time-symmetry breaking in quantum walks on graphs [abstract]
Abstract: Most interesting systems in nature are often complex, lying in the region between order and randomness. Even though the idea of complexity is understood intuitively, it lacks a formal mathematical description. The statistical complexity, defined by Crutchfield as the minimum information required by a classical model to statistically simulate a process, serves to quantify complexity. Here, we show that the same idea of complexity behaves differently in the quantum information regime. We introduce the quantum statistical complexity as the minimum information required by a quantum model to statistically simulate a process, and show that it exhibits drastically different qualitative behavior when applied to the same system - the 1D Ising spin chain. Thus, we illustrate that the notion of what is complex depends on the information theory used to describe it. Quantum walks on graphs represent an established model capturing essential physics behind a host of natural and synthetic phenomena. Quantum walks have further been proven to provide a universal model of quantum computation and have been shown to capture the core underlying physics of several biological processes in which quantum effects play a central role. A 'single particle quantum walker' moves on a graph, with dynamics governed by Schrödinger's equation and quantum theory predicts the probability of a walker to transition between the graphs' nodes. Any quantum process in finite dimensions can be viewed as a single particle quantum walk. Until recently, quantum walks implicitly modeled only probability transition rates between nodes which were symmetric under time inversion. Breaking this time-reversal symmetry provides a new arena to consider applications of this symmetry breaking and to better understand its foundations. The main application discovered so far is that this symmetry breaking can be utilized as a passive means to control and direct quantum transport. A subtle interplay between the assignment of complex Hamiltonian edge weights and the geometry of the underlying network has emerged in a sequence of studies. This interplay has been central to several works, but in the absence of definitive statements, past work has only produced criteria for a process on a graph to be time-symmetric. Leaving the classification problem and its implications, open. Here we provide a full classification of the Hamiltonians which enable the breaking of time-reversal symmetry in their induced transition probabilities. Our results are furthermore proven in terms of the geometry of the corresponding Hamiltonian support graph. We found that the effect can only be present if the underlying support graph is not bipartite whereas certain bipartite graphs give rise to transition probability suppression, but not broken time-reversal symmetry. These results fill an important missing gap in understanding the role this omnipresent effect has in quantum information science.
Jacob Turner and Jacob Biamonte
45008 The Ambiguity of Simplicity [abstract]
Abstract: A system’s apparent simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Thus, notions of absolute physical simplicity—minimal structure or memory—at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham’s Razor or to the “elegance” of competing theories, may be fundamentally subjective, perhaps even beyond the purview of physics itself. It also presents a challenge to using quantum computers for statistical inference. Fortunately, experiments are now beginning to probe measures of simplicity, creating the potential to directly test for ambiguity.
James Crutchfield, Cina Aghamohammadi and John Mahoney
45007 Increasing excess entropy in the approach towards equilibrium in a reversible Ising dynamics model [abstract]
Abstract: Dynamic models of spin systems, based on microscopic reversibility and conserved energy, have been used for simulation of the Ising model and the approach towards equilibrium. The equilibrium is here determined by the set of distributions, over configurations of finite blocks of spins, with increasing sizes, that with the given energy constraint maximises the entropy density. Since also the entropy density is conserved in such a dynamics, a natural question is whether the full equilibrium is really reached and how? We investigate this question in detail by making an information-theoretic analysis of the one-dimensional, infinite lattice version of the Q2R model. Its two-dimensional version has been extensively used for simulation of the two-dimensional Ising model. Starting from a low entropy state, with appropriate statistics, it is shown that despite the conserved entropy, if entropy is estimated only from finite size block statistics, entropy appears to be increasing and the equilibrium state for the given energy is approached. By showing how the excess entropy increases during this process, it is clarified how local order is transformed into correlation information over increasing distances, explaining the apparent entropy increase and the approach towards equilibrium. The findings are discussed in the broader context of reversible microscopic dynamics, macroscopic irreversibility, and the second law of thermodynamics.
Kristian Lindgren and Eckehard Olbrich
45005 Mutual information reveals lower-level mechanisms, aiding higher level predictability [abstract]
Abstract: I will present some recent work on Shannon information theory applied to natural complex systems. As part of this we have developed a new correlation length function based on mutual information. I will discuss how it aids predictability of complex systems by detecting underlying mechanisms of change. One example I will present is a glass former, and I will discuss how this far-from-equilibrium many-particle system is representative of many other complex systems.
Karoline Wiesner
45013 Unbounded memory advantage in stochastic simulation using quantum mechanics [abstract]
Abstract: Simulations using real quantities on a digital computer require a trade-off between the precision to which these quantities are stored, and the memory required to store them. The limit of the simulation's precision is hence limited by the internal memory available to the simulator. In this presentation, I shall demonstrate using tools from statistical complexity theory and its quantum extensions that quantum information processing allows the simulation of stochastic processes to arbitrarily high precision at a finite memory cost. This demonstrates the unbounded memory advantage that a quantum computer can exhibit over its best possible classical counterpart when used for stochastic simulations.
Andrew Garner, Qing Liu, Jayne Thompson, Vlatko Vedral and Mile Gu