Foundations & Economics  (FE) Session 1

Schedule Top Page

Time and Date: 14:15 - 15:45 on 19th Sep 2016

Room: H - Ontvangkamer

Chair: Jose Mateos

372 Collaborations Across Economic and Scientific Domains [abstract]
Abstract: In this work we develop an agent-based model of strategic link formation in collaboration networks. In our representation, such networks are composed of agents (e.g. scientists or companies) connected by collaborative links that allow knowledge flow. Previous research has found that the growth of collaboration networks is driven by mechanisms which are both dependent (endogenous) and independent (exogenous) of collaboration patterns. Our model includes both of these allowing for the tuning of their relative importance. This permits the investigation of the effects and relation of both mechanisms. In addition, the model contains ingredients derived from previous empirical observations, such as the heterogeneous propensity of the agents to initiate collaborations and the presence of circles of influences, i.e. clusters of agents sharing a given membership attribute. We first test our model against a dataset listing around 15,000 inter-firm Research and Development (R&D) alliances (Thomson Reuters SDC Platinum), and then extend our validation on a large dataset of co-authored studies in scientific disciplines, listing over 73,000 papers. Remarkably, by fitting only three macroscopic network properties through a Maximum Likelihood approach, our model reproduces a number of microscopic measures characterizing the network topology. Furthermore, by quantifying the link probabilities from/to new-entrant and established agents, we find that the endogenous mechanisms are predominant over the exogenous ones in the formation of new collaborations. This result holds for both R&D and co-authorship networks, highlighting a strong similarity. At the same time, we detect a notable difference between the two systems: in the R&D domain, newcomers tend to enter the networks by forming links with established agents, while in the co-authorship domain newcomers prefer to link with other newcomers. We argue that this difference is due to higher entry barriers in economic systems than in academia.
Mario V. Tomasello, Giacomo Vaccario and Frank Schweitzer
356 Predicting link dissolution in an interfirm buyer-seller network [abstract]
Abstract: Evolution of complex networks are driven by nodes constantly forming and dissolving links. While link formation prediction has attracted a lot of attention, its counterpart link dissolution prediction is a less studied problem. Especially, link dissolution prediction lacks versatile prediction method where many previous studies utilizes domain specific information unavailable to obtain in different settings. In this presentation we present a versatile prediction method focusing on dynamics of groups (i.e. blocks, communities) instead of dynamics of each nodes. Our method is tested on an interfirm buyer-seller network data where link dissolution is extremely important. This is because losing a link implies losing a customer and hence directly related to the survival rate of a node (i.e. firm). We show that our method successfully improves in predictive accuracy compared to the state of the art methods. We also confirm a previous finding that link dissolution prediction is a more difficult task than link formation prediction.
Ryohei Hisano
182 Systemic Risk Assessment and Endogenous Valuation in Financial Networks [abstract]
Abstract: The financial crisis has shown how interconnections among financial institutions can lead to large amplifications of small shocks. After the crisis, consensus has emerged among practitioners and policymakers on the fact that a large fraction of losses in the banking system was due to the endogenous revaluation of inter-financial claims. This accounting principle, known as Credit Valuation Adjustment (CVA), implies that banks revalue their assets to reflect the deterioration in creditworthiness of counterparties. In contrast, most of the current works on stress-testing routinely claim that default contagion through interbank exposures is unlikely and negligible. In order to address this paradox, we develop a general network-based stress-testing framework to compare, both analytically and empirically, the five most popular financial distress propagation models. Analytically, we show that the widely adopted Eisenberg-Noe model (and subsequent extensions) fails to account for CVA and leads to severe underestimation of financial distress, as we prove by exhibiting upper bounds on contagion losses. Furthermore, we prove that in the Eisenberg-Noe model the banking system, despite being modeled as a network, is essentially treated as a single aggregate entity, thus ruling out the possibility for complexity to emerge. On the other hand, models that implement CVA, such as the DebtRank algorithm, are shown to be able to account for larger systemic vulnerabilities. These relations among the models are illustrated empirically on a dataset of the top 50 EU banks, between January 1st, 2006 and September 30th, 2015, by comparing the models’ behavior across several dimensions, i.e. magnitude of shocks, type of shocked asset and recovery rates. Furthermore, our results indicate that the DebtRank algorithm allows to identify build-ups of vulnerability in periods immediately preceding crises. Exploiting this feature, we use daily market capitalization data to implement a quasi real-time assessment of systemic vulnerability in the EU banking system.
Stefano Battiston, Marco D'Errico and Gabriele Visentin
197 Enhanced extraction of weighted networks backbone [abstract]
Abstract: The problem of information filtering in weighted complex networks is crucial in order to better understand the relevant features of a system. Indeed, a network characterized by a high edge density may be unmanageable by the usual tools like community detection algorithms or visualization techniques. A common solution to this issue is the application of a threshold on the edge weights; however, this procedure introduces a typical scale in a heterogeneous system, therefore destroying its multiscale nature. In this context, we have developed a technique to prune non-significant links in a weighted network. This filter is able to assign a p-value to each edge of the graph, based on the level of compatibility of the observed link with respect to its expected value according to a given null model. Here we consider the so-called Enhanced Configuration Model (ECM). This null model preserves both the degree and strength sequences of the empirical weighted network, and the resulting filtering technique (henceforth ECM filter) is therefore more refined than other edge sparsification methods based on the strengths only. We have applied our filter to the International Trade Network, namely the system displaying the import-export relations between countries, both in a temporal and multiplex perspective. We have compared the outcome with the results obtained from previously defined pruning techniques and studied how various structural properties change when different levels of significance (p-values) are considered. Furthermore, we have analysed other infrastructural and biological systems, showing that the ECM filter is suitable for a broad range of networks, displaying different levels of weight heterogeneity and edge density. We have shown that our method is able to correctly extract the backbone of the considered weighted networks and is also computationally efficient.
Valerio Gemmetto, Alessio Cardillo and Diego Garlaschelli
573 Modeling the price dynamics of a stock market: stability, market efficiency, and information processing [abstract]
Abstract: The question of what drives stock price movements is fundamental in the theory of financial markets, and has profound implications for forecasting and managing crises, as well as for foundational economic issues. We describe a theoretical and empirical study of stock market returns based on linear response theory, and propose a general model of return dynamics based on agent behavior that accurately describes the return responses observed in real-world markets. We assume a simple set of dynamical equations and use methods from statistical physics to uncover distinct “calm” regimes, where profits and losses are relatively small, and “frantic” regimes, where bubbles or crashes may occur. These regimes are separated by a phase transition, controlled by an empirically accessible model parameter that we introduce. We observe evidence of this transition in the returns of a wide range of financial institutions across multiple time periods. Moreover, we introduce several techniques for modeling and understanding the mechanisms by which a market may digest external information, and identify key parameters which encode both the response to outside signals and large scale stability of the market as a whole. This allows us to directly address, theoretically and empirically, important questions such as market efficiency and also to propose a connection between efficient or inefficient market operation to overall systemic stability. To test the influence of news on market behavior we use the large English language news dataset underlying the NewStream platform (http://newstream.ijs.si/), and develop machine learning methods to identify exactly which subsets of the news data influence market operation. Our work offers a novel theoretical framework for probing how markets respond to and digest external news, how this process may result in instabilities, and also identifies specific parameters which may serve as an early warning tool for detecting system-wide dynamics which lead to crashes.
Asher Mullokandov, Nima Dehmamy, Igor Mozetic, Petra Kralj Novak, Irena Vodenska, Gene Stanley, Sebastian Gemsheim and Jose Morales

Foundations & Economics  (FE) Session 2

Schedule Top Page

Time and Date: 16:15 - 18:00 on 19th Sep 2016

Room: D - Verwey kamer

Chair: Silvia Bartolucci

325 Systemic risk in multiplex networks with asymmetric coupling and threshold feedback [abstract]
Abstract: We study cascades on a two-layer multiplex network, with asymmetric feedback that depends on the coupling strength between the layers. Based on an analytical branching process approximation, we calculate the systemic risk measured by the final fraction of failed nodes on a reference layer. The results are compared with the case of a single layer network that is an aggregated representation of the two layers. We find that systemic risk in the two-layer network is smaller than in the aggregated one only if the coupling strength between the two layers is small. Above a critical coupling strength, systemic risk is increased because of the mutual amplification of cascades in the two layers. We even observe sharp phase transitions in the cascade size that are less pronounced on the aggregated layer. Our insights can be applied to a scenario where firms decide whether they want to split their business into a less risky core business (layer A) and a more risky subsidiary business (layer B). In such setting a failure (or bankruptcy) on the core layer implies a failure on the subsidiary layer as well, as the failed firm is out of business. On the other hand, a failure on the subsidiary layer only decreases a firm's failure threshold on the core layer and, thus, increases its absolute failure probability in the core business. We show that in most cases, this kind of business diversification may lead to a drastic increase of systemic risk, which is underestimated in an aggregated approach.
Rebekka Burkholz, Matt V. Leduc, Antonios Garas and Frank Schweitzer
215 Detecting early-signs of the 2007 crisis in the world trade [abstract]
Abstract: Since 2007, several contributions have tried to identify early-warning signals of the financial crisis. However, the vast majority of analyses has focused on financial systems and little theoretical work has been done on the economic counterpart. In the present paper we fill this gap and employ the theoretical tools of network theory to shed light on the response of world trade to the financial crisis of 2007 and the economic recession of 2008-2009. We have explored the evolution of the bipartite World Trade Web (WTW) across the years 1995-2010, monitoring the behavior of the system both before and after 2007. Our analysis shows early structural changes in the WTW topology: since 2003, the WTW becomes more and more compatible with the picture of a network where correlations between countries and products are progressively lost. Moreover, the WTW structural modification can be considered as concluded in 2010, after a seemingly stationary phase of three years. We have also refined our analysis by considering specific subsets of countries and products: the most statistically significant early-warning signals are provided by the most volatile macrosectors, especially when measured on emerging economies, suggesting the latter as the most sensitive indicators of the WTW health.
Fabio Saracco, Riccardo Di Clemente, Andrea Gabrielli and Tiziano Squartini
77 The hidden relationship between Zipf’s law and segregation [abstract]
Abstract: Modern societies are often faced to segregation dictated by race, religion, social status or incomes differences. The understanding of the rise of such phenomenon has thus attracted a lot of attention from economists, politicians and sociologists. We first introduce a metapopulation version of the Schelling model showing that an hidden relationship emerges, for low tolerances, between the segregation patterns and the population variability in different urban areas. In particular, we observe that the population frequencies of each node, emerging from the model, once ordered according to the population ranking, follow a Zipf’s law. Motivated by this theoretical result we analyze the internal composition of several metropolitan areas in the US. We show that a universal Zipf’s law is present also at the urban scale and that a correlation between the population heterogeneity and the segregation patterns can be observed. Moreover we analyze the internal urban preferences for different ethnic groups, using the z-score for identifying the overestimation of a certain ethnic group in each zip. We show that density “preferences” can be observed and that the ethnic composition strongly depends on the density.
Floriana Gargiulo and Timoteo Carletti
74 Emergence of giant strongly connected components in continuum disk-spin percolation [abstract]
Abstract: We propose a continuum model of percolation in two dimensions for overlapping disks with spin. In this model the existence of bonds is determined by the distance between the centers of the disks, and by the scalar product of the (randomly) directed spin with the direction of the vector connecting the centers of neighboring disks. The direction of a single spin is controlled by a “temperature”, representing the amount of polarization of the spins in the direction of an external field. Our model is inspired by biological neuronal networks and aims to characterize their topological properties when axonal guidance plays a major role. We numerically study the phase diagram of the model observing the emergence of a giant strongly connected component, representing the portion of neurons that are causally connected. We provide strong evidence that the critical exponents depend on the temperature.
Francesco Caravelli, Marco Bardoscia and Fabio Caccioli
97 Input-output relationship in social communication datasets characterized by neuronal spike train analysis [abstract]
Abstract: As human communication datasets become increasingly rich, various approaches have been employed to improve the network modeling in order to uncover hidden aspects of human dynamics. In particular, records of communication events between individuals with high temporal resolutions have enabled us to study dynamical properties of networks rather than static ones, in the emerging field of temporal networks. We study the dynamical properties of human communication through different channels, i.e., short messages, phone calls, and emails, adopting techniques from neuronal spike train analysis in order to characterize the temporal fluctuations of successive inter-event times. We measured the so-called local variation (LV) of incoming and outgoing event sequences of users, which is originally designed to characterize temporal fluctuations in spike train data. We found that these in- and out-LV values are positively correlated for short messages, and uncorrelated for phone calls and emails. An important originality of our work is to focus on the relationship between incoming and outgoing events involving social agents and its impacts on temporal fluctuations. Similarly to neurons, receiving inputs and integrating them to send outputs, social agents are subject to incoming messages that may, or not, trigger reactions. In order to test this idea and to understand the observed LV-correlations, we analyzed the response-time distribution in empirical datasets and developed a generalized Hawkes process to model the observed dynamical properties. Numerical simulations of the model indicate that a quick response to incoming events and a refractory effect after outgoing events are key factors to reproduce the positive LV correlations. This investigation of the input-output relationship in human messaging processes may provide us important insight on how information flows in human communications.
Takaaki Aoki, Taro Takaguchi, Ryoya Kobayashi and Renaud Lambiotte
418 Natural hazards, individual behavior and non-linearities in market flood risk assessments [abstract]
Abstract: Flood risk is one of the most frequently occurring disasters worldwide, and cost-benefit analysis is widely applied to assess management strategies. The central number that influences the balance between costs and benefits here is the price of flood risk. It is usually assessed either through the analysis of market-level data using hedonic pricing or by eliciting individual willingness to pay to avoid risk through surveys. The two approaches constitute two different paradigms – micro vs macro – and often produce contrasting results in pricing flood risks. The assessment is also sensitive to the timing after the shock. At the times of natural disasters price trends in flood prone areas shift significantly and abruptly implying that there are systemic changes in property markets. On the one hand, it implies that transactions in the past may not be representative anymore when making current price assessments or projections for the future. On the other hand, it is essential to trace the link between individual risk perceptions and macro-level market outcomes as the former fuel these structural market shifts. This calls for new computation methods for assessing capital-at risk and its fluctuations as shocks occur and markets aggregate individual reactions to these natural hazards. We present an agent-based model of a housing market covering flood prone areas, in which we not only utilize the most recent sales in conducting market price predictions but also explicitly test the evolution of housing prices (and consequently the price of flood risk) emerging from interactions of heterogeneous household agents with various individual representations of risk perceptions. We compare market outcomes under three common behavioral models: expected utility, prospect theory and risk negligence Our results demonstrate non-linearity between agents’ individuals risk perceptions and aggregated price discount, which uncovers the nature of the gap between the two measurement approaches.
Tatiana Filatova and Koen de Koning