Foundations & Economics (FE) Session 1
Time and Date: 14:15 - 15:45 on 19th Sep 2016
Room: H - Ontvangkamer
Chair: Jose Mateos
|372|| Collaborations Across Economic and Scientific Domains
Abstract: In this work we develop an agent-based model of strategic link formation in collaboration networks. In our representation, such networks are composed of agents (e.g. scientists or companies) connected by collaborative links that allow knowledge flow. Previous research has found that the growth of collaboration networks is driven by mechanisms which are both dependent (endogenous) and independent (exogenous) of collaboration patterns. Our model includes both of these allowing for the tuning of their relative importance. This permits the investigation of the effects and relation of both mechanisms. In addition, the model contains ingredients derived from previous empirical observations, such as the heterogeneous propensity of the agents to initiate collaborations and the presence of circles of influences, i.e. clusters of agents sharing a given membership attribute. We first test our model against a dataset listing around 15,000 inter-firm Research and Development (R&D) alliances (Thomson Reuters SDC Platinum), and then extend our validation on a large dataset of co-authored studies in scientific disciplines, listing over 73,000 papers. Remarkably, by fitting only three macroscopic network properties through a Maximum Likelihood approach, our model reproduces a number of microscopic measures characterizing the network topology. Furthermore, by quantifying the link probabilities from/to new-entrant and established agents, we find that the endogenous mechanisms are predominant over the exogenous ones in the formation of new collaborations. This result holds for both R&D and co-authorship networks, highlighting a strong similarity. At the same time, we detect a notable difference between the two systems: in the R&D domain, newcomers tend to enter the networks by forming links with established agents, while in the co-authorship domain newcomers prefer to link with other newcomers. We argue that this difference is due to higher entry barriers in economic systems than in academia.
|Mario V. Tomasello, Giacomo Vaccario and Frank Schweitzer|
|356|| Predicting link dissolution in an interfirm buyer-seller network
Abstract: Evolution of complex networks are driven by nodes constantly forming and dissolving links. While link formation prediction has attracted a lot of attention, its counterpart link dissolution prediction is a less studied problem. Especially, link dissolution prediction lacks versatile prediction method where many previous studies utilizes domain specific information unavailable to obtain in different settings. In this presentation we present a versatile prediction method focusing on dynamics of groups (i.e. blocks, communities) instead of dynamics of each nodes. Our method is tested on an interfirm buyer-seller network data where link dissolution is extremely important. This is because losing a link implies losing a customer and hence directly related to the survival rate of a node (i.e. firm). We show that our method successfully improves in predictive accuracy compared to the state of the art methods. We also confirm a previous finding that link dissolution prediction is a more difficult task than link formation prediction.
|182|| Systemic Risk Assessment and Endogenous Valuation in Financial Networks
Abstract: The financial crisis has shown how interconnections among financial institutions can lead to large amplifications of small shocks. After the crisis, consensus has emerged among practitioners and policymakers on the fact that a large fraction of losses in the banking system was due to the endogenous revaluation of inter-financial claims. This accounting principle, known as Credit Valuation Adjustment (CVA), implies that banks revalue their assets to reflect the deterioration in creditworthiness of counterparties. In contrast, most of the current works on stress-testing routinely claim that default contagion through interbank exposures is unlikely and negligible. In order to address this paradox, we develop a general network-based stress-testing framework to compare, both analytically and empirically, the five most popular financial distress propagation models. Analytically, we show that the widely adopted Eisenberg-Noe model (and subsequent extensions) fails to account for CVA and leads to severe underestimation of financial distress, as we prove by exhibiting upper bounds on contagion losses. Furthermore, we prove that in the Eisenberg-Noe model the banking system, despite being modeled as a network, is essentially treated as a single aggregate entity, thus ruling out the possibility for complexity to emerge. On the other hand, models that implement CVA, such as the DebtRank algorithm, are shown to be able to account for larger systemic vulnerabilities. These relations among the models are illustrated empirically on a dataset of the top 50 EU banks, between January 1st, 2006 and September 30th, 2015, by comparing the models’ behavior across several dimensions, i.e. magnitude of shocks, type of shocked asset and recovery rates. Furthermore, our results indicate that the DebtRank algorithm allows to identify build-ups of vulnerability in periods immediately preceding crises. Exploiting this feature, we use daily market capitalization data to implement a quasi real-time assessment of systemic vulnerability in the EU banking system.
|Stefano Battiston, Marco D'Errico and Gabriele Visentin|
|197|| Enhanced extraction of weighted networks backbone
Abstract: The problem of information filtering in weighted complex networks is crucial in order to better understand the relevant features of a system. Indeed, a network characterized by a high edge density may be unmanageable by the usual tools like community detection algorithms or visualization techniques. A common solution to this issue is the application of a threshold on the edge weights; however, this procedure introduces a typical scale in a heterogeneous system, therefore destroying its multiscale nature. In this context, we have developed a technique to prune non-significant links in a weighted network. This filter is able to assign a p-value to each edge of the graph, based on the level of compatibility of the observed link with respect to its expected value according to a given null model. Here we consider the so-called Enhanced Configuration Model (ECM). This null model preserves both the degree and strength sequences of the empirical weighted network, and the resulting filtering technique (henceforth ECM filter) is therefore more refined than other edge sparsification methods based on the strengths only. We have applied our filter to the International Trade Network, namely the system displaying the import-export relations between countries, both in a temporal and multiplex perspective. We have compared the outcome with the results obtained from previously defined pruning techniques and studied how various structural properties change when different levels of significance (p-values) are considered. Furthermore, we have analysed other infrastructural and biological systems, showing that the ECM filter is suitable for a broad range of networks, displaying different levels of weight heterogeneity and edge density. We have shown that our method is able to correctly extract the backbone of the considered weighted networks and is also computationally efficient.
|Valerio Gemmetto, Alessio Cardillo and Diego Garlaschelli|
|573|| Modeling the price dynamics of a stock market: stability, market efficiency, and information processing
Abstract: The question of what drives stock price movements is fundamental in the theory of financial markets, and has profound implications for forecasting and managing crises, as well as for foundational economic issues. We describe a theoretical and empirical study of stock market returns based on linear response theory, and propose a general model of return dynamics based on agent behavior that accurately describes the return responses observed in real-world markets. We assume a simple set of dynamical equations and use methods from statistical physics to uncover distinct “calm” regimes, where profits and losses are relatively small, and “frantic” regimes, where bubbles or crashes may occur. These regimes are separated by a phase transition, controlled by an empirically accessible model parameter that we introduce. We observe evidence of this transition in the returns of a wide range of financial institutions across multiple time periods. Moreover, we introduce several techniques for modeling and understanding the mechanisms by which a market may digest external information, and identify key parameters which encode both the response to outside signals and large scale stability of the market as a whole. This allows us to directly address, theoretically and empirically, important questions such as market efficiency and also to propose a connection between efficient or inefficient market operation to overall systemic stability. To test the influence of news on market behavior we use the large English language news dataset underlying the NewStream platform (http://newstream.ijs.si/), and develop machine learning methods to identify exactly which subsets of the news data influence market operation. Our work offers a novel theoretical framework for probing how markets respond to and digest external news, how this process may result in instabilities, and also identifies specific parameters which may serve as an early warning tool for detecting system-wide dynamics which lead to crashes.
|Asher Mullokandov, Nima Dehmamy, Igor Mozetic, Petra Kralj Novak, Irena Vodenska, Gene Stanley, Sebastian Gemsheim and Jose Morales|