Economics  (E) Session 1

Schedule Top Page

Time and Date: 14:15 - 15:45 on 19th Sep 2016

Room: A - Administratiezaal

Chair: Marco Alberto Javarone

112 Forgiveness evolves to ensure cooperation in long-term agreements [abstract]
Abstract: Commitments for enhancing cooperation are widespread in human societies. They offer an alternative to punishment and rewards. Commitments are defined within the context of social dilemmas as agreements to cooperate with posterior compensations when any of the parties involved defects. It has been shown to be an evolutionarily viable strategy in one-shot social dilemmas. However, in many situations agreements aim to establish long-term mutually beneficial interactions. Our analytical and numerical results reveal under which conditions revenge, apology, forgiveness and ostracism can evolve and deal with mistakes within ongoing agreements in the context of the Iterated Prisoner's Dilemma. We show that, when the agreement fails, participants prefer to take revenge by defecting in the subsisting encounters. Incorporating costly apology and forgiveness reveals that, even when mistakes are frequent, there exists a sincerity threshold for which mistakes will not lead to the destruction of the agreement, inducing even higher levels of cooperation. The apology cost should be high enough in order to avoid fake committers that intent to take advantage of the system defecting and apologizing continuously, yet not too high to be worth it. We also show that when interactions are taking place among group of individuals, reinserting individuals that where expelled after defecting is more efficient than maintaining them ostracized with the cost that may come from it. Forgiveness is, in its different ways, an evolutionarily viable strategy which plays a fundamental role in inducing cooperation in repeated dilemmas.
Luis A. Martinez-Vaquero, The Anh Han, Luís Moniz Pereira and Tom Lenaerts
263 Discontinuity and convergence in global economies [abstract]
Abstract: Economies are complex adaptive systems, and investigation of their dynamics within a complex systems framework may provide a deeper understanding of their behavior and response to perturbation. We borrow methodologies from ecology to test whether global economies have discontinuous size distributions, a signature of multi-scale processes in complex adaptive systems, and we contrast the theoretical assumptions underpinning our methodology with that of the economic convergence club literature. Discontinuous distributions in complex systems consist of groupings of similarly-sized entities, such as animal body mass, firm size, or, perhaps, economies, separated by gaps, in a pattern of non-random departures from a continuous or power law distribution. We analyse constant per capita GDP for all countries of the world, from 1970-2012. We tested each yearly distribution for discontinuities, and then compared the distributions over time using multivariate modelling. We find that the size distributions of economies are discontinuous and that there are persistent patterns of aggregations and gaps over time, suggesting that there are scale domains of structuring processes that act as basins of attraction. These size classes are outwardly similar to convergence clubs, but are derived from theory that is a more appropriate fit to economic dynamics because it adequately incorporates realistic expectations of economies as complex adaptive systems. We argue that the underlying mechanisms, rather than emerging from conditions of initial equivalence, evolve and operate in multiple scale domains that can be objectively identified and assessed. Understanding the patterns within and across scale domains may provide insight into the processes that structure wealth over time.
Shana Sundstrom, Craig Allen and David Angeler
124 Systemic distortions in digital reputation [abstract]
Abstract: The digital economy is self-organizing into a “platform society”, where individuals exchange knowledge and goods on a P2P basis. P2P platforms rely on trust, which is typically established by requiring users to develop a digital reputation through peer-review mechanisms. Given that revenues from P2P business such as the Sharing Economy are expected to increase more than twentyfold over the next ten years, digital reputation will increasingly become a key commodity in our online lives, as it will determine access to substantial economic opportunities. P2P systems are often thought to promote more economic freedom and democratization. Yet, their current lack of regulation exposes them to malicious behavior. Indeed, users are often incentivized to reciprocate ratings in order to mutually boost reputation or retaliate. Such practices have distorting effects, as they mask “true reputation” and prevent users from making informed decisions about their peers. In this work we investigate the impact of reciprocity on reputation in three platforms (Slashdot, Epinions, Wikipedia) where users exchange binary ratings, and which can be conveniently cast as signed networks. We find that reciprocity is markedly over-expressed in all three systems with respect to a wide range of null hypotheses, and we provide evidence of the biases it introduces on reputation. At the macro level, we observe that the contribution to reputation from reciprocated ratings is systematically over-expressed and larger than the contribution from non-reciprocated ones. We identify the root causes of such bias at the micro level: we prove that the patterns of interactions between users, and the relationship between a user’s reputation and that of her nearest neighbors in the network, are markedly different from those observed under several null hypotheses. In our conclusions we highlight possible policy guidelines that would mitigate the distorting effects of the aforementioned biases.
Giacomo Livan, Fabio Caccioli and Tomaso Aste
203 Liquidity crises in the limit order book: a tale of two time scales [abstract]
Abstract: We present an empirical analysis of the microstructure of financial markets and, in particular, of the static and dynamic properties of liquidity. We find that on relatively large time scales (15 min) large price fluctuations are connected to the failure of the subtle mechanism of compensation between the flows of market and limit orders: in other words, the missed revelation of the latent order book breaks the dynamical equilibrium between the flows,triggering the large price jumps. This behavior naturally leads to a dynamical definition of liquidity. On smaller time scales (30 s), instead, the static depletion of the limit order book is an indicator of an intrinsic fragility of the system, which leads to a strongly nonlinear enhancement of the response, in terms of price impact, to incoming orders, even if their volume is small. In order to quantify this phenomenon, we introduce a static measure of the liquidity imbalance present in the book and we show that this quantity is correlated to both the sign and the magnitude of the next price movement. These findings prove that large price fluctuations are due to different mechanisms that act at different time scales and, as a consequence, the effective liquidity should be defined in relation to the time interval one wants to consider. Reference: Corradi, F., Zaccaria, A., and Pietronero, L. Liquidity crises on different time scales. Physical Review E, 92(6), 062802 (2015).
Andrea Zaccaria, Francesco Corradi and Luciano Pietronero

Economics  (E) Session 2

Schedule Top Page

Time and Date: 14:15 - 15:45 on 19th Sep 2016

Room: B - Berlage zaal

Chair: Roland Kupers

14 Vulnerability of Banking Networks Against Financial Contagion: Measures, Evaluations and Implications [abstract]
Abstract: Instabilities of major financial institutions during the recent financial crisis of 2007 and later have generated renewed interests in evaluating the stabilities (or, lack thereof) of banking networks among economists, regulatory authorities and other relevant segments of the population. In particular, one reason of such type of vulnerabilities to the so-called financial contagion process in which failures of few individual banks propagate through the "web of banking dependencies" to affect a significant part of the entire global banking system. Motivated by such observations, we consider the problem of defining and evaluating stabilities of both homogeneous and heterogeneous banking networks against propagation of synchronous idiosyncratic shocks given to a subset of banks. We formalize an extension of a financial network model originally proposed by Nier et al. for scenarios such as the over-the-counter derivatives market and its corresponding heterogeneous version, formalize the synchronous shock propagation procedures, define two appropriate stability measures and investigate the computational complexities of evaluating these measures for various network topologies and parameters of interest. We next perform a comprehensive evaluation of these stability measures over more than 700,000 combinations of networks types and parameter combinations. Based on our evaluations, we discover many interesting implications of our evaluations of the stability measures, and derive topological properties and parameter combinations that may be used to flag the network as a possible fragile network. An interactive software FIN-STAB for computing the stability is available from the website www2.cs.uic.edu/~dasgupta/financial-simulator-files.
Piotr Berman, Bhaskar Dasgupta, Lakshmi Kaligounder and Marek Karpinski
437 Financial complexity: network reconstruction, systemic risk, and early-warning signals [abstract]
Abstract: The global financial crisis shifted the interest from traditional measures of “risk” of individual banks to new measures of “systemic risk”, defined as the risk of collapse of an entire interbank system. In principle, estimating systemic risk requires the knowledge of the whole network of exposures among banks. However, due to confidentiality issues, banks only disclose their total exposure towards the aggregate of all other banks, rather than their individual exposures towards each bank. Is it possible to statistically reconstruct the hidden structure of a network in such a way that privacy is protected, but at the same time higher-order properties are correctly predicted? In this talk, I will present a general maximum-entropy approach to the problem of network reconstruction and systemic risk estimation. I will illustrate the power of the method when applied to various economic, social, and biological systems. Then, as a counter-example, I will show how the Dutch interbank network started to depart from its reconstructed counterpart in the three years preceding the 2008 crisis. Over this period, many topological properties of the network showed a gradual transition to the crisis, suggesting their usefulness as early-warning signals of the upcoming crisis. By definition, these early warnings are undetectable if the network is reconstructed from partial bank-specific information.
Diego Garlaschelli
581 Dynamics in two networks based on stocks of the US stock market [abstract]
Abstract: We follow the main stocks belonging to the New York Stock Exchange and to Nasdaq from 2003 to 2012, through years of normality and of crisis, and study the dynamics of networks built on two measures expressing relations between those stocks: correlation, which is symmetric and measures how similar two stocks behave, and Transfer Entropy, which is non-symmetric and measures the influence of the time series of one stock onto another in terms of the information that the time series of one stock transmits to the time series of another stock. The two measures are used in the creation of two networks that evolve in time, revealing how the relations between stocks and between industrial sectors changed in times of crisis. The two networks are also used in conjunction with a dynamic model of the spreading of volatility in order to detect which are the stocks that are most likely to spread crises, according to the model. This information may be used in the building of policies aiming to reduce the effects of financial crises.
Leonidas Sandoval
186 Entangling credit and funding shocks in interbank markets [abstract]
Abstract: Credit and liquidity risks represent main channels of financial contagion for interbank lending markets. On one hand, banks face potential losses whenever their counter-parties are under distress and thus unable to fulfill their obligations. On the other hand, solvency constraints may force banks to recover lost fundings by selling their illiquid assets, resulting in effective losses in the presence of fire sales---that is, when funding shortcomings are widespread over the market. Because of the complex structure of the network of interbank exposures, these losses reverberate among banks and eventually get amplified, with potentially catastrophic consequences for the whole financial system. Building on Debt Rank [Battiston et al., 2012], in this work we define a systemic risk metric that estimates the potential amplification of losses in interbank markets accounting for both credit and liquidity contagion channels: the Debt-Solvency Rank. We implement this framework on a dataset of 183 European banks that were publicly traded between 2004 and 2013, showing indeed that liquidity spillovers substantially increase systemic risk, and thus cannot be neglected in stress-test scenarios. We also provide additional evidence that the interbank market was extremely fragile up to the 2008 financial crisis, becoming slightly more robust only afterwards.
Giulio Cimini and Matteo Serri
423 EXPLORING THE COUNTERPARTY-LIQUIDITY RISK NEXUS USING A MULTI-AGENT NETWORK MODEL OF THE INTERBANK MARKET [abstract]
Abstract: We simulate a bilayer network comprising the two channels mentioned above. To generate the direct exposure network, we draw sample of bank sizes (refecting the size of their balance sheets) from a truncated power-law distribution. Citing the literature finding that interbank network typically exhibit a core-periphery architecture, we apply a fitness-based model that connects the nodes/banks according to their sizes. Using this algorithm, we obtain an undirected, unweighted network with a densely connected core of large banks and a sparsely connected periphery who's connections are mainly to the core. The network of overlapping port- folios is simulated using a simple random graph generation algorithm based on calibrations for the average diversification of banks vis-a-vis their securities portfolios. This results in a bipartite graph representation. The Agent-Based Model (ABM) begins with an idiosyncratic deposit shock whose sign determines banks' role as borrowers or lenders on the interbank market. Following this, borrowers distribute their aggregate liquidity requirement across their local network of counterparties. After borrower liquidity requests have been transmitted to lenders, the next step is to determine final loan volumes as well as interbank interest rates. In the next step, borrowers repay their loans following an asset price shock. In order to meet their contractual obligations and comply with policy constraints (capital ratio and a minimum reserve requirement), banks sell of a fraction of their assets. The total volume of firesalesthen puts further downward pressure on asset prices and thus impacts all banks holding the distressed assets in their portfolios.
Nicolas K. Scholtes

Economics  (E) Session 3

Schedule Top Page

Time and Date: 16:15 - 18:00 on 19th Sep 2016

Room: A - Administratiezaal

Chair: Marco Alberto Javarone

202 Which past should be used to forecast the future of China? [abstract]
Abstract: Forecasting the evolution of economic systems is one of the key challenges for Economics and it is a natural consequence of the ultimate mission of this discipline: explain why an economic system grows. A natural way to tackle this forecast is the use of the past but the state of an economic system is typically specified by a very huge set of indicators and variables. This fact prevents to provide reliable forecast especially on the long term. We will discuss how Economic Complexity, a data-driven approach to growth, can provide a scientific grounding to this challenge. The starting point will be a recent empirical framework, the Selective Predictability Scheme, hereinafter SPS, which provides a practical guide on how to use the past to forecast the future and in particular which past should be used to forecast which future. It also provides an estimate of the goodness of this strategy, introducing the concept of heterogeneity of the degree of predictability which challenges regressive approaches. In practice, the SPS acts as a feature selector in a suitable space which allows as to find events which are candidates to be predictors for the growth of a country and to simultaneously estimate their goodness. This space is defined by GDP per capita and by a synthetic measure of the competitiveness of a country named Fitness. The dynamics in this plane permits to interpret several facts concerning country growth developments as well, including the the exit from the poverty trap and why some emerging countries catch up with developed economies. We will also discuss how this search for analogues to forecast country’s future can be refined and extended even at a micro level by measuring the distance between countries’ productive systems at product and sector level.
Matthieu Cristelli, Andrea Tacchella and Luciano Pietronero
89 Can Twitter sentiment predict Earning Announcements returns? [abstract]
Abstract: Social media are increasingly reflecting and influencing behavior of other complex systems. We investigate the relations between Twitter and stock market, Dow Jones 30 in particular. In our previous work we adapted the well-known "event study" from economics [1] to the analysis of Twitter data. We defined "events" as peaks of Twitter activity, and automatically classified sentiment in Twitter posts. During the Twitter peaks, we found significant dependence between the Twitter sentiment and stock returns [2]. Can these results be used to devise a trading strategy? We focus on the Earning Announcements (EA) only, for which the dates are well known in advance. We compare the effects of Twitter sentiment on Cumulative Abnormal Returns (CAR), on the day of EA to the day before. Twitter sentiment is an indicator of the direction of CAR movement, when measured on the day of EA, consistent with our previous results [2]. Somehow surprisingly, the same relation holds even if we consider the Twitter sentiment on the day immediately _before_ the EA. The amount of CAR is lower (about 1%), but the dependence is still statistically significant at the 5% level. These results suggest a simple trading strategy, ignoring the transaction costs. Classify the Twitter sentiment before the EA, take a long position for positive sentiment, and short for negative sentiment. Currently we consider the Twitter and trading data during a period of 15 months. We plan to extend the study to the period of the last three years. We will also investigate if there are higher abnormal returns for specific classes of stocks. [1] MacKinlay. Event studies in economics and finance. Journal of economic literature:13--39, 1997. [2] Ranco, Aleksovski, Caldarelli, Grcar, Mozetic. The effects of Twitter sentiment on stock price returns. PLoS ONE 10(9):e0138441, 2015.
Igor Mozetic, Peter Gabrovsek, Darko Aleksovski and Miha Grcar
147 Architectures of Power: The Evolution of the Global Ownership Network [abstract]
Abstract: The global financial crisis has had a profound disruptive effect on the world economy. Indeed, the repercussions can be observed to this day. However, how has this upheaval impacted the structure of economic power worldwide? To answer this question, we present the first exhaustive analysis of the evolution of the entire global ownership network, comprised of tens of millions of nodes and links. By utilizing an efficient algorithm to detect the influence of individual (or groups of) economic actors in the network and analyzing the network's topology, the transformation of the global economic system is tracked in time. We observe resilience both in the network structure overall and specifically in the emergent power-structures harboring a few highly influential actors. In detail, the network approximately doubles in size from 2007 to 2012, starting with 16,636,351 nodes and 14,724,489 links. The network forms one dominant, largest connected component (LCC) displaying a bow-tie topology. The main feature in this structure is comprised of 715,629 nodes (+/-8.6%) and represents 70.5% (+/-2.1%) of the total economic value in the system (proxied by operating revenue). Our algorithm, for the first time, is able to compute the monetary value of economic influence of shareholders in USD, meaning the value of the portion of the network an actor can recursively influence. Not only can we now rank all the shareholders in the network but crucially compute the cumulative influence value for a set of economics actors. We uncover that the tiny core of the LCC is comprised of the most influential actors. Nested within this core is a "super-entity" made up of approximately 100 to 160 nodes, capable of wielding disproportionate influence corresponding to 16 to 20 trillion USD. Tracking the evolution of the cumulative influence value of the various power-structures in the network reveals their resilience.
James Glattfelder and Stefano Battiston
503 Higher-order correlations of consumption patterns in social-economic networks [abstract]
Abstract: We analyze a multi-modal dataset collecting the mobile phone communication and bank transactions of a large number of individuals living in Mexico [1]. This corpus allows for an innovative global analysis both in term of social network and its relation to the economical status and merchant habits of individuals. We introduce several measures to estimate the socioeconomic status of each individual together with their purchasing habits. Using these information we identify distinct socioeconomic classes, which reflect strongly imbalanced distribution of purchasing power in the population. After mapping the social network of egos from mobile phone interactions, we show that typical consumption patterns are strongly correlated with the socioeconomic classes and the social network behind. We observe these correlations on the individual and social class level. In the second half of our study we detected correlations between merchant categories commonly purchased together and introduced a correlation network which in turn emerged with communities grouping homogeneous sets of purchase categories. We further analyze some multivariate relations between merchant categories and average demographic and socioeconomic features, and find meaningful patterns of correlations giving insights into higher-order correlations in purchasing habits of individuals. We identify several new directions to explore in the future. One possible track would be to better understand the role of the social structure and interpersonal influence on individual purchasing habits, while the exploration of correlated patterns between commonly purchased brands assigns another promising directions. Beyond our over goal to better understand the relation between social and consuming behaviour we believe that these results may enhance applications to better design marketing, advertising, and recommendation strategies. [1] C. Sarraute, P. Blanc, and J. Burroni. A study of age and gender seen through mobile phone usage patterns in mexico. In Advances in Social Networks Analysis and Mining (ASONAM), 2014 IEEE/ACM, 836–843 (2014).
Yannick Léo, Márton Karsai, Carlos Sarraute and Eric Fleury
223 Credit Risk Valuation in Financial Networks [abstract]
Abstract: In this paper, we develop a new valuation model to carry out an ex ante valuation of the claims, in a network context, in the presence of uncertainty deriving from shocks on the external assets of banks, and, at the same time, providing an endogenous and consistent recovery rate. To our knowledge, our valuation model is the first to provide such a comprehensive and consistent way to carry out asset valuation in a network of liabilities. The new valuation model encompasses both the ex-post approaches of Eisenberg and Noe and Rogers and Veraart and the ex-ante approach of Merton and DebtRank in the sense that each of these models can be recovered with the appropriate parameter set. We characterize the existence and maximality of the solution of the valuation problem. Further, we define an algorithm to carry out the network-adjusted claim valuation and provide sufficient conditions for convergence to the maximal solution to a given precision in finite time. Finally we show that the valuation given by the new model converges towards Eisenberg and Noe valuation as maturity approaches, thus providing a consistent valuation procedure.
Paolo Barucca, Marco Bardoscia, Fabio Caccioli, Gabriele Visentin, Marco D'Errico, Guido Caldarelli and Stefano Battiston
61 Foods, fuels or finances: Which prices matter for biofuels? [abstract]
Abstract: We examine co-movements between biofuels and a wide range of commodities and assets in the US, Europe, and Brazil. We analyze a unique dataset of 32 commodities and relevant assets (between 2003 and 2015) which is unprecedented in the biofuels literature. We combine the minimum spanning trees correlation filtration to detect the most important connections of the broad analyzed system with continuous wavelet analysis which allows for studying dynamic connections between biofuels and relevant commodities and assets and their frequency characteristics as well. We confirm that for the Brazilian and US ethanol, their respective feedstock commodities lead the prices of biofuels, and not vice versa. This dynamics remains qualitatively unchanged when controlling for the influence of crude oil prices. As opposed to the Brazilian and US ethanol, the European biodiesel exhibits only moderate ties to its production factors. We show that financial factors do not significantly interact with biofuel prices.
Ondrej Filip, Karel Janda, Ladislav Kristoufek and David Zilberman

Economics  (E) Session 4

Schedule Top Page

Time and Date: 10:45 - 12:45 on 22nd Sep 2016

Room: N - Graanbeurszaal

Chair: Siew AnnCheong

168 A taxonomy of learning dynamics in 2 x 2 games [abstract]
Abstract: Learning is a convincing method to achieve coordination on a Nash Equilibrium (NE). But does learning converge, and to what? We answer this question in generic 2-player, 2-strategy games, using Experience-Weighted Attraction (EWA), which encompasses most extensively studied learning algorithms. We exhaustively characterize the parameter space of EWA learning, for any payoff matrix, and we understand the generic properties that imply convergent or non-convergent behaviour. Irrational choice and lack of incentives imply convergence to a mixed strategy in the centre of the simplex, possibly far from the NE. In the opposite limit, where the players quickly modify their strategies, the behaviour depends on the payoff matrix: (i) a strong discrepancy between the pure strategies is associated with dominance solvable games, which show convergence all the time; (ii) a substantial difference between the diagonal and the antidiagonal elements relates to coordination games, with multiple fixed points corresponding to the NE; (iii) a cycle in beliefs defines discoordination games, which commonly yield limit cycles or low-dimensional chaos. While it is well known that mixed strategy equilibria may be unstable, our approach is novel from several perspectives: we fully analyse EWA and provide explicit thresholds that define the onset of instability; we find an emerging taxonomy of the learning dynamics, without focusing on specific classes of games ex-ante; we show that chaos can occur even in the simplest games; we make a precise theoretical prediction that can be tested against data on experimental learning of discoordination games.
Marco Pangallo, James Sanders, Tobias Galla and Doyne Farmer
268 Causal Inference Using Multi-Channel Regime Switching Information Transfer Estimation [abstract]
Abstract: The past decade has seen the development of new methods to infer causal relationships in biological and socio-economic complex systems, following the expansion of network theory. Nevertheless, the standard estimation of causality still involves a single pair of time dependent variables which could be conditioned, in some instance, on its close environment. However, interactions may appear at a higher level between parts of the considered systems represented by more than one variable. We propose to study these types of relationships and develop a multi-channel framework, in the vein of Barrett and Barnett (Phys. Rev. E, 81 (2010)), allowing the inference of causal relationships between two sets of variables. Each channel represents the possible interaction between a variable of each sub-system. Based on this new framework, we develop two different multi-channel causality measures derived from the usual Granger causality to account for linear interactions and from the concept of transfer entropy for nonlinear contribution. Our measures provide different information about the inferred causal links: the strength of the global interaction between the two sub-systems, the average frequency of the channel switches and the channel contributing the most to the information transfer process for each time step. After having demonstrated the ability of our measures to infer linear as well as nonlinear interactions, we propose an application looking at the U.S. financial sector in order to better understand the interactions between individual financial institutions, as well as parts of the financial system. At the individual level, the considered channels between financial institutions are expressed both in terms of spectral representation using wavelet transform and probability distribution using quantile regressions. Beyond the application presented in the paper, this new multi-channel framework should be easy to implement in other fields of complex systems science such as neuroscience, biology or physics.
Carl-Henrik Dahlqvist
79 The geography of sleeping beauties in patenting: a country-level analysis [abstract]
Abstract: This study explores sleeping beauties, i.e. breakthrough inventions that experienced delayed recognition, by means of patent data. References in a patent signal the state of the art on which the patent is based, and they can limit the property rights established by its claims. A patent that is cited by many others, thus, includes some technology central to further developments. Patent citations can be used to study patented breakthrough inventions, identifying them as highly cited patents (Singh and Fleming, 2010; Castaldi et al, 2015). We add to this literature by analysing geographical determinants of the occurrence of sleeping beauties. A sleeping beauty is defined as a patent family that is both a sleeper (is not cited for at least x years after its priority date) and highly cited (receives at least x citations). Using this definition, with x=13, the database contains over 3,000 sleeping beauties. We hypothesize that the share of sleeping beauties in the output of a country and the share of sleeping beauties in the total of highly cited patents in a country is higher, the more geographically isolated the country is, reasoning that isolation renders diffusion and acceptance of new (radical) ideas more difficult. Geographical isolation is proxied by the mean geographical distance to all foreign patent inventors, measured both generally and specifically for each general technology class. We take into account the presence of international airports, average travel times, each country’s general proficiency in world languages, and the number of patents within each country, while controlling for technological and author variations across patents. Castaldi C, Frenken K, Los B (2015) Related Variety, Unrelated Variety and Technological Breakthroughs: An analysis of US State-Level Patenting. Regional Studies 49(5):767–781 Singh J, Fleming L (2010) Lone Inventors as Sources of Breakthroughs: Myth or Reality? Management Science 56(1):41–56
Mignon Wuestman, Koen Frenken, Jarno Hoekman and Elena Mas Tur
388 Comparing Density Forecasts in a Risk Management Context [abstract]
Abstract: In this paper we develop a testing framework for comparing the accuracy of competing density forecasts of a portfolio return in the downside part of the support. Three proper scoring rules including conditional likelihood, censored likelihood and penalized weighted likelihood are used for assessing the predictive ability of out-of-sample density forecasts, all closely related to the Kullback-Leibler information criterion (KLIC). We consider forecast distributions from the skew-elliptical family of distributions, as these are analytically tractable under affine transformations and projections onto linear combinations. We argue that the common practice to do forecast comparison in high-dimensional space can be problematic in the context of assessing portfolio risk, because a better multivariate forecast does not necessarily correspond to a better aggregate portfolio return forecast. This is illustrated by examples. An application to daily returns of a number of US stock prices suggests that the Student-t forecast distribution outperforms the Normal, Skew t and Skew Normal distributions in the left tail of the portfolio return. Additionally, the visualized dynamics of our test statistic provides empirical evidence for regime changes over the last thirty years. In a second application, techniques for forecast selection based on scoring rules are applied, and it turns out that the one-step-ahead Value-at-Risk (VaR) estimates from those dynamically selected time-varying distributions are more accurate than those based on a fixed distribution.
Cees Diks and Hao Fang
433 Investigating Open Innovation Collaborations Strategies between Organizations using Multi-level Networks and Dimensions of Similarity [abstract]
Abstract: Open innovation is a set of practices that enable organizations make direct use of external R&D to augment their internal research. Open innovation has received a lot of attention in the last decade, so it is of considerable interest to understand how widespread these practices are and how they affect the innovation process. Joint application for patents by multiple organizations is a form of open innovation that may result from joint R&D or other knowledge exchange between organizations. Interactions and collaborations affect the external knowledge potentially accessible by an organization, but they may also reduce the organization’s ability to appropriate the value of its internal knowledge. An optimal innovation strategy will balance these factors. We find that joint patent applications are relatively widespread and organizations utilise a range of strategies. To better understand some of the factors underpinning partner selection, we investigate the role of similarity between organizations and their impact on collaboration. We consider three dimensions of homophily, namely: technological proximity, geographical proximity as well as organization type (e.g. company, university, government agency). Here we construct a multi-level network in order to quantify these similarities between organizations. We define layers of the network as dimensions of homophily. These dimensions (layers) can be viewed as node attributes of bipartite networks. We use European Patent Office data dating back to 1978 for 40 countries with harmonized applicant names (OECD REGPAT and HAN databases) to construct four related bipartite networks relating organizations to patents, technological codes, geographic regions, and organization type. The respective one-mode projections can be combined as a co-organization network that is related by the different edge types: namely patents, technologies, geographical and organization type. This resulting network shows the structure of connections between organizations and the correlations between patent collaborations and the different dimensions of similarity under consideration.
Catriona Sissons, Demival Vasques, Dion O'Neale and Shaun Hendy
20 Bubbles in the Singapore and Taiwan Housing Markets are Dragon Kings [abstract]
Abstract: Asia is experiencing an unprecedented region-wide housing bubble right now. Should this bubble collapse, the economic and social fallouts are mind-boggling. As Asian governments race against time to defuse these ‘ticking bombs’, a deeper understanding of housing bubbles becomes necessary. By plotting the cumulative distribution functions (CDFs) of home prices per unit area in Singapore between 1995 and 2014, we found that these CDFs are stable over non-bubble years, and consist universally of an exponentially decaying body crossing over to a power-law tail. We also found in bubble years that dragon kings (positive deviations from the equilibrium distribution) develop near where the exponential body crosses over to the power-law tail. These were found in home price distribution of the Greater Taipei Area between Aug 2012 and Jul 2014, even though the two housing markets are structurally different. For the Singapore housing market, we also investigated the spatio-temporal dynamics of the bubble, and found price surges always start in a prestigious investment district, before propagating outwards to the rest of the island.
Darrell Jiajie Tay, Chung-I Chou, Sai-Ping Li, Shang-You Tee and Siew Ann Cheong

Economics  (E) Session 5

Schedule Top Page

Time and Date: 10:45 - 12:45 on 22nd Sep 2016

Room: A - Administratiezaal

Chair: Sumit Sourabh

231 Pathways towards instability in financial networks [abstract]
Abstract: Following the financial crisis of 2007-2008, a deep analogy between the origins of instability in financial systems and in complex ecosystems has been pointed out - in both cases, topological features of network structures influence how easy it is for distress to spread within the system. However, in financial network models, the intricate details of how financial institutions interact typically play a decisive role. Hence, a general understanding of precisely how network topology creates instability remains lacking. Here we show how processes that are widely believed to stabilise the financial system, i.e. market integration and diversification, can actually drive it towards instability, as they contribute to create cyclical structures which tend to amplify financial distress, thereby undermining systemic stability and making large crises more likely. This result holds irrespective of the precise details of how institutions interact, and demonstrates that policy-relevant analysis of the factors affecting financial stability can be carried out while abstracting away from such details.
Marco Bardoscia, Stefano Battiston, Fabio Caccioli and Guido Caldarelli
67 The role of networks in firms’ multi-characteristics competition and market-share inequality [abstract]
Abstract: We develop a location analysis spatial model of firms’ competition in multi-characteristics space, where consumers’ opinions about the firms’ products are distributed on multilayered networks. Firms do not compete on price but only on location upon the products’ multi-characteristics space, and they aim to attract the maximum number of consumers. Boundedly rational consumers have distinct ideal points/tastes over the possible available firm locations but, crucially, they are affected by the opinions of their neighbors. Our central argument is that the consolidation of a dense underlying consumers’ opinion network is the key for the firm to enlarge its market-share. Proposing a dynamic agent-based analysis on firms’ location choice we characterize multi-dimensional product differentiation competition as adaptive learning by firms’ managers and we argue that such a complex systems approach advances the analysis in alternative ways, beyond game-theoretic calculations.
Athanasios Lapatinas and Antonios Garas
569 Concentration and systemic risk in banking networks [abstract]
Abstract: Since the 2007–2009 financial crisis, mounting evidence suggests that failures of large banks represent a major risk for the resilience of banking networks. This finding is widely used to link the increasing concentration of financial markets with an increase in their fragility. However, the same argument can easily result in the mistaken idea that any market change associated with an increase in concentration also amplifies systemic risk. In this study we applied stress tests to both hypothetical and empirically calibrated banking networks to observe how various bank-size distributions affect systemic risk. We found that analogous to the resilience of ecosystems, no single property of banking networks could explain the probability of systemic failure. We quantified concentration in terms of the Herfindahl–Hirschman index and also identified an additional indicator, inequality, measured by Rao’s quadratic entropy, which is important for understanding the concentration–resilience relationship. We found, counterintuitively, that an increase in concentration was beneficial when it was not followed by an increase in inequality. Similarly, a decrease in concentration became harmful when it was not followed by a decrease in inequality. Mergers of large banks increased, whereas mergers of small banks decreased systemic risk. Splitting of large banks was also effective in reducing systemic risk if splitting was not overdone to the extent that it resulted in too many small banks. Our results provide a guideline that can be applied to frequent issues that regulators face, such as bank mergers.
Stojan Davidovic, Amit Kothiyal, Konstantinos Katsikopoulos and Mirta Galesic
92 The asymptotic dynamics of wealth inequality and income-wealth interactions [abstract]
Abstract: The rapid increase of wealth inequality in the past few decades is one of the most disturbing social and economic issues of our time. Studying its origin and underlying mechanisms is essential for policy aiming to control and even reverse this trend. In this talk, I will describe a novel theoretical approach using interacting multi-agent master-equations from which the dynamics of the wealth inequality emerge. Taking into account growth rate, return on capital and personal savings, it is possible to capture the historical dynamics of wealth inequality in the United States during the course of the 20th century. The personal savings rate is shown to be the single most important factor that governs the wealth inequality dynamics. Notably, its major decrease in the past 30 years can be associated with the current wealth inequality surge. The asymptotic dynamic behavior of wealth inequality, on the other hand, emphasize the importance of the economic output growth rate. Furthermore, the effects changes in the income distribution have on the distribution of wealth are discussed. Plausible changes in income tax are found to have an insignificant effect on wealth inequality, in the short run. The results imply, therefore, that controlling income inequality is an impractical tool for regulating wealth inequality.
Yonatan Berman, Eshel Ben-Jacob and Yoash Shapira
286 The climate-finance macro-network: mapping the exposures of the financial system to climate policy risks in the Euro Area [abstract]
Abstract: In the wake of the recent international climate negotiations there is growing interest on the impact of climate policies on the financial system and the possible role of financial institutions in facilitating decarbonization pathway of the global economy. However, there are no established methodologies to assess gains and losses, and data is scarce and scattered. Further, while it is now understood that the interlinkages among financial institutions can amplify both positive and negative shocks to the financial system, this is seldom investigated. Here, we take a complex-systems perspective to climate policies and we develop a methodology to map the macro-network of financial exposures among the institutional sectors (e.g. non-financial corporations, investment funds, banks, insurance and pension funds, governments and households). This macro-network can be regarded as a multiplex weighted network in which multiple types of links correspond to different financial instruments: equity holdings (ownership shares), corporate and sovereign bonds (tradable debt obligations) and loans (non-tradable debt obligations). We illustrate the approach on recently available data by investigating the evolution of the macro-network of institutional sectors in the Euro Area. In particular, we estimate the exposures of the financial sectors to climate-policy risks, building on our previously developed climate-stress approach (Battiston 2016, ssrn 2726076). We find that while direct exposures to the fossil sector are limited, the combined exposures to other climate relevant sectors are large for both insurance, pension funds and investment funds (about 30%-40%). Further, these sectors bear large indirect exposures via banks to the housing sector. As a result of climate policies supporting green technologies and discouraging brown technologies, large portions of assets on the balance sheet of financial institutions are potentially subject to positive or negative revaluation. Our work contributes to a better understanding of the governance of climate-finance arena.
Veronika Stolbova and Stefano Battiston
238 Circadian Rhythm of Cancellation Rates in Foreign Currency Market [abstract]
Abstract: The recent proliferation of automated traders increase significantly the reaction speed of financial markets. As a consequence, new detailed database becomes available to help researchers to describe precisely impacts of some characteristics on the market as a whole. For example, the field of Econophysics emerged to focus on market microstructure using physics methodology especially to describe order book fluctuation [1]. Market participants annihilate their orders either via cancellation or transaction. Then, cancellation of orders play a major role in market fluctuation and an increasing number of researches study this subject [2]. Our research mainly focus on studying the rate at which limit orders are cancelled in Foreign Currency Market according to the time of the day and the distance from the market price. We use a precise database of Electronic Broking System (EBS) which contains identification of every order. The database is composed of three weeks. Two of them are from March 06 21:00 (GMT) to March 18 21:00 (GMT), 2011 and one week is from October 30 21:00 (GMT) to November 04 21:00 (GMT), 2011. It contains information about injected and annihilated orders with minimal tick time of one millisecond. Our study mainly focus on USD/JPY, EUR/USD and EUR/JPY currency pairs. In investigating the cancellation rate of limit orders, we discover that it follows a circadian rhythm similar to transactions. In addition, cancellation rate of limit orders is high when the distance from market price is low and vice-versa. In other words, market participants are less patient when their orders have a small distance from the market price. [1] M. Takayasu, T. Watanabe and H. Takayasu, Approaches to Large-Scale Business Data and Financial Crisis, Springer, 2010 [2] X-H Ni, Z-Q Jiang, G-F Gu, F. Ren, W. Chen and W-X Zhou, Phy. A 389, 2751–2761 (2010)
Jean-François Boilard, Misako Takayasu and Hideki Takayasu

Economics  (E) Session 6

Schedule Top Page

Time and Date: 13:45 - 15:30 on 22nd Sep 2016

Room: B - Berlage zaal

Chair: Dexter Drupsteen

491 Agent Based Model Exploration and Calibration using Machine Learning Surrogates [abstract]
Abstract: Bringing Agent-Based Models closer to the data is an open challenge. While facilitating the comparison to more standard approaches, getting closer to the data promotes Agent-Based Models as a methodology. In this paper, we treat parameter space exploration from the machine learning problem setting of supervised learning and introduce machine learning surrogates as a fast and efficient means to explore positive calibrations from the parameter space. Three steps are involved: adaptively sampling a small number of simulations from the Agent-Based Model through the "active" learning problem setting, measuring the calibration quality of parameter combinations to real data with a chosen statistical hypothesis test, learn a powerful machine learning surrogate or "meta-model" on these "training" or modeling samples and rapidly filtering positive calibrations out of the parameter space for evaluation. Dramatic time savings are demonstrated by replacing the expensive Agent-Based Model and the machine learning surrogate. Though surrogates can potentially replace the agent-based model, we approach the simpler objective of filtering positive calibrations. Our aim is to provide a fast and efficient tool to explore the parameter space, while enabling policy-makers to evaluate and choose the particular parameterizations of interest. Finally, parameterizations of interest can be directly studied via the agent-based model. Ultimately, we do not wish to replace the agent-based model, but to help accelerate the turn-around time from real data to agent-based model calibrations that respect economic intuition and convey economic insight. We illustrate our approach by filtering positive calibrations (using the standard Kolmogorov-Smirnov two-sample test against the daily Standard and Poor's 500 Index) for the simple agent-based asset pricing model (introduced in "Heterogeneous beliefs and routes to chaos in a simple asset pricing model" by Brock and Holmes 1998) over ten parameters with generous ranges.
Francesco Lamperti, Antoine Mandel, Andrea Roventini and Amir Sani
558 The Echoes of Bandwagon Through a Complex System of Innovation and Development [abstract]
Abstract: Dating back from Schumpeter, literature on Innovation has evolved to the point of leading it’s object of study to the status of one of the main forces driving economic growth and development . The fact that Sollow's TFP black box is not so black anymore has probably something to do with understanding how the engine of innovation is greased. In this paper, we investigate if one of the cogwheels of this engine might be the bandwagon behaviour of consumers and its impact on the firm’s decision to engage on a certain type of innovative process. In order to do so, we introduce a new framework for complex agent-based models that is different from the commonly used Small Worlds Network, which we call Spatial Dynamic Awareness Model. Consumers have heterogeneous stochastic thresholds in respect to what we call “profile” towards new products and follow the distribution proposed by Moore (2005) as a baseline. They also have spatial mobility and bounded rationality (awareness), acquiring information and interacting only with agents inside their awareness radius to evaluate how many others are using a given product or technology and to ultimately decide to change their product of choice or not at each point in time. Firms on the other hand cannot see individual preferences, but analyses market saturation and concentration to decide on the amount of R&D investment and between process and product innovation. Simulations suggests that a society with a greater amount of crazy for technology individuals yields a faster saturation and de-concentration of the relevant market, generating more product than process innovations, higher mean prices and profits. We hope to reward the attendants of our presentation with new insights on network modelling and the importance of behavioural economics in better understanding the micro – macro process of innovation and economic development.
João Basilio Pereima and Pedro Einloft
233 Emergence of social networks due to human mobility [abstract]
Abstract: There is a recent burst of work on human mobility and social networks. However, the connection between these two important fields is still in its infancy or lack thereof. It is clear that both are closely related: People tend to visit popular places in a city with some frequency meeting other people there. If this occurs often, there is a chance of a friendship or acquaintance to emerge, linking people together. On the other hand, once you have established a social network, people tend to go together to the same places. In this way, there is feedback between human mobility in space and the structure of the social network. Mobility generates friends, and friends move together. We model the above situation with random walkers that visit places in space following a strategy akin to Lévy flights. We measure the encounters or coincidences in space and time and establish a link between walkers after they coincide several times. This generates a temporal network that is characterized by global quantities. We compare this dynamics with real data for two big cities: New York City and Tokyo. We use data from the location-based social network Foursquare and obtain the emergent temporal encounter network for New York City and Tokyo that we analyze in detail and compare with our model. Even though there are differences for the two cities, there are some common features: for instance, a long-range (Lévy-like) distribution of distances that characterize the emergent social network due to mobility in cities. This study contributes to the unification of two important fields: social networks and human mobility. Applications and implications to several fields like epidemics, social influence, voting, contagion models, behavioral adoption and diffusion of ideas will be discussed.
Jose L. Mateos and Alejandro P. Riascos
118 Using statistical symmetries to characterize binary time series of the foreign exchange market [abstract]
Abstract: We use the concept of statistical symmetry as the invariance of a probability distribution under transformation to analyze the sign dynamics of price difference in the foreign exchange market. Using a local hypothesis test with a stationary Markov process as model, we characterize different intervals of the sign time series of price difference as symmetric or not for the symmetries of independence and space odd reversion. For the test, we derive the probability of a binary Markov process to generate a given set of number of symbol pairs. As a particular result, we find that the foreign exchange market is essentially space odd reversible - interpreted as time reversible - but this symmetry is broken when there is a strong external influence. We also obtain that above a resolution of 90s the intervals of the sign time series are considered to be statistically symmetric implying that the direction of price movements in the market can be described by an independent random process.
Arthur Matsuo Yamashita Rios de Sousa, Hideki Takayasu and Misako Takayasu
218 Analysis, prediction and control of technological progress [abstract]
Abstract: Technological evolution is one of the main drivers of social and economic change, with transformative effects on most aspects of human life. How do technologies evolve? How can we predict and influence technological progress? To answer these questions, we looked at the historical records of the performance of multiple technologies. We first evaluate simple predictions based on a generalised version of Moore’s law. All technologies have a unit cost decreasing exponentially, but at a technology-specific rate. We then look at a more explanatory theory which posits that experience, measured as cumulative production, drives technological progress. These experience curves work relatively well in terms of forecasting, but in reality technological progress is a very complex process. To clarify the role of different causal mechanisms, we also study military production during World War II, where it can be argued that demand and other factors were exogenous. Finally, we analyse how to best allocate investment between competing technologies. A decision maker faces a trade-off between specialisation and diversification which is influenced by technology characteristics, risk aversion, demand and the planning horizon. Our methods are used to provide distributional forecasts for the cost of photovoltaic modules at different horizon, making it possible to evaluate their potential to provide an inexpensive source of energy in a relatively short horizon.
Francois Lafond
373 Portfolio Optimization under Expected Shortfall: Contour Maps of Estimation Error [abstract]
Abstract: The contour maps of the error of historical estimates for large random portfolios optimized under the Expected Shortfall (ES) risk measure are constructed. Similar maps for the sensitivity of the portfolio weights to small changes in the returns are also presented. The contour maps allow one to quantitatively determine the sample size (the length of the time series) required by the optimization for a given number of different assets in the portfolio, at a given confidence level and a given level of relative estimation error. The necessary sample sizes turn out to be unrealistically large for reasonable choices of the number of assets and the confidence level. These results are obtained via analytical calculations based on methods borrowed from the statistical physics of random systems, supported by numerical simulations.
Fabio Caccioli, Imre Kondor and Gábor Papp

Economics  (E) Session 7

Schedule Top Page

Time and Date: 13:45 - 15:30 on 22nd Sep 2016

Room: C - Veilingzaal

Chair: Francesca Lipari

494 Taming the leverage cycle [abstract]
Abstract: This paper focuses on the dynamical aspects of systemic risk in financial markets resulting from positive feedback loops in the interaction of risk management and asset markets. It will thereby highlight the importance of non-equilibrium approaches to understanding and tackling systemic risk in financial markets. We investigate a simple dynamical model for the systemic risk caused by the use of Value-at-Risk (VaR). The model consists of a bank with a leverage target and an unleveraged fundamentalist investor subject to exogenous noise with clustered volatility. The parameter space has three regions: (i) a stable region, where the system has a fixed point equilibrium; (ii) a locally unstable region, characterized by cycles with chaotic behavior; and (iii) a globally unstable region. A calibration of parameters to data puts the model in region (ii). In this region there is a slowly building price bubble, resembling the period prior to the Global Financial Crisis, followed by a crash resembling the crisis, with a period of approximately 10–15 years. While our model does not show that the financial crisis and the period leading up to it were due to VaR risk management policies, it does suggest that it could have been caused by VaR risk management, and that the housing bubble may have just been the spark that triggered the crisis. We also explore alternative leverage control policies based on their ability to minimize risk for a given average leverage. We find the best policy depends on the market impact of the bank. VaR is optimal when the exogenous noise is high, the bank is small and leverage is low; in the opposite limit where the bank is large and leverage is high the optimal policy is closer to constant leverage.
Christoph Aymanns, Fabio Caccioli, J Doyne Farmer and Vincent Tan
520 Why Do Banks Default Overnight? Modeling Edogenous Contagion on O/N Interbank Market [abstract]
Abstract: On September 15, 2008, the Lehman Brothers bank announced its bankruptcy. This started a panic on the US stock exchange and the mortgage crisis that has spread throughout the world. The consequences of these events are still visible today. Since the events of 2008, concepts such as systemic risk and financial contagion are in the common language. At the same time the development of models of the interbank market has gained tremendous momentum. We want to present a new model of banking system, focusing on daily time-scale and short-term activities, mainly overnight loans. In our model we take into account three possible ways of financial contagion. The first, most direct way of propagation is by a collapsing bank not paying its obligations. Banks that granted loans bear this loss, which worsens their financial situation. Other, perhaps less obvious, a falling bank, in order to pay its obligations, must sell its external assets in significant amounts, what results in an immediate and significant decrease in their value. Not only does it not recover the full value of the assets and repays liabilities to a smaller extent, it also affects the decrease in the value of assets held in the portfolios of other banks - worsening their situation. Last, but not least, there is the decline in the availability of interbank loans due to a decrease in trust. This results in banks having lower resistance to deterioration of their financial situation. Most of the previous models tested the system's reaction to an external shock e.g. collapse of one or more banks. In contrast, in our dynamical model of the entire banking system crashes can occur as an internal feature of the system. We will present results for artificial data as well as for empirical data from Polish interbank market.
Tomasz Gubiec and Mateusz Wilinski
316 Relaxation Analysis for the Layered Structure on the basis of the Order Book Data of FX Market [abstract]
Abstract: The amount of data has been radically increasing accompanied by the development of electronics devices, and the data set, so-called big data has attracted attention among econophysicists lately. One of the field where big data becomes available is foreign exchange market (FX market). The big data of FX market is called as the Order Book Data and it includes the data described below: i. Transaction price from start to end of the FX market ii. Order volume and order price of traders iii. Time when traders put an order and cancel it It is reported that there is a correlation between transaction price movement and behavior of traders, and its sign changes depending on which price range trader put an order at(Ref.[1,2]). The correlation implies that price movement and trader behavior are closely related and its relation enable us to understand the various property of price movement from traders behavior, including a sudden jump of price. There, however, have been few studies on the correlation between price movement and trader behavior. We study the statistical properties of traders behavior so as to understand that relation. We focus on the relaxation process for trader's order and report that there is a typical pattern for relaxation timescale, and it depends on which price range they are at. This result is consistent with the one shown by [2]. References [1] Y.Yura, H.Takayasu, D.Sornette, M.Takayasu, Physical Review E 92.4 (2015): 042811. [2] Y. Yura, H.Takayasu, D.Sornette, M.Takayasu, Phys. Rev. Lett. 112, 098703 (2015).
Takumi Sueshige, Kiyoshi Kanazawa, Hideki Takayasu and Misako Takayasu
187 Statistically similar portfolios and systemic risk [abstract]
Abstract: We propose a network-based similarity measure between portfolios with possibly very different numbers of assets and apply it to a historical database of institutional holdings ranging from 1999 to the end of 2013. The resulting portfolio similarity measure increased steadily before the 2008 financial crisis and reached a maximum when the crisis occurred. We argue that the nature of this measure implies that liquidation risk from fire sales was maximal at that time. After a sharp drop in 2008, portfolio similarity resumed its growth in 2009, with a notable acceleration in 2013, reaching levels not seen since 2007.
Stanislao Gualdi, Giulio Cimini, Kevin Primicerio, Riccardo Di Clemente and Damien Challet
212 From innovation to diversification: a simple competitive model [abstract]
Abstract: Few attempts have been proposed in order to describe the statistical features and historical evolution of the export bipartite matrix countries/products. An important standpoint is the introduction of a products network, namely a hierarchical forest of products that models the formation and the evolution of commodities. In the present article, we propose a simple dynamical model where countries compete with each other to acquire the ability to produce and export new products. Countries will have two possibilities to expand their export: innovating, i.e. introducing new goods, namely new nodes in the product networks, or copying the productive process of others, i.e. occupying a node already present in the same network. In this way, the topology of the products network and the country-product matrix evolve simultaneously, driven by the countries push toward innovation.
Fabio Saracco, Riccardo Di Clemente, Andrea Gabrielli and Luciano Pietronero

Economics  (E) Session 8

Schedule Top Page

Time and Date: 16:00 - 17:20 on 22nd Sep 2016

Room: B - Berlage zaal

Chair: Dexter Drupsteen

306 The Interdependence of Trade Networks of Value Added Goods and Services: Networks of Networks [abstract]
Abstract: There are two fundamentally different networks in play in global trade networks: the trade in goods and the trade in services. These two complementary networks describe the way in which economic resources are dynamically allocated within countries, local regions and across the globe. It is interesting to observe that these networks are the macroscopic consequences of the detailed microstructure of the trade in goods and services between individual market sectors. It is only recently that analysis of such detailed microstructure and the network-to-network exchange of economic resources at multiple scales has been made possible through the development of econometrically consistent ‘trade in value added’ data has become readily available to the broader research community. In an exploratory analysis of global economic trade, we combine well established complex network theory with the newly emerging methods from the ‘networks of networks’ field to uncover a rich diversity of interactions at multiple scales both within the networks and between the networks. While it is now well understood that individual networks have specific risks associated with specific network topologies, for example single node vulnerabilities pose different network risks for scale-free versus Erdos-Renyi networks, it is a more complex issue when considering interactions between networks. In this study we use the OECD’s trade in value added data set to study the capital flows that are exchanged between networks that give rise to specific risks that are not immediately apparent when the networks are considered in isolation. As the long term goals of such analysis needs to be to inform the debate of global economic risks, we conclude by discussing some of the practical consequences for our understanding of global economic trade.
Michael Harre, Alexandra Vandeness and Alex Li-Kim-Mui
41 Enhanced Gravity Model of trade: reconciling macroeconomic and network models [abstract]
Abstract: The International Trade Network (ITN) is involved in an increasing number of processes of relevance for the world economy, including globalization, integration, competitiveness, and the propagation of shocks and instabilities. Characterizing the ITN via a simple yet accurate model is an open problem. The traditional Gravity Model successfully reproduces the volume of trade between two connected countries, using macroeconomic properties such as GDP and geographic distance. However, it generates a network with a complete or homogeneous topology, thus failing to reproduce the highly heterogeneous structure of the real ITN. On the other hand, recent maximum-entropy network models successfully reproduce the complex topology of the ITN, but provide no information about trade volumes. Here we integrate these two currently incompatible approaches via the introduction of an Enhanced Gravity Model (EGM) of trade. The EGM is the simplest model combining the maximum-entropy network approach with the Gravity Model, while at the same time enforcing a novel ingredient that we denote as `topological invariance', i.e. the invariance of the expected topology under an arbitrary change of units of trade volumes. Via this unified and principled mechanism that is transparent enough to be generalized to any economic network, the EGM provides a new econometric framework wherein trade probabilities and trade volumes can be separately controlled by any combination of dyadic and country-specific macroeconomic variables. We show that the EGM successfully reproduces both the topology and the weights of the ITN, finally reconciling the conflicting approaches. Moreover, it provides a general and simple theoretical explanation for the failure of economic models that do not explicitly focus on network topology: namely, their lack of topological invariance.
Assaf Almog, Rhys Bird and Diego Garlaschelli
578 The International Mergers & Acquisitions Web: A Network Approach [abstract]
Abstract: This paper analyzes the world web of mergers and acquisitions (M\&As) using a complex network approach. We aggregate data of M\&As to build a temporal sequence of binary and weighted-directed networks, for the 1995-2010 period and 224 countries. We study different geographical and temporal aspects of the M\&As web, building sequences of filtered sub-networks which links belongs to specific intervals of distance or time. Then, we derive directed-network statistics to see how topological properties of the network change over space and time. The M\&A web is a low density network characterized by a persistent giant component with many external nodes and with a few number of reciprocated links. Clustering patterns are very heterogeneous and dynamic. High-income economies are characterized by high connectivity, these countries mainly merge to several high- and middle-income economies, implying that most countries might work as targets of a few acquirers. We find that distance strongly impacts the structure of the network: link-weights and node degrees are non-linear.
Rossana Mastrandrea, Marco Duenas, Matteo Barigozzi and Giorgio Fagiolo
206 Measuring the Coherence of Financial Markets [abstract]
Abstract: Financial Agent Based Models (ABM) have been developed aiming at understanding the Stylized Facts (SF) observed in the financial time series. ABM allowed to overcome mainstream's vision and concepts like the rational representative agent. ABMs are capable of explaining the role of elements such as heterogeneity of strategies and time horizons, contagion dynamics, intrinsic large fluctuations (endogenous) but they still can not be concretely useful in policy-making processes. In a series of papers regarding a minimal ABM [see Alfi V., et al. Eur. Phys. J. B, 67 (2009) 385] it is shown that a key element in order to measure systemic risk and financial distress is the effective number of agents or, in other words, the number of effectively independent strategies in the market. In order to verify this insight, we tried to develop strategies to empirically estimate this coherence. We discuss some preliminary results of a novel measure of the stock market coherence with a reference community for this research. A market becomes coherent when agents (i.e investing subjects) tend to behave similarly and consequently perform the same actions. In such a scenario, markets are maximally exposed to large positive feedbacks and self-reinforcing dynamics which can dramatically enhance even small/local financial shocks turning them into systemic and global crashes. Here we propose a simple stochastic model which allows to give a daily estimation of markets' coherence starting from the modeling of the correlation network among stocks. The parameters of the model are estimated via Monte-Carlo procedure applied to daily price time series. This measure is a promising index to assess systemic risk of financial systems. The measure does not simply reproduce standard risk measures as the realized and implied volatility and it especially appears to be informative on the building dynamics of events before financial crisis.
Matthieu Cristelli, Fabrizio Piasini and Andrea Tacchella