Economics (E) Session 6
Time and Date: 13:45 - 15:30 on 22nd Sep 2016
Room: B - Berlage zaal
Chair: Dexter Drupsteen
|491|| Agent Based Model Exploration and Calibration using Machine Learning Surrogates
Abstract: Bringing Agent-Based Models closer to the data is an open challenge. While facilitating the comparison to more standard approaches, getting closer to the data promotes Agent-Based Models as a methodology. In this paper, we treat parameter space exploration from the machine learning problem setting of supervised learning and introduce machine learning surrogates as a fast and efficient means to explore positive calibrations from the parameter space. Three steps are involved: adaptively sampling a small number of simulations from the Agent-Based Model through the "active" learning problem setting, measuring the calibration quality of parameter combinations to real data with a chosen statistical hypothesis test, learn a powerful machine learning surrogate or "meta-model" on these "training" or modeling samples and rapidly filtering positive calibrations out of the parameter space for evaluation. Dramatic time savings are demonstrated by replacing the expensive Agent-Based Model and the machine learning surrogate. Though surrogates can potentially replace the agent-based model, we approach the simpler objective of filtering positive calibrations. Our aim is to provide a fast and efficient tool to explore the parameter space, while enabling policy-makers to evaluate and choose the particular parameterizations of interest. Finally, parameterizations of interest can be directly studied via the agent-based model. Ultimately, we do not wish to replace the agent-based model, but to help accelerate the turn-around time from real data to agent-based model calibrations that respect economic intuition and convey economic insight. We illustrate our approach by filtering positive calibrations (using the standard Kolmogorov-Smirnov two-sample test against the daily Standard and Poor's 500 Index) for the simple agent-based asset pricing model (introduced in "Heterogeneous beliefs and routes to chaos in a simple asset pricing model" by Brock and Holmes 1998) over ten parameters with generous ranges.
|Francesco Lamperti, Antoine Mandel, Andrea Roventini and Amir Sani|
|558|| The Echoes of Bandwagon Through a Complex System of Innovation and Development
Abstract: Dating back from Schumpeter, literature on Innovation has evolved to the point of leading it’s object of study to the status of one of the main forces driving economic growth and development . The fact that Sollow's TFP black box is not so black anymore has probably something to do with understanding how the engine of innovation is greased. In this paper, we investigate if one of the cogwheels of this engine might be the bandwagon behaviour of consumers and its impact on the firm’s decision to engage on a certain type of innovative process. In order to do so, we introduce a new framework for complex agent-based models that is different from the commonly used Small Worlds Network, which we call Spatial Dynamic Awareness Model. Consumers have heterogeneous stochastic thresholds in respect to what we call “profile” towards new products and follow the distribution proposed by Moore (2005) as a baseline. They also have spatial mobility and bounded rationality (awareness), acquiring information and interacting only with agents inside their awareness radius to evaluate how many others are using a given product or technology and to ultimately decide to change their product of choice or not at each point in time. Firms on the other hand cannot see individual preferences, but analyses market saturation and concentration to decide on the amount of R&D investment and between process and product innovation. Simulations suggests that a society with a greater amount of crazy for technology individuals yields a faster saturation and de-concentration of the relevant market, generating more product than process innovations, higher mean prices and profits. We hope to reward the attendants of our presentation with new insights on network modelling and the importance of behavioural economics in better understanding the micro – macro process of innovation and economic development.
|João Basilio Pereima and Pedro Einloft|
|233|| Emergence of social networks due to human mobility
Abstract: There is a recent burst of work on human mobility and social networks. However, the connection between these two important fields is still in its infancy or lack thereof. It is clear that both are closely related: People tend to visit popular places in a city with some frequency meeting other people there. If this occurs often, there is a chance of a friendship or acquaintance to emerge, linking people together. On the other hand, once you have established a social network, people tend to go together to the same places. In this way, there is feedback between human mobility in space and the structure of the social network. Mobility generates friends, and friends move together. We model the above situation with random walkers that visit places in space following a strategy akin to Lévy flights. We measure the encounters or coincidences in space and time and establish a link between walkers after they coincide several times. This generates a temporal network that is characterized by global quantities. We compare this dynamics with real data for two big cities: New York City and Tokyo. We use data from the location-based social network Foursquare and obtain the emergent temporal encounter network for New York City and Tokyo that we analyze in detail and compare with our model. Even though there are differences for the two cities, there are some common features: for instance, a long-range (Lévy-like) distribution of distances that characterize the emergent social network due to mobility in cities. This study contributes to the unification of two important fields: social networks and human mobility. Applications and implications to several fields like epidemics, social influence, voting, contagion models, behavioral adoption and diffusion of ideas will be discussed.
|Jose L. Mateos and Alejandro P. Riascos|
|118|| Using statistical symmetries to characterize binary time series of the foreign exchange market
Abstract: We use the concept of statistical symmetry as the invariance of a probability distribution under transformation to analyze the sign dynamics of price difference in the foreign exchange market. Using a local hypothesis test with a stationary Markov process as model, we characterize different intervals of the sign time series of price difference as symmetric or not for the symmetries of independence and space odd reversion. For the test, we derive the probability of a binary Markov process to generate a given set of number of symbol pairs. As a particular result, we find that the foreign exchange market is essentially space odd reversible - interpreted as time reversible - but this symmetry is broken when there is a strong external influence. We also obtain that above a resolution of 90s the intervals of the sign time series are considered to be statistically symmetric implying that the direction of price movements in the market can be described by an independent random process.
|Arthur Matsuo Yamashita Rios de Sousa, Hideki Takayasu and Misako Takayasu|
|218|| Analysis, prediction and control of technological progress
Abstract: Technological evolution is one of the main drivers of social and economic change, with transformative effects on most aspects of human life. How do technologies evolve? How can we predict and influence technological progress? To answer these questions, we looked at the historical records of the performance of multiple technologies. We first evaluate simple predictions based on a generalised version of Moore’s law. All technologies have a unit cost decreasing exponentially, but at a technology-specific rate. We then look at a more explanatory theory which posits that experience, measured as cumulative production, drives technological progress. These experience curves work relatively well in terms of forecasting, but in reality technological progress is a very complex process. To clarify the role of different causal mechanisms, we also study military production during World War II, where it can be argued that demand and other factors were exogenous. Finally, we analyse how to best allocate investment between competing technologies. A decision maker faces a trade-off between specialisation and diversification which is influenced by technology characteristics, risk aversion, demand and the planning horizon. Our methods are used to provide distributional forecasts for the cost of photovoltaic modules at different horizon, making it possible to evaluate their potential to provide an inexpensive source of energy in a relatively short horizon.
|373|| Portfolio Optimization under Expected Shortfall: Contour Maps of Estimation Error
Abstract: The contour maps of the error of historical estimates for large random portfolios optimized under the Expected Shortfall (ES) risk measure are constructed. Similar maps for the sensitivity of the portfolio weights to small changes in the returns are also presented. The contour maps allow one to quantitatively determine the sample size (the length of the time series) required by the optimization for a given number of different assets in the portfolio, at a given confidence level and a given level of relative estimation error. The necessary sample sizes turn out to be unrealistically large for reasonable choices of the number of assets and the confidence level. These results are obtained via analytical calculations based on methods borrowed from the statistical physics of random systems, supported by numerical simulations.
|Fabio Caccioli, Imre Kondor and Gábor Papp|