Economics & Socio-Ecology  (ES) Session 1

Schedule Top Page

Time and Date: 14:15 - 15:45 on 19th Sep 2016

Room: J - Derkinderen kamer

Chair: Dexter Drupsteen

153 Analysis of collaboration networks enhanced through NRF early career funding program in Singapore [abstract]
Abstract: The objective of our research is to investigate how an early career research funding enhances the output of National Research Foundation (NRF) Fellows by studying their co-authorships networks and visualizing emerging topics of research of the NRF 2008 cohort through topic modeling of their paper abstracts. NRF Fellowship offers substantial grants worth up to S$3 million (~ € 1.9 M) over 5 years and is open to international applications without restriction on nationality. Since 2008, NRF Fellowship has attracted over a thousand top scientists and awarded nearly a hundred fellowships to date. We analyze collaboration patterns by performing network analysis on the 2008 NRF cohort. In addition, we study network influence and robustness by examining the graph density and degree distribution of co-authorship networks. Collaboration patterns are also examined at a geographic level to understand if the research collaborations are local in nature or distributed at a global level. Lastly, topic modeling with Latent Dirichlet Allocation (LDA) is used as a potential tool for identifying emerging topics of research so that research funding agencies can better support these areas. Increased density of collaborations can be clearly observed for NRF Fellows from co-authorship network graphs during the Fellowship period. Geographical diversity of collaboration for Fellows is also higher than those Finalists (who did not get the award) as evidenced through Circos visualization. Lastly, a visualization of topic models shows that NRF fellows are conducting research on emerging topics from genetic engineering to graphene. Traditionally, bibliometric methods have been utilized to measure the output of researchers. We have used alternative methods from Network theory and Computer Science to analyze the dividends of an early career research funding program leading to greater cross-border academic collaborations and research on emerging topics that may lead to critical breakthroughs for future industries.
Anand Gautam, Giovanni Ko, Walter Theseira and Michael Khor
183 A co-evolutionary approach to sustainability transitions in a two sector economic growth model. [abstract]
Abstract: The Anthropocene is thought of as the age of the humans, where in the context of a World-Earth [or social-ecological Earth] system, man-made processes and environmental dynamics can not be treated separately anymore. It is at the core of human agency, to keep the trajectory of this coupled system within the boundaries of a save and just operating space, to ensure prosperity for future generations. In all the common business-as-usual scenarios of future economic development and greenhouse gas emissions, this is not likely to happen. Therefore, we aim at investigating sustainability transitions towards independence from fossil resources. The German Energiewende has proven that besides economic realities, social dynamics such as opinion spreading can play a significant role in the choice of energy sources and thereby resource dependence of a society. Consequently, we study the co-dependence and co-evolution of social, economic and resource dynamic processes. Since we are interested in qualitative behavior of this complex system such as transient behavior and phase transitions rather than quantitative predictions, we use a conceptual model for our studies. This model combines a resource-dependent two-sector economic growth model with heterogeneous households with Fast and Frugal heuristics for household decision making, as well as an adaptive network approach to opinion spreading amongst households. We use analytic and numeric tools to analyze the models state space, to gain insight in its topological structure and phase transitions depending on parameter choices and to point out trajectories that lead to favorable resource dependencies.
Jakob Kolb and Jobst Heitzig
386 Inference of phylogenetic structure from the interaction matrices of mutualistic ecosystems [abstract]
Abstract: Nestedness is a feature of many bipartite interaction networks found in nature and social sciences. In nested bipartite networks, with two sets of nodes and edges only between nodes belonging to different sets, specialist nodes, namely those with fewer interactions, interact mainly with generalist nodes. Similar structures arise in trade networks as well (e.g. countries-products bipartite networks) and their study is the main focus of the relatively new field of Economic Complexity. Both in ecosystems and economics, such structures of interaction result from an evolutive process, and here we show to what extent it is possible to leverage them to infer phylogenetic relations among species, with methods devised in the context of economics. We project the bipartite network into two monopartite ones based on similarity of interactions and consequently filter the network structure to keep only the most important links. We use independently collected phylogenetic data to assess the accuracy of the proposed methods in identifying phylogenetically related species by only using information on their interactions. We compare the proposed methods against standard network approaches in their ability to isolate communities and topological structures of phylogenetically close species. Previous works on mutualistic interaction networks have already pointed out correlations between degree sequencies and phylogenetic information. Here we show that in most cases the methods we use are able to extract information that is lost on randomized copies of the interaction networks where only the degree sequence is conserved.
Andrea Tacchella and Giacomo Banti
435 A heterogeneous agent model of transition towards a circular economy [abstract]
Abstract: In this paper we analyse the transition towards a circular economy as a complex adaptive system focusing on the contribution of underlying demand-side factors. To do this a heterogeneous agent model, HAM (Brock and Hommes, 1997) is developed with a population of boundely rational heterogeneous agents choosing between two varieties of a consumer goods or service – a 'circular' and a 'non-circular' type. The model is further extended to include the effect of ‘Word of mouth’ by linking the HAM to a percolation model (Frenken, 2012). The model is highly nonlinear due to evolutionary switching between strategies and can exhibit a wide range of dynamical behavior ranging from a unique stable steady state to complex state dynamics with multiple equilibria. For which there can be changes in the qualitative structure of the set of solutions with completely different economic outcomes if parameters are varied. For a classification of these outcomes based on different value of parameters we use bifurcation analysis. Using this analysis we seek to answer the following question: what are the necessary behavioral and market conditions for obtaining a stable market share of the 'circular' type? An Environmental Extended Input-Output database is used to calculate some resource-efficiency indicators for these outcomes. The results and methods developed in this paper is applied to a case of second life of tires in the Netherlands.
Saeed Moghayer, Trond Husby and Hettie Boonman
240 An evolving network model for the structure of visitors and services in a tourism destination [abstract]
Abstract: We present a growing network model to explain the visitors' behavior in a tourism destination. Specifically, we build an evolving bipartite network with two categories of nodes, lodgings (H) and services/attractions (S). In every lodging, we assume a sole tourist whose behavior is the average of all tourist's behavior hosted in the lodging. A link between a lodging and service appears if the representative tourist visits/enjoys the service during his/her staying in the destination. We assume that links are unweighted, undirected and permanent along time. The latter assumption means that, once a service is visited by a high enough amount of tourists staying in a certain lodging, the preference for this service is maintained by successive guests. The bipartite network grows similarly to previous models for collaboration networks (e.g. Ramasco et al., Phys. Rev. E, 70, 036106, 2004). At any time, one new lodging and m new services are created in the destination. We assume that the representative tourist of all new hotels visit c different services, including the old and m new ones, following part of them a preferential attachment and the other part a random rule. We show analytically that the long-term degree distribution of services in the bipartite network follows a shifted power-law distribution. This is also the case for the degree of the one-mode projections. We have also tested the model with real data. Specifically we have collected recommendations of lodgings and services in the destination of Maspalomas-Gran Canaria (Spain) published by tourists in during the period 2005-2016, with a sample size of around 78.000 opinions on 222 hotels and 768 services/attractions. To the extent of our knowledge, this is the first growing network model to represent the structure of supply and demand in a tourist destination.
Juan M Hernandez and Christian González

Economics & Socio-Ecology  (ES) Session 2

Schedule Top Page

Time and Date: 16:15 - 18:00 on 19th Sep 2016

Room: R - Raadzaal

Chair: Neil Huynh Hoai Nguyen

445 Stock-dependent discrete-time dynamic pollution games [abstract]
Abstract: In this paper we address the dynamics of strategic actions in a pollution game model where the payoff matrix of the game depends endogenously on the accumulation or depletion of an environmental stock variable described by a non-linear discrete time dynamical system. The model has the structure of a stock-dependent dynamic game, that is a game where the payoffs to the players changes with the evolution of an external state variable, which in turn depends on the action of the players. More specifically, we analyze a two-player stock-dependent dynamic game in which the stage game is a normal form game with myopic players who use Markov stationary strategy, that is they play a stage game Nash equilibrium in each period. A dynamic consequence of stock-dependent payoffs on the strategies played is that the qualitative structure of the solution of the stock-dependent stage game might change as the stock variable evolves. The method is applied to the discrete time dynamic ecological-economic interest conflicts in the lake pollution problem in which by varying the economic parameters we study the pattern of possible changes in the Nash equilibria structure of the stage game played by the farmers at each time period as a result of the change in the payoffs ranking, classify the corresponding economic outcomes and environmental impacts. The qualitative changes in the game dynamic is represented geometrically in the payoffs space utilizing the notion of the bifurcation diagram. The diagram is partitioned into twelve different regions corresponding to different types of the two-by-two games. We also vary the stiffness or responsiveness of the lake by varying and analyze its consequences on the dynamic of the game.
Saeed Moghayer, Florian Wagener and Paolo Zeppini
201 A Dynamic Network Of Physical Contacts Derived From A Multiplex Social Network [abstract]
Abstract: We present our work on a multiplex block-structured social network among a statistically sampled population, which we use to induce a time-varying network of physical contacts. We generate up to 8 Mio nodes with age, sex and place of residence etc. and allocate them into blocks on different layers/regimes like households and workplaces based on statistical and demographic data. From survey data we sample social relations among the nodes which result in physical contacts with a predefined periodicity. Sampling of social relations happens either on a per-block and -layer basis or based on additional assortative (wealth and geographic position) and clustering principles. It is clear that sampling of such "high-dimensional" structures involves a great number of heuristics and assumptions (e.g. commuting area for workplaces, ...). On the one hand, we try to assess the influence of some assumptions and identify steps and concepts in the sampling approach which are irrelevant or negligible to the resulting social network structure. Secondly we try to validate the resulting structure with other contact models by investigating classical measures as well as simple epidemic spread simulations on the secondary quasi-dynamic contact network (which also is a main application scenario). Furthermore we try to find network generation algorithms and parameters that deliver the same key characteristics as the data-driven network generation approach.
Günter Schneckenreither and Niki Popper
337 Complex economies have a lateral escape from the poverty trap [abstract]
Abstract: In the standard view of the industrialization process of countries, these have to face a barrier to escape from the poverty trap, which is a monetary threshold defined in terms of average wage (as measured, for example, by GDP per capita, GDPpc in the following) or physical capital. When such a threshold is reached, a self-feeding process quickly brings the country from one state of equilibrium (the poverty trap) to another, catching up with the fully developed countries. During this transition, the growth is mainly due to inputs, that is, to capital investments and increase in the labor force. This view, if correct, should lead to a positive relation between the growth due to inputs and GDPpc for low values of GDPpc, and a negative relation for high values of GDPpc, i.e. to an upside down parabola. Only the first relation is supported by the empirical evidence. In this work we use a non-monetary measure of the economic complexity of a country, called Fitness, and we see that complex economies start to industrialize with a lower threshold. On the contrary, if the Fitness is low, a sustainable growth can be reached only if a higher standard, monetary threshold is reached. As a consequence, we can introduce the concept of a two-dimensional poverty trap: a country will start the industrialization process if it is not complex but rich (following the neo-classical economic theories), or if it is poor but very complex (exploiting this new dimension and laterally escaping from the poverty trap), or a linear combination of the two. This naturally leads to our proposal of a Development Index that, when studied as a function of the growth due to input, shows the same shape of an upside down parabola that one was expecting from the standard economic theories. Reference: arXiv:1511.08622.
Emanuele Pugliese, Andrea Zaccaria, Guido Chiarotti and Luciano Pietronero
360 Games of corruption in preventing the overuse of common-pool resources [abstract]
Abstract: Overuse and corruption are major obstacles to managing common-pool resources. Although corruption has traditionally been studied in economics, evolutionary game theory offers certain advantages when analyzing corruption in the context of managing common-pool resources. By incorporating the selection process, evolutionary game theory explicitly accounts for the temporal dynamics of the investigated system. This inclusion of the selection process permits the coupling of evolutionary and resource dynamics to create a more general and powerful analytical framework for the management of common-pool resources. Within such a framework, it is possible to study (i) the influence of ecological parameters (e.g. the resource growth rate) on the prospects that a resource will be overused and (ii) the efficiency of decision making in situations when the resource is still in a transient state due to the slow convergence of the resource dynamics. The traits of our analytical framework are: (i) an arbitrary number of harvesters share the responsibility to sustainably exploit a specific part of an ecosystem, (ii) harvesters face three strategic choices for exploiting the resource, (iii) a delegated enforcement system is available if called upon, (iv) enforcers are either honest or corrupt, and (v) the resource abundance reflects the choice of harvesting strategies. The resulting dynamical system is bistable; depending on the initial conditions, it evolves either to cooperative or defecting equilibria. Using the domain of attraction to cooperative equilibria as an indicator of successful management, we find that the more resilient the resource (i.e. the higher the growth rate), the more likely the dominance of corruption suppressing the cooperative outcome. A qualitatively similar result arises when slow resource dynamics relative to the dynamics of decision-making mask the benefit of cooperation. We discuss the implications of these results in the context of managing common-pool resources.
Marko Jusup, Joung-Hun Lee and Yoh Iwasa
576 Nonlinear analysis of climate data by techniques of complex networks - ClimNet [abstract]
Abstract: Climate analysis is a fast growing field of study with many economic, political and social implications. In the last years many techniques of complex network analysis have found application in climate research. This big data approach consists of embedding patters found in climate variables, as pseudo-periodic changes in temperature, pressure or rainfall into the topology of complex networks by means of appropriate linear and nonlinear measures. These patterns are also found to be acting on various time scales, such as synoptic atmospheric waves in the extra-tropics or longer time scale events in the tropics. The analysis of this topology can yield insight on the actual behaviour of regional or global climate, taking advantage of the well-developed branch of network analysis. In this paper we present ClimNet, a software toolkit for the construction of complex networks’ adjacency matrices from climate time series. Apart from finding linear and nonlinear relationships, ClimNet also provides ways to fine-tune relationships to different time-scales by means of symbolic ordinal analysis. The functions available in the software and their performance on multi-core platforms are introduced in this work.
J. Ignacio Deza and Hisham Ihshaish
500 Scaling laws in Cities - a statistical mechanics approach [abstract]
Abstract: The main function and the essence of the city is to generate interactions in space and time between different individuals. Human settlements leverage interactions by spatially joining its inhabitants. Accordingly Bettencourt, the cities are organized on a network: the mere fact that individuals are spatially close increases the number of potential encounters between them. Theoretically, the probability of encounters and interaction grows as the population and produces reductions in transaction costs and communication society, and other co-location advantages. Socioeconomic related properties of a city seems to grow faster than a linear relationship with the population, called superlinear scaling. Reversely, the larger a city, the more efficient it is in the use of its infrastructure, leading to a sublinear scaling. To build a quantitative theory of cities we must take into account the city's geometry to calculate aggregate amounts that generate social and infrastructure indexes. In the present work we address a simple explanation for the scaling law in cities based on fractal properties of the cities and as well as on the behavior of individuals. We follow the statistical mechanics approach and the result was a very simple model, just to obtain the scaling of social and infrastructure indicators. For that, we introduce a measure of social potential energy which capture the influence of social interaction on the economic performance and the benefits of facilities in the case of infrastructure offered by the city. We assume that the population density depends on the fractal dimension and the individual interaction intensity decay with distance. As a result we obtain a power Law scaling for social indexes and for infrastructure. Those scaling are coherent with empirical data. An agent based model was proposed to make computational experiments and test the hypothesis.
Fabiano Ribeiro, Joao Meirelles, Camilo Rodrigues Neto and Fernando Fagundes Ferreira

Economics & Socio-Ecology  (ES) Session 3

Schedule Top Page

Time and Date: 16:00 - 17:20 on 22nd Sep 2016

Room: E - Mendes da Costa kamer

Chair: Andrew Schauf

193 The development of countries on the product progression network [abstract]
Abstract: Is there a common path of development for different countries, or each one must follow its own way? In order to produce cars, one has to learn how to produce wheels before? Let us represent countries as walkers in a network made of goods, defined such that if a country steps on one product, it will export it. Obviously, paths can be very different: while Germany has already explored much of the available space, underdeveloped countries have a long road ahead. Which are the best paths in the product network? To answer these questions we build a network of products starting from the UN-Comtrade data about the international trade flows over time. A possible approach is to connect two products if many countries produce both of them. Wanting to study the countries’ dynamics, we want also our links to indicate if one product is necessary to produce the other, like transistors for smartphones and wheels for cars. So our network is directed: a country usually goes from one product to another, but not vice versa. We introduce an algorithm that, starting from the empirical bipartite country-product network, is able to extract this kind of information. In particular, we project the bipartite network onto a filtered monopartite one in which a suitable normalization takes into account the nested structure of the system. We find that countries follow the direction of the links during industrialization. In other words, we are able to spot which products are helpful to start to export new products. These results suggest paths in the product network which are easier to achieve, and so can drive countries’ policies in the industrialization process and to exit from the poverty trap. Reference: Zaccaria, A., Cristelli, M., Tacchella, A., and Pietronero, L., PloS one, 9(12), e113770 (2014).
Andrea Zaccaria, Matthieu Cristelli, Andrea Tacchella and Luciano Pietronero
397 The Effect of Marketing Strategies on the Percolation of Innovations in Social Networks with Negative Word-of-Mouth [abstract]
Abstract: Because real-world marketing experiments are costly, firms make use of diffusion models to decrease uncertainty. Over the last few years Agent Based Models of Percolation have received increased attention in the literature, in which information about the existence of an innovation propagates through neutral Word-of-Mouth (WOM) between adopters and their susceptible neighbors, and product (e.g. price or quality) and promotion (seeding) strategies can be experimented with (cf. Solomon et al., 2000). A limitation of the basic percolation model such as Solomon et al. (2000) is that actors only receive WOM, but their attitude towards adoption remains unaffected by Positive- and Negative Word-of-Mouth (PWOM and NWOM). Although the effects of PWOM and NWOM have been studied empirically, only few extensions on the basic percolation model have been made capturing these effects (e.g. models on NWOM by Erez et al. (2004), and social reinforcement by Mas Tur (2016)). Addressing this gap, I will extend the standard percolation model by including the effect of NWOM in an actor’s decision process (from neighboring rejecters). With this model I will simulate percolation on small-world networks and test the effectiveness of price and seeding strategies to overcome the effects of NWOM on percolation size. As the relationship between price and diffusion size is highly non-linear, at some price (the percolation threshold) a small change causes the network to shift from almost no diffusion to almost full diffusion. However, NWOM may hamper percolation and an increase of seeds may prove to be more effective than lowering the price. A further contribution will be a model where awareness not only propagates from adopters but from rejecters as well, as it can be assumed that ‘negative’ information might also inform actors. Although the network will be fully informed, the effect from NWOM on percolation size may be substantially larger.
Daan Edelkoort
249 Early identification of high-quality papers [abstract]
Abstract: Seminal papers are usually recognized as such only many years after publication. Citation-based indicators of paper impact share this lag and often implicitly penalize recent papers that had less time to attract citations and thus cannot score well. Using insights from complex network analysis, we introduce a new article-level metric which allows us to early identify the papers that later become highly regarded. This metric – called rescaled PageRank score - is based on combining the classical PageRank centrality metric with the explicit requirement that paper score is not biased by paper age. We analyze here the network of citations among the 449935 papers published by the American Physical Society journals between 1893 and 2009, and focus on a group of papers labeled as Milestone Letters by the editors of Physical Review Letters, a leading physics journal. We compare various metrics with respect to their ability to identify the milestone papers and show that rescaled PageRank score outperforms the other metrics. The performance gap between rescaled PageRank and PageRank is particularly wide in the first years after paper publication, and it takes 15 years for PageRank to reach the identification level of the rescaled score. Due to its ability to recognize high-quality papers earlier than other metrics, rescaled PageRank score could prove particularly useful for the evaluation of young researchers, who are disadvantaged by indices biased by age and may be forced to leave academia if their potential is not appreciated promptly enough. The score proposed here may find further applications in other contexts, such as early identification of viral content or high-quality websites in the World Wide Web.
Manuel Sebastian Mariani, Matus Medo and Yi-Cheng Zhang