Retrospectives: Engel Curves Chai, Andreas; Moneta, Alessio
The Journal of economic perspectives,
01/2010, Letnik:
24, Številka:
1
Journal Article
Recenzirano
Odprti dostop
Engel curves describe how household expenditure on particular goods or services depends on household income. German statistician Ernst Engel (1821–1896) was the first to investigate this relationship ...systematically in an article published about 150 years ago. The best-known single result from the article is “Engel's law,” which states that the poorer a family is, the larger the budget share it spends on nourishment. We revisit Engel's article, including its context and the mechanics of the argument. Because the article was completed a few decades before linear regression techniques were established and income effects were incorporated into standard consumer theory, Engel was forced to develop his own approach to analyzing household expenditure patterns. We find his work contains some interesting features in juxtaposition to both the modern and classical literature. For example, Engel's way of estimating the expenditure–income relationship resembles a data-fitting technique called the “regressogram” that is nonparametric—in that no functional form is specified before the estimation. Moreover, Engel introduced a way of categorizing household expenditures in which expenditures on commodities that served the same purpose by satisfying the same underlying “want” were grouped together. This procedure enabled Engel to discuss the welfare implications of his results in terms of the Smithian notion that individual welfare is related to the satisfaction of wants. At the same time, he avoided making a priori assumptions about which specific goods were necessities, assumptions which were made by many classical economists like Adam Smith. Finally, we offer a few thoughts about some modern literature that builds on Engel's research.
Structural vector‐autoregressive models are potentially very useful tools for guiding both macro‐ and microeconomic policy. In this study, we present a recently developed method for estimating such ...models, which uses non‐normality to recover the causal structure underlying the observations. We show how the method can be applied to both microeconomic data (to study the processes of firm growth and firm performance) and macroeconomic data (to analyse the effects of monetary policy).
The ongoing economic crisis has revealed fundamental problems both in our economic system and the discipline which analyses it. This book presents a series of contrasting but complementary approaches ...in economic theory in order to offer a critical toolkit for examining the modern capitalist economy. The global economic crisis may have changed the world in which we live, but not the fundamental tenets of the discipline.
This book is a critical assessment of the relation between economic theory and economic crises: how intellectual thinking impacts on real economic events and vice versa. It aims at challenging the conventional way in which economics is taught in universities and later adopted by public officials in the policymaking process. The contributions, all written by distinguished academics and researchers, offer a heterodox perspective on economic thinking and analysis. Each chapter is inspired by alternative theoretical approaches which have been mostly side-lined from current academic teaching programmes. A major suggestion of the book is that the recent economic crisis can be better understood by recovering such theoretical analyses and turning them into a useful framework for economic policymaking.
Economic Crisis and Economic Thought is intended as a companion to economics students at the Master’s and PhD level, in order for them to confront issues related to the labour market, the financial sector, macroeconomics, industrial economics, etc. with an alternative and complementary perspective. It challenges the way in which economic theory is currently taught and offered via alternatives for the future.
This paper develops a structural VAR methodology based on graphical models to identify the monetary policy shocks and to measure their macroeconomic effects. The advantage of this procedure is to ...work with testable overidentifying models, whose restrictions are derived by the partial correlations among residuals plus some institutional knowledge. This permits to test some restrictions on the reserve market used in several approaches existing in the literature. The main findings are that neither VAR innovations to federal funds rate nor innovations to nonborrowed reserves are good indicators of monetary policy shocks.
•An interpretation of U.S. Federal Reserve decisions based on a comparison between competing monetary policy “rules” that include nonperforming loans as indices of instability and insolvency.•The ...empirical investigation is carried out following a structural vector autoregressive approach that exploits a statistical identification procedure.•Our empirical findings provide very limited and incomplete support for the standard “Taylor rule” in its various forms while give comprehensive evidence in favor of an alternative “Solvency rule”.
We assess to what extent decisions taken by the Federal Reserve in setting interest rates can be interpreted in the light of monetary policy rules that are either built on standard objectives of output and price stabilization or based on alternative objectives of financial stability and regulation of the solvency conditions in the economic system. This goal is pursued through a comparison between the “Taylor rule” in its “original” and “augmented” versions, and an alternative “Solvency rule”. We use nonperforming loans as a proxy for the conditions of financial stability and solvency in the system. The empirical investigation is carried out following a structural vector autoregressive approach that exploits a statistical identification procedure. In this way, we are able to identify the causal structure among variables without imposing theoretical restrictions on the model. Our empirical findings provide very limited and incomplete support for the Taylor rule in its various forms while give comprehensive evidence in favor of the alternative Solvency rule.
This paper addresses the methodological problems of empirical validation in agent-based (AB) models in economics and how these are currently being tackled. We first identify a set of issues that are ...common to all modelers engaged in empirical validation. We then propose a novel taxonomy, which captures the relevant dimensions along which AB economics models differ. We argue that these dimensions affect the way in which empirical validation is carried out by AB modelers and we critically discuss the main alternative approaches to empirical validation being developed in AB economics. We conclude by focusing on a set of (as yet) unresolved issues for empirical validation that require future research. PUBLICATION ABSTRACT
A method for agent-based models validation Guerini, Mattia; Moneta, Alessio
Journal of economic dynamics & control,
September 2017, 2017-09-00, Letnik:
82
Journal Article
Recenzirano
Odprti dostop
This paper proposes a new method to empirically validate simulation models that generate artificial time series data comparable with real-world data. The approach is based on comparing structures of ...vector autoregression models which are estimated from both artificial and real-world data by means of causal search algorithms. This relatively simple procedure is able to tackle both the problem of confronting theoretical simulation models with the data and the problem of comparing different models in terms of their empirical reliability. Moreover the paper provides an application of the validation procedure to the agent-based macroeconomic model proposed by Dosi et al. (2015).
Independent Component Analysis (ICA) is a statistical method that linearly transforms a random vector. Under the assumption that the observed data are mixtures of non-Gaussian and independent ...processes, ICA is able to recover the underlying components, but with a scale and order indeterminacy. Its application to structural vector autoregressive (SVAR) models allows the researcher to recover the impact of independent structural shocks on the observed series from estimated residuals. We analyze different ICA estimators, recently proposed within the field of SVAR analysis, and compare their performance in recovering structural coefficients. Moreover, we assess the size distortions of the estimators in hypothesis testing. We conduct our analysis by focusing on non-Gaussian distributional scenarios that get gradually close to the Gaussian case. The latter is the case where ICA methods fail to recover the independent components. Although the ICA estimators that we analyze show similar pattern of performance, two of them — the fastICA algorithm and the pseudo-maximum likelihood estimator — tend to perform relatively better in terms of variability, stability across sub- and super-Gaussian settings, and size distortion. We finally present an empirical illustration using US data to identify the effects of government spending and tax cuts on economic activity, thus providing an example where ICA techniques can be used for hypothesis testing.
The tendency of sectoral demand to satiate has long been argued to be a key driver of the structural change in an economy (Pasinetti 1981; Saviotti 2001). This literature raises the question as to ...what extent cross-sectional patterns of household expenditure can be used to make inferences about how the demand for goods and services will grow over time. Moreover, if indeed satiation does take place, then firms and entrepreneurs could react to this situation by innovating goods and services in order to overcome stagnation in demand growth (Witt 2001). We empirically investigate this 'satiation-escape' hypothesis by examining the inter-temporal dynamics of Engel curves and their derivatives, which reflect how household spending on a good changes with income. Taking into account changes in the price level, we investigate whether Engel curves that exhibit cross-section satiation tend to exhibit over time an upwards shift in the satiation level jointly with a shift in position and shape. Evidence suggests that this is actually the case.
Economic growth stimulates fundamental changes in consumption patterns as consumers who get rich tend to spread their spending more evenly across a wider variety of goods and services. Comparing ...cross sectional spending patterns across rich and poor countries, we investigate how this diversification process enables more niche patterns of spending to emerge across the global population of consumers. We use entropy measures to quantify the dispersion of household spending across goods and study how it unfolds as GDP rises. Using a gravity model to study international differences in the relative order of income elasticities, i.e. expenditure hierarchies, we show how this diversification process on the national level is correlated with cultural norms, GDP and income inequality. We find that national expenditure hierarchies are relatively similar across countries among necessities, while they are increasingly unique among luxuries. We further consider how rising affluence tends to generate more niche consumption patterns by examining how rising income is positively correlated with demand heterogeneity and income inequality is negatively correlated with market depth.