Most countries, including the United States, have an array of greenhouse gas mitigation policies, which provide subsidies or restrictions typically aimed at specific technologies or sectors. Such ...climate policies range from automobile fuel economy standards, to gasoline taxes, to mandating that a certain amount of electricity in a state comes from renewables, to subsidizing solar and wind electrical generation, to mandates requiring the blending of biofuels into the surface transportation fuel supply, to supply-side restrictions on fossil fuel extraction. This paper reviews the costs of various technologies and actions aimed at reducing greenhouse gas emissions. Our aim is twofold. First, we seek to provide an up-to-date summary of costs of actions that can be taken now using currently available technology. These costs focus on expenditures and emissions reductions over the life of a project compared to some business-as-usual benchmark—for example, replacing coal-fired electricity generation with wind, or weatherizing a home. We refer to these costs as static because they are costs over the life of a specific project undertaken now, and they ignore spillovers. Our second aim is to distinguish between dynamic and static costs and to argue that some actions taken today with seemingly high static costs can have low dynamic costs, and vice versa. We make this argument at a general level and through two case studies, of solar panels and of electric vehicles, technologies whose costs have fallen sharply. Under the right circumstances, dynamic effects will offer a justification for policies that have high costs according to a myopic calculation.
When instruments are weakly correlated with endogenous regressors, conventional methods for instrumental variables (IV) estimation and inference become unreliable. A large literature in econometrics ...has developed procedures for detecting weak instruments and constructing robust confidence sets, but many of the results in this literature are limited to settings with independent and homoskedastic data, while data encountered in practice frequently violate these assumptions. We review the literature on weak instruments in linear IV regression with an emphasis on results for nonhomoskedastic (heteroskedastic, serially correlated, or clustered) data. To assess the practical importance of weak instruments, we also report tabulations and simulations based on a survey of papers published in the
American Economic Review
from 2014 to 2018 that use IV. These results suggest that weak instruments remain an important issue for empirical practice, and that there are simple steps that researchers can take to better handle weak instruments in applications.
External sources of as-if randomness — that is, external instruments — can be used to identify the dynamic causal effects of macroeconomic shocks. One method is a one-step instrumental variables ...regression (local projections - IV); a more efficient two-step method involves a vector au to regression. We show that, under a restrictive instrument validity condition, the one-step method is valid even if the vector autoregression is not invertible, so comparing the two estimates provides a test of invertibility. If, however, lagged endogenous variables are needed as control variables in the one-step method, then the conditions for validity of the two methods are the same.
We review the main identification strategies and empirical evidence on the role of expectations in the New Keynesian Phillips curve, paying particular attention to the issue of weak identification. ...Our goal is to provide a clear understanding of the role of expectations that integrates across the different papers and specifications in the literature. We discuss the properties of the various limited-information econometric methods used in the literature and provide explanations of why they produce conflicting results. Using a common dataset and a flexible empirical approach, we find that researchers are faced with substantial specification uncertainty, as different combinations of various a priori reasonable specification choices give rise to a vast set of point estimates. Moreover, given a specification, estimation is subject to considerable sampling uncertainty due to weak identification. We highlight the assumptions that seem to matter most for identification and the configuration of point estimates. We conclude that the literature has reached a limit on how much can be learned about the New Keynesian Phillips curve from aggregate macroeconomic time series. New identification approaches and new datasets are needed to reach an empirical consensus.
This paper examines the macroeconomic dynamics of the 2007–09 recession in the United States and the subsequent slow recovery. Using a dynamic factor model with 200 variables, we reach three main ...conclusions. First, although many of the events of the 2007–09 collapse were unprecedented, their net effect was to produce macro shocks that were larger versions of shocks previously experienced, to which the economy responded in a historically predictable way. Second, the shocks that produced the recession were primarily associated with financial disruptions and heightened uncertainty, although oil shocks played a role in the initial slowdown, and subsequent drag was added by effectively tight conventional monetary policy arising from the zero lower bound. Third, although the slow nature of the recovery is partly due to the shocks of this recession, most of the slow recovery in employment, and nearly all of the slow recovery in output, is due to a secular slowdown in trend labor force growth.
Abstract
The social cost of carbon dioxide (SC-CO
2
) measures the monetized value of the damages to society caused by an incremental metric tonne of CO
2
emissions and is a key metric informing ...climate policy. Used by governments and other decision-makers in benefit–cost analysis for over a decade, SC-CO
2
estimates draw on climate science, economics, demography and other disciplines. However, a 2017 report by the US National Academies of Sciences, Engineering, and Medicine
1
(NASEM) highlighted that current SC-CO
2
estimates no longer reflect the latest research. The report provided a series of recommendations for improving the scientific basis, transparency and uncertainty characterization of SC-CO
2
estimates. Here we show that improved probabilistic socioeconomic projections, climate models, damage functions, and discounting methods that collectively reflect theoretically consistent valuation of risk, substantially increase estimates of the SC-CO
2
. Our preferred mean SC-CO
2
estimate is $185 per tonne of CO
2
($44–$413 per tCO
2
: 5%–95% range, 2020 US dollars) at a near-term risk-free discount rate of 2%, a value 3.6 times higher than the US government’s current value of $51 per tCO
2
. Our estimates incorporate updated scientific understanding throughout all components of SC-CO
2
estimation in the new open-source Greenhouse Gas Impact Value Estimator (GIVE) model, in a manner fully responsive to the near-term NASEM recommendations. Our higher SC-CO
2
values, compared with estimates currently used in policy evaluation, substantially increase the estimated benefits of greenhouse gas mitigation and thereby increase the expected net benefits of more stringent climate policies.
This article provides a simple shrinkage representation that describes the operational characteristics of various forecasting methods designed for a large number of orthogonal predictors (such as ...principal components). These methods include pretest methods, Bayesian model averaging, empirical Bayes, and bagging. We compare empirically forecasts from these methods with dynamic factor model (DFM) forecasts using a U.S. macroeconomic dataset with 143 quarterly variables spanning 1960-2008. For most series, including measures of real economic activity, the shrinkage forecasts are inferior to the DFM forecasts. This article has online supplementary material.
We examine whether the U.S. rate of price inflation has become harder to forecast and, to the extent that it has, what changes in the inflation process have made it so. The main finding is that the ...univariate inflation process is well described by an unobserved component trend-cycle model with stochastic volatility or, equivalently, an integrated moving average process with time-varying parameters. This model explains a variety of recent univariate inflation forecasting puzzles and begins to explain some multivariate inflation forecasting puzzles as well.
The conventional heteroskedasticity-robust (HR) variance matrix estimator for cross-sectional regression (with or without a degrees-of-freedom adjustment), applied to the fixed-effects estimator for ...panel data with serially uncorrelated errors, is incon- sistent if the number of time periods T is fixed (and greater than 2) as the number of entities n increases. We provide a bias-adjusted HR estimator that is$\sqrt{nT}$-consistent under any sequences (n, T) in which n and/or T increase to ∞. This estimator can be extended to handle serial correlation of fixed order.
Least-squares estimates of the response of gasoline consumption to a change in the gasoline price are biased toward zero, given the endogeneity of gasoline prices. A seemingly natural solution to ...this problem is to instrument for gasoline prices using gasoline taxes, but this approach tends to yield implausibly large price elasticities. We demonstrate that anticipatory behavior provides an important explanation for this result. Gasoline buyers increase purchases before tax increases and delay purchases before tax decreases, rendering the tax instrument endogenous. Including suitable leads and lags in the regression restores the validity of the IV estimator, resulting in much lower elasticity estimates.