Difference-in-differences (DID) analyses are used in a variety of research areas as a strategy for estimating the causal effect of a policy, program, intervention, or environmental hazard (hereafter, treatment). The… Click to show full abstract
Difference-in-differences (DID) analyses are used in a variety of research areas as a strategy for estimating the causal effect of a policy, program, intervention, or environmental hazard (hereafter, treatment). The approach offers a strategy for estimating the causal effect of a treatment using observational (i.e., nonrandomized) data in which outcomes on each study unit have been measured both before and after treatment. To identify a causal effect, a DID analysis relies on an assumption that confounding of the treatment effect in the pretreatment period is equivalent to confounding of the treatment effect in the post treatment period. We propose an alternative approach that can yield identification of causal effects under different identifying conditions than those usually required for DID. The proposed approach, which we refer to as generalized DID, has the potential to be used in routine policy evaluation across many disciplines, as it essentially combines two popular quasiexperimental designs, leveraging their strengths while relaxing their usual assumptions. We provide a formal description of the conditions for identification of causal effects, illustrate the method using simulations, and provide an empirical example based on Card and Krueger’s landmark study of the impact of an increase in minimum wage in New Jersey on employment.
               
Click one of the above tabs to view related content.