论文标题
识别和估计弱可分开的模型而没有单调性
Identification and Estimation of Weakly Separable Models Without Monotonicity
论文作者
论文摘要
我们研究了弱可分开的模型中治疗效应参数的识别和估计。 Vytlacil和Yildiz(2007)在他们的开创性工作中展示了如何识别和估计伪内源变量的平均治疗效果,而当结果在单个指数中弱可分开。相对于此单个索引,他们的识别结果基于单调性条件。相比之下,我们考虑具有多个指数的类似弱可分开的模型,并放松识别的单调性条件。与Vytlacil和Yildiz(2007)不同,我们利用了结果变量分布中的全部信息,而不仅仅是其平均值。实际上,当结果分布函数比均值更具信息性时,我们的方法适用于比其更一般的设置;特别是我们不依赖他们的单调性假设,同时我们也允许多个指数。为了说明我们方法的优势,我们提供了模型的示例,我们的方法可以识别感兴趣的参数,而现有方法将失败。这些示例包括具有多个未观察到的干扰术语的模型,例如ROY模型和具有虚拟内源变量的多项式选择模型,以及具有内源随机系数的潜在结果模型。我们的方法易于实现,可以应用于广泛的模型。我们建立标准的渐近特性,例如一致性和渐近正态性。
We study the identification and estimation of treatment effect parameters in weakly separable models. In their seminal work, Vytlacil and Yildiz (2007) showed how to identify and estimate the average treatment effect of a dummy endogenous variable when the outcome is weakly separable in a single index. Their identification result builds on a monotonicity condition with respect to this single index. In comparison, we consider similar weakly separable models with multiple indices, and relax the monotonicity condition for identification. Unlike Vytlacil and Yildiz (2007), we exploit the full information in the distribution of the outcome variable, instead of just its mean. Indeed, when the outcome distribution function is more informative than the mean, our method is applicable to more general settings than theirs; in particular we do not rely on their monotonicity assumption and at the same time we also allow for multiple indices. To illustrate the advantage of our approach, we provide examples of models where our approach can identify parameters of interest whereas existing methods would fail. These examples include models with multiple unobserved disturbance terms such as the Roy model and multinomial choice models with dummy endogenous variables, as well as potential outcome models with endogenous random coefficients. Our method is easy to implement and can be applied to a wide class of models. We establish standard asymptotic properties such as consistency and asymptotic normality.