We study the dynamics of consumption growth in a series of countries over a time span of 200 years. We seek to answer whether longrun risk or disasters are features of models that yield good fit to consumption data. To accomplish this goal, we develop a new methodology for the filtering and estimation of multivariate jump diffusion processes in the presence of incomplete data and measurement errors. Our methodology is both statistically and computationally efficient, and enables the empirical analysis of previously intractable multidimensional models. Our estimates suggest that small and frequent disasters that arrive at a timevarying rate are a predominant feature of consumption data. Persistent stochastic volatility is also found to be a significant driver of consumption growth, especially in the United States.
This paper develops estimators of the transition density, filters, and parameters of multivariate jumpdiffusion models. The drift, volatility, jump intensity, and jump magnitude are allowed to be statedependent and nonaffine. It is not necessary to diagonalize the volatility matrix. Our density and filter estimators converge at the canonical rate typically associated with exact Monte Carlo estimation. Our parameter estimators have the same asymptotic distribution as maximum likelihood estimators, which are often intractable for the class of models we consider. The results of this paper enable the empirical analysis of previously intractable models of asset prices and economic time series."The Systemic Effects of Benchmarking" (with D. Duarte and K. Lee)We show that the competitive pressure to beat a benchmark may induce institutional trading behavior that exposes retail investors to tail risk. In our model, institutional investors are different from a retail investor because they derive higher utility when their benchmark outperforms. This forces institutional investors to take on leverage to overinvest in the benchmark. Institutional investors execute fire sales when the benchmark experiences shock. This behavior increases market volatility, raising the tail risk exposure of the retail investor. Ex post, tail risk is only short lived. All investors survive in the long run under standard conditions, and the most patient investor dominates. Ex ante, however, benchmarking is welfare reducing for the retail investor, and beneficial only to the impatient institutional investor."Inference for Large Financial Systems" (with K. Giesecke and J. Sirignano), revise and resubmit at Mathematical Finance.We consider the problem of parameter estimation for large interacting stochastic systems where data is available on the aggregate state of the system. Parameter inference is computationally challenging due to the scale and complexity of such systems. Weak convergence results, similar in spirit to a law of large numbers and a central limit theorem, can be used to approximate large systems in distribution. We exploit these weak convergence results in order to develop approximate maximum likelihood estimators for such systems. The approximate estimators are shown to converge to the true parameters and are asymptotically normal as the number of observations and the size of the system become large. Numerical studies demonstrate the computational efficiency and accuracy of the approximate MLEs. Although our approach is widely applicable to large systems in many fields, we are particularly motivated by examples arising in finance such as systemic risk in banking systems and large portfolios of loans. This paper develops an unbiased Monte Carlo approximation to the transition density of a jumpdiffusion process with statedependent drift, volatility, jump intensity, and jump magnitude. The approximation is used to construct a likelihood estimator of the parameters of a jumpdiffusion observed at fixed time intervals that need not be short. The estimator is asymptotically unbiased for any sample size. It has the same largesample asymptotic properties as the true but uncomputable likelihood estimator. Numerical results illustrate its advantages.We study the sources of corporate default clustering in the United States. We reject the hypothesis that firms’ default times are correlated only because their conditional default rates depend on observable and latent systematic factors. By contrast, we find strong evidence that contagion, through which the default by one firm has a direct impact on the health of other firms, is a significant clustering source. The amount of clustering that cannot be explained by contagion and firms’ exposure to observable and latent systematic factors is insignificant. Our results have important implications for the pricing and management of correlated default risk.
Point processes are used to model the timing of defaults, market transactions, births, unemployment and many other events. We develop and study likelihood estimators of the parameters of a marked point process and of incompletely observed explanatory factors that influence the arrival intensity and mark distribution. We establish an approximation to the likelihood and analyze the convergence and largesample properties of the associated estimators. Numerical results highlight the computational efficiency of our estimators, and show that they can outperform EM Algorithm estimators.

