Gustavo Schwenkler

Assistant Professor of Finance

"Estimating the Dynamics of Consumption Growth"

We estimate models of consumption growth that allow for long-run risks and disasters using data for a series of countries over a time span of 200 years. Our estimates indicate that a model with small and frequent disasters that arrive at a mean-reverting rate best fits international consumption data. The implied posterior disaster intensity in such a model predicts equity returns without compromising the unpredictability of consumption growth. It also generates time-varying excess stock volatility, empirically validating key economic mechanisms often assumed in consumption-based asset pricing models.

"The Systemic Effects of Benchmarking" (with D. Duarte and K. Lee)

We show that the competitive pressure to beat a benchmark may induce institutional trading behavior that exposes retail investors to tail risk. In our model, institutional investors are different from a retail investor because they derive higher utility when their benchmark outperforms. This forces institutional investors to take on leverage to overinvest in the benchmark. Institutional investors execute fire sales when the benchmark experiences shock. This behavior increases market volatility, raising the tail risk exposure of the retail investor. Ex post, tail risk is only short lived. All investors survive in the long run under standard conditions, and the most patient investor dominates. Ex ante, however, benchmarking is welfare reducing for the retail investor, and beneficial only to the impatient institutional investor.

"Efficient Inference and Filtering for Multivariate Jump-Diffusions" (with F. Guay), submitted. Codes

This paper develops estimators of the transition density, filters, and parameters of multivariate jump-diffusions with latent components. The drift, volatility, jump intensity, and jump magnitude are allowed to be general functions of the state. Our density and filter estimators converge at the canonical square-root rate, implying computational efficiency. Our parameter estimators have the same asymptotic properties as true maximum likelihood estimators, implying statistical efficiency. Numerical experiments highlight the superior performance of our estimators.

"Inference for Large Financial Systems" (with K. Giesecke and J. Sirignano), second round at Mathematical Finance.

We consider the problem of parameter estimation for large interacting stochastic systems where data is available on the aggregate state of the system. Parameter inference is computationally challenging due to the scale and complexity of such systems. Weak convergence results, similar in spirit to a law of large numbers and a central limit theorem, can be used to approximate large systems in distribution. We exploit these weak convergence results in order to develop approximate maximum likelihood estimators for such systems. The approximate estimators are shown to converge to the true parameters and are asymptotically normal as the number of observations and the size of the system become large. Numerical studies demonstrate the computational efficiency and accuracy of the approximate MLEs. Although our approach is widely applicable to large systems in many fields, we are particularly motivated by examples arising in finance such as systemic risk in banking systems and large portfolios of loans.

"Simulated Likelihood Estimators for Discretely-Observed Jump-Diffusions" (with K. Giesecke), second round at Journal of Econometrics. Codes

This paper develops an unbiased Monte Carlo approximation to the transition density of a jump-diffusion process with state-dependent drift, volatility, jump intensity, and jump magnitude. The approximation is used to construct a likelihood estimator of the parameters of a jump-diffusion observed at fixed time intervals that need not be short. The estimator is asymptotically unbiased for any sample size. It has the same large-sample asymptotic properties as the true but uncomputable likelihood estimator. Numerical results illustrate its properties.

"Exploring the Sources of Default Clustering" (with K. Giesecke and S. Azizpour), Journal of Financial Economics 129 (2018), 154-183.

We study the sources of corporate default clustering in the United States. We reject the hypothesis that firms’ default times are correlated only because their conditional default rates depend on observable and latent systematic factors. By contrast, we find strong evidence that contagion, through which the default by one firm has a direct impact on the health of other firms, is a significant clustering source. The amount of clustering that cannot be explained by contagion and firms’ exposure to observable and latent systematic factors is insignificant. Our results have important implications for the pricing and management of correlated default risk.

"Filtered Likelihood for Point Processes" (with K. Giesecke), Journal of Econometrics 204 (2018), 33-53. Codes

Point processes are used to model the timing of defaults, market transactions, births, unemployment and many other events. We develop and study likelihood estimators of the parameters of a marked point process and of incompletely observed explanatory factors that influence the arrival intensity and mark distribution. We establish an approximation to the likelihood and analyze the convergence and large-sample properties of the associated estimators. Numerical results highlight the computational efficiency of our estimators, and show that they can outperform EM Algorithm estimators.