Bayesian Predictive Synthesis

Bayesian predictive synthesis defines a coherent theoretical basis for combining multiple forecast densities, whether from models, individuals, or other sources, and extends existing forecast pooling, ensemble learning, and Bayesian model mixing methods.

Motivated by an interest in the question of foundational underpinnings of some of the specific algorithmic/empirical models for forecast density combination recently introduced, we developed a new framework called Bayesian predictive synthesis (BPS). The framework provides interpretation of traditional and recently introduced pooling methods as special cases. More importantly from a practical time series forecasting perspective, development of BPS for sequential forecasting of time series enables the use of flexible, adaptive Bayesian dynamic models that are able to respond to changes in characteristics of sets of models and forecasters over time. BPS has the potential to define fully Bayesian, interpretable models that can adapt to time-varying biases and mis-calibration of multiple models or forecasters, and generate useful insights into patterns of relationships and dependencies among them while also improving forecast accuracy.

                  BPS illustrated

 BPS

           4-step ahead MSE of quarterly inflation                24-step ahead MSE of monthly inflation (multivariate)

Papers: Dynamic Bayesian predictive synthesis for time series forecasting. Code can be found here.

Multivariate Bayesian predictive synthesis in macroeconomic forecasting

Dynamic mixed frequency synthesis for economic nowcasting.

Dynamic Shrinkage

For dynamic linear modeling with many predictors, the assumption of a static generative model with a fixed subset of regressors may be misleadingly  restrictive.  
By obscuring variable selection uncertainty over time, confinement to a single inferential model may lead to poorer predictive performance and inference, especially when the actual effective subset at each time is sparse. 

Motivated by such contexts, we develop a new dynamic shrinkage approach for time series models that exploits time-varying predictive subset sparsity, when it in fact exists. The proposed Dynamic Spike-and-Slab (DSS) priors are constructed as mixtures of two processes: a spike process for the irrelevant coefficients and a slab autoregressive process for the active coefficients. The mixing weights are themselves time-varying. 

Extending this, we develop a novel methodology for sparse dynamic factor analysis using DSS priors. The proposed method addresses and attains (1) time-varying patterns of sparsity in the latent structure, (2) number of factors, and (3) accounts for structural instabilities over time with  time-varying loadings and/or factors. As the EM algorithm finds a likely sparse structure, it does not require strong identification constraints that would typically be needed. 

   

     Dynamic active coefficients for monthly inflation       Active factor loadings on the latent factors at different periods

Papers: Dynamic variable selection with spike-and-slab process priors.

Dynamic sparse factor analysis.

Dynamic causal effects in time series: Temporal impacts of U.S. minimum wage.

Decouple-Recouple Synthesis

The increasing availability of large datasets, combined with the recent advancements in the field of econometrics, statistics, and machine learning, have spurred interest in predictive models with many explanatory variables. Confronted with a large set of predictors, two main classes of models became popular, even standard, within the regression framework: Sparse modeling (shrinkage) and dense modeling (factor analysis). Despite their popularity, both approaches have shortcomings for economic and financial forecasting and decision making. In particular, these dimension reduction techniques either impose a dogmatic prior for shrinkage or decrease in interpretability and decision insight, something that might be critical for policy makers, analysts, and investors.

We propose a novel class of data-rich predictive synthesis techniques and contribute to the literature on predictive modeling and decision making with large datasets. The proposed method retains all of the information available and decouple a large predictive regression model into a set of smaller regressions constructed by clustering the set of regressors, according to their economic meaning. Unlike other high-dimensional methods, this approach retains all of the information by estimating multiple predictive densities (separately), and recouple them to generate aggregate predictive densities. By decoupling a large predictive regression model into smaller, less complex regressions, the method keeps the aggregate model variance low, while sequentially learning and correcting for the misspecification bias that characterize each group. This approach simultaneously improves the out-of-sample predictive performance, while maintaining critical economic interpretability.

  
         Dynamic coefficients in the recouple step             Certainty equivalenty returns on the non-durable industry

Papers: Large-scale dynamic predictive regressions.

© Kenichiro McAlinn, 2016.