WHOA-PSI 3
September 8-10, 2018

Workshop Description

The Third Workshop on Higher-Order Asymptotics and Post-Selection Inference (WHOA-PSI)^{3} seeks to build upon the success of the first workshop and second workshop, by presenting the latest developments in post-selection inference, and discussing how tools from higher-order asymptotics can both elucidate important properties of post-selection inference procedures, as well as suggest new directions which may ultimately yield more accurate small-sample performance. The workshop format is intended to encourage collaboration and lively discussion, and to give a voice to all participants with online discussion forums (a result of a successful experiment from the first two workshops). More specific details will soon be posted below. Contact: Todd Kuffner, email: kuffner@wustl.edu     

This conference supports the Non-Discrimination Statement of the Association for Women in Mathematics (AWM).


Organizing Committee

Todd Kuffner lead organizer
Washington University in St. Louis
John Kolassa
Rutgers University
Dalia Ghanem
UC Davis

Speakers


Local Information

For those arriving early or thinking about staying longer, St. Louis is a lovely place to visit. Besides the iconic Gateway Arch and the nearby Old Courthouse which houses exhibits on the Dred Scott case, St. Louis has a stunning botanical garden, a high density of good restaurants (BBQ is a specialty), and is close to many rivers (Missouri, Mississippi and Meremac) which are great for float trips. There are many nearby parks and nature reserves which are excellent for hiking, as well as a wolf sanctuary. Mark Twain's boyhood home lies an hour north of the city. Anheuser-Busch is headquartered in St. Louis and offers tours of the brewery (requires advance booking due to popularity). For those unfamiliar with the institution, Washington University in St. Louis is a leading national research university, ranked 23rd in the world in the 2016 Academic Ranking of World Universities. Our statistics presence is concentrated in the Dept. of Mathematics. You are encouraged to look around this beautiful campus on the western edge of St. Louis, which faces Forest Park, the site of the 1904 World's Fair and home to the Saint Louis Zoo and Saint Louis Art Museum (both free admission, walking distance from campus).

Dates and Times

The workshop is a full 3 days. The talks will begin around 8:30am on Saturday September 8th, and will end by 5pm on Monday September 10th, 2018.

Registration Information

Registration details will be available soon. The registration fee is expected to be around $320-340, which will include breakfasts, lunches, coffee breaks, and a banquet dinner.


Lodging Information

All workshop participants will have to make their own lodging arrangements. Details will be available soon.

Potential Topics include (but are certainly not limited to):   Participants: feel free to send me updates!
Principles and general views of post-selection inference, for example
Benjamini (2010). `Simultaneous and selective inference: current successes and future challenges', Biometrical Journal 52, 708-721.
Taylor & Tibshirani (2015), `Statistical learning and selective inference', Proceedings of the National Academy Sciences 112, 7629-7634.
Leeb & Potscher (2005), `Model selection and inference: facts and fiction', Econometric Theory 21, 21-59.

Foundations (general, not necessarily post-selection),
Keli Liu and Xiao-Li Meng (2016). There is individualized treatment. Why not individualized inference? Annual Review of Statistics and Its Application 3, 79-111.
Suzanne Thornton and Min-ge Xie (2017). Approximate confidence distribution computing: an effective likelihood-free method with statistical guarantees, arXiv:1705.10347.
Ryan Martin and Chuanhai Liu (2016). Validity and the foundations of statistical inference, arXiv: 1607.05051.

Comparisons of naive intervals and post-selection inference, for example
Zhao, Shojaie & Witten (2017), `In defense of the indefensible: a very naive approach to high-dimensional inference', arXiv: 1705.05543.
Leeb, Potscher & Ewald (2015), `On various confidence intervals post-model-selection', Statistical Science 30, 216-227.

Incorporating resampling and asymptotic refinements into inference procedures relevant for this workshop, for example
Stephen M.S. Lee and Yilei Wu (2017). Resampling-based post-model-selection inference for linear regression models.
Andreas Buja and Werner Stuetzle (2017). Smoothing effects of bagging: von Mises expansions of bagged statistical functionals, arXiv: 1612.02528.
Noureddine El Karoui and Elizabeth Purdom (2015). Can we trust the bootstrap in high dimensions? Submitted.
Noureddine El Karoui and Elizabeth Purdom (2016). The bootstrap, covariance matrices, and PCA in moderate and high-dimensions. Submitted.
McCarthy, Zhang, Brown, Berk, Buja, George & Zhao (2017). Calibrated Percentile Double Bootstrap for Robust Linear Regression Inference, Statistica Sinica, accepted.
Mayya Zhilova (2016). Non-classical Berry-Esseen inequality and accuracy of the weighted bootstrap, arXiv: 1611.02686 .
Mayya Zhilova (2015). Simultaneous likelihood-based bootstrap confidence sets for a large number of models, arXiv: 1506.05779 .
Ian McKeague and Min Qian (2015). An adaptive resampling test for detecing the presence of significant predictors (with discussion). J. Amer. Statist. Assoc. 110, 1422-1433.

Cross-Validation, AIC, inference and prediction post-selection, for example
Jing Lei (2017). Cross-validation with confidence, arXiv: 1703.07904.
Ali Charkhi & Gerda Claeskens (2017). Asymptotic post-selection inference for Akaike's information criterion.
Lukas Steinberger and Hannes Leeb (2016). Leave-one-out prediction intervals in linear regression models with many variables, arXiv: 1602.05801.
Liang Hong, Todd Kuffner & Ryan Martin (2017). On overfitting and post-selection uncertainty assessments. Submitted.
Liang Hong, Todd Kuffner & Ryan Martin (2017). On prediction of future insurance claims when the model is uncertain. Submitted.
Francois Bachoc, Hannes Leeb & Benedikt Potscher (2017). Valid confidence intervals for post-model-selection predictors, arXiv: 1412.4605.
Hannes Leeb (2009). Conditional predictive inference post model selection. Annals of Statistics 37(5B), 2838-2876.
Hannes Leeb (2008). Evaluation and selection of models for out-of-sample prediction when the sample size is small relative to the complexity of the data-generating process. Bernoulli 14(3), 661-690.

Assumption-lean and distribution-free inference, conformal prediction and robustness, for example
Buja, Berk, Brown, George, Kuchibhotla & Zhao. Models as Approximations II: A General Theory of Model-Robust Regression. arXiv: 1612.03257.
Anru Zhang, Larry Brown & Tony Cai (2016). Semi-supervised inference: general theory and estimation of means, arXiv: 1606.07268.
Lei, G'Sell, Rinaldo, Tibshirani & Wasserman (2017). Distribution-free predictive inference for regression, J. Amer. Statist. Assoc., to appear.
Fan Yang and Rina Foygel Barber (2017). Contraction and uniform convergence of isotonic regression. arXiv: 1706.01852.
Azriel, Brown, Sklar, Berk, Buja & Zhao (2016). Semi-supervised linear regression, arXiv: 1612.02391.

Statistical efficiency and inference in machine learning, for example
Susan Athey & Stefan Wager (2017). Efficient policy learning, arXiv: 1702.02896.
Qingyuan Zhao & Trevor Hastie (2017). Causal interpretations of black-box models, J. of Business & Economic Statistics, to appear.

Model-based clustering and cluster-based models and inference
, for example
Bunea, Eisenbach, Ning, Dinicu and Liu (2017). Inference in cluster-based graphical models.
Bunea, Ning and Wegkamp (2017). Overlapping clustering with statistical guarantees, arXiv: 1704.06977.

Selective inference (conditional approaches), for example
Yuvan Benjamini, Jonathan Taylor & Rafael Irizarry (2016). Selection corrected statistical inference for region detection with high-dimensional throughput assays, bioRxiv preprint.
Hyun, G'Sell & Tibshirani (2016), `Exact post-selection inference for changepoint detection and other generalized lasso problems', arXiv: 1606.03552
Taylor & Tibshirani (2016), `Post-selection inference for L1-penalized likelihood models', arXiv: 1602.07358
Fithian, Taylor, Tibshirani & Tibshirani (2015+), `Selective sequential model selection', arXiv: 1512.02565
Tibshirani, Taylor, Lockhart, Tibshirani (2015+), `Exact post-selection inference for sequential regression procedures', J. Amer. Statist. Assoc., to appear.
Lockhart, Taylor, Tibshirani & Tibshirani (2014), `A significance test for the lasso', Annals of Statistics 42, 413-468.
Tibshirani, Rinaldo, Tibshirani & Wasserman (2015), `Uniform asymptotic inference and the bootstrap after model selection', arXiv: 1506.06266
Tian & Taylor (2015), `Asymptotics of selective inference', arXiv: 1501.03588
Lee, Sun, Sun & Taylor (2016), `Exact post-selection inference with the lasso', to appear in the Annals of Statistics.

Simultaneous inference, false discovery rates (FDR), false coverage statement rates (FCR), family-wise error rates (FWER), for example
Berk, Brown, Buja, Zhang & Zhao (2013), `Valid post-selection inference', Annals of Statistics 41, 802-837.
Benjamini (2010), `Discovering the false discovery rate', J. Roy. Statist. Soc. Ser. B 72, 405-416.
Benjamini & Yekutieli (2005), `False discovery rate-adjusted multiple confidence intervals for selected parameters', J. Amer. Statist. Assoc. 100, 71-93.
G'Sell, Wager, Chouldechova & Tibshirani (2015+), `Sequential selection procedures and false discovery rate control', J. Roy. Statist. Soc. Ser. B, to appear.
Barber & Candes (2015), `Controlling the false discovery rate via knockoffs', Annals of Statistics 43, 2055-2085.
Su, Bogdan & Candes (2016+), `False discoveries occur early on the Lasso path', arXiv: 1511.01957.

Principled statistical inference, for example
Todd Kuffner and Alastair Young (2017). Principled statistical inference in data science. Submitted.
Todd Kuffner and Alastair Young (2017). Philosophy of science, principled statistical inference, and data science.

Bayesian post-selection inference, for example
Panigrahi, Taylor & Weinstein (2016). `Bayesian post-selection inference in the linear model', arXiv: 1605.08824
Yekutieli (2012). `Adjusted Bayesian inference for selected parameters', J. Roy. Statist. Soc. Ser. B, 74(3), 515-541.

Bagging and Boosting, for example
Bradic (2016). `Randomized maximum-contrast selection: subagging for large-scale regression', Elec. J. Statist. 10(1), 121-170.
Li & Bradic (2015). `Boosting in the presence of outliers: adaptive classification with non-convex loss functions', arXiv: 1510.01064.
Efron (2014), `Estimation and accuracy after model selection', J. Amer. Statist. Assoc. 109, 991-1007.
Buhlmann & Yu (2002), `Analyzing bagging', Annals of Statistics 30, 927-961.

High-dimensional inference, for example
Po-Ling Loh (2017). Statistical consistency and asymptotic normality for high-dimensional robust M-estimators, Annals of Statistics 45(2), 866-896.
Fan, Shao & Zhou (2015), `Are discoveries spurious? Distributions of Maximum Spurious Correlations and their applications', arXiv: 1502.04237
Cai & Guo (2015), `Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity', arXiv: 1506.05539
Ning & Liu (2015), `A general theory of hypothesis tests and confidence regions for sparse high dimensional models', arXiv: 1412.8765
Ning, Zhao & Liu (2015), `A likelihood ratio framework for high dimensional semiparametric regression', arXiv: 1412.2295
Shah & Samworth (2013), `Variable selection with error control: another look at stability selection', J. Roy. Statist. Soc. B 75, 55-80.
Meinshausen & Buhlmann (2010), `Stability selection', J. Roy. Statist. Soc. Ser. B 72, 417-473.
van de Geer, Buhlmann, Ritov & Dezeure (2014), `On asymptotically optimal confidence regions and tests for high-dimensional models', Annals of Statistics 42, 1166-1202.
Javanmard & Montanari (2015+), `Hypothesis testing in high-dimensional regression under the Gaussian random design model: asymptotic theory', IEEE Trans. Inform. Theory, to appear.
Liu & Yu (2013), `Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression', Electronic J. Statist. 7, 3124-3169.
Zhang & Zhang (2014), `Confidence intervals for low-dimensional parameters in high-dimensional linear models', J. Roy. Statist. Soc. Ser. B 76, 217-242.
Belloni, Chernozhukov & Hansen, `Inference methods for high-dimensional sparse econometric models', Advances in Economics & Econometrics, Econometric Society World Congress 2010.

Selection and inference for weak signals, for example
Shi & Qu (2016). `Weak signal identification and inference in penalized model selection', Annals of Statistics, to appear.
Jeng (2016). `Detecting weak signals in high dimensions', J. Multivariate Statist. 147, 234-246.


The aspects of the above topics and other post-selection inference procedures which will be emphasized in the workshop are those related to higher-order asymptotics, including both analytic- and resampling-based tools and refinements, some of which are described in:
Some recent references for post-selection inference include
Chapter 3 of Fithian (2015), Topics in Adaptive Inference, Ph.D. thesis, Stanford University.
Chapter 6 of Hastie, Tibshirani & Wainwright (2015), Statistical Learning with Sparsity: The Lasso and Generalizations, Chapman & Hall.
Chapters 10-11 of Buhlmann & van de Geer (2011), Statistics for High-Dimensional Data, Springer.