Alastair Young's
Contributions to Statistics



Updated on 13th August, 2019
George Alastair Young originally hails from Scotland and is presently a Professor of Statistics at Imperial College London. He was previously at the University of Cambridge, where he had been a student from 1981-1987, and a faculty member from 1987-2004. Alastair completed his B.Sc. in Mathematics and Statistics in 1981 at the University of Edinburgh, and his Ph.D. in 1987 at Cambridge under David George Kendall. Alastair has an outstanding record of service to the profession, and especially to the Royal Statistical Society and the Institute of Mathematical Statistics (IMS). Among his many contributions include serving as Joint Editor of JRSS B from 1994-1998, as an Associate Editor for Biometrika since 1999 (continuously), and as an Elected Member of the IMS Council (2019-2022).

Alastair was elected as an IMS Fellow in 2000 with the citation: ``For contributions to contemporary non-parametric methods, particularly algorithms for the bootstrap (e.g. saddlepoint methods), bootstrap methods for confidence procedures, and applications of computer algebra to resampling; and for service to the profession, for example in the roles of editor and associate editor of scholarly journals."

A CV prepared by T. Kuffner is available by clicking here.

Alastair's research is on the theoretical side of statistics, with particular focus on theory motivated by practice, and the implications of theory for methodology and applications. Much of Alastair's work is driven by the desire to achieve highly accurate inferences -- such as those based on interval estimates, p-values, or related inferential tools. Since the late 1980s, Alastair has been among the leading figures in the use of higher-order asymptotic analysis to (i) understand how and when bootstrap methodology provides accurate inferences; (ii) understand the sources of innacuracy in first-order asymptotic approximations of distributions; and (iii) seek refinements to standard procedures which are more accurate in small- to moderate-sized sample settings. His work has involved a diverse collection of topics, covering the full spectrum of the bootstrap (parametric, nonparametric, independent and dependent data, smooth and nonsmooth functionals), both Fisherian and Bayesian likelihood-based inferences, nonparametric methodology, comparative analysis of different routes to achieve accurate and powerful small-sample inferences, higher-order asymptotic methods, and post-selection inference. Below are some topics emphasized in his career up to his 60th birthday along with some relevant publications. A full list of publications is further down the page.

Google Scholar Profile
MathSciNet Profile
Math Genealogy
Alastair's Department Webpage

Bootstrap Smoothing
Saddlepoint and Laplace Approximations

Analytic and Bootstrap Inference / Distribution Estimation

Fisherian Conditional Inference
Likelihood and Bootstrap Inference with Nuisance Parameters

Objective Bayes / Probability Matching Priors

Kriging Prediction

Reviews and Monographs

Full List of Publications
  1. Kendall, D.G. and Young, G.A.  (1984). Indirectional statistics and the significance of an asymmetry discovered by Birch. Monthly Notices of the Royal Astronomical Society 207, 637-647. [pdf]
  2. Young, G.A. (1986). Data-based statistical methods. Ph.D. thesis, University of Cambridge. [pdf by request & approval only]
  3. Young, G.A. (1986). Conditioned data-based simulations: some examples from geometrical statistics. International Statistical Review 54, 1-13. [pdf]
  4. Silverman, B.W. and Young, G.A. (1987). The bootstrap: to smooth or not to smooth? Biometrika 74(3), 469-479. [pdf]
  5. Banks, D.L. and Young, G.A. (1987). Discussion of the paper by Jones and Sibson. Journal of the Royal Statistical Society Series A 150, 23-24. [pdf]
  6. Young, G.A. (1987). Contributed conference paper: Non-parametric smoothing of the bootstrap. In Proceedings of the 1st World Congress of the Bernoulli Society, Yu. Prohorov, V.V. Sazonov (eds.), Volume 2, 105-108. VNU Science Press, Utrecht. [pdf]
  7. Young, G.A. (1988). Contributed conference paper: Resampling tests of statistical hypotheses. In Compstat 88: Proceedings in Computational Statistics, D. Edwards, N.E. Raun (eds.), 233-238. Physica-Verlag, Heidelberg. [pdf]
  8. Young, G.A. (1988). A note on bootstrapping the correlation coefficient. Biometrika 75, 370-373. [pdf]
  9. Young, G.A. (1988). Invited conference paper: Data-smoothing and bootstrap resampling. In Model-oriented Data Analysis, V. Fedorov, H. Lauter (eds.), 144-149. Springer-Verlag, Berlin. [pdf]
  10. Young, G. A. (1988). Discussion of the papers by Hinkley and by DiCiccio and Romano. Journal of the Royal Statistical Society Series B 50, 358. [pdf]
  11. Young, G.A. (1990). Alternative smoothed bootstraps. Journal of the Royal Statistical Society Series B 52, 477-484. [pdf]
  12. Young, G.A. and Daniels, H.E. (1990). Bootstrap bias. Biometrika 77, 179-185. [pdf]
  13. De Angelis, D. and Young, G.A. (1990). Un esempio di perequazione empirica del bootstrap. Quaderni di Statistica e Econometria 12, 163-170. [pdf not available]
  14. Daniels, H.E. and Young, G.A. (1991). Saddlepoint approximation for the studentized mean, with an application to the bootstrap. Biometrika 78, 169-179. [pdf]
  15. DiCiccio, T.J., Martin, M.A. and Young, G.A. (1991). An invariance property of marginal density and tail probability approximations for smooth functions. Statistics and Probability Letters 12, 249-255. [pdf]
  16. De Angelis, D. and Young, G.A. (1992). Bootstrapping the correlation coefficient: a comparison of smoothing strategies. Journal of Statistical Computation and Simulation 40, 167-176. [pdf]
  17. De Angelis, D. and Young, G.A.  (1992). Smoothing the bootstrap. International Statistical Review 60, 45-56. [pdf]
  18. DiCiccio, T.J., Martin, M.A. and Young, G.A. (1992). Analytic approximations for iterated bootstrap confidence intervals. Statistics and Computing 2, 161-171. [pdf]
  19. DiCiccio, T.J., Martin, M.A. and Young, G.A. (1992). Fast and accurate approximate double bootstrap confidence intervals. Biometrika 79, 285-295. [pdf]
  20. Young, G.A. (1992). Discussion of the paper by Efron. Journal of the Royal Statistical Society Series B 54, 113-114. [pdf]
  21. De Angelis, D., Hall, P.G. and Young, G.A. (1993). A note on coverage error of bootstrap confidence intervals for quantiles. Mathematical Proceedings of the Cambridge Philosophical Society 114, 517-531. [pdf]
  22. De Angelis, D., Hall, P.G. and Young, G.A. (1993). Analytic and bootstrap approximations to estimator distributions in L1 regression. Journal of the American Statistical Association 88, 1310-1322. [pdf]
  23. DiCiccio, T.J., Martin, M.A. and Young, G.A. (1993). Analytical approximations to conditional distribution functions. Biometrika 80, 781-790. [pdf]
  24. DiCiccio, T.J., Martin, M.A. and Young, G.A. (1994). Analytic approximations to bootstrap distribution functions using saddlepoint methods. Statistica Sinica 4, 281-295. [pdf]
  25. Lee, S.M.S. and Young, G.A. (1994). Practical higher-order smoothing of the bootstrap. Statistica Sinica 4, 445-459. [pdf]
  26. Young, G.A. (1994). Bootstrap: more than a stab in the dark? (with discussion). Statistical Science 9, 382-415. [pdf]
  27. Lee, S.M.S. and Young, G.A. (1994). Invited conference paper: Approximate iterated bootstrap confidence intervals. Computing Science and Statistics 26, 464-471. [pdf]
  28. Lee, S.M.S. and Young, G.A. (1995). Asymptotic iterated bootstrap confidence intervals. Annals of Statistics 23, 1301-1330. [pdf]
  29. DiCiccio, T.J., Martin, M.A., Stern, S.E. and Young, G.A. (1996). Information bias and adjusted profile likelihoods. Journal of the Royal Statistical Society Series B 58, 189-203. [pdf]
  30. Lee, S.M.S. and Young, G.A. (1996). Sequential iterated bootstrap confidence intervals. Journal of the Royal Statistical Society Series B 58, 235-251. [pdf]
  31. Lee, S.M.S. and Young, G.A. (1996). On the use of bootstrap calibration: invited discussion of a paper by DiCiccio and Efron. Statistical Science 11, 221-223. [pdf]
  32. Lee, S.M.S. and Young, G.A. (1997). Estimation of the distribution function of a standardized statistic. Journal of the Royal Statistical Society Series B 59, 383-400. [pdf]
  33. Brown, B.M., Hall, P.G. and Young, G.A. (1997). On the effect of inliers on the spatial median. Journal of Multivariate Analysis 63, 88-104. [pdf]
  34. De Angelis, D., Fachin, S. and Young, G.A. (1997). Bootstrapping unit root tests. Applied Economics 29, 1155-1161. [pdf]
  35. Lee, S.M.S. and Young, G.A. (1997). Invited conference paper: Asymptotics and resampling methods. Computing Science and Statistics 28, 221-227. [pdf]
  36. Lee, S.M.S. and Young, G.A. (1997). Invited conference paper: Bootstrapping and improved nonparametric likelihood ratio confidence intervals. Proceedings of the Statistical Computing Section, ASA, 18-23. [pdf not available]
  37. De Angelis, D. and Young, G.A. (1998). Bootstrap method. In Encyclopedia of Biostatistics, Volume 1, P. Armitage, T. Colton (eds.), 426-433. Wiley, New York. [Reproduced in Elston, Olson and Palmer (eds.) Encyclopedia of Human Genetics and Genetic Epidemiology (2002).] [pdf not available]
  38. Lee, S.M.S. and Young, G.A. (1999). The effect of Monte Carlo approximation on coverage error of double bootstrap confidence intervals. Journal of the Royal Statistical Society Series B 61, 353-366. [pdf]
  39. Lee, S.M.S. and Young, G.A. (1999). Nonparametric likelihood ratio confidence intervals. Biometrika 86, 107-118. [pdf]
  40. Hall, P.G., Lee, S.M.S. and Young, G.A. (2000). Importance of interpolation when constructing double bootstrap confidence intervals. Journal of the Royal Statistical Society Series B 62, 479-491. [pdf]
  41. Lee, S.M.S. and Young, G.A. (2000). Comment on `Hybrid resampling methods for confidence intervals' by C.S. Chuang and T.L. Lai. Statistica Sinica 10, 43-46. [pdf]
  42. Brown, B.M., Hall, P.G. and Young, G.A. (2001). The smoothed median and the bootstrap. Biometrika 88, 519-534. [pdf]
  43. Putter, H. and Young, G.A. (2001). On the effect of covariance function estimation on the accuracy of kriging predictors. Bernoulli 7, 421-438. [pdf]
  44. Sjostedt-de Luna, S. and Young, G.A. (2003). The bootstrap and kriging prediction intervals. Scandinavian Journal of Statistics 30, 175-192. [pdf]
  45. Lee, S.M.S. and Young, G.A. (2003). Prepivoting by weighted bootstrap iterations. Biometrika 90, 393-410. [pdf]
  46. Robinson, J., Ronchetti, E. and Young, G.A. (2003). Saddlepoint approximations and tests based on multivariate M-estimates. Annals of Statistics 31, 1154-1169. [pdf]
  47. Young, G.A. (2003). Better bootstrapping by constrained prepivoting. Metron 61, 227-242. [pdf]
  48. Davison, A.C., Hinkley, D.V. and Young, G.A. (2003). Recent developments in bootstrap methodology. Statistical Science 18, 141-157. [pdf]
  49. Young, G.A. and Smith, R.L. (2005). Essentials of Statistical Inference, Cambridge University Press, 225 pages. [pdf not available]
  50. Lee, S.M.S. and Young, G.A. (2005). Parametric bootstrapping with nuisance parameters. Statistics and Probability Letters 71, 143-153. [pdf]
  51. Cheung, K.Y., Lee, S.M.S. and Young, G.A. (2005). Iterating the m out of n bootstrap in nonregular smooth function models. Statistica Sinica 15, 945-967. [pdf]
  52. Cheung, K.Y., Lee, S.M.S. and Young, G.A. (2006). Stein confidence sets based on non-iterated and iterated parametric bootstraps. Statistica Sinica 16, 45-75. [pdf]
  53. DiCiccio, T.J., Monti, A.C. and Young, G.A. (2006). Variance stabilization for a scalar parameter. Journal of the Royal Statistical Society Series B 68, 281-303. [pdf]
  54. DiCiccio, T.J. and Young, G.A. (2008). Conditional properties of unconditional parametric bootstrap procedures for inference in exponential families. Biometrika 95, 747-758. [pdf]
  55. Young, G.A. (2009). Routes to higher-order accuracy in parametric inference. Australian & New Zealand Journal of Statistics 51, 115-126. [pdf]
  56. DiCiccio, T.J. and Young, G.A. (2010). Objective Bayes and conditional inference in exponential families. Biometrika 97, 497-504. [pdf]
  57. DiCiccio, T.J. and Young, G.A. (2010). Computer-intensive conditional inference. In Complex Data Modeling and Computationally Intensive Statistical Methods, P. Mantovan, P. Secchi (eds.), 137-150. Springer-Verlag Italia, Milan. [pdf]
  58. Lu, K. and Young, G.A. (2010). Discussion of the paper by Cule, Samworth and Stewart. Journal of the Royal Statistical Society Series B 72, 582-584. [pdf]
  59. DiCiccio, T.J. and Young, G.A. (2011). Conditional inference by estimation of a marginal distribution. In Selected works of Debabrata Basu, A. Dasgupta (ed.), 9-14. Springer-Verlag, New York. [pdf]
  60. Lu, K. and Young, G.A. (2012). Parametric bootstrap under model mis-specification. Computational Statistics and Data Analysis 56, 2410-2420. [pdf]
  61. DiCiccio, T.J., Kuffner, T.A. and Young, G.A. (2012). Objective Bayes, conditional inference and the signed root likelihood ratio statistic. Biometrika 99, 675-686. [pdf]
  62. DiCiccio, T.J., Kuffner, T.A. and Young, G.A. (2015). Quantifying nuisance parameter effects via decompositions of asymptotic refinements for likelihood-based statistics. Journal of Statistical Planning and Inference 165, 1-12. [pdf]
  63. Ruan, D., Young, G.A. and Montana, G. (2015). Differential analysis of biological networks. BMC Bioinformatics 16(1), 1-13. [pdf]
  64. DiCiccio, T.J., Kuffner, T.A., Young, G.A. and Zaretzki, R. (2015). Stability and uniqueness of p-values for likelihood-based inference. Statistica Sinica 25(4), 1355-1376. [pdf]
  65. Lee, S.M.S. and Young, G.A. (2016). Distribution of likelihood-based p-values under a local alternative hypothesis. Biometrika 103(3), 641-653. [pdf]
  66. DiCiccio, T.J., Kuffner, T.A. and Young, G.A. (2017). The formal relationship between analytic and bootstrap approaches to parametric inference. Journal of Statistical Planning and Inference 191, 81-87. [pdf]
  67. DiCiccio, T.J., Kuffner, T.A. and Young, G.A. (2017). A simple analysis of the exact probability matching prior in the location-scale model. American Statistician 71(4), 302-304. [pdf]
  68. Kuffner, T.A., Lee, S.M.S. and Young, G.A. (2018). Consistency of a hybrid block bootstrap for distribution and variance estimation for sample quantiles of weakly dependent sequences. Australian & New Zealand Journal of Statistics 60(1), 103-114. [pdf]
  69. Kuffner, T.A. and Young, G.A. (2018). Principled statistical inference in data science. In Statistical Data Science, N. Adams, E. Cohen, Y.K. Guo (eds.), 21-36. World Scientific. [pdf]
Ph.D Students Supervised
  1. Iain Stemp; University of Cambridge, 1993
  2. Stephen M.S. Lee; University of Cambridge, 1994
  3. Richard Samworth; University of Cambridge, 2004
  4. Kevin Lu; Imperial College London, 2011
  5. Todd Kuffner; Imperial College London, 2011
  6. Daniel Garcia Rasines; Imperial College London (TBD, co-advised with Axel Gandy)
Miscellanea