UNI-MB - logo
UMNIK - logo
 
E-resources
Peer reviewed Open access
  • NONPARAMETRIC STOCHASTIC AP...
    Dieuleveut, Aymeric; Bach, Francis

    The Annals of statistics, 08/2016, Volume: 44, Issue: 4
    Journal Article

    We consider the random-design least-squares regression problem within the reproducing kernel Hubert space (RKHS) framework. Given a stream of independent and identically distributed input/output data, we aim to learn a regression function within an RKHS ℋ, even if the optimal predictor (i.e., the conditional expectation) is not in ℋ. In a stochastic approximation framework where the estimator is updated after each observation, we show that the averaged unregularized least-mean-square algorithm (a form of stochastic gradient descent), given a sufficient large step-size, attains optimal rates of convergence for a variety of regimes for the smoothnesses of the optimal prediction function and the functions in ℋ. Our results apply as well in the usual finite-dimensional setting of parametric least-squares regression, showing adaptivity of our estimator to the spectral decay of the covariance matrix of the covariates.