Mathematical Statistics (2023/2024)

Course code
cod wi: dt001090
Name of lecturer
Catia Scricciolo
Catia Scricciolo
Number of ECTS credits allocated
Academic sector
Language of instruction
Anno accademico 2023/2024 Dottorato di Ricerca dal Oct 1, 2023 al Sep 30, 2024.

Lesson timetable

Go to lesson schedule

Learning outcomes

Introduce students to the theory of nonparametric estimation through models and examples.


Introduction to the problem of nonparametric estimation and overview of the course topics:
a. methods of construction of estimators,
b. statistical properties of estimators (convergence and rates of convergence),
c. study of optimality of estimators.
Examples of nonparametric problems and models:
- estimation of a probability density,
- nonparametric regression,
- Gaussian white noise model.
Distances/divergences between probability measures:
- Hellinger and total variation distances,
- Scheffè’s theorem and Le Cam’s inequalities,
- Kullback-Leibler and χ2-divergences,
- link inequalities among distances and divergences.
Estimation of the distribution function: definition of the empirical distribution function and consistency.
Estimation of a probability density:
- definition of the Parzen–Rosenblatt kernel density estimator in the uni- and multidimensional cases, examples of kernels,
- definition of the mean squared error (MSE) of kernel estimators at a point and decomposition into the sum of the variance and the squared bias,
- upper bound on the point-wise variance,
- upper bound on the point-wise bias under regularity conditions on the density and the kernel: definitions of Hölder classes and higher order kernels,
- upper bound on the supremum point-wise MSE of kernel estimators,
- mean integrated squared error (MISE): decomposition into the
sum of the integrated variance and the squared bias,
- control of the variance term,
- control of the bias term on Nikol’ski and Sobolev classes of regular densities, upper bound on the MISE for densities in
Sobolev classes.
Fourier analysis of kernel density estimators:
- preliminary facts on Fourier transforms (FT’s),
- the empirical characteristic function: unbiasedness of the FT for the distribution function, expression of the variance,
- expression of the exact MISE of kernel density estimators,
- control of the bias term over Sobolev classes of densities,
- discussion of the local condition around zero on the FT of the
Nonparametric regression:
- nonparametric regression with fixed or random design,
- nonparametric regression with random design and the Nadaraya-Watson (N-W) estimator,
- derivation of the expression of the N-W estimator from kernel
density estimators,
- the N-W estimator as a linear nonparametric regression
- asymptotic analysis of the N-W estimator,
- nonparametric regression with fixed (regular) design,
- definition of projection (or orthogonal series) estimators,
- the trigonometric basis as an example of orthonormal basis,
- Sobolev classes and ellipsoids,
- bias and MSE of the coefficient estimators,
- control of the residuals by the condition that the vector of
coefficients belongs to a Sobolev ellipsoid, decomposition of
the MISE of the projection estimator and optimal choice of the
cut-off point,
- upper bound on the MISE for the projection estimator,
- connection between the Gaussian white noise model and
nonparametric regression.
Lower bounds on the minimax risk:
- minimax risk associated with a statistical model and a semi-metric,
- definition of an optimal rate of convergence,
- a general reduction scheme for proving lower bounds,
- main theorem on lower bounds based on many hypotheses using the Kullback-Leibler divergence,
- example of lower bound on the minimax L2-risk for the Hölder class in nonparametric regression estimation with fixed

Reference books

See the teaching bibliography

Assessment methods and criteria

There is both the possibility of taking a written assessment test in classical form with questions related to topics covered in lectures and the possibility of writing a report on findings from the recent literature on nonparametric statistical inference.