On Kolmogorov asymptotics of estimators of the misclassification error rate in linear discriminant analysis

Amin Zollanvari, Marc G. Genton

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

We provide a fundamental theorem that can be used in conjunction with Kolmogorov asymptotic conditions to derive the first moments of well-known estimators of the actual error rate in linear discriminant analysis of a multi-variate Gaussian model under the assumption of a common known covariance matrix. The estimators studied in this paper are plug-in and smoothed resub-stitution error estimators, both of which have not been studied before under Kolmogorov asymptotic conditions. As a result of this work, we present an optimal smoothing parameter that makes the smoothed resubstitution an unbiased estimator of the true error. For the sake of completeness, we further show how to utilize the presented fundamental theorem to achieve several previously reported results, namely the first moment of the resubsti-tution estimator and the actual error rate. We provide numerical examples to show the accuracy of the succeeding finite sample approximations in situations where the number of dimensions is comparable or even larger than the sample size.

Original languageEnglish
Pages (from-to)300-326
Number of pages27
JournalSankhya: The Indian Journal of Statistics
Volume75 A
Issue numberPART2
Publication statusPublished - 2013
Externally publishedYes

Fingerprint

Misclassification Error
Discriminant Analysis
Error Rate
Estimator
Moment
Smoothing Parameter
Unbiased estimator
Error Estimator
Multivariate Models
Gaussian Model
Plug-in
Optimal Parameter
Theorem
Covariance matrix
Completeness
Sample Size
Numerical Examples
Approximation
Discriminant analysis
Misclassification error

Keywords

  • Double asymptotics
  • Error estimation
  • Kolmogorov asymptotic analysis
  • Plug-in error
  • Resubstitution
  • Smoothed resubstitution
  • True error

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

On Kolmogorov asymptotics of estimators of the misclassification error rate in linear discriminant analysis. / Zollanvari, Amin; Genton, Marc G.

In: Sankhya: The Indian Journal of Statistics, Vol. 75 A, No. PART2, 2013, p. 300-326.

Research output: Contribution to journalArticle

@article{e59fed0bb8c8418b87680c7fbca8af99,
title = "On Kolmogorov asymptotics of estimators of the misclassification error rate in linear discriminant analysis",
abstract = "We provide a fundamental theorem that can be used in conjunction with Kolmogorov asymptotic conditions to derive the first moments of well-known estimators of the actual error rate in linear discriminant analysis of a multi-variate Gaussian model under the assumption of a common known covariance matrix. The estimators studied in this paper are plug-in and smoothed resub-stitution error estimators, both of which have not been studied before under Kolmogorov asymptotic conditions. As a result of this work, we present an optimal smoothing parameter that makes the smoothed resubstitution an unbiased estimator of the true error. For the sake of completeness, we further show how to utilize the presented fundamental theorem to achieve several previously reported results, namely the first moment of the resubsti-tution estimator and the actual error rate. We provide numerical examples to show the accuracy of the succeeding finite sample approximations in situations where the number of dimensions is comparable or even larger than the sample size.",
keywords = "Double asymptotics, Error estimation, Kolmogorov asymptotic analysis, Plug-in error, Resubstitution, Smoothed resubstitution, True error",
author = "Amin Zollanvari and Genton, {Marc G.}",
year = "2013",
language = "English",
volume = "75 A",
pages = "300--326",
journal = "Sankhya: The Indian Journal of Statistics",
issn = "0972-7671",
publisher = "Indian Statistical Institute",
number = "PART2",

}

TY - JOUR

T1 - On Kolmogorov asymptotics of estimators of the misclassification error rate in linear discriminant analysis

AU - Zollanvari, Amin

AU - Genton, Marc G.

PY - 2013

Y1 - 2013

N2 - We provide a fundamental theorem that can be used in conjunction with Kolmogorov asymptotic conditions to derive the first moments of well-known estimators of the actual error rate in linear discriminant analysis of a multi-variate Gaussian model under the assumption of a common known covariance matrix. The estimators studied in this paper are plug-in and smoothed resub-stitution error estimators, both of which have not been studied before under Kolmogorov asymptotic conditions. As a result of this work, we present an optimal smoothing parameter that makes the smoothed resubstitution an unbiased estimator of the true error. For the sake of completeness, we further show how to utilize the presented fundamental theorem to achieve several previously reported results, namely the first moment of the resubsti-tution estimator and the actual error rate. We provide numerical examples to show the accuracy of the succeeding finite sample approximations in situations where the number of dimensions is comparable or even larger than the sample size.

AB - We provide a fundamental theorem that can be used in conjunction with Kolmogorov asymptotic conditions to derive the first moments of well-known estimators of the actual error rate in linear discriminant analysis of a multi-variate Gaussian model under the assumption of a common known covariance matrix. The estimators studied in this paper are plug-in and smoothed resub-stitution error estimators, both of which have not been studied before under Kolmogorov asymptotic conditions. As a result of this work, we present an optimal smoothing parameter that makes the smoothed resubstitution an unbiased estimator of the true error. For the sake of completeness, we further show how to utilize the presented fundamental theorem to achieve several previously reported results, namely the first moment of the resubsti-tution estimator and the actual error rate. We provide numerical examples to show the accuracy of the succeeding finite sample approximations in situations where the number of dimensions is comparable or even larger than the sample size.

KW - Double asymptotics

KW - Error estimation

KW - Kolmogorov asymptotic analysis

KW - Plug-in error

KW - Resubstitution

KW - Smoothed resubstitution

KW - True error

UR - http://www.scopus.com/inward/record.url?scp=84881576929&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84881576929&partnerID=8YFLogxK

M3 - Article

VL - 75 A

SP - 300

EP - 326

JO - Sankhya: The Indian Journal of Statistics

JF - Sankhya: The Indian Journal of Statistics

SN - 0972-7671

IS - PART2

ER -