TY - JOUR
T1 - Optimal Bayesian Classification With Vector Autoregressive Data Dependency
AU - Zollanvari, Amin
AU - Dougherty, Edward R.
N1 - Funding Information:
Manuscript received November 8, 2018; revised February 4, 2019, March 5, 2019, and April 8, 2019; accepted April 8, 2019. Date of publication April 18, 2019; date of current version May 2, 2019. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Sotirios Chatzis. The work of A. Zollanvari was supported by the Nazarbayev University Faculty Development Competitive Research Grant under Award SOE2018008. The work of E. R. Dougherty was supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Mathematical Multifaceted Integrated Capability Centers program under Award DE-SC0019393. (Corresponding author: Amin Zollanvari.) A. Zollanvari is with the Department of Electrical and Computer Engineering, Nazarbayev University, Nur-Sultan 010000, Kazakhstan (e-mail: amin.zollanvari@nu.edu.kz).
Publisher Copyright:
© 1991-2012 IEEE.
Copyright:
Copyright 2019 Elsevier B.V., All rights reserved.
PY - 2019/6/15
Y1 - 2019/6/15
N2 - In classification theory, it is generally assumed that the data are independent and identically distributed. However, in many practical applications, we face a set of observations that are collected sequentially with a dependence structure among samples. The primary focus of this investigation is to construct the optimal Bayesian classifier (OBC) when the training observations are serially dependent. To model the effect of dependency, we assume the training observations are generated from tVAR(p), which is a multidimensional vector autoregressive process of order p. At the same time, we assume there exists uncertainty about parameters governing the VAR(p) model. To model this uncertainty, we assume that model parameters (coefficient matrices) are random variables with a prior distribution, and find the resulting OBC under the assumption of known covariance matrices of white-noise processes. We employ simulations using both synthetic and real data to demonstrate the efficacy of the constructed OBC.
AB - In classification theory, it is generally assumed that the data are independent and identically distributed. However, in many practical applications, we face a set of observations that are collected sequentially with a dependence structure among samples. The primary focus of this investigation is to construct the optimal Bayesian classifier (OBC) when the training observations are serially dependent. To model the effect of dependency, we assume the training observations are generated from tVAR(p), which is a multidimensional vector autoregressive process of order p. At the same time, we assume there exists uncertainty about parameters governing the VAR(p) model. To model this uncertainty, we assume that model parameters (coefficient matrices) are random variables with a prior distribution, and find the resulting OBC under the assumption of known covariance matrices of white-noise processes. We employ simulations using both synthetic and real data to demonstrate the efficacy of the constructed OBC.
KW - Optimal Bayesian classification
KW - serially dependent training data
KW - vector autoregressive processes
UR - http://www.scopus.com/inward/record.url?scp=85065448185&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85065448185&partnerID=8YFLogxK
U2 - 10.1109/TSP.2019.2912131
DO - 10.1109/TSP.2019.2912131
M3 - Article
AN - SCOPUS:85065448185
VL - 67
SP - 3073
EP - 3086
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
SN - 1053-587X
IS - 12
M1 - 8693865
ER -