TY - JOUR
T1 - Autoencoders for a manifold learning problem with a jacobian rank constraint
AU - Takhanov, Rustem
AU - Abylkairov, Y. Sultan
AU - Tezekbayev, Maxat
N1 - Funding Information:
This research has been funded by Nazarbayev University under Faculty-development competitive research grantsprogram for 2023-2025 Grant #20122022FD4131, PI R. Takhanov. The authors would like to thank Atakan Varol and Makat Tlebaliyev for providing computational resources of the Institute of Smart Systems and Artificial Intelligence(ISSAI).
Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2023/11
Y1 - 2023/11
N2 - We formulate the manifold learning problem as the problem of finding an operator that maps any point to a close neighbor that lies on a “hidden” k-dimensional manifold. We call this operator the correcting function. Under this formulation, autoencoders can be viewed as a tool to approximate the correcting function. Given an autoencoder whose Jacobian has rank k, we deduce from the classical Constant Rank Theorem that its range has a structure of a k-dimensional manifold. A k-dimensionality of the range can be forced by the architecture of an autoencoder (by fixing the dimension of the code space), or alternatively, by an additional constraint that the rank of the autoencoder mapping is not greater than k. This constraint is included in the objective function as a new term, namely a squared Ky-Fan k-antinorm of the Jacobian function. We claim that this constraint is a factor that effectively reduces the dimension of the range of an autoencoder, additionally to the reduction defined by the architecture. We also add a new curvature term into the objective. To conclude, we experimentally compare our approach with the CAE+H method on synthetic and real-world datasets.
AB - We formulate the manifold learning problem as the problem of finding an operator that maps any point to a close neighbor that lies on a “hidden” k-dimensional manifold. We call this operator the correcting function. Under this formulation, autoencoders can be viewed as a tool to approximate the correcting function. Given an autoencoder whose Jacobian has rank k, we deduce from the classical Constant Rank Theorem that its range has a structure of a k-dimensional manifold. A k-dimensionality of the range can be forced by the architecture of an autoencoder (by fixing the dimension of the code space), or alternatively, by an additional constraint that the rank of the autoencoder mapping is not greater than k. This constraint is included in the objective function as a new term, namely a squared Ky-Fan k-antinorm of the Jacobian function. We claim that this constraint is a factor that effectively reduces the dimension of the range of an autoencoder, additionally to the reduction defined by the architecture. We also add a new curvature term into the objective. To conclude, we experimentally compare our approach with the CAE+H method on synthetic and real-world datasets.
KW - Alternating algorithm
KW - Autoencoders
KW - Dimensionality reduction
KW - Ky fan antinorm
KW - Manifold learning
KW - Rank constraints
UR - http://www.scopus.com/inward/record.url?scp=85163861313&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85163861313&partnerID=8YFLogxK
U2 - 10.1016/j.patcog.2023.109777
DO - 10.1016/j.patcog.2023.109777
M3 - Article
AN - SCOPUS:85163861313
SN - 0031-3203
VL - 143
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 109777
ER -