TY - JOUR

T1 - Inverse eigenvalue problems for extended Hessenberg and extended tridiagonal matrices

AU - Mach, Thomas

AU - Van Barel, Marc

AU - Vandebril, Raf

N1 - Funding Information:
This research was supported in part by the Research Council KU Leuven, projects OT/11/055 (Spectral Properties of Perturbed Normal Matrices and their Applications), OT/10/038 (Multi-Parameter Model Order Reduction and its Applications), CoE EF/05/006 Optimization in Engineering (OPTEC); by the Fund for Scientific Research—Flanders (Belgium) project G034212N (Reestablishing Smoothness for Matrix Manifold Optimization via Resolution of Singularities), by the Interuniversity Attraction Poles Programme, initiated by the Belgian State, Science Policy Office, Belgian Network DYSCO (Dynamical Systems, Control, and Optimization); and by a DFG research stipend MA 5852/1-1 .

PY - 2014/12/15

Y1 - 2014/12/15

N2 - In inverse eigenvalue problems one tries to reconstruct a matrix, satisfying some constraints, given some spectral information. Here, two inverse eigenvalue problems are solved. First, given the eigenvalues and the first components of the associated eigenvectors (called the weight vector) an extended Hessenberg matrix with prescribed poles is computed possessing these eigenvalues and satisfying the eigenvector constraints. The extended Hessenberg matrix is retrieved by executing particularly designed unitary similarity transformations on the diagonal matrix containing the eigenvalues. This inverse problem closely links to orthogonal rational functions: the extended Hessenberg matrix contains the recurrence coefficients given the nodes (eigenvalues), poles (poles of the extended Hessenberg matrix), and a weight vector (first eigenvector components) determining the discrete inner product. Moreover, it is also sort of the inverse of the (rational) Arnoldi algorithm: instead of using the (rational) Arnoldi method to compute a Krylov basis to approximate the spectrum, we will reconstruct the orthogonal Krylov basis given the spectral info. In the second inverse eigenvalue problem, we do the same, but refrain from unitarity. As a result we execute possibly non-unitary similarity transformations on the diagonal matrix of eigenvalues to retrieve a (non)-symmetric extended tridiagonal matrix. The algorithm will be less stable, but it will be faster, as the extended tridiagonal matrix admits a low cost factorization of O(n) (n equals the number of eigenvalues), whereas the extended Hessenberg matrix does not. Again there is a close link with orthogonal function theory, the extended tridiagonal matrix captures the recurrence coefficients of bi-orthogonal rational functions. Moreover, it is again sort of inverse of the nonsymmetric Lanczos algorithm: given spectral properties, we reconstruct the two basis Krylov matrices linked to the nonsymmetric Lanczos algorithm.

AB - In inverse eigenvalue problems one tries to reconstruct a matrix, satisfying some constraints, given some spectral information. Here, two inverse eigenvalue problems are solved. First, given the eigenvalues and the first components of the associated eigenvectors (called the weight vector) an extended Hessenberg matrix with prescribed poles is computed possessing these eigenvalues and satisfying the eigenvector constraints. The extended Hessenberg matrix is retrieved by executing particularly designed unitary similarity transformations on the diagonal matrix containing the eigenvalues. This inverse problem closely links to orthogonal rational functions: the extended Hessenberg matrix contains the recurrence coefficients given the nodes (eigenvalues), poles (poles of the extended Hessenberg matrix), and a weight vector (first eigenvector components) determining the discrete inner product. Moreover, it is also sort of the inverse of the (rational) Arnoldi algorithm: instead of using the (rational) Arnoldi method to compute a Krylov basis to approximate the spectrum, we will reconstruct the orthogonal Krylov basis given the spectral info. In the second inverse eigenvalue problem, we do the same, but refrain from unitarity. As a result we execute possibly non-unitary similarity transformations on the diagonal matrix of eigenvalues to retrieve a (non)-symmetric extended tridiagonal matrix. The algorithm will be less stable, but it will be faster, as the extended tridiagonal matrix admits a low cost factorization of O(n) (n equals the number of eigenvalues), whereas the extended Hessenberg matrix does not. Again there is a close link with orthogonal function theory, the extended tridiagonal matrix captures the recurrence coefficients of bi-orthogonal rational functions. Moreover, it is again sort of inverse of the nonsymmetric Lanczos algorithm: given spectral properties, we reconstruct the two basis Krylov matrices linked to the nonsymmetric Lanczos algorithm.

KW - Data sparse factorizations

KW - Extended Hessenberg matrix

KW - Inverse eigenvalue problem

KW - Rational Arnoldi

KW - Rational nonsymmetric Lanczos

UR - http://www.scopus.com/inward/record.url?scp=84903893830&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84903893830&partnerID=8YFLogxK

U2 - 10.1016/j.cam.2014.03.015

DO - 10.1016/j.cam.2014.03.015

M3 - Article

AN - SCOPUS:84903893830

VL - 272

SP - 377

EP - 398

JO - Journal of Computational and Applied Mathematics

JF - Journal of Computational and Applied Mathematics

SN - 0377-0427

ER -