2021

Journal article
Open Access

Noferini V., Robol L., Vandebril R.

Structured QR Comrade matrix Mathematics - Numerical Analysis Colleague matrix Analysis FOS: Mathematics Backward error Numerical Analysis (math.NA) Companion matrix Linearization

A standard approach to calculate the roots of a univariate polynomial is to compute the eigenvalues of an associated confederate matrix instead, such as, for instance, the companion or comrade matrix. The eigenvalues of the confederate matrix can be computed by Francis's QR algorithm. Unfortunately, even though the QR algorithm is provably backward stable, mapping the errors back to the original polynomial coefficients can still lead to huge errors. However, the latter statement assumes the use of a non-structure-exploiting QR algorithm. In [J. L. Aurentz et al., Fast and backward stable computation of roots of polynomials, SIAM J. Matrix Anal. Appl., 36 (2015), pp. 942-973] it was shown that a structure-exploiting QR algorithm for companion matrices leads to a structured backward error in the companion matrix. The proof relied on decomposing the error into two parts: a part related to the recurrence coefficients of the basis (a monomial basis in that case) and a part linked to the coefficients of the original polynomial. In this article we prove that the analysis can be extended to other classes of comrade matrices. We first provide an alternative backward stability proof in the monomial basis using structured QR algorithms; our new point of view shows more explicitly how a structured, decoupled error in the confederate matrix gets mapped to the associated polynomial coefficients. This insight reveals which properties have to be preserved by a structure-exploiting QR algorithm to end up with a backward stable algorithm. We will show that the previously formulated companion analysis fits into this framework, and we analyze in more detail Jacobi polynomials (comrade matrices) and Chebyshev polynomials (colleague matrices).

**Source: **Electronic transactions on numerical analysis 54 (2021): 420–442. doi:10.1553/ETNA_VOL54S420

**Publisher: **Kent State University,, Kent, OH , Stati Uniti d'America

[1] Milton Abramowitz and Irene A Stegun. Handbook of mathematical functions: with formulas, graphs, and mathematical tables, volume 55. Courier Corporation, 1965.

[2] Jared L Aurentz, Thomas Mach, Leonardo Robol, Raf Vandebril, and David S Watkins. Core-chasing algorithms for the eigenvalue problem, volume 13. SIAM, 2018.

[3] Jared L Aurentz, Thomas Mach, Leonardo Robol, Raf Vandebril, and David S. Watkins. Fast and backward stable computation of roots of polynomials, part II: backward error analysis; companion matrix and companion pencil. SIAM J. Matrix Anal. A., 2018.

[4] Jared L. Aurentz, Thomas Mach, Leonardo Robol, Raf Vandebril, and David S. Watkins. Fast and backward stable computation of the eigenvalues of matrix polynomials. Math. Comp., 2018.

[5] Jared L. Aurentz, Thomas Mach, Raf Vandebril, and David S Watkins. Fast and backward stable computation of roots of polynomials. SIAM J. Matrix Anal. A., 36(3):942-973, 2015.

[6] Stephen Barnett. Polynomials and Linear Control Systems. Marcel Dekker Inc., 1983.

[7] Fernando De Ter´an, Froila´n M. Dopico, and Javier P´erez. Backward stability of polynomial root-finding using Fiedler companion matrices. IMA J. Numer. Anal., 36(1):133-173, 2016.

[8] Tobin A Driscoll, Nicholas Hale, and Lloyd N Trefethen. Chebfun guide, 2014.

[9] Alan Edelman and H Murakami. Polynomial roots from companion matrix eigenvalues. Math. Comp., 64(210):763-776, 1995.

[10] Yuli Eidelman, Luca Gemignani, and Israel Gohberg. Efficient eigenvalue computation for quasiseparable hermitian matrices under low rank perturbations. Numerical Algorithms, 47(3):253-273, 2008.

[11] Miroslav Fiedler. A note on companion matrices. Linear Algebra and its Applications, 375:325-332, 2003.

[12] Luigi Gatteschi. On the zeros of jacobi polynomials and bessel functions. In International conference on special functions: theory and computation (Turin, 1984). Rend. Sem. Mat. Univ. Politec. Torino (Special Issue), pages 149-177, 1985.

[13] Walter Gautschi and Carla Giordano. Luigi gatteschis work on asymptotics of special functions and their zeros. Numerical Algorithms, 49(1-4):11-31, 2008.

[14] Luca Gemignani and Leonardo Robol. Fast Hessenberg reduction of some rank structured matrices. SIAM J. Matrix Anal. A., 38(2):574-598, 2017.

[15] Piers W. Lawrence and Rob Corless. Stability of rootfinding for barycentric lagrange interpolants. Numer. Algorithms, 65(3):447-464, 2014.

[16] Piers W. Lawrence, Marc Van Barel, and Paul Van Dooren. Backward error analysis of polynomial eigenvalue problems solved by linearizations. SIAM J. Matrix Anal. A., 37(1):123- 144, 2016.

[17] D. Steven Mackey, Niloufer Mackey, Christina Mehl, and Volker Mehrmann. Vector spaces of linearizations for matrix polynomials. SIAM J. Matrix Anal. A., 28:971-1004, 2006.

[18] Yuji Nakatsukasa and Vanni Noferini. On the stability of computing polynomial roots via confederate linearizations. Math. Comp., 85(301):2391-2425, 2016.

[19] Yuji Nakatsukasa, Vanni Noferini, and Alex Townsend. Vector spaces of linearizations for matrix polynomials: a bivariate polynomial approach. SIAM J. Matrix Anal. A., 38(1):1-29, 2017.

[20] Vanni Noferini and Javier P´erez. Chebyshev rootfinding via computing eigenvalues of colleague matrices: when is it stable? Math. Comp., 86(306):1741-1767, 2017.

[21] Peter Opsomer. Asymptotics for orthogonal polynomials and high-frequency scattering problems. PhD thesis, Department of Computer Science, KU Leuven, 2018.

[22] Gabor Szego¨. Orthogonal Polynomials. AMS Colloquium Publications, 1992.

[23] Lloyd N. Trefethen and et al. Chebfun version 6. 2017.

[24] Paul Van Dooren and Patrick Dewilde. The eigenstructure of an arbitrary polynomial matrix: computational aspects. Linear Algebra Appl., 50:545-579, 1983.

[25] D. S. Watkins. The Matrix Eigenvalue Problem: GR and Krylov Subspace Methods. SIAM, Philadelphia, Pennsylvania, USA, 2007.

[2] Jared L Aurentz, Thomas Mach, Leonardo Robol, Raf Vandebril, and David S Watkins. Core-chasing algorithms for the eigenvalue problem, volume 13. SIAM, 2018.

[3] Jared L Aurentz, Thomas Mach, Leonardo Robol, Raf Vandebril, and David S. Watkins. Fast and backward stable computation of roots of polynomials, part II: backward error analysis; companion matrix and companion pencil. SIAM J. Matrix Anal. A., 2018.

[4] Jared L. Aurentz, Thomas Mach, Leonardo Robol, Raf Vandebril, and David S. Watkins. Fast and backward stable computation of the eigenvalues of matrix polynomials. Math. Comp., 2018.

[5] Jared L. Aurentz, Thomas Mach, Raf Vandebril, and David S Watkins. Fast and backward stable computation of roots of polynomials. SIAM J. Matrix Anal. A., 36(3):942-973, 2015.

[6] Stephen Barnett. Polynomials and Linear Control Systems. Marcel Dekker Inc., 1983.

[7] Fernando De Ter´an, Froila´n M. Dopico, and Javier P´erez. Backward stability of polynomial root-finding using Fiedler companion matrices. IMA J. Numer. Anal., 36(1):133-173, 2016.

[8] Tobin A Driscoll, Nicholas Hale, and Lloyd N Trefethen. Chebfun guide, 2014.

[9] Alan Edelman and H Murakami. Polynomial roots from companion matrix eigenvalues. Math. Comp., 64(210):763-776, 1995.

[10] Yuli Eidelman, Luca Gemignani, and Israel Gohberg. Efficient eigenvalue computation for quasiseparable hermitian matrices under low rank perturbations. Numerical Algorithms, 47(3):253-273, 2008.

[11] Miroslav Fiedler. A note on companion matrices. Linear Algebra and its Applications, 375:325-332, 2003.

[12] Luigi Gatteschi. On the zeros of jacobi polynomials and bessel functions. In International conference on special functions: theory and computation (Turin, 1984). Rend. Sem. Mat. Univ. Politec. Torino (Special Issue), pages 149-177, 1985.

[13] Walter Gautschi and Carla Giordano. Luigi gatteschis work on asymptotics of special functions and their zeros. Numerical Algorithms, 49(1-4):11-31, 2008.

[14] Luca Gemignani and Leonardo Robol. Fast Hessenberg reduction of some rank structured matrices. SIAM J. Matrix Anal. A., 38(2):574-598, 2017.

[15] Piers W. Lawrence and Rob Corless. Stability of rootfinding for barycentric lagrange interpolants. Numer. Algorithms, 65(3):447-464, 2014.

[16] Piers W. Lawrence, Marc Van Barel, and Paul Van Dooren. Backward error analysis of polynomial eigenvalue problems solved by linearizations. SIAM J. Matrix Anal. A., 37(1):123- 144, 2016.

[17] D. Steven Mackey, Niloufer Mackey, Christina Mehl, and Volker Mehrmann. Vector spaces of linearizations for matrix polynomials. SIAM J. Matrix Anal. A., 28:971-1004, 2006.

[18] Yuji Nakatsukasa and Vanni Noferini. On the stability of computing polynomial roots via confederate linearizations. Math. Comp., 85(301):2391-2425, 2016.

[19] Yuji Nakatsukasa, Vanni Noferini, and Alex Townsend. Vector spaces of linearizations for matrix polynomials: a bivariate polynomial approach. SIAM J. Matrix Anal. A., 38(1):1-29, 2017.

[20] Vanni Noferini and Javier P´erez. Chebyshev rootfinding via computing eigenvalues of colleague matrices: when is it stable? Math. Comp., 86(306):1741-1767, 2017.

[21] Peter Opsomer. Asymptotics for orthogonal polynomials and high-frequency scattering problems. PhD thesis, Department of Computer Science, KU Leuven, 2018.

[22] Gabor Szego¨. Orthogonal Polynomials. AMS Colloquium Publications, 1992.

[23] Lloyd N. Trefethen and et al. Chebfun version 6. 2017.

[24] Paul Van Dooren and Patrick Dewilde. The eigenstructure of an arbitrary polynomial matrix: computational aspects. Linear Algebra Appl., 50:545-579, 1983.

[25] D. S. Watkins. The Matrix Eigenvalue Problem: GR and Krylov Subspace Methods. SIAM, Philadelphia, Pennsylvania, USA, 2007.

Back to previous page

@article{oai:it.cnr:prodotti:468672, title = {Structured backward errors in linearizations}, author = {Noferini V. and Robol L. and Vandebril R.}, publisher = {Kent State University,, Kent, OH , Stati Uniti d'America}, doi = {10.1553/etna_vol54s420 and 10.48550/arxiv.1912.04157}, journal = {Electronic transactions on numerical analysis}, volume = {54}, pages = {420–442}, year = {2021} }