Conjecture: A square matrix [imath]A[/imath] for which all eigenvalues [imath]\lambda_i[/imath] are such that [imath]Re(\lambda_i) <= 0 \forall i[/imath] is normal; that is, [imath]A^* A = A A^*[/imath], and its real Schur decomposition is a block diagonal whose diagonal elements are the eigenvalues of A, or in the case of complex eigenvalues, 2x2 blocks with the real part on the diagonal, and the complex part on the off-diagonals.

This is not homework. It is important for me to know if this property holds true for the implementation of some code.

EDIT: Shit. I found a counter-example.

## Matrices Matrices Matrices...

**Moderators:** gmalivuk, Moderators General, Prelates

### Re: Matrices Matrices Matrices...

What are you coding, if you don't mind my asking? I'll be doing the same, coding stuff with matrices etc sooner or later in school.

- NathanielJ
**Posts:**882**Joined:**Sun Jan 13, 2008 9:04 pm UTC

### Re: Matrices Matrices Matrices...

gorcee wrote:EDIT: Shit. I found a counter-example.

Yep, there is absolutely no connection between eigenvalues and whether or not the matrix is normal. Take whatever eigenvalues you're curious about, stick them down the diagonal of a matrix -- there's a normal matrix with those eigenvalues. Throw a 1 into the upper-triangular part of the matrix somewhere and you now have a non-normal matrix with the same eigenvalues.

Matrices of the type you described are exactly the matrices whose Hermitian part (i.e. A + A

^{*}) is negative semidefinite. Thus, the matrix A + A

^{*}satisfies your conjecture, but A itself doesn't need to.

### Re: Matrices Matrices Matrices...

I'm writing an implementation of Hammarling's method for the direct solution of the discrete-time non-negative definite Lyapunov equations designed to run on embedded controllers. I don't have access to a SLICOT license to use those routines commercially, and I've already developed the routine in MATLAB.

I had scared myself earlier because I thought maybe I hadn't tested my MATLAB code against some edge cases. However, it turns out the cases I was concerned about wouldn't necessarily be able to arise or have solutions for the discrete time Lyapunov equations: the eigenvalues were outside the stability range.

Either way, the thing that initially led me down my path was the observation that the Schur decomposition need not be block-diagonal. In MATLAB I kind of cheated and just used the backslash command to solve a linear system within the routine. I had to code that in C, and I didn't really feel like implementing a full linear system solver, or invoking a LAPACK routine to do so. Turns out that the structure of the Schur decomposition naturally allowed me to implement a backsolve by doing nothing more than hard coding the solution to at most a 2x2 block, and back propagating the solutions through the right hand side as they were found.

I had scared myself earlier because I thought maybe I hadn't tested my MATLAB code against some edge cases. However, it turns out the cases I was concerned about wouldn't necessarily be able to arise or have solutions for the discrete time Lyapunov equations: the eigenvalues were outside the stability range.

Either way, the thing that initially led me down my path was the observation that the Schur decomposition need not be block-diagonal. In MATLAB I kind of cheated and just used the backslash command to solve a linear system within the routine. I had to code that in C, and I didn't really feel like implementing a full linear system solver, or invoking a LAPACK routine to do so. Turns out that the structure of the Schur decomposition naturally allowed me to implement a backsolve by doing nothing more than hard coding the solution to at most a 2x2 block, and back propagating the solutions through the right hand side as they were found.

### Who is online

Users browsing this forum: No registered users and 15 guests