# sandbox/easystab/david/LinearSystems.md

(Lecture notes for the M2R-DET course, D. Fabre, nov. 2018)

Work in progress !

This documents gives mathematical support for the study of linear dynamical systems. The notions will be used throughout the course, especially in lecture 1 and lecture 9.

Paragraphs in green are mostly useful for lectures 9 and 10, and can be skipped at first lecture.

# Linear dynamical systems

In this section we investigate the behaviour of an exactly linear dynamical system possessing a fixed point X_0 =0, written with the form:

\displaystyle \frac{d X}{dt} = A X

Where X(t) = [x_1(t);x_2(t);.. ;x_N(t)] is a vector (column-vector) of dimension N and A is a square matrix of dimension N.

The matrix A (which coincides with the gradient of the flow of the underlying dynamical system) defines a linear application defined on \mathbb{R}^N \rightarrow \mathbb{R}^N (if the matrix is real) or on \mathbb{C}^N \rightarrow \mathbb{C}^N (if the matrix is complex). However, even if the matrix is real, the solution of the dynamical system may have to be expressed in terms of complex solutions, so it is better to consider it as a complex application in all cases.

## A few definitions

In the sequel we will require to define a scalar product <X,Y> and a norm ||X|| on \mathbb{C}^N. The simplest choice (canonical) is as follows (where the overbar denotes the complex conjugate): \displaystyle <X,Y> = \overline{X}^T \cdot Y = \sum_{i=1}^N \overline{x}_i y_i, \displaystyle ||X|| = {\left(<X,X>\right)}^{1/2} = {\left( \sum_{i=1}^N |{x}_i|^2 \right)}^{1/2}. We call A^\dag the adjoint of the matrix A, defined by the property : \displaystyle \forall (X,Y), \quad < A^\dag x, y> = <x, A y>. It is clear that A^\dag = \overline{A}^T, namely the adjoint matrix is the conjugate transpose of the matrix (also called the hermitian transpose).
A matrix is said to be self-adjoint (or Hermitian) if A^\dag = A.
For real matrices this notion is equivalent to saying that the matrix is symmetric.

## Eigenvalues and eigenmodes

We look for solutions under the “eigenmode” ansatz : \displaystyle X(t) = \hat{X} e^{\lambda t}

The dynamical system thus results to a eigenvalue problem :

\displaystyle \lambda \hat{X} = A \hat{X}

The solutions \lambda are the eigenvalues and the corresponding \hat{X} the eigenvectors. The full set of eigenvalue is called the spectrum of the matrix.

We suppose in this section and in the next one that the eigenvalues are all distinct. The exceptional case where two eigenvalues (or more) are identical requires specific treatment, but can be set aside for the moment.

There are n couples of eigenvalue/eigenvector which can be numbered (\lambda_n;\hat X_n) for n=1..N.

The set of eigenvectors {\{ \hat X_n \}}_{n=1..N} forms a basis of \mathbb{R}^N (or \mathbb{C}^N) They can be normalised by ||\hat{X}_n||=1, leading to a normed basis (not necessarily orthogonal).

• If the matrix A is symmetric (or hermitian in the complex case), the eigenvalues are all real and the eigenmodes form an orthogonal basis. Namely, they satisfy the orthogonality condition : \displaystyle <\hat X_n, \hat X_m> = \delta_{mn}

• If A is real, nonsymmetric, the eigenvalues may be either real values (noted \lambda_n) or complex conjugate pairs (noted $_n = _n i _n$). The eigenmode associated to complex eigenvalues can be written under the form \hat{X}_n = U_n \pm i V_n where U_n and V_n are real vectors which can be chosen as orthogonal (but not necessarily of same norm).

# Going further :

• Charru, chapter 11
• Charru, Iooss & Léger
• Glendinning
• Bender & Orszag (for differential eigenvalue problems)