Revision as of 11:36, 24 November 2013 editLilily (talk | contribs)260 edits →Proof using Gaussian integration: corrected and generalised proof← Previous edit | Revision as of 11:45, 24 November 2013 edit undoLilily (talk | contribs)260 edits →Proof: added proof using eigendecompositionNext edit → | ||
Line 2: | Line 2: | ||
== Proof == | == Proof == | ||
=== Proof using eigendecomposition === | |||
Let <math>M = \sum \mu_i m_i m_i^T</math> and <math>N = \sum \nu_i n_i n_i^T</math>. Then | |||
: <math>M \circ N = \sum_{ij} \mu_i \nu_i (m_i m_i^T) \circ (n_i n_i^T) = \sum_{ij} \mu_i \nu_j (m_i \circ n_j) (m_i \circ n_j)^T</math> | |||
Each <math>(m_i \circ n_j) (m_i \circ n_j)^T</math> is positive definite and <math>\mu_i \nu_j > 0</math>, thus the sum giving <math>M \circ N</math> is also psotive definite. | |||
=== Proof using the trace formula === | === Proof using the trace formula === |
Revision as of 11:45, 24 November 2013
In mathematics, particularly linear algebra, the Schur product theorem, named after Issai Schur (Schur 1911, p. 14, Theorem VII) (note that Schur signed as J. Schur in Journal für die reine und angewandte Mathematik) states that the Hadamard product of two positive definite matrices is also a positive definite matrix.
Proof
Proof using eigendecomposition
Let and . Then
Each is positive definite and , thus the sum giving is also psotive definite.
Proof using the trace formula
It is easy to show that for matrices and , the Hadamard product considered as a bilinear form acts on vectors as
where is the matrix trace and is the diagonal matrix having as diagonal entries the elements of .
Since and are positive definite, we can consider their square-roots and and write
Then, for , this is written as for and thus is positive. This shows that is a positive definite matrix.
Proof using Gaussian integration
Case of M=N
Let be an dimensional centered Gaussian random variable with covariance . Then the covariance matrix of and is
Using Wick's theorem to develop we have
Since a covariance matrix is positive definite, this proves that the matrix with elements is a positive definite matrix.
General case
Let and be dimensional centered Gaussian random variables with covariances , and independt from each other so that we have
- for any
Then the covariance matrix of and is
Using Wick's theorem to develop
and also using the independence of and , we have
Since a covariance matrix is positive definite, this proves that the matrix with elements is a positive definite matrix.
Refercenes
- Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1515/crll.1911.140.1, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1515/crll.1911.140.1
instead. - Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1007/b105056, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1007/b105056
instead., page 9, Ch. 0.6 Publication under J. Schur - Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1112/blms/15.2.97, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1112/blms/15.2.97
instead.