Recently, I was asked about some equation on Pattern Recognition and Machine learning by my friend. Hence, I will share some of them here :) In this article, we're gonna work on two following equations on page 80, ( Let Σ be variance-covariance matrix, →u be eigen vector, λ be eigen value. )
Σ=D∑i=1λi→ui→uTi
1. Prerequisite Knowledge¶
To understand the above two equations, there are two prerequisite knowledge.
- Inverse matrix of diagonal matrix
Suppose we have n x n diagonal matrix as following,
This matrix is *invertible*, if all of elemnts on the main diagonal is non-zero.
Then, *Inverse matrix* of D has *reciprocals* of the elements in the main diagonal
as below. D=(1λ101λ2⋱01λn)
- Inverse matrix of orthogonal matrix
If the matrix we wanna know about inverse matrix is orthogonal matrix, you can cut down on time to compute inverse matrix. Let's say matrix D is orthogonal matrix. Then inverse matrix of D would be transoposed D.
2. Derivation of equation¶
From the definition of eigen vector,
Σ→ui=λi→uiTherefore,
Σ(→u1,→u2,…,→uD)=(λ1→u1,λ2→u2,…,λD→uD)Since (→u1,→u2,…,→uD) is orthogonal matrix , we can apply "Inverse matrix of othogonal matrix" we discussed above.
Σ=(λ1→u1,λ2→u2,…,λD→uD)(→u1,→u2,…,→uD)TΣ=(λ1→u1,λ2→u2,…,λD→uD)(uT1→uT2⋮→uTD)As a result we can get equation (2.48) Σ=D∑i=1λi→ui→uTi
Equation (1) can be expressed as below, Σ(→u1,→u2,…,→uD)=(→u1,→u2,…,→uD)(λ10λ2⋱0λn)
Multiply inverse matrix of co-variance matrix from left on both side,
Σ−1(→u1,→u2,…,→uD)(λ10λ2⋱0λn)=(→u1,→u2,…,→uD)Now is the time to apply inverse matrix of diagonal matrix we disscussed above,
Σ−1=(→u1,→u2,…,→uD)(1λ101λ2⋱01λn)(uT1→uT2⋮→uTD)
Finally, we can get (2.49) :), Σ−1=D∑i=11λi→ui→uTi
No comments:
Post a Comment