About Eigen Values and Eigen Vectors


If we calculate the eigen value and eigen vector of  data; eigen vector  represent which basis direction data is spread and eigen value informs which basis direction (eigen vector ) have more information about data.


Basics of Eigen values and Eigen vectors :

  • Suppose A is matrix with size  N x N.
  • V is vector with size N.
  • V is called the eigen vector of A.

A V  =   λ V         ……. (i)

  • if multiplication of A scale the vector V .

In equation (i)  λ is called the eigen value of matrix.

Function eig is used in MATLAB to get the eigen vector and eigen value.


                                            [ V, D]  =  eig (A)

where D is diagonal matrix contain the eigen values and V is columns corresponding eigen vector then

                                             AV = VD           …..(2)

form equation 1

λ I – A  = 0     called the root of polynomial 

where  λ is diagonal eigenvalues, I is identity matrix and  A is given input matrix.

  • Determinant of  matrix A  (|A|) will be product of all eigen values.
  • Set of eigen vector is called the spectrum of matrix.


Symmetric matrix : 

If transpose of matrix does not change the matrix its called symmetric matrix.

S = S T

where ST is transpose of S matrix.

One of the example of symmetric matrix is co-variance matrix.

The property of symmetric matrix is that if we calculate eigen values and eigen vectors, all eigen values will be real.

Eigen Value Decomposition:

if V1 and V2 correspond to eigen values λ 1 and λ2 , if λ1 is not equal to λ2 then vector V1 and V2 will be orthogonal.

Lets take eigen vectors  V = ( V1 , V2 , V3 ….)   (orthogonal matrix)

if  V*VT is zero we can say columns are orthogonal

and if  Λ  =  ( λ 1, λ 2, λ 3….)  diagonal eigen values.

we can write

S =  V Λ VT      …….  (3)

where VT is transpose of V

equation 3 is called the eigen value decomposition.

Lets write the code of eigen value decomposition in MATLAB.

To make sense of  Eigen value decomposition input matrix should be symmetry matrix so eigen value will be real. Will calculate co variance matrix which is symmetry matrix.


%%%%%%%%%%%%%%%%%%%%%%%% Code Start %%%%%%%%%%%%%%%%

%% read Image

% set path to read image from system

ImagePath = ‘D:\DSUsers\uidp6927\image_processingCode\lena.jpg’;

img_RGB = imread(ImagePath); % read image from path

img_RGB = im2double(img_RGB); % Convert image to double precision

img_gray = rgb2gray(img_RGB); % convert image to gray

%% calculate the co- variance matrix which is symmetry matrix

Cov_I = cov(img_gray);

% Show Cov_I matrix

figure, imshow(Cov_I,[])

%% will decompose symmetry matrix and reconstruct it.

%% Eigen value decomposition ………………….

[E,D] = eig(Cov_I);        %%%  E is Eigen vectors and D is diagonal of eigen values

%%%%%%% Reconstruction %%%%%%%%%%%%%%%%

% reconstruction loss is loss of information due to removeing egine vector and values.

% In process of reconstruction I used only top 5 eigen values and took coresponding eigen vectors

ReCov_I5 = E(:,295:end)*D(295:end,295:end)*E(:,295:end)’;

figure, imshow(ReCov_I5,[])

title(‘reconstruct symmetry matrix with 5 eigen vectors’)

reconstruction_loss5 = 100 – (sum(sum(D(295:end,295:end)))/sum(D(:)))*100;

% In process of reconstruction I used only top 1 eigen value and took coresponding eigen vector

ReCov_I1 =E(:,300:end)*D(300:end,300:end)*E(:,300:end)’;

figure, imshow(ReCov_I1,[])

title(‘reconstruct symmetry matrix with 1 eigen vectors’)

reconstruction_loss1 = 100-(sum(sum(D(300:end,300:end)))/sum(D(:)))*100

%%%%%%%%%%%%%%%%%%%%%%%%% Code end %%%%%%%%%%%%%%%%%%%


Input image

input lena
Input Lena Image


Symmetric or Covariance input image generated by Lena image:

Covariance / symmetry image


Covariance image reconstruction by top 5 eigen values and corresponding vectors:

Reconstruction with top 5 eigen vectors

Loss of information because we used only top  5 eigen vector  = 27.84 percentage

Top 5 eigen vectors contain 72.16 percentage information of input image


Covariance image reconstruction by 1 eigen vector:

Reconstruction with top 1 eigen vector

Loss of information  = 72.40 percentage

Top 1 eigen vector contain 27.8 percentage information of input image


Eigen value docomposition is the basic of Sigular value decomposition and Principal Component Analaysis with the help of PCA we can able to reconstruct our original lena image which we will discuss in futher post.




Happy Learning








Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s