As we have seen that any finite-dimensional vector space with an ordered basis is isomorphic to . Now suppose we have two vector spaces and over the same field , and a linear transformation between them. If we fix a basis for and for , then we have following situation
where the function is given by the compositions of these functions as follows So, for any linear transformation , we can find a unique linear transformation .
One important property of the vector spaces like is that any linear transformation between such space is given by a matrix as the following theorem
Theorem: For any field , any linear transformation is given by a matrix, , of size , such that, .
Definition: Let and be two vector spaces over a field with ordered basis and respectively. Suppose is a linear transformation, then the matrix associated with the linear transformation is called the matrix of with respect to and , and it is denoted by .
The following theorem provides a method to calculate the matrix of a linear transformation with given bases.
Theorem: Let be a linear transformation from a vector space to over field Let and be the ordered basis of and respectively. Then the matrix= of linear transformation w.r.to and is given by.
Proof For any , we have
Now applying on both sides, we get
Since, , we can find scalar , such that,
Now assume,
After watching this you can try question 1 from assignment 1.