3.1Introduction¶
As we have mentioned on the first seminar, the main idea behind the notion of a matrix is to simultaneously keep track of various data we want to keep track of. Now we ask the following question: how to recognize if there is a relationship between data we are measuring?
3.2Vectors¶
Vector is either a row-matrix or a column-matrix. Hence, a row-vector is just another name for a row-matrix, and column-vector is just another name for a column-matrix. Because vectors are just matrices with one row or one column, we know how to add vectors and we know how to multiply vectors by a scalar.
Notice that a vector is determined by the number of its elements. For example, the vector has three elements. We will be refering to the elements of a vector as its components. Hence, the vector has three components. The set of all vectors with components is denoted by
On the lectures, you have learnt that that if the vectors are linearly dependent, then one of those vectors can be represented as a linear combination of the others.
Common mistake on the Midterm/Exam
In the sentence above, notice what is the defining property of linear dependence and what is the consequence of the definition. By definition, the vectors are linearly dependent if there is a non-trivial linear combination of those vectors that is equal to the zero-vector 0. So, that is the defining property of the linear dependence. The consequence of such a definition is that if the vectors are linearly dependent, then one of them is a linear combination of others.
A lot of students confuse these two notions when they are questioned on it on the Midterms or the Exam, so keep in mind what is the definition and what is the consequence of the definition.
3.3Rank of a matrix¶
As we have seen, checking whether or not the given vectors are linearly dependent or independent using the definition can be quite messy - it boils down to solving systems of equations. So, we would like to come up with an easier method of checking the linear (in)dependence of vectors.
One of the reasons we introduced the notion of a rank of a matrix is to check linear (in)dependence of vectors. Here’s how we do it:
Given the vectors, we make a matrix whose rows are equal to the entries of the given vectors
If the rank of that matrix is equal to the number of vectors we started with, the vectors are independent
Otherwise, they are dependent
3.4Inverse of a matrix (remastered)¶
As we have seen in the previous chapter, we can use the determinant to calculate the inverse of a regular matrix:
We will now learn how to use the Gauss-Jordan algorithm to find the inverse of a regular matrix in a much easier way.