3402 Vectors and Matrices Homogeneous Systems
More on matrices
Associated with every square matrix is a number, written or called the determinant of . For these notes, it will be enough if you can calculate the determinant of matrices, which is as follows:
The trace of a square matrix is the sum of the elements on the main diagonal; it is denoted :
Remark. Theoretically, the determinant should not be confused with the matrix itself. The determinant is a number, the matrix is the square array. But, everyone puts vertical lines on either side of the matrix to indicate its determinant, and then uses phrases like "the first row of the determinant," meaning the first row of the corresponding matrix.
An important formula which everyone uses and no one can prove is
Homogeneous systems
Matrices and determinants were originally invented to handle, in an efficient way, the solution of a system of simultaneous linear equations. This is still one of their most important uses. We give a brief account of what you need to know for now. We will restrict ourselves to square homogeneous systems; they have two equations and two variables (or "unknowns", as they are frequently called). Our notation will be:
$$A = (a_{ij}), \text{a square matrix of constants}\boldsymbol{x} = (x_1, x_2)^T, \text{a column vector of unknowns}a_{11}x_1+a_{12}x_2=0a_{21}x_1+a_{22}x_2=0A\boldsymbol{x}=0\tag{2}$$
This always has the solution , which we call the trivial solution. The question is: when does it have a nontrivial solution?
Theorem. Let be a square matrix. The equation
Linear independence of vectors
Conceptually, linear independence of vectors means each one provides something new to the mix. For two vectors this just means they are not zero and are not multiples of each of other.
Example 1. and are linearly independent.
Example 2. and are linearly dependent because is a multiple of . Notice that if we take linear combinations then doesn't add anything to the set of vectors we can get from alone.
Example 3. and are linearly dependent because is a multiple of , i.e., .
Determinantal criterion for linear independence
Let and be 2-vectors, and the square matrix having these vectors for its rows (or columns). Then
Let us re-visit our previous examples.
Examples.
1. = 4 - 6 = -2 \neq 0. Therefore, and are linearly independent.
2. = 4 - 4 = 0. Therefore, and are linearly dependent.
3. = 0 - 0 = 0. Therefore, and are linearly dependent.
Remark. The theorem on square homogeneous systems follows from this criterion. We will prove neither.
Two linearly independent 2-vectors and form a basis for the plane: every 2-vector can be written as a linear combination of and . That is, there are scalars and such that Remark. All of the notions and theorems mentioned in this section generalize to higher (and a larger collection of vectors), though we will not need them.