### Linear Algebra

##### Description of Chapter 1

Introduce the concept of linear equations, systems of linear equations, and the related notations and terminologies.

The first step for solving a problem is to try to simplify the problem. For a system of linear equations, the simplifications means the elimination of variables in an orderly way. This can be done by three operations that do not change the solutions of the system.

Also note that the system is equivalent to the augmented matrix. Therefore the three operations have their counterparts for matrices, called the row operations.

Because its origin in solving linear equations, the row operations will be very useful for any problems related to linear equations. As a matter of fact, 80% of linear algebra problems are reduced to solutions of linear equations and are finally solved by row operations.

Being developed as the method for simplifying matrices, it is natural to ask how simple a matrix can become by row operations. The concept of (reduced) row echelon form is naturally derived from the answer to the question.

In the next two sections, the concept of row echelon form is applied to systems of linear equations (which we recall is where row echelon forms originally come from). As a result, we have a theory for the existence and the uniqueness of solutions.

For a systems of linear equations, the existence of solutions can be read off directly from the shape of the row echelon form of the augmented matrix.

We may also consider systems of linear equations with fixed coefficient matrix and arbitrary right side. The existence of solutions for the system for all right side (which we call always existence) can be read off directly from the shape of the row echelon form of the coefficient matrix.

Suppose a system of linear equations has solutions. Then the shape of the row echelon form of the augmented matrix further tells us the structure of the general solutions. Basically, we may express the nonfree variables (corresponding to the pivot columns of the coefficient matrix) in terms of free variables (corresponding to the nonpivot columns).

The study of the structure of solutions leads us to the criterion for the uniqueness of the solution. The criterion also implies that, for systems of linear equations, the uniqueness is independent of the right side.

Since the existence and the uniqueness can be determined by the shape of the row echelon form. The existence and the uniqueness will have consequences on the number of equations and the number of variables. Conversely, if the number of equations and the number of variables are equal, then the always existence and the uniqueness are equivalent. This striking conclusion is what I call a basic principle of linear algebra.