Let H_{1}, H_{2}, ..., H_{k} be subspaces of V. Then
S(v_{1}, v_{2}, ..., v_{k}) = v_{1} + v_{2} + ... + v_{k}: H_{1}⊕H_{2}⊕...⊕H_{k} → V
is a linear transformation with the subspace H_{1} + H_{2} + ... + H_{k} as the range.
The sum H_{1} + H_{2} + ... + H_{k} of subspaces of a vector space V is a direct sum if the natural linear transformation S is one-to-one. If we also have V = H_{1} + H_{2} + ... + H_{k}, then we say V is a direct sum of the subspaces and denote V = H_{1}⊕H_{2}⊕...⊕H_{k}.
Thus a sum of subspaces is a direct sum simply means that H_{1} + H_{2} + ... + H_{k} is isomorphic to H_{1}⊕H_{2}⊕...⊕H_{k} in a natural way. By the definition, the condition for a sum to be direct is
v_{1} + v_{2} + ... + v_{k} = 0, v_{i} ∈ H_{i} ⇒ v_{1} = v_{2} = ... = v_{k} = 0.
In the special case H_{i} = span{u_{i}} are 1-dimensional lines (u_{i} ≠ 0), we have v_{i} = c_{i}u_{i}, and the condition above becomes the linear independence of v_{1}, v_{2}, ..., v_{k}. Thus direct sum generalizes the concept of linear independence. Moreover, in case k = 2, we have kernelS = {(v, -v): v ∈ H_{1}∩H_{2}}. Therefore
H_{1} + H_{2} is a direct sum ⇔ H_{1}∩H_{2} = {0}.
H_{1} = span{(1, -1, 0, 1)}, H_{2} = span{(3, -1, 4, 3), (0, 1, 4, -2)}, H_{3} = span{(1, 1, 3, 0)}
of R^{4}. For any vectors
v_{1} = c_{1}(1, -1, 0, 1), v_{2} = c_{2}(3, -1, 4, 3) + c_{3}(0, 1, 4, -2), v_{3} = c_{4}(1, 1, 3, 0)
in the three subspaces, we have
v_{1} + v_{2} + v_{3} = c_{1}(1, -1, 0, 1) + c_{2}(3, -1, 4, 3) + c_{3}(0, 1, 4, -2) + c_{4}(1, 1, 3, 0).
By the row operations in this example (look at columns 1, 2, 4, 6), the four vectors are linearly independent. Therefore
v_{1} + v_{2} + v_{3} = 0 ⇒ c_{1} = c_{2} = c_{3} = c_{4} = 0 ⇒ v_{1} = v_{2} = v_{3} = 0,
and we conclude H_{1} + H_{2} + H_{3} is a direct sum.
In general, if a basis of V is divided into a disjoint union of subsets, then the subspaces spanned by the subsets form a direct sum of V.
Example Consider the matrix
A = [ | 1 | 3 | 0 | ]. |
-1 | -1 | 1 | ||
0 | 4 | 4 | ||
1 | 3 | -2 |
Using first two rows as the coefficient matrix, we have a kernel subspace
H = {(x_{1}, x_{2}, x_{3}): x_{1} + 3x_{2} = 0, -x_{1} - x_{2} + x_{3} = 0}.
Using the last two rows, we also have
K = {(x_{1}, x_{2}, x_{3}): 4x_{2} + 4x_{3} = 0, x_{1} + 3x_{2} - 2x_{3} = 0}.
The intersection H∩K consists of solutions of the system Ax = 0. From this example, we know the solution has to be trivial. Therefore H∩K = {0}, and H + K is a direct sum.
Example Since the only symmetric and skew symmetric matrix is the zero matrix, the sum of the subspace of symmetric matrices and the subspace of skew-symmetric matrices is a direct one. In fact, by this exercise, we may say that the space of all n by n matrices is the direct sum of symmetric and skew symmetric matrices.
Example By this example (also see this exercise), the even functions form a subspace H = {f ∈ F(R): f(t) = f(-t)} of F(R). By the similar reason, the odd functions also form a subspace K = {f ∈ F(R): f(t) = -f(-t)}. Now for any function f(t) ∈ F(R), we construct
f_{E}(t) = | f(t) + f(-t) | , f_{O}(t) = | f(t) - f(-t) |
. |
2 | 2 |
Then it is easy to verify that f_{E} is an even function, f_{O} is an odd function, and f = f_{E} + f_{O}. This means H + K = F(R). Moreover, if f is both even and odd, then we will have f(t) = f(-t) = -f(t), which implies f(t) = 0. Thus we conclude H⊕K = F(R).