
 f is a linearfunction (map, transformation, ...) iff
 f(v + w) = f(v) + f(w),
 f(k v) = k f(v)
 (consequently
 f(0) = 0.)
 f is often called a lineartransformation
if the input and output spaces of f are the same.
 A linear transformation over a finitedimension vectorspace
can be represented by a matrix.
Examples
 Identity, I:



 Projection of <x, y>,
onto the line through the origin whose unit normal
is n = <n_{x}, n_{y}>:

1n_{x}^{2}  n_{x}n_{y} 
n_{x}n_{y}  1n_{y}^{2}



 (<x, y>
→ <x, y>  (<x, y> . n) n
= <x, y>
 (x n_{x}  y n_{y}) n
= <x  (x n_{x} + y n_{y}) n_{x},
y  (x n_{x} + y n_{y}) n_{y}>.)

 Reflection of <x, y>,
in the line through the origin whose unit normal
is n = <n_{x}, n_{y}>:

12n_{x}^{2}  2n_{x}n_{y} 
2n_{x}n_{y}  12n_{y}^{2}




 Anticlockwise rotation of <x, y>,
by angle θ, about the origin:

Equations
 Given constants a, b, c, d, p, q, s, and t, and
variables w, x, y, and z, in matrices

 i.e.,
 a w + b y = p,
 a x + b z = q,
 c w + d y = r,
 c x + d z = s.

 The following are all 2×2 but
generalize ...


 is equivalent (wrt solving for w, x, y, and z) to


to




to  and to 

 =


row multiplication by constant k



 =


row addition (or subtraction)



 If we can work on M and P to reduce
 M X = P
 to an equivalent
 I X = P',
 using the relations above,
then we can just read off the solution, P', for X.
 X and P can be n×1 column vectors, or n×n matrices, etc..
 For example, let P=I, the identity, and reduce
 M X = I
 to
 I X = M^{1}
 giving the
matrix inverse
of M.
 (Note that any column swaps cause row swaps, in X, which must
be undone to get the final answer.)

