The obvious matrix multiplication algorithm for an
n_{1}×n_{2} matrix by an
n_{2}×n_{3} matrix has
O(n_{1} n_{2} n_{3})-time
complexity, i.e., n^{3}
if n_{1} = n_{2} = n_{3} = n.Strassen (1969) showed how to reduce this from
n^{3} to O(n^{log2(7)})
by recursively performing seven multiplications of
n/2×n/2 matrices,
e.g., [code].
Note that log_{2}(7) ≈ 2.8 < 3.
The improvement is not dramatic, but
it sparked a cottage industry looking for further improvements --
search for 'fast matrix multiplication' in
the [bib].
Matrix Inversion
The inverse of a square k×k matrix, m,
is a k×k matrix, m^{-1}, s.t.
m m^{-1} = m^{-1} m = id,
where id is the k×k identity matrix.
(Note, a singular matrix does not have an inverse.)
The matrix inversion problem can be solved by
Gauss-Jordon elimination.
The "naive" Gauss-Jordon elimination algorithm is numerically unstable.
Gauss-Jordon elimination with pivoting is more stable.
There seems to be some debate in the literature over whether
(i) partial pivoting (row pivoting) is adequate or
(ii) full pivoting (row and column pivoting)
is needed in general.