Matrices

LA home
Computing
Algorithms
 glossary
 Numerical
  Num.Errors
  Polynomials
  Stirling
  Mean&S.D.
  Segmentation
  Superposition
  Integration
  Matrices
  Eigen v.
  Matrices

also see
maths
 matrices
 least squares

Matrix Multiplication

The obvious matrix multiplication algorithm for an n1×n2 matrix by an n2×n3 matrix has O(n1 n2 n3)-time complexity, i.e., n3 if n1 = n2 = n3 = n. Strassen (1969) showed how to reduce this from n3 to O(nlog2(7)) by recursively performing seven multiplications of n/2×n/2 matrices, e.g., [code]. Note that log2(7) ≈ 2.8 < 3. The improvement is not dramatic, but it sparked a cottage industry looking for further improvements -- search for 'fast matrix multiplication' in the [bib].

Matrix Inversion

The inverse of a square k×k matrix, m, is a k×k matrix, m-1, s.t.
m m-1 = m-1 m = id,
where id is the k×k identity matrix.
(Note, a singular matrix does not have an inverse.)
The matrix inversion problem can be solved by Gauss-Jordon elimination.
The "naive" Gauss-Jordon elimination algorithm is numerically unstable. Gauss-Jordon elimination with pivoting is more stable.
There seems to be some debate in the literature over whether (i) partial pivoting (row pivoting) is adequate or (ii) full pivoting (row and column pivoting) is needed in general.

Random example generated by JavaScript ...

(try "reload" for a new example)

Eigen Values and Eigen Vectors

See maths/Eigen-values and algorithm.

-- L.A., Tues. 21/6/2011
www #ad:

↑ © L. Allison, www.allisons.org/ll/   (or as otherwise indicated).
Created with "vi (Linux)",  charset=iso-8859-1,   fetched Friday, 26-Apr-2024 10:22:07 UTC.

Free: Linux, Ubuntu operating-sys, OpenOffice office-suite, The GIMP ~photoshop, Firefox web-browser, FlashBlock flash on/off.