Science Fair Project Encyclopedia
System of linear equations
- 3x1 + 2x2 − x3 = 1
- 2x1 − 2x2 + 4x3 = −2
- −x1 + ½x2 − x3 = 0.
The problem is to find those values for the unknowns x1, x2 and x3 which satisfy all three equations simultaneously.
Systems of linear equations belong to the oldest problems in mathematics and they have many applications, such as in digital signal processing, estimation, forecasting and generally in linear programming and in the approximation of non-linear problems in numerical analysis. An efficient way to solve systems of linear equations is given by the Gauss-Jordan elimination or by the Cholesky decomposition.
In general, a system with m linear equations and n unknowns can be written as
- a11x1 + a12x2 + + a1nxn = b1
- a21x1 + a22x2 + + a2nxn = b2
- am1x1 + am2x2 + + amnxn = bm,
where x1, ... ,xn are the unknowns and the numbers aij are the coefficients of the system. We can separate the coefficients in a matrix as follows:
If we represent each matrix by a single letter, this becomes
- Ax = b,
where A is an m-by-n matrix above, x is a column vector with n entries and b is a column vector with m entries. The above mentioned Gauss-Jordan elimination applies to all these systems, even if the coefficients come from an arbitrary field.
- the system has no solution
- the system has a single solution
- the system has infinitely many solutions.
A system of the form
- Ax = 0
is called a homogenous system of linear equations. The set of all solutions of such a homogeneous system is called the null space of the matrix A, it is written as Nul A.
Especially in view of the above applications, several more efficient alternatives to Gauss-Jordan elimination have been developed for a wide diversity of special cases. Many of these improved algorithms are of complexity O(n²). Some of the most common special cases are:
- For problems of the form Ax = b, where A is a symmetric Toeplitz matrix, we can use Levinson recursion or one of its derivatives. One special commonly used Levinson-like derivative is Schur recursion , which is used in many digital signal processing applications.
- For problems of the form Ax = b, where A is a singular matrix or nearly singular, the matrix A is decomposed into the product of three matrices in a process called singular-value decomposition. The left and right hand matrices are left and right hand singular vectors. The middle matrix is a diagonal matrix and contains the singular values. The matrix can then be inverted simply by reversing the order of the three components, transposing the singular vector matrices, and taking the reciprocal of the diagonal elements of the middle matrix. If any of the singular values is too close to zero and therefore close to being singular, they are set to zero.
The contents of this article is licensed from www.wikipedia.org under the GNU Free Documentation License. Click here to see the transparent copy and copyright details