# THE ELIMINATION FORM OF THE INVERSE AND ITS APPLICATION TO LINEAR PROGRAMMING

Original Publication Date: 1955-Apr-08

Included in the Prior Art Database: 2005-Sep-19

## Publishing Venue

Software Patent Institute

## Related People

Markowitz, Harry: AUTHOR [+2]

## Abstract

The inverse (A -1 ) of a matrix (A) is valuable when a number of sets of equations AX = -b are to be solved using different b's and the same A. A -1 , like any matrix, may be expressed as the product of other matrices A -1 = M m M m-1 ...M 1 in an infinite number of ways. E.g. (2) = (1/2) (4) = (1/8) (16) etc. If we have such M 1 , . . . , M m we can solve AX = b, X = A -l b in a series of steps: X (1) = M 1 b X (2) = M 2 X (1) . . . X = M m X (m-1) The expression M m . . . M 1 is referred to as a ";product form"; of inverse. In some problems there may be M i which are easier to obtain and apply than A -1 itself. This paper will discuss a particular product form of inverse which is closely related to the Gaussian elimination method of solving a set of simultaneous equations. This ";elimination form of the inverse,"; as we shall call it, is especially valuable when A has a large number of zero coefficients. If A has no zero coefficients, on the other hand, the elimination form of inverse is still generally as convenient as the conventional A -1 .

**This text was extracted from a PDF file.**

**This is the abbreviated version, containing approximately 18% of the total text.**

__Page 1 of 6__THIS DOCUMENT IS AN APPROXIMATE REPRESENTATION OF THE ORIGINAL.

**THE ELIMINATION FORM OF THE INVERSE AND ITS
APPLICATION TO LINEAR PROGRAMMING **

Harry Markowitz

P-680

8 April 1955

The RAND CORPORATION 1700 MAIN ST. SANTA MONICA CALIFORNIA

**ACKNOWLEDGEMENT **

The writer is indebted to George Dantzig and Alan Manne for valuable suggestions.

**THE ELIMINATION FORM OF THE INVERSE AND ITS
APPLICATION TO LINEAR PROGRAMMING [ title ] e **

The inverse (A^{-1}) of a matrix (A) is valuable when a number of sets of equations AX = ^{-b} are to
be solved using different b's and the same A. A^{-1}, like any matrix, may be expressed as the
product of other matrices
A^{-1} = M_{m} M_{m-1} ...M_{1}
in an infinite number of ways. E.g. (2) = (1/2) (4) = (1/8) (16) etc. If we have such M_{1} , . . . , M_{m}
we can solve AX = b, X = A^{-l}b in a series of steps:
X^{(1)} = M_{1} b

X^{(2)} = M_{2}X^{(1)}

.

.

.

X = M_{m} X^{(m-1)}

The expression M_{m} . . . M_{1} is referred to as a "product form" of inverse. In some problems there
may be M_{i} which are easier to obtain and apply than A^{-1} itself.

This paper will discuss a particular product form of inverse which is closely related to the
Gaussian elimination method of solving a set of simultaneous equations. This "elimination form
of the inverse," as we shall call it, is especially valuable when A has a large number of zero
coefficients. If A has no zero coefficients, on the other hand, the elimination form of inverse is
still generally as convenient as the conventional A^{-1}.

The elimination form of inverse can be illustrated in terms of the solution of three equations in three unknowns:

Rand Corporation Page 1 Apr 08, 1955

__Page 2 of 6__THE ELIMINATION FORM OF THE INVERSE AND ITS APPLICATION TO LINEAR PROGRAMMING

1) a_{ll} X_{1} + a_{l2} X_{2} + a_{13} X_{3} = r_{1}

2) a_{21} X_{1} + a_{22} X_{2} + a_{23} X_{3} = r_{2}

3) a_{31} X_{1} + a_{32} X_{2} + a_{33} X_{3} = r_{3}

For the moment we will let the k^{th} diagonal element be the k^{th} pivotal element. From equation 1)
we get the first equation of our back solution [ Equation omitted ]

We eliminate X_{1} from equations 2) and 3) by adding [ Equation omitted ] times the first equation
to the i^{th} equation, thus obtaining
2') b_{22} X_{2} + b_{23} X_{3} = r^{*}_{2}
3') b_{32} X_{2} + b_{33} X_{3} = r^{*}_{3}
where [ Equation omitted ]

Similarly we get [ Equation omitted ] and [ Equation omitted ] where [ Equation omitted ] . Finally [ Equation omitted ]

B3) gives us X_{3}; X_{3} and B2)give X_{2}; X_{2}, X_{3} and B1) give X_{1} Consider the transformations which
occurred to our original right hand side. First we formed [ Equation omitted ] then [ Equation
omitted ] then [ Equation omitted ] then [ Equation omitted ] and finally [ Equation omitted ] Thus
[ Equation omitted ] Since the a_{ij} b_{ij} c_{ij} do not depend on the r_{i} we have [ Equation omitted ]
Similarly if A is any m X m non-singular matrix we have A^{-1} = B_{1} B_{2} ... B_{m} R_{m- 1}...R_{1}
where R_{k} is a matrix of the form [ Equation omitted ] and B_{k} is of the form [ Equation omitted ]

Although this elimination form of inverse consists of 2m-1 matrices it contains only n^{2} numbers
which are not known a-prio...