LU decomposition and its special case

Matrices are fundamental objects in modern mathematics and invariably appear in all branches of science. An introductory use case for matrices is in solving linear systems like Ax = b. While this system is introductory and may feel toy'ish, such linear systems are at the center of most engineering problems. 

Formulating a system into a linear matrix expression (Ax = b) is beyond the scope of this blog (may be covered in future). In this post, we will talk about a matrix factorization technique called $LU$ decomposition and its special case the Cholesky decomposition. These decomposition techniques help in transforming the original matrix expression into a simpler expression that gives solution x of the linear system Ax = b in an easy way. Consider the following two options for matrix $A$


In the case of $A_2$, owing to the upper triangular structure of the matrix solving for x is straightforward via substitution. The LU decomposition provides us with such an upper triangular matrix.

 


 For the case of $A_1$, without the LU decomposition, solving for x requires computing the inverse of A and then multiplying the inverse with vector b. Computing inverse of a matrix via determinant and minor's is tedious as well as compute intensive, especially for big matrices. In practice, computer algebra systems (CAS) use techniques such as LU factorization for solving such linear systems efficiently. The LU factorization helps in transforming the matrix A into two factors $L$ and $U$. Where $L$ is a unit lower triangular matrix and $U$ is an upper triangular matrix as shown below:

The permutation matrix, ${P}$, can be ignored for the time being. Obtaining LU decomposition requires finding the 6 $u_{ij}$ terms and 3 $l_{ij}$ terms. The computation is given as follows: 

 

Once we have the $L$ and $U$ matrices, we can use the same factorization to solve the system for various $b$ vectors. For example, to compute the inverse of $A$, we can use columns of $I$ (the identity) matrix for computing the columns of $A^{-1}$. Let's see an example of inverse computation:

While implementing the LU decomposition on a computer, we could possibly encounter 'divide by zero' scenarios. This is where the permutation matrix $P$ comes into play. The P matrix essentially helps in reordering of the rows of $A$ so that the 'divide by zero' situation is averted. A very important fact to observe is that LU decomposition is just Gaussian elimination technique in disguise. The $L$ matrix corresponds to scaling and substraction that are performed to convert $A$ into a upper triangular $U$ matrix.

Now let's see a special case, when $A$ is a positive definite (more on this later) and symmetric matrix we obtain $A = U^*~U$ or $A = LL^*$ i.e. the factorization involves single factor $L$ (or $U^*$). The $^*$ denotes the complex conjugate of the matrix.

 

 


This factorization is popularly known as the Cholesky decomposition and is useful in many applications that involve positive definite, symmetric matrices. Computing inverse of a positive definite symmetric matrix requires much less computations as shown in this paper.


Comments

Popular posts from this blog

Simulation setup on Win11 and Issac-sim

ISAAC SIM: Stereo camera on a differential drive