Matrix Transposes and Symmetric Matrices

Everything you need to know about transposing matrices and symmetric matrices

adam dhalla
7 min readJan 8, 2021

--

Matrix transposes and symmetric matrices are linked — in fact, the definition of a symmetric matrix is that a symmetric matrix A’s transpose gives back the same matrix A.

This is a continuation of my linear algebra series, tied with the 18.06 MIT OCW Gilbert Strang course on introductory linear algebra. Let’s jump right in with the basics on transposes.

Transposing a matrix

We transpose a two-dimensional matrix by using it’s rows as columns, or inversely, as it’s columns as rows. That is all it takes to do the simple operation.

More exactly, the entry in the aij position becomes the entry in the aji position. Concretely, look at how e changes from being in the (3, 1) position to being in the (1, 3) position. That’s all fine, but what about transposing a sum or product of matrices?

Transposing a sum of matrices

Transposing the sum (and in extension, the difference) of matrices is quite easy. We can just “distribute” the transposition to both of the numbers.

This, again, makes quite logical sense and doesn’t need to be proved to be seen as true. On the left side, we add two matrices and then transpose the sum. On the right, we transpose them additionally and add them. The same numbers are still being added to each other, so the end result is the same. A concrete example:

Both ways of adding are equivalent. We often have to simplify transposes in matrix equations, so keep note of this.

Transposing a product of matrices

This is a harder to intuitively understand than summation. Let’s view what happens when we transpose a product AB.

If you’re familiar about the inverse of a product, we factor it identically — we transpose both of the products but reverse the order. It’s harder to look at this one and see why it works, so let’s go into the details of matrix multiplication to understand why this happens.

Let’s look at this same system with a concrete example.

Let’s look at where the first row of our outcome matrix comes from. Going backwards from the end, we see that this row is actually a column before it is transposed. This column comes from the first and the second rows of A multiplied by the first column of B. We can say that the first row of our outcome is the product of the rows of A by the first column of B.

Say we transposed A and B before multiplying them. How would we find a way to preserve this same calculation (the rows of A by the first column of B)? Let’s do this and see how.

Highlighted are the same rows of A (now columns of A) and first column of B (now the first row of B) That need to be multiplied to get the first row of B. What order would we have to multiply these in order to get the first row of the product? Well, it would be, of course:

We preserve the identical multiplication, and thus get the exactly same product for the first row. Thus our rule, stated earlier, is corrected.

Symmetric Matrices

Symmetric matrices, as stated earlier, are matrices where, after being transposed, are identical. Officially:

Going by our previous definition of what a transpose is, that when being transposed, some item in row i and column j (aij) becomes an item in row j and column i (aij). Thus, in a symmetrical matrix, aij must = aji for all aji. Looking at an example of a symmetric matrix, this is obvious.

The pairs have swapped indexes. For example, the 13 in the first row is index (1, 3). The 13 in the first column is (3, 1)

A good trick for spotting symmetric matrices is that they look mirrored along the diagonal. Symmetric matrices pop up quite a lot, actually. They’re nice and neat for a whole variety of calculations. Let’s explore a few more properties of symmetric matrices.

The Inverse of a Symmetric Matrix is also Symmetric

This property might seem a bit out of the blue at first, but we can very, very quickly prove this by slightly altering the formula for a symmetric matrix.

If A is already symmetric, so A = A(T), their inverses must be as well, because:

Taking the inverse of both sides (both sides to keep the equality) we get the second statement, in which we’re basically saying the transpose of the inverse is equal to the inverse. This property comes in handy often.

The Product of a Matrix and it’s Transpose is Symmetric

The product of any matrix (square or rectangular) and it’s transpose is always symmetric. In more easier to understand notation, that’s:

It’s easy to prove but hard to believe until you actually do it, which is where it becomes quite obvious. Let’s do a concrete case first and then prove it with matrix notation (easily) afterwards.

Do this multiplication to really understand why we’re getting seven in the corners. To put it into words, for the top 7, we’re multiplying (1, 2) with (3, 2) and for the bottom 7, we’re multiplying (3, 2) with (1, 2), which gives the same product. This idea can be easily expanded to multiple symmetric entries — we are just multiplying some row in A transpose (a, b, c…. z) by a column (z, y, z … a) in A, and then later on, multiplying some row in A transpose (z, y, z … a) by some column in A (a, b, c…. z), which give two identical answers.

Let’s quickly prove this with help of the ‘transpose of products’ rule we learned. Remember, the test for symmetry is to take the transpose and see if it gives back the same thing.

Here, if we take the transpose of our product, we get returned the same product, meaning our A transpose * A is symmetric.

We mentioned it as a little caveat, but we get a symmetric, but different, matrix if we swap the order of the transpose and the original. For example, with a nonsquare matrix:

We see that our products are different — heck, they’re different dimensions! But, they’re both symmetrical. Our single element result on the left is still symmetric, since the transpose of the scalar 10 is just 10.

Gaussian Elimination What if A in A = LDU is symmetric?

If you’re not familiar with gaussian elimination, please feel free to skip this part. If you’re familiar with elimination but not A = LU or A = LDU factorization, check out my last article in my Gaussian Elimination miniseries.

Here, we’re dealing with the possibility that our coefficient matrix A is symmetric. Is there a way we can come up with the lower and upper triangular matrices A = U in A = LDU in a quicker way?

Well, if A is symmetric or A(T) = A, then

So, finding our matrix U is even easier, and we don’t have to worry about doing the pesky thing where we divide U out so that we can have a matrix D containing the pivots and U with only 1’s along the diagonal. As an example, let’s compute a 2 x 2 matrix A and factorize it into LDL(T).

So there we go — that’s all that needs to be known about transposes and symmetric matrices in a basic starting course on linear algebra.

Thanks for reading!

Adam Dhalla is a high school student out of Vancouver, British Columbia. He is fascinated with the outdoor world, and is currently learning about emerging technologies for an environmental purpose. To keep up,

Follow his Instagram, and his LinkedIn. For more, similar content, subscribe to his newsletter here.

--

--

adam dhalla

17 y old learning about machine learning, as well as a lifelong naturalist. Climate activist in Vancouver. Writer. Visit me @ adamdhalla.com