Implications of Symmetry
A real n by n matrix A is defined to be symmetric
if, for every possible pair of indices i and j between 1 and
n, it is the case that A_{i,j} = A_{j,i}.
In order to give you some practice with this concept, and with manipulating
matrices as abstract quantities, we offer a list of tasks. In each case,
we state a fact about a matrix A or a formula for its entries,
and it is your problem to demonstrate
that this information implies that A must be symmetric.
The notation I indicates the identity matrix; x' is the
transpose of the vector x; similarly, A' is the transpose
of the matrix A. We assume all constants, vectors and matrices
in the following discussion have real entries.

A_{i,j} = i * ( n  j + 1 ) / ( n + 1 )
if i <= j, or else
j * ( n  i + 1 ) / ( n + 1 ) if j < i.

A = c_{1} * x_{1} * x'_{1}
+ c_{2} * x_{2} * x'_{2}
+ ... + c_{n} * x_{1} * x'_{n}
for some real constants c_{i} and nvectors
x_{i}.

(u,v) = u' * (A*v) defines an inner product for
all nvectors u and v.

A = Q * D * inverse(Q) where D is a
diagonal matrix and Q is an orthogonal matrix.

A = I  2 * x * x' for some nvector x.

The inverse of A exists, and is a symmetric matrix.

A_{i,j} = 1 if i+j=n+1, 0 otherwise.

A_{i,j} =  x(i)  x(j)  for some given
nvector x.

A_{i,j} = 2 * min ( i, j )  1.

A_{i,j} = alpha^{ij}
for some real number alpha.

A_{i,j} = min ( i, j ) / max ( i, j ).

A_{1,1} = 1,
A_{i,j} = A_{i1,j} + A_{i,j1}
with "outofrange" entries taken to be 0.

The matrix exponential e^(At) is a symmetric matrix.

A * x = A' * x for every nvector x.

A is the Hessian matrix of a realvalued function f(x),
where x is a real nvector argument,
so that A_{i,j} = d^2 f(x) dxi dxj.

The singular value decomposition of a matrix A is defined
to have the form A = U * S * V', where U and V are
orthogonal, and S is diagonal. For our particular matrix A,
the SVD happens to have the property that
U = V.
You can return to the
HTML web page.
Last revised on 29 September 2010.