Implications of Symmetry
A real n by n matrix A is defined to be symmetric
if, for every possible pair of indices i and j between 1 and
n, it is the case that Ai,j = Aj,i.
In order to give you some practice with this concept, and with manipulating
matrices as abstract quantities, we offer a list of tasks. In each case,
we state a fact about a matrix A or a formula for its entries,
and it is your problem to demonstrate
that this information implies that A must be symmetric.
The notation I indicates the identity matrix; x' is the
transpose of the vector x; similarly, A' is the transpose
of the matrix A. We assume all constants, vectors and matrices
in the following discussion have real entries.
-
Ai,j = i * ( n - j + 1 ) / ( n + 1 )
if i <= j, or else
j * ( n - i + 1 ) / ( n + 1 ) if j < i.
-
A = c1 * x1 * x'1
+ c2 * x2 * x'2
+ ... + cn * x1 * x'n
for some real constants ci and n-vectors
xi.
-
(u,v) = u' * (A*v) defines an inner product for
all n-vectors u and v.
-
A = Q * D * inverse(Q) where D is a
diagonal matrix and Q is an orthogonal matrix.
-
A = I - 2 * x * x' for some n-vector x.
-
The inverse of A exists, and is a symmetric matrix.
-
Ai,j = 1 if i+j=n+1, 0 otherwise.
-
Ai,j = | x(i) - x(j) | for some given
n-vector x.
-
Ai,j = 2 * min ( i, j ) - 1.
-
Ai,j = alpha|i-j|
for some real number alpha.
-
Ai,j = min ( i, j ) / max ( i, j ).
-
A1,1 = 1,
Ai,j = Ai-1,j + Ai,j-1
with "out-of-range" entries taken to be 0.
-
The matrix exponential e^(At) is a symmetric matrix.
-
A * x = A' * x for every n-vector x.
-
A is the Hessian matrix of a real-valued function f(x),
where x is a real n-vector argument,
so that Ai,j = d^2 f(x) dxi dxj.
-
The singular value decomposition of a matrix A is defined
to have the form A = U * S * V', where U and V are
orthogonal, and S is diagonal. For our particular matrix A,
the SVD happens to have the property that
U = V.
You can return to the
HTML web page.
Last revised on 29 September 2010.