An elegant algebraic identity says
If x is the vector [a b] and y is the vector [c d] then this identity can be written
where the dot indicates the usual dot product. I posted this on Twitter the other day.
Gram matrix
Now suppose that x and y are vectors of any length n. The matrix on the right hand side above is called the Gram matrix of x and y, and its determinant is called the Gram determinant or Gramian.
Correlation
Let θ be the angle between x and y in ℝn. Then
where ⟨x, y⟩ is the dot product of x and y.
If x and y are data vectors, then cos θ is their correlation.
The Gram matrix G can be written in terms of cos θ:
Generalization
The idea of the Gram matrix generalizes to more than two vectors. If we have m vectors, the Gram matrix is m × m whose (i, j) entry is the dot product of the ith and jth vectors.
Note that the dimension n of the vectors does not have to equal the dimension m of the Gram matrix.
If m does equal n, then we have the theorem that the square of the matrix whose rows are the vectors xi equals the Gram matrix. If
then we can state this as
If m does not equal n then the two matrices cannot be equal because they have different dimensions, but the determinants of the two sides are equal. That is, the square of the m-dimensional volume of the span of the x‘s inside ℝn equals their Gram determinant.
The Gram determinant can be defined for more general inner products than the dot product in ℝn. It could be, for example, the integral of the product of two functions.
Related posts
The post Gram matrix first appeared on John D. Cook.