# A determinantal formula

I see a similar formula I wish to be true and merely have a proof in mind occur as a MO’s problem:

In my research, I encounter the following formula which I believe is correct (checked for $n\le3$). Is it classical ?

I am given a real symmetric matrix
$S:=\int Y(t)Y(t)^Td\mu(t),$
where $\mu$ is a probability and $Y(t):\Omega\rightarrow{\mathbb R}^n$.

Let $\sigma_k(S)$ be the elementary symmetric polynomial in the eigenvalues of $S$. For instance, $\sigma_1(S)$ is the trace and $\sigma_n(S)$ the determinant. The following formula gives $\sigma_k(S)$ in terms of the Gram matrix $G_k(s_1,\ldots,s_k)$ whose entries are the scalar products $Y(s_i)\cdot Y(s_j)$.

$\sigma_k(S)=\frac1{k!}\int^{\otimes k}\det G_k(s_1,\ldots,s_k)\,d\mu(s_1)\cdots d\mu(s_k).$

Remark that $S$ is positive semi-definite. The integrand is non-negative, as well as $\sigma_k(S)$. The integrand vanishes identically iff $Y(t)$ takes values in a subspace of dimension $<k$, which is the condition under which $\sigma_k(S)$ vanishes. It follows that, if the formula above failed, it would be because of an inequality between strictly positive numbers.

The case $k=n$ is a consequence of the identity

$\int \det(f_j(s_k))\det(g_j(s_k))\prod_{j=1}^N d\mu(s_j) = N!\ \det\left(\int d\mu(t) f_j(t)g_k(t)\right)$

which I have seen under the names “Andreief identity” and also “Gram identity”. The proof is elementary using the Leibniz formula for the determinant.

## Related posts

No related posts were found.