I have worked out the Tao’s geometric proof in detail and have re-write it so anybody can follow what is going on in detail.

WARNING: I go very slow in order to be clear and explicit (some times pedantic). Skip the parts you feel its too slow.

NOTE1: read Tao’s post first https://terrytao.wordpress.com/2019/08/13/eigenvectors-from-eigenvalues/ and then read this extended version of the proof.

NOTE2: If you find errors please point them out (specially with signs). Do the same for things not clear.

First I think its much simpler if I work on a small sample. I am going to work with 4 \times 4 matrices, the R^4 vector space and the geometric algebra (4,0,0).

Let A be a full-rank real symmetric 4 \times 4 matrix. A has 4 distinct real eigenvalues, denoted by \lambda_i, and 4 distinct corresponding orthonormal eigenvectors denoted by v_i.

The characteristic equation is (A - \lambda_i I) v_i = 0 and \det(A - \lambda_i I) = 0. In geometric algebra terms the determinant of A is given by \det(A) = \|a_1 \wedge a_2 \wedge a_3 \wedge a_4\| where a_i are the columns of matrix A. The characteristic outer product is:

(a_1 - \lambda_i e_1) \wedge (a_2 - \lambda_i e_2) \wedge (a_3 - \lambda_i e_3) \wedge (a_4 - \lambda_i e_4) = 0

The parallel of the characteristic outer product and the characteristic equation continue as follows:

\|(a_1 - \lambda_i e_1) \wedge (a_2 - \lambda_i e_2) \wedge (a_3 - \lambda_i e_3) \wedge (a_4 - \lambda_i e_4)\| = \det(A - \lambda_i I)

which will be used in Tao’s proof.

Since eigenvectors form an orthonormal basis we can define k-vectors on the eigenvector’s basis of the form w_i = v_1 \wedge v_{i-1} \wedge v_{i+1} \wedge v_n. In our case those are:

w_4 = v_1 \wedge v_2 \wedge v_3

w_3 = v_1 \wedge v_2 \wedge v_4

w_2 = v_1 \wedge v_3 \wedge v_4

w_1 = v_2 \wedge v_3 \wedge v_4

Applying the function (A - \lambda_4 I) to the 3-vector w_4 we get:

(A - \lambda_4 I) w_4 = (A - \lambda_4 I) v_1 \wedge (A - \lambda_4 I) v_2 \wedge (A - \lambda_4 I)v_3

(A - \lambda_4 I) w_4 = (\lambda_1 v_1 - \lambda_4 v_1) \wedge (\lambda_2 v_2 - \lambda_4 v_2) \wedge (\lambda_3 v_3 - \lambda_4 v_3)

(A - \lambda_4 I) w_4 = (\lambda_1 - \lambda_4)v_1 \wedge (\lambda_2 - \lambda_4)v_2 \wedge (\lambda_3 - \lambda_4)v_3

(A - \lambda_4 I) w_4 = \left[\prod_{k=1, k\ne4}^4 (\lambda_k - \lambda_4)\right] v_1 \wedge v_2 \wedge v_3

(A - \lambda_4 I) w_4 = \left[\prod_{k=1, k\ne4}^4 (\lambda_k - \lambda_4)\right] w_4

So w_4 is an eigenblade (by outermorphism) as expected. Now, if we apply the same function (A - \lambda_4 I) to w_1, w_2 or w_3 we get:

(A - \lambda_4 I) w_1 = \left[\prod_{k=1, k\ne1}^4 (\lambda_k - \lambda_4)\right] w_1 = 0

(A - \lambda_4 I) w_2 = \left[\prod_{k=1, k\ne2}^4 (\lambda_k - \lambda_4)\right] w_2 = 0

(A - \lambda_4 I) w_3 = \left[\prod_{k=1, k\ne3}^4 (\lambda_k - \lambda_4)\right] w_3 = 0

That is why Tao said this is “rank one”. So generalizing:

(A - \lambda_i I) w_i = \left[\prod_{k=1, k\ne i}^4 (\lambda_k - \lambda_i)\right] w_i

The inner product (Euclidean Inner product) of any k-vector w_i with itself is <w_i, w_i> = 1, by orthonormality of the eigenbasis v_i

<(A - \lambda_i I)w_i, w_i> = \prod_{k=1, k\ne4}^4 (\lambda_k - \lambda_4)

We can use the following expression to agree with Tao’s definition:

<(A - \lambda_i I)w_i, w_i> = \left[\prod_{k=1, k\ne4}^4 (\lambda_k - \lambda_4)\right] |w_i \wedge v_i|^2

since |w_i \wedge v_i|^2 = 1. But the above expression holds even if you replace w_i with any k-vector w on the orthonormal eigenbasis v_i.

Now, lets define k-vectors on the basis R^4:

x_4 = e_1 \wedge e_2 \wedge e_3

x_3 = e_1 \wedge e_2 \wedge e_4

x_2 = e_1 \wedge e_3 \wedge e_4

x_1 = e_2 \wedge e_3 \wedge e_4

And compute x_4 \wedge v_i:

x_4 \wedge v_i = x_4 \wedge (v_{i1} e_1 + v_{i2} e_2 + v_{i3} e_3 + v_{i4} e_4)

x_4 \wedge v_i = v_{i4} e_1 \wedge e_2 \wedge e_3 \wedge e_4

\|x_4 \wedge v_i\| = v_{i4}

Generalizing: \|x_j \wedge v_i\| = v_{ij}

Now, lets apply the function (A - \lambda_4 I) to the 3-vector x_4 we get:

(A - \lambda_4 I) x_4 = (A - \lambda_4 I) e_1 \wedge (A - \lambda_4 I) e_2 \wedge (A - \lambda_4 I)e_3

(A - \lambda_4 I) x_4 = (a_1 - \lambda_4 e_1) \wedge (a_2 - \lambda_4 e_2) \wedge (a_3 - \lambda_4 e_3)

Remember that a_i = a_{i1} e_1 + a_{i2} e_2 + a_{i3} e_3 + a_{i4} e_4 and since a_i contains components in the e_4 direction the above outer product gives:

(A - \lambda_4 I) x_4 = \det(M_4 - \lambda_4 I) e_{123} + \alpha e_{124} + \beta e_{134} + \gamma e_{234}

We still need to proof (and explain) that the coefficient of the first term, corresponding to the 3-vector e_{123}, is \det(M_4 - \lambda_4 I).

To do that, lets define the column vectors \hat a^4_i:

\hat a^4_i = a_{i1} e_1 + a_{i2} e_2 + a_{i3} e_3

So the vector \hat a^4_i is suppressing the coordinate e_4 from the column vector a_i.

So if we define the 3-vector \hat m_4 = \hat a^4_1 \wedge \hat a^4_2 \wedge \hat a^4_3 it corresponds to the outer product of the columns of the matrix M_4 which is a submatrix of A obtained by eliminating the row 4 and column 4. Which is known also as minor.

If we apply the function M_4 - \lambda_4 I to 3-vector x_4 we get:

(M_4 - \lambda_4 I) x_4 = (M_4 - \lambda_4 I) e_1 \wedge (M_4 - \lambda_4 I) e_2 \wedge (M_4 - \lambda_4 I)e_3

(M_4 - \lambda_4 I) x_4 = (\hat a^4_1 - \lambda_4 e_1) \wedge (\hat a^4_2 - \lambda_4 e_2) \wedge (\hat a^4_3 - \lambda_4 e_3)

Since (\hat a^4_1 - \lambda_4 e_1) \wedge (\hat a^4_2 - \lambda_4 e_2) \wedge (\hat a^4_3 - \lambda_4 e_3) is just the characteristic outer product of M_4 (see discussion about characteristic outer product above) we can simply write:

(M_4 - \lambda_4 I) x_4 = \det(M_4 - \lambda_4I) x_4

Finally, since M_4 is full rank symmetric real matrix, it also has an orthonormal eigenbasis, which will be denoted as \hat v^4_i. Consider the 3-vector \hat w^4 = \hat v^4_1 \wedge \hat v^4_2 \wedge \hat v^4_3.

(M_4 - \lambda_4 I ) w^4 = \left[\prod^3_{k=1} (\lambda_k - \lambda_4)\right] w^4

where \lambda_k are eigenvalues of M_4. Since \|x_4\| = \|w^4\| = 1 we can equate:

\det(M_4 - \lambda_4I) = \prod^3_{k=1} (\lambda_k - \lambda_4)

Generalizing:

\det(M_i - \lambda_iI) = \prod^{n-1}_{k=1} (\lambda_k - \lambda_i)

Which concludes the Tao’s proof.