Covariance of multivectors

I’m building a neural network that is invariant to rotations and translations. For that I’m mapping points to null points in CGA. Then using the geometric product between them and linear combinations of each of the grades to find equivarint features then I use the scalar product between the features to obtain invariant features.

So my problem now is in trying to define a batch normalization layer such that the invariant features keep their invariance.
There are two operations that I can do to all my points, linear combination of grades and orthogonal conformal transformations i.e. combining those together:

x\rightarrow \sum_k \alpha_k\langle UxU^\dagger\rangle_k

where the spinor U satisfies U^\dagger U = UU^\dagger = \beta . And \beta is a scalar.

What now I’m trying to find Is what kind of assumptions must I make to my covariance matrix \mathbf{C} such that whitening my multivector x results in the transformation

\mathbf{C}^{-1/2} \mathbf{x} \rightarrow \sum_k \alpha_k\langle UxU^\dagger\rangle_k

In Augmented Quaternion Statistics they find an invertible mapping from \mathbb{R^4} to \mathbb{H}^4 as a linear transformation expressed as

q^a = Aq^r

where A is a 4x4 matrix.
I’m also trying to do the same. I can map a multivector x\in\mathbb{G_{p,q}} to a vector using the scalar product and choosing a basis e_J = \{1,e_1,e_2,\dots,e_{12},e_{13},\dots e_{12\dots n}\} and finding its respective reciprocal e^J i.e.

y_{k_J} = x* e^J = x_J * e^J

and the inverse mapping is simply

x=\sum_J y_{k_J}e_J.

Maybe if I can find a way to express this mapping using the geometric product instead of the scalar product probably I could be able to find an inverse mapping for multivectors

Quaternions are a skew field, so regular matrix algebra doesn’t apply, while multivectors aren’t even a skew field. So right off the bat, you can’t even assume that these are the same kind of matrix properties as for quaternions, unless you make some sort of assumptions about your multivector (such as it being a quaternion (aka spinor in R3)). Key property here is the invertibility of multivectors, which is an essential property to maintain a skew field and therefore a requirement for similar matrix algebra.

Ok so then what do you propose to be the invertible mapping between a multivector and the real vector in linear algebra.

Based on my brief look at this, I don’t believe what you’re trying to do is going to work.

Let’s just start with only thinking about how quaternions would be represented as a sub-matrix within your multivector matrix. To start off, you’d need to have a way to translate from multivectors to quaternions. Let’s do that first. In Grassmann.jl this is with:

julia> using Grassmann, AbstractTensors; basis"3"
(⟨×××⟩, v, v₁, v₂, v₃, v₁₂, v₁₃, v₂₃, v₁₂₃)

julia> 𝕚,𝕛,𝕜
(-1v₂₃, 1v₁₃, -1v₁₂)

julia> hyperplanes(ℝ3)
3-element Vector{Single{⟨×××⟩, 2, B, Int64} where B}:
 -1v₂₃
  1v₁₃
 -1v₁₂

julia> 𝕚*𝕛 == 𝕜, 𝕛*𝕜 == 𝕚, 𝕜*𝕚 == 𝕛, 𝕚*𝕛*𝕜 == -1
(true, true, true, true)

The generalization of this is to all grades is to use the transformation -I\omega:

julia> basis = Ref(-I).*Values(v,v1,v2,v3,v12,v13,v23,v123)
(-1v₁₂₃, -1v₂₃, 1v₁₃, -1v₁₂, 1v₃, -1v₂, 1v₁, 1v)

Now we have an actual multivector basis containing the quaternion basis, unlike what you’ve been doing in your posts (although the ordering is different, it doesn’t matter since matrix algebra is invariant modulo swapping the order of the basis). Now let’s define the multivector “involutions” required for the matrix definition using similar concepts as written in the paper you referenced. First, we need a few constructors:

fun(a,b) = a==b ? b : ~b
create(x,b) = Chain(fun.(Ref(x),b))
(Single{V,G,B,T} where {G,B})(::Zero{V}) where {V,T} = zero(T)*One{V}()

Then, evaluating this:

julia> A,B,C,D,E,F,G,H = create.(basis,Ref(basis))
8-element Values{8, Chain{⟨××××××××⟩, 1, Single{⟨×××⟩, G, B, Int64} where {G, B}, 8}} with indices SOneTo(8):
 -1v₁₂₃*v₁ + 1v₂₃*v₂ + -1v₁₃*v₃ + 1v₁₂*v₄ + 1v₃*v₅ + -1v₂*v₆ + 1v₁*v₇ + 1v*v₈
 1v₁₂₃*v₁ + -1v₂₃*v₂ + -1v₁₃*v₃ + 1v₁₂*v₄ + 1v₃*v₅ + -1v₂*v₆ + 1v₁*v₇ + 1v*v₈
   1v₁₂₃*v₁ + 1v₂₃*v₂ + 1v₁₃*v₃ + 1v₁₂*v₄ + 1v₃*v₅ + -1v₂*v₆ + 1v₁*v₇ + 1v*v₈
 1v₁₂₃*v₁ + 1v₂₃*v₂ + -1v₁₃*v₃ + -1v₁₂*v₄ + 1v₃*v₅ + -1v₂*v₆ + 1v₁*v₇ + 1v*v₈
  1v₁₂₃*v₁ + 1v₂₃*v₂ + -1v₁₃*v₃ + 1v₁₂*v₄ + 1v₃*v₅ + -1v₂*v₆ + 1v₁*v₇ + 1v*v₈
  1v₁₂₃*v₁ + 1v₂₃*v₂ + -1v₁₃*v₃ + 1v₁₂*v₄ + 1v₃*v₅ + -1v₂*v₆ + 1v₁*v₇ + 1v*v₈
  1v₁₂₃*v₁ + 1v₂₃*v₂ + -1v₁₃*v₃ + 1v₁₂*v₄ + 1v₃*v₅ + -1v₂*v₆ + 1v₁*v₇ + 1v*v₈
  1v₁₂₃*v₁ + 1v₂₃*v₂ + -1v₁₃*v₃ + 1v₁₂*v₄ + 1v₃*v₅ + -1v₂*v₆ + 1v₁*v₇ + 1v*v₈

Finally, we want to take the determinant of this in 8 dimensions, since the matrix will be 8x8 in order to determine the possibility of it leading to an invertible matrix definition:

julia> A∧B∧C∧D∧E∧F∧G∧H
0v*v₁₂₃₄₅₆₇₈

We get a 0 determinant, which means that if we were to attempt defining a matrix for multivectors in a similar way to the one for quaternions you metnioned, then we don’t have the possibility of getting an invertible matrix, unlike as for quaternions:

julia> qbasis = Ref(-I).*Values(v1,v2,v3,v123)
4-element Values{4, Single{⟨×××⟩, G, B, Int64} where {G, B}} with indices SOneTo(4):
 -1v₂₃
  1v₁₃
 -1v₁₂
    1v

julia> W,X,Y,Z = create.(qbasis,Ref(qbasis))
4-element Values{4, Chain{⟨××××⟩, 1, Single{⟨×××⟩, G, B, Int64} where {G, B}, 4}} with indices SOneTo(4):
 -1v₂₃*v₁ + -1v₁₃*v₂ + 1v₁₂*v₃ + 1v*v₄
   1v₂₃*v₁ + 1v₁₃*v₂ + 1v₁₂*v₃ + 1v*v₄
 1v₂₃*v₁ + -1v₁₃*v₂ + -1v₁₂*v₃ + 1v*v₄
  1v₂₃*v₁ + -1v₁₃*v₂ + 1v₁₂*v₃ + 1v*v₄

julia> W∧X∧Y∧Z
-4v*v₁₂₃₄

As we can see, my calculation comes out compatible with the references, the determinant is not zero which means the resulting matrix would be invertible for quaternions, but for multivectors the same calculation leads to a 0 determinant.

In conclusion, I have demonstrated that what you’re trying to do is not really possible. If you attempt to create a matrix for multivectors that is similarly constructed as the one for quaternions you referenced, then this is not possible based on my interpretations. This technique is possible for quaternions, but does not appear appropriate for multivectors.

I have to point out that the quaternions aren’t only isomorphic to \mathbb{G}^+_{3,0} but also isomorphic to \mathbb{G}_{0,2}. Which means that the mapping they defined must work for \mathbb{G}_{0,2}. So if I can define a map in \mathbb{G}_{0,2} shouldn’t I be able to define a map in any algebra \mathbb{G}_{p,q}?

Quaternions are always spinors, so it will always work for those, but i just showed you it can’t work for all geometric algebras because i demonstrated a counter example.

I’m thankful for your example with grassman.jl. But your counter example wasn’t enough to convince me. Maybe it’s because it feels like that you’re multiplying elements of the same algebra together. Since there is no distinction between v_1, v_2 and v_3 defined from the VGA basis and the first three elements of the basis /{v_1,v_2,v_3,v_4,v_5,v_6,v_7,v_8/}.

This is a false and incorrect statement. The 8 dimensional wedge product \Lambda(\mathbb{R}^8) and its basis is not same thing as the \Lambda(\mathbb{R}^3) basis. It’s understandable that you do not understand the source of my thinking and that it’s not convincing enough. However, what I have demonstrated is quite valid, as far as trying to do something similar to what is in the referenced paper goes. The correspondence is the following:

julia> Λ(8).b[2:9] .=> basis
8-element Vector{Pair}:
 v₁ => -1v₁₂₃
 v₂ => -1v₂₃
 v₃ => 1v₁₃
 v₄ => -1v₁₂
 v₅ => 1v₃
 v₆ => -1v₂
 v₇ => 1v₁
 v₈ => 1v

Or in reverse:

julia> [v1=>Λ(8).v7,v2=>-Λ(8).v6,v3=>Λ(8).v5]
3-element Vector{Pair{Submanifold{⟨×××⟩, 1}, TensorTerm{⟨××××××××⟩, 1}}}:
 v₁ => v₇
 v₂ => -1v₆
 v₃ => v₅

Similar to how it works for quaternions:

julia> Λ(4).b[2:5] .=> qbasis
4-element Vector{Pair}:
 v₁ => -1v₂₃
 v₂ => 1v₁₃
 v₃ => -1v₁₂
 v₄ => 1v

Also, the order of the 8 dimensional basis is not important, since you can change the order of the variables in matrix algebra without affecting the properties such as invertible.

As you can see, the relationship you are assuming is false and incorrect and you have not yet grasped the source of my thinking in my demonstration.