Hi Dan,

Welcome to the forum !

I happen to think that’s actually a great question to have out in the open. Plenty of introductions to Geometric Algebra take that road and it’s easy to end up with the assumption that this decomposition in inner and outer product is a general thing.

It most certainly is not - as Leo pointed out it is in fact only generally true for vectors - in that sense it is probably not doing GA any favors - one of the core ideas is to treat elements of arbitrary grade (i.e. dimension) on equal footing.

I personally prefer the so called ‘contraction axiom’ approach to understanding the geometric product. (and the ‘lazy’ approach to defining the inner and outer products). In a nutshell :

- start from \mathbb R
- add some orthogonal basis vectors \mathbf e_i, called generators
- the \mathbf e_i satisfy the contraction axiom … mostly written \mathbf {e_i}^2 = | \mathbf e_i |

Think of it as just extending the reals with some new *numbers*, that are not in \mathbb R but their square is.

\mathbf e_i \notin \mathbb R, {\mathbf e_i}^2 \in \mathbb R

as in linear algebra, arbitrary vectors are written as linear combinations of these \mathbf e_i

\vec x = x_1\mathbf e_1 + x_2\mathbf e_2

Now since you could’ve picked any vector as basis vector, the contraction axiom must also hold for the general vector \vec x (2D example because I’m lazy) :

\begin {aligned}\color{green}{\vec x}^2\color{black} &= (x_1\mathbf e_1 + x_2\mathbf e_2) (x_1\mathbf e_1 + x_2\mathbf e_2) \\ &= \color{green} x_1^2\mathbf e_1^2 \color{black}+ x_1\mathbf e_1x_2\mathbf e_2 + x_2\mathbf e_2 x_1\mathbf e_1 + \color{green} x_2^2\mathbf e_2^2\end{aligned}

Now since everything in green, by definition is \in \mathbb R, the black part of that last equality needs to be zero. **Anti-commutativity** of different basis vectors follows :

\begin{aligned}
x_1x_2\mathbf e_1\mathbf e_2 + x_2x_1\mathbf e_2\mathbf e_1 &= 0 \\
x_1x_2\mathbf e_1\mathbf e_2&= - x_2x_1\mathbf e_2\mathbf e_1\\
\mathbf e_1\mathbf e_2 &= -\mathbf e_2\mathbf e_1
\end{aligned}

Now, the entire geometric product can be seen as applying these two rules:

\text{Same basis vectors \bf{contract}}\\
\boxed{\mathbf e_i^2 \in \{-1,0,+1\}}

\text{Different basis vectors \bf{anticommute}}\\
\boxed{\mathbf e_i\mathbf e_j= -\mathbf e_j\mathbf e_i}

Now for two vectors it becomes easy to see … they are all linear combinations of the \mathbf e_i, so the product between vectors will always involve either \mathbf e_i\mathbf e_i, and will contract into a scalar (part of the inner product), or it will involve \mathbf e_i\mathbf e_j, which will produce bivector components in the result (part of the outer product).

\begin{aligned} \vec x \vec y &= \langle \vec x \vec y \rangle_0 + \langle \vec x \vec y \rangle_2\\ &= \vec x \cdot \vec y + \vec x \wedge \vec y\end{aligned}

for two bivectors b and c, proceeding similarly you’ll find that their product contains scalar (both generators contract, e.g. \mathbf e_{12}\mathbf e_{12}), bivector (1 generator contracts, e.g. \mathbf e_{12}\mathbf e_{23}) and quadvector (no generators contract, e.g. \mathbf e_{12}\mathbf e_{34}) parts :

\begin {aligned} bc &= \langle bc \rangle_0 + \langle bc \rangle_2 + \langle bc \rangle_4\\ &= b \cdot c + b \times c + b \wedge c \end {aligned}

The grade 2 part here is called the **commutator** (as Leo mentioned above), and is more generally defined as a \times b = \frac 1 2 (ab - ba).

In the case of the product between bivectors, you should notice that the result contains only elements of even grade. This makes the elements of even grade a closed group under the geometric product, commonly called **the even subalgebra**. (written e.g. \mathbb R^+_{2,0} - in 2D)

The *inner* and *outer* products can now be derived using the **lazy** aproach … they’re really just the *geometric product* but you’re to lazy to calculate it all, using s for the grade of a and t for the grade of b :

\boxed{a \cdot b = \langle ab \rangle_{|s-t|}}

\boxed{a \wedge b = \langle ab \rangle_{s+t}}

(where the \langle x \rangle_y notation denodes the y-th grade of x)

From this perspective, there’s really no special product, and even the geometric product is reduced to the ordinary product, with its special properties transferred to special properties of the \mathbf e_i generators themselves.

There are two other products that are also worth your attention, one is called the **regressive** product \vee, defined as a \vee b = (a^* \wedge b^*)^* and the other is called the **sandwich** product, which does not have its own symbol but lends its name from its form : x y \tilde x, as one of the arguments sandwiches the other.

The wedge \wedge and vee \vee products are your partners for **incidence** relations, and can be used to **join** and **meet** geometric elements. (which one is which depends on your choice of geometry, hyper-plane based (e.g. PGA) or point based (e.g. CGA))

The sandwich product of a 1-vector \mathbf x_a with a general element \mathbf Y will turn out to encode a reflection :

\mathbf x_a \mathbf Y \mathbf x_a

And the combination of two reflections forms a rotation :

\mathbf x_b\mathbf x_a \mathbf Y \mathbf x_a \mathbf x_b = \mathbf x_{ba} \mathbf Y \tilde{\mathbf x_{ba}}

Where the tilde denotes the reversal. This explains why this sandwich product also shows up for quaternions. (which are nothing more than the product of two vectors in 3D - just as complex numbers are the product of two vectors in 2D - both are just a **number** and a **plane** - writing that plane takes just one number in 2D, and 3 in 3D …). By projectivizing (as in PGA), everything stays the same, but the planes you reflect in are no longer attached to the origin, and so you can use parallel planes to represent translations (arriving at the dual quaternions). By conformalizing, you can reflect in spheres, greatly extending the transformations and elements you can represent - while still building on the exact same ideas …

Hope that helps you and other readers,

Steven.