I understand that there are methods to solve for non-linear optimization problem such as “Geometric Algebra Levenberg-Marquardt”. But this still uses a vector valued function f(x) where each element x_k are just paremeterizations. For example a bivector in G_3 is parameterized as B(b) = b_1e_{12}+b_2e_{13}+b_3e_{23} and then use different differenciation tools to take the derivative for each one of the elements of b. This way you can use the traditional optimization techniques that uses coordinates.

What I was trying to achieve was writing an algorithm that only uses the standard derivatives in geometric calculus. If for example I want to minimize a multivector multivector-valued function F(X). I would take It’s derivative and set the derivative to 0.

G(X)\equiv\partial_XF(X) = 0

The next step would be to use the taylor series expansion

G(X+A) = G(X) + \sum_{k=1}^{\infty}(A*\partial_X)^kG(X)/k!

to obtain a first order approximation for G(X):

G(X+A) \approx G(X) + A*\partial_XG(X).

This way the second term is a multilinear function on A and since is multilinear the solution to

A*\partial_XG(X) = -G(X)

should be easier to solve for A. Than G(X) = 0.

But I’ve been looking for ways to solve this multilinear form equation for A in general.

I know that an outermorphism is the simplest of multinear forms that actually has a nice inverse.

Could someone point me to the right direction to where I should look for to be able to solve this kind of multilinear form equations in geometric/clifford algebra?

It’s unlikely that this will be answered on this site as it’s focused more on applied aspects.

It’s a shame though, because GA is in my opinion most significant in pure math as it allows for clear understanding of abstract structures without the prism of random objects like coordinates.

But most math people are formalists, and think ga is just pure notation and are not that interested.

So it’s hard to find someone who understands GA and is experienced enough in math to expand on the math foundations of ga.

Then where or to whom do you think I should ask this question?

That’s also a good question, I wish I knew the answer to that also.

On the first one, maybe the Jacobian could tell you something more about a function, you can play with adjoints and induced differential transformations and see what that will reveal.

From a practical (i.e., numerical) point of view you can always formulate the Gauss-Newton optimization to find the “best” multivector X by switching temporarily to tensors as shown by Perwass and Sommer in the paper entitled “Parameter Estimation from Uncertain Data in Geometric Algebra” and also in their “Numerical Evaluation of Versors with Clifford Algebra” papers.

The vector derivative w.r.t X can be found with ease in tensor notation, no matter how involved is the function on X i.e., you don’t need to find the adjoint of anything, and geometric constraints can be added as lagrange multipliers (also in tensor notation). Then you can “throw” those jacobians and gradients to a black-box non-linear optimization algorithm such as LM and get the best approximation tensor which can be casted back to Clifford Algebra multivector.

So, provided the optimization problem is well posed, you formulate it in Geometric Algebra then you switch to tensor algebra temporarily to get Jacobian matrix and perform Gauss-Newton or whatever other optimization algorithm then you cast the resulting tensor back to a GA multivector. That is a very generic framework that “just works”. Of course the inconvinient is that it is slow (euclidean geometric algebra G^{n,0,0} needs tensors of size n^3) and naive tensor usage doesn’t scale beyond toy problems or proof of concept. You would need to exploit the particular function’s structure or symmetries in order to optimize your numerical algorithms. You also need to master (multi)linear algebra to use effectively the classic matrix optimization algorithms (there is no real “black box” algorithm outside the convex optimization). More important, you loose geometric intuition in all that process.

My 50 cents. Hope it helps.

I think the OP really wanted to emphasize a coordinate free eporoach here.

But I saw an interesting point, that tensor notation and manipulation, whether abstract index("coord free) or not, is intrinsically a " “graph” and thus a better way for handling complex transformations, unlike math notation with braces used in GA which is more akin to language sentances and can get messy with high order tensors, and it can apparently be rigorously proven, and I think it can clearly be seen in general relativity.

But then again the differential can push the functions and fields from the base to the tangent bundle, which can in theory be quite revealing, but still needs a lot of research.

Though this is a nice comment, as I am curious about the sinergy between TC and GA.

I think the best way to develop “nice” algorithms is to first study linear multivector multivector valued functions. But I wasn’t able to find any kind of documentation on that topic, either on geometric algebra or clifford algebra. I’ve tried Hestens’ book “Clifford algebra to geometric calculus” that has some things about multilinear functions but it doesn’t help to develop some sort of general theory for solving multilinear equations. What I’m basically looking for is a way to determine solutions to multilinear equations without the use of coordinates.

For examlpe for an outermorphism f(x) = \underbar f(x) there is a nice inverse without the use of coordinates \underbar f^{-1}(A)={\bar f(AI)I^{-1}}/\text{det}(f) which doesn’t have any coordinates. So what I wanted is to develop some sort of general theory to solve the harder more general case, where F =\underbar F = F(A) is a multivector multivector valued function.

Problem is mathematicians are usually formalists and as Davids father said to him, very few understand the difference between a math concept and its symbolic representation.

Such people are rare, and usually best of the best, in the likes of Godel, Hermit, Clifford, Grassman…

But this way of thinking is fundamentally different than standard and is the basic reason why math people don’t care about GA or being completely cord free even though they like to say they are.

There are many questions like this one, and there should be a systematic approach to math research based on the theoretical scientific method, where one studies the evolution of a given set of axioms objectively and rigorously, which means working directly from first principles without adding any arbitrary structure like a basis or coordinates, and just following what the axioms themselves predict if they even have enough predictive power.

I wrote about this in one topic, and as it turns out GA shows it us indeed possible to have a powerful beyond belief theory that includes geometry, and it is the answer to fundamental problems of linear algebra and Diff geometry, and but it requires a collaboration of acrross may math areas, in a “scientific” way.

This is something that David Hestenes set out to initiate, and was somewhat successful but still far from completely.

1 Like

I’ve been studiyng calculus on manifolds mostly using the book CAGC by Hestenes. And I’ve got some insights related to this topic that I’d like to share.

So I’ve discovered that somehow you can embed a multivector manifold in a vector manifold. So if I want to do calculus in a multivector manifold you just use a 1-vector representation of that manifold (It might be needed to extend the algebra to do so). And that actually is the approach that Hestenes wants us to do as.

So Instead of trying to develop calculus on multivector manifolds I’ve just been studying geometric calculus on vector manifolds. This enabled me to start thinking about the fundamental theorem of calculus and its special cases. I’ve also been trying to link monogenic (complex analytic) functions to cost functions with associated constraints.

So now what I want to achieve is to link calculus on manifolds with constrained optimization theory. Such that I might find some new insights to constrained optimization and find new algorithms to solve this optimization problems.

1 Like