Talking about differential equations one distinguishes between ordinary and partial DEs (one or multiple independent variables). The theory of course lives in its own right in the discrete case, sometimes treated in the context of recursions.
Mathematical Physics is mostly differential equations (see Chapter 17, Mathematical Physics), so we can expect quite a lot of significant applications.
What _is_ a vector field? Arrows all over the place having length and direction? A vector field associates a vector to every point in space.
You know, those holonomy groups are actually operators working on functions having some values on the directed edges. You can add those functions, this forms a group. And you can multiply the functions with scalars. And one can multiply functions pointwise. For example think in GF(2) and use XOR. It seems that's called finite function fields, being other realisations (than Galois fields) of finite fields. That's our current opinion.
Now let's say you use the rational numbers
We might ask how the functions can be developed like Taylor series explicitly, with coefficients taken from some number ring, and we could make up functional equations defining exponential, log and so on. There should be algebraic and transcendental functions (like Euler's e is a transcendental number). And when we have an operator acting on a product or chain of functions, then how about s.th. like Leibniz rule and chain rule analog to standard analysis? And can one take our operators into a Jordan normalform?
Anyway there is the question of invariant rings of our holonomy groups (see Chapter 14, Algebra). A difference equation is actually invariant on the subgroup belonging to evolution in time, and other conserved quantities are just having same Eigenfunctions and can be measured simultaneously to H. Well, the (unitary) Hamiltonian H is an invariant of the subgroup, and other (Hermitian, self-adjoint) observables O (like squared angular momentum for example) behave in the Heisenberg picture like dt = [H,O] or so. This reminds us of CSA and adjoint representation btw.! The Hamiltonian will simply have as Eigenvalues for the orbits (periods) some root of 1.
As well you might say instead of looking for some x with f(x)=0 in the case of polynomials now we are after functions f such that Af=0 for some operator A, so there is an analogy to Galois theory, now with function field extensions and so on, and the Galois group does permutate functions which are solutions to partial difference equations.
Perhaps to investigate the Lie algebra structure is all you need for moving, and the full group is not necessary to construct? Or it may be a polynomial task to find out about the optimal moves, whereas it remains unknown und undecidable at this point if the given position is won, drawn or lost. So this wouldn't be a perfect solution, but it would be sufficient for a strong engine.
Let us last but not least cite a remark in [Now96] (describing a certain model for the moving of Chess pieces, which slightly but still nevertheless quite substantially differs from ours btw.):
[…]. The Chess move operators can be encoded by a group-equivariant matrix; rapid multiplication of a group-equivariant matrix by a vector, in general, relies on (decomposability:) the algebra-isomorphism between a group algebra and a (direct in the context of using integers for example) sum of matrix algebras […]. | ||
-- |
Yes, now we have reached a quite comfortable understanding. At least it looks as if the mathematical tools could offer serious advantages in terms of complexity over plain game tree search.