Grad, curl and div in differential geometry
Another example can be derived from differential geometry, especially relevant for work on the Maxwell equations.
Consider the Hilbert space
of scalar-valued square-integrable functions on three dimensions
. Taking the gradient of a function
moves us to a subset of
, the space of vector valued, still square-integrable functions on the same domain
— specifically, the set of such functions that represent conservative vector fields. (The generalized Stokes' theorem has preserved integrability.)
First, note the curl of all such fields is zero — since

for all such f. However, this only proves that the image of the gradient is a subset of the kernel of the curl. To prove that they are in fact the same set, prove the converse: that if the curl of a vector field
is 0, then
is the gradient of some scalar function. This follows almost immediately from Stokes' theorem (see the proof at conservative force.) The image of the gradient is then precisely the kernel of the curl, and so we can then take the curl to be our next morphism, taking us again to a (different) subset of
.
Similarly, we note that

so the image of the curl is a subset of the kernel of the divergence. The converse is somewhat involved (for the general case see Poincaré lemma):
Proof that = 0 implies for some  |
We shall proceed by construction: given a vector field such that , we produce a field such that

First, note that since as proved above , we can add the gradient of any scalar function to without changing the curl. We can use this gauge freedom to set any one component of to zero without changing its curl; choosing arbitrarily the z-component, we thus require simply that

Then by simply integrating the first two components, and noting that the 'constant' of integration may still depend on any variable not integrated over, we find that

Note that since the two integration terms both depend only on x and y and not on z, then we can add another gradient of some function that also does not depend on z. This permits us to eliminate either of the terms in favor of the other, without spoiling our earlier work that set to zero. Choosing to eliminate and applying the last component as a constraint, we have

By assumption, , and so

Since the fundamental theorem of calculus requires that the first term above be precisely plus a constant in z, a solution to the above system of equations is guaranteed to exist. |
Having thus proved that the image of the curl is precisely the kernel of the divergence, this morphism in turn takes us back to the space we started from
. Since definitionally we have landed on a space of integrable functions, any such function can (at least formally) be integrated in order to produce a vector field which divergence is that function — so the image of the divergence is the entirety of
, and we can complete our sequence:

Equivalently, we could have reasoned in reverse: in a simply connected space, a curl-free vector field (a field in the kernel of the curl) can always be written as a gradient of a scalar function (and thus is in the image of the gradient). Similarly, a solenoidal vector field can be written as a curl of another field.[1] (Reasoning in this direction thus makes use of the fact that 3-dimensional space is topologically trivial.)
This short exact sequence also permits a much shorter proof of the validity of the Helmholtz decomposition that does not rely on brute-force vector calculus. Consider the subsequence

Since the divergence of the gradient is the Laplacian, and since the Hilbert space of square-integrable functions can be spanned by the eigenfunctions of the Laplacian, we already see that some inverse mapping
must exist. To explicitly construct such an inverse, we can start from the definition of the vector Laplacian

Since we are trying to construct an identity mapping by composing some function with the gradient, we know that in our case
. Then if we take the divergence of both sides

we see that if a function is an eigenfunction of the vector Laplacian, its divergence must be an eigenfunction of the scalar Laplacian with the same eigenvalue. Then we can build our inverse function
simply by breaking any function in
into the vector-Laplacian eigenbasis, scaling each by the inverse of their eigenvalue, and taking the divergence; the action of
is thus clearly the identity. Thus by the splitting lemma,
,
or equivalently, any square-integrable vector field on
can be broken into the sum of a gradient and a curl — which is what we set out to prove.