Adept (C++ library)

Automatic differentiation and array software library From Wikipedia, the free encyclopedia

Adept is a combined automatic differentiation and array software library for the C++ programming language. The automatic differentiation capability facilitates the development of applications involving mathematical optimization. Adept is notable for having applied the template metaprogramming technique of expression templates to speed-up the differentiation of mathematical statements.[1][2] Along with the efficient way that it stores the differential information, this makes it significantly faster than most other C++ tools that provide similar functionality (e.g. ADOL-C, CppAD and FADBAD),[1][3][4][5][6] although comparable performance has been reported for Stan and in some cases Sacado.[3] Differentiation may be in forward mode, reverse mode (for use with a Quasi-Newton minimization scheme), or the full Jacobian matrix may be computed (for use with the Levenberg-Marquardt or Gauss-Newton minimization schemes).

DeveloperRobin Hogan
Stable release
2.1.2 / 3 October 2023 (2 years ago) (2023-10-03)
Written inC++
Quick facts Developer, Stable release ...
Adept C++ Library
DeveloperRobin Hogan
Stable release
2.1.2 / 3 October 2023 (2 years ago) (2023-10-03)
Written inC++
Operating systemCross-platform
TypeLibrary
LicenseApache 2.0 (open source)
Websitewww.met.reading.ac.uk/clouds/adept/
Repository
Close

Applications of Adept have included computer functionality in the financial field,[6][7] computational fluid dynamics,[8] physical chemistry,[9] parameter estimation[10] and meteorology.[11] Adept is free software distributed under the Apache License.

Example

Adept implements automatic differentiation using an operator overloading approach, in which scalars to be differentiated are written as adouble, indicating an "active" version of the normal double, and vectors to be differentiated are written as aVector. The following simple example uses these types to differentiate a 3-norm calculation on a small vector:

import <adept/adept.h>;
import std;

using adept::Real;
using adept::Stack;
using adept::Vector;

int main(int argc, char* argv[]) {
    Stack s; // Object to store differential statements
    Vector x(3); // Independent variables: active vector with 3 elements
    x << 1.0, 2.0, 3.0; // Fill vector x
    s.new_recording(); // Clear any existing differential statements
    Real J = adept::cbrt(adept::sum(adept::abs(x * x * x))); // Compute dependent variable: 3-norm in this case
    J.set_gradient(1.0); // Seed the dependent variable
    s.reverse(); // Reverse-mode differentiation
    std::println("dJ/dx = {}", x.get_gradient()); // Print the vector of partial derivatives dJ/dx

    return 0;
}

When compiled and executed, this program reports the derivative as:

dJ/dx = {0.0917202, 0.366881, 0.825482}

See also

References

Related Articles

Wikiwand AI