# pyADiff: A simple, pure python algorithmic differentiation package¶

pyADiff is a (yet) very basic algorithmic differentiation package, which implements forward and adjoint/reverse mode differentiation. If you are looking for a fully-featured and faster library, have a look at google/jax, autograd or dco/c++ (or many more), but if you are interested in a package where you are able to quickly “look under the hood”, you may be right here.

## Contents¶

## Basic Usage¶

Suppose we want to compute the gradient of the function \(f(x_0, x_1) = 2 x_0 x_1^2\). This is a rather trivial task, because by simple calculus the gradient is:

Nevertheless we use this example illustrate the use of pyADiff.

```
import pyADiff as ad
# define the function f
def f(x):
return 2.*x[0]*x[1]**2.
# call the gradient function of pyADiff
df = ad.gradient(f)
x = [0.5, 2.0]
# Call the function f and the gradient function df
y = f(x)
dy = df(x)
print("f({}) = {}".format(x, y)) # prints f([0.5, 2.0]) = 4.0
print("f'({}) = {}".format(x, dy)) # prints f'([0.5, 2.0]) = [8. 4.]
```

Which corresponds to the evaluation of the analytic gradient.

## Motivation¶

My motivation to start this project arose from curiosity while listening to the lecture Computational Differentiation by Uwe Naumann at RWTH Aachen University. So basically I tried to understand the concepts from the lecture by implementing them by myself. In the end I was (positively) surprised with the outcome and decided to bundle it in a python package. Additionaly this gave me the chance to learn about python packaging, distributing, documentation, …