Recursive custom gradients in TensorFlow

from blog josh iza.ac, | ↗ original
Most autodifferentiation libraries, such as PyTorch, TensorFlow, Autograd — and even PennyLane in the quantum case — allow you to create new functions and register custom gradients that the autodiff framework makes use of during backpropagation. This is useful in several cases; perhaps you have a composite function with a gradient that …