_            _    _        _         _
      /\ \         /\ \ /\ \     /\_\      / /\
      \_\ \       /  \ \\ \ \   / / /     / /  \
      /\__ \     / /\ \ \\ \ \_/ / /     / / /\ \__
     / /_ \ \   / / /\ \ \\ \___/ /     / / /\ \___\
    / / /\ \ \ / / /  \ \_\\ \ \_/      \ \ \ \/___/
   / / /  \/_// / /   / / / \ \ \        \ \ \
  / / /      / / /   / / /   \ \ \   _    \ \ \
 / / /      / / /___/ / /     \ \ \ /_/\__/ / /
/_/ /      / / /____\/ /       \ \_\\ \/___/ /
\_\/       \/_________/         \/_/ \_____\/
python-autograd 1.5-0.c6d81ce
Propagated dependencies: python-future@0.18.2 python-numpy@1.24.4
Channel: guix
Location: gnu/packages/machine-learning.scm (gnu packages machine-learning)
Home page: https://github.com/HIPS/autograd
Licenses: Expat
Synopsis: Efficiently computes derivatives of NumPy code
Description:

Autograd can automatically differentiate native Python and NumPy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization.

Total results: 2