_            _    _        _         _
      /\ \         /\ \ /\ \     /\_\      / /\
      \_\ \       /  \ \\ \ \   / / /     / /  \
      /\__ \     / /\ \ \\ \ \_/ / /     / / /\ \__
     / /_ \ \   / / /\ \ \\ \___/ /     / / /\ \___\
    / / /\ \ \ / / /  \ \_\\ \ \_/      \ \ \ \/___/
   / / /  \/_// / /   / / / \ \ \        \ \ \
  / / /      / / /   / / /   \ \ \   _    \ \ \
 / / /      / / /___/ / /     \ \ \ /_/\__/ / /
/_/ /      / / /____\/ /       \ \_\\ \/___/ /
\_\/       \/_________/         \/_/ \_____\/
python-jaxopt 0.8.3
Propagated dependencies: python-dm-tree@0.1.9 python-jax@0.4.28 python-jaxlib@0.4.28 python-optax@0.1.5 python-numpy@1.26.4 python-scipy@1.12.0
Channel: guix-science
Location: guix-science/packages/python.scm (guix-science packages python)
Home page: https://github.com/google/jaxopt
Licenses: ASL 2.0
Synopsis: Hardware accelerated, batchable and differentiable optimizers in JAX
Description:

JAXopt provides hardware accelerated, batchable and differentiable optimizers in JAX.

  1. Hardware accelerated: the implementations run on GPU and TPU, in addition to CPU.

  2. Batchable: multiple instances of the same optimization problem can be automatically vectorized using JAX’s vmap.

  3. Differentiable: optimization problem solutions can be differentiated with respect to their inputs either implicitly or via autodiff of unrolled algorithm iterations.

Total results: 1