Skip to content

google/jaxopt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

b12c2f3 · Apr 14, 2025
Apr 11, 2025
Jan 18, 2024
Apr 11, 2025
Apr 9, 2025
Apr 11, 2025
Apr 9, 2025
Oct 22, 2021
Sep 4, 2023
Jul 12, 2021
Jul 12, 2021
Apr 10, 2025
Dec 14, 2022
Feb 16, 2023
Feb 16, 2023
Apr 11, 2025

JAXopt

Status | Installation | Documentation | Examples | Cite us

Hardware accelerated, batchable and differentiable optimizers in JAX.

  • Hardware accelerated: our implementations run on GPU and TPU, in addition to CPU.
  • Batchable: multiple instances of the same optimization problem can be automatically vectorized using JAX's vmap.
  • Differentiable: optimization problem solutions can be differentiated with respect to their inputs either implicitly or via autodiff of unrolled algorithm iterations.

Status

JAXopt is no longer maintained nor developed. Alternatives may be found on the JAX website. Some of its features (like losses, projections, lbfgs optimizer) have been ported into optax. We are sincerely grateful for all the community contributions the project has garnered over the years.

Installation

To install the latest release of JAXopt, use the following command:

$ pip install jaxopt

To install the development version, use the following command instead:

$ pip install git+https://github.com/google/jaxopt

Alternatively, it can be installed from sources with the following command:

$ python setup.py install

Cite us

Our implicit differentiation framework is described in this paper. To cite it:

@article{jaxopt_implicit_diff,
  title={Efficient and Modular Implicit Differentiation},
  author={Blondel, Mathieu and Berthet, Quentin and Cuturi, Marco and Frostig, Roy 
    and Hoyer, Stephan and Llinares-L{\'o}pez, Felipe and Pedregosa, Fabian 
    and Vert, Jean-Philippe},
  journal={arXiv preprint arXiv:2105.15183},
  year={2021}
}

Disclaimer

JAXopt was an open source project maintained by a dedicated team in Google Research. It is not an official Google product.