_            _    _        _         _
      /\ \         /\ \ /\ \     /\_\      / /\
      \_\ \       /  \ \\ \ \   / / /     / /  \
      /\__ \     / /\ \ \\ \ \_/ / /     / / /\ \__
     / /_ \ \   / / /\ \ \\ \___/ /     / / /\ \___\
    / / /\ \ \ / / /  \ \_\\ \ \_/      \ \ \ \/___/
   / / /  \/_// / /   / / / \ \ \        \ \ \
  / / /      / / /   / / /   \ \ \   _    \ \ \
 / / /      / / /___/ / /     \ \ \ /_/\__/ / /
/_/ /      / / /____\/ /       \ \_\\ \/___/ /
\_\/       \/_________/         \/_/ \_____\/
python-jaxlib 0.4.20
Dependencies: curl@8.6.0 double-conversion@3.1.5 flatbuffers@2.0.0 giflib@5.2.1 grpc@1.34.0 hwloc@2.11.1 icu4c@71.1 jsoncpp@1.9.5 libjpeg-turbo@2.1.4 openssl@3.0.8 pybind11@2.8.1 python-absl-py@1.4.0 python-numpy@1.23.2 python-scipy@1.12.0 python-six@1.16.0 python-wrapper@3.10.7 zlib@1.3
Propagated dependencies: python-absl-py@1.4.0 python-importlib-metadata@5.2.0 python-gast@0.5.3 python-ml-dtypes@0.3.1 python-numpy@1.23.2 python-opt-einsum@3.3.0 python-protobuf-for-tensorflow-2@4.21.9 python-scipy@1.12.0
Channel: guix-science
Location: guix-science/packages/python.scm (guix-science packages python)
Home page: https://github.com/google/jax
Licenses: ASL 2.0
Synopsis: Differentiate, compile, and transform Numpy code.
Description:

JAX is Autograd and XLA, brought together for high-performance numerical computing, including large-scale machine learning research. With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order.

Total results: 1