JAX is a Python library for accelerator-oriented array computation and program transformation, designed for high-performance numerical computing and large-scale machine learning. It is developed by Google with contributions from Nvidia and other community contributors.

It is described as bringing together a modified version of autograd (automatic obtaining of the gradient function through differentiation of a function) and OpenXLA's XLA (Accelerated Linear Algebra). It is designed to follow the structure and workflow of NumPy as closely as possible and works with various existing frameworks such as TensorFlow and PyTorch. The primary features of JAX are:

  1. Providing a unified NumPy-like interface to computations that run on CPU, GPU, or TPU, in local or distributed settings.
  2. Built-in Just-In-Time (JIT) compilation via Open XLA, an open-source machine learning compiler ecosystem.
  3. Efficient evaluation of gradients via its automatic differentiation transformations.
  4. Automatically vectorized to efficiently map them over arrays representing batches of inputs.

grad

The below code demonstrates the grad function's automatic differentiation.

The final line should outputː

jit

The below code demonstrates the jit function's optimization through fusion.

The computation time for jit_cube (line #17) should be noticeably shorter than that for cube (line #16). Increasing the values on line #7, will further exacerbate the difference.

vmap

The below code demonstrates the vmap function's vectorization.

The GIF on the right of this section illustrates the notion of vectorized addition.

pmap

The below code demonstrates the pmap function's parallelization for matrix multiplication.

The final line should print the valuesː

See also

  • NumPy
  • TensorFlow
  • PyTorch
  • CUDA
  • Accelerated Linear Algebra

External links

  • Documentationː jax.readthedocs.io
  • Colab (Jupyter/iPython) Quickstart Guideː colab.research.google.com/github/google/jax/blob/main/docs/notebooks/quickstart.ipynb
  • TensorFlow's XLAː www.tensorflow.org/xla (Accelerated Linear Algebra)
  • YouTube TensorFlow Channel "Intro to JAX: Accelerating Machine Learning research": www.youtube.com/watch?v=WdTeDXsOSj4
  • Original paperː mlsys.org/Conferences/doc/2018/146.pdf

References


jax · GitHub Topics · GitHub

Jax technologie logo Fotos und Bildmaterial in hoher Auflösung Alamy

Jax Guide Bestes Jax Build lolvvv

JAX Conveyor Release WB Trennmittel H3 Baumann Mineralölvertrieb

Mobile App JAX & WJAX