JAX is an open-source Python library that brings together Autograd and XLA, facilitating high-performance machine learning research. In this episode of AI Adventures, Yufeng Guo talks about how you can use JAX to compile and run your NumPy programs on GPUs and TPUs, and its ability to support composable function transformations.
GitHub for Jax → https://goo.gle/2L2C4Kv
Try out the notebook used in this video → https://goo.gle/2L5Kqkh
XLA: Optimizer for machine learning → https://goo.gle/35Acdml
Check out the rest of the Cloud AI Adventures playlist → https://goo.gl/UC5usG
Subscribe to get all the episodes as they come out → https://goo.gl/S0AS51
Product: AI Platform Training; fullname: Yufeng Guo;
Publisher: Google Cloud
You can watch this video also at the source.