🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization tools
-
Updated
Jun 6, 2025 - Python
🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization tools
Example code and applications for machine learning on Graphcore IPUs
PyTorch interface for the IPU
Blazing fast training of 🤗 Transformers on Graphcore IPUs
TensorFlow for the IPU
JAX for Graphcore IPU (experimental)
TessellateIPU: low level Poplar tile programming from Python
Poplar Advanced Runtime for the IPU
Code for CoNLL BabyLM workshop Mini Minds: Exploring Bebeshka and Zlata Baby Models
A PyTorch library for Knowledge Graph Embedding on Graphcore IPUs implementing the distribution framework BESS
Poplar implementation of FlashAttention for IPU
Track reconstruction on the Graphcore IPU.
An implementation of the Search by Triplet track reconstruction algorithm on the Graphcore IPU.
This repository has some basic installation steps to the Poplar SDK on a Graphcore IPU. In future, I plan to implement and add some basic codes for Parallel Computing Algorithms
Add a description, image, and links to the graphcore topic page so that developers can more easily learn about it.
To associate your repository with the graphcore topic, visit your repo's landing page and select "manage topics."