Daniël de Kok d383fdd4b4 Add support for XPU layer repostories (#142)
This change adds support for XPU layer repositories, e.g.:

```
kernel_mapping = {
    "LigerRMSNorm": {
        "xpu": LayerRepository(
            repo_id="kernels-community/liger_kernels",
            layer_name="LigerRMSNorm",
        )
    },
}

Co-authored-by: YangKai0616 <kai.yang@intel.com>
2025-09-11 15:51:02 +02:00
2025-08-01 15:42:38 +02:00
2025-02-25 16:13:37 +01:00
2025-02-25 16:13:37 +01:00
2024-11-29 17:43:30 +01:00
2025-08-01 15:56:02 +02:00
2025-08-01 15:56:02 +02:00
2025-04-04 20:35:29 +02:00

kernels

kernel-builder logo

PyPI - Version GitHub tag Test kernels


The Kernel Hub allows Python libraries and applications to load compute kernels directly from the Hub. To support this kind of dynamic loading, Hub kernels differ from traditional Python kernel packages in that they are made to be:

  • Portable: a kernel can be loaded from paths outside PYTHONPATH.
  • Unique: multiple versions of the same kernel can be loaded in the same Python process.
  • Compatible: kernels must support all recent versions of Python and the different PyTorch build configurations (various CUDA versions and C++ ABIs). Furthermore, older C library versions must be supported.

🚀 Quick Start

Install the kernels package with pip (requires torch>=2.5 and CUDA):

pip install kernels

Here is how you would use the activation kernels from the Hugging Face Hub:

import torch

from kernels import get_kernel

# Download optimized kernels from the Hugging Face hub
activation = get_kernel("kernels-community/activation")

# Random tensor
x = torch.randn((10, 10), dtype=torch.float16, device="cuda")

# Run the kernel
y = torch.empty_like(x)
activation.gelu_fast(y, x)

print(y)

You can search for kernels on the Hub.

📚 Documentation

Description
Load compute kernels from the Hub
Readme Apache-2.0 782 KiB
Languages
Python 98.7%
Nix 1.2%