mirror of
https://github.com/huggingface/kernels.git
synced 2025-10-20 21:10:02 +08:00
811 B
811 B
Basic Usage
Loading Kernels
Here is how you would use the activation kernels from the Hugging Face Hub:
import torch
from kernels import get_kernel
# Download optimized kernels from the Hugging Face hub
activation = get_kernel("kernels-community/activation")
# Create a random tensor
x = torch.randn((10, 10), dtype=torch.float16, device="cuda")
# Run the kernel
y = torch.empty_like(x)
activation.gelu_fast(y, x)
print(y)
Checking Kernel Availability
You can check if a specific kernel is available for your environment:
from kernels import has_kernel
# Check if kernel is available for current environment
is_available = has_kernel("kernels-community/activation")
print(f"Kernel available: {is_available}")