Default Branch

e1e8491b31 · [1/N] Change C-style casts to static_cast or reinterpret_cast (#165750) · Updated 2025-10-20 12:36:19 +08:00

Branches

863f1fdfa5 · Update on "[ATen] Fix CUDA reduction warp shuffle order" · Updated 2025-10-20 11:45:45 +08:00

2
29

afa8f7136b · Fix eager reduction warp shuffle order to start from offset=16 · Updated 2025-10-20 11:45:45 +08:00

2
1

bd32d09668 · tc · Updated 2025-10-20 09:02:49 +08:00

65
6

449b5f9222 · update vllm commit hash · Updated 2025-10-20 08:28:43 +08:00

6
1

68ef8e35af · update triton commit hash · Updated 2025-10-20 08:28:17 +08:00

6
1

b87d3ddd0d · Update base for Update on "[WIP] Support python slicing with tensor inputs." · Updated 2025-10-20 08:27:06 +08:00

31
14

6e61ce45fe · Update on "[WIP] Support python slicing with tensor inputs." · Updated 2025-10-20 08:27:06 +08:00

31
23

a2c2c3295c · WIP Support python slicing with data depedennt inptu tensors maybe · Updated 2025-10-20 08:27:06 +08:00

31
1

61d9a5180e · [Fix XPU CI] [Inductor UT] Fix test cases broken by community. (#165714) · Updated 2025-10-20 07:59:04 +08:00

10
0
Included

23cba2d68c · fix typo and lint · Updated 2025-10-20 07:52:56 +08:00

941
24

86279c6f25 · Merge remote-tracking branch 'origin/main' into attention_benchmark · Updated 2025-10-20 07:08:23 +08:00

9
7

f194b740f7 · resolve comments:fix input_gen_fns API; tensor shape infer with siz_hints instead of ir_node_to_tensor; seperated non_tensor_args for Functools.partial for kwargs ; added Single output - layout assertion; lint · Updated 2025-10-20 05:43:38 +08:00

941
22

159131321b · [rfc] add debug mode to print meta in fx graphs · Updated 2025-10-20 05:00:58 +08:00

647
1

d5a5c7c470 · [rfc] add debug mode to print meta in fx graphs · Updated 2025-10-20 05:00:54 +08:00

647
1

cc3852ada5 · Update (base update) · Updated 2025-10-20 01:58:34 +08:00

206
2

5ad32e5c17 · Update · Updated 2025-10-20 01:58:34 +08:00

206
4

1db4025783 · [FlexAttention] Add mechanism to get optimal autotune decision · Updated 2025-10-20 01:58:34 +08:00

206
3

a317caf67e · [FlexAttention] Fix dynamic shaped heads flex_flash check · Updated 2025-10-20 01:58:33 +08:00

206
2

22841d772d · Update (base update) · Updated 2025-10-20 01:58:31 +08:00

206
1

89eca67ddc · Update · Updated 2025-10-20 01:58:31 +08:00

206
2