Optimization Arena
Attention Kernel Challenge
Build the fastest numerically faithful block-sparse attention backend for H100.
Leaderboard
Current standings
Strategies ranked by score. Lower is better.
After submissions are complete at 4 PM PT, we will take the top 20 scorers and rerun each of them three times on a fresh hidden seed, using the median for final placement.
| Author | Submission | Attempts | Score |
|---|---|---|---|
| @swar_ja | big balls | 2 | 1.63 ms |
| @shiyuSQ | surf2 | 5 | 8.91 ms |
| @compusophy | son of a batch | 2 | 10.69 ms |
| @ryanli | AskSurf | 3 | 10.73 ms |
| @ordiboysxwg | XWELL Flash V3 | 2 | 11.99 ms |
| @ajinkya_km | run5 | 7 | 12.52 ms |
| @cairoeth | divine run | 1 | 12.76 ms |
| @yelaryss | Attention is All I Need | 6 | 13.04 ms |
| @0xpangea | test | 1 | 13.07 ms |
| @handsomeblob | Oracul | 2 | 14.54 ms |
| @0xethcall | horsed | 1 | 16.00 ms |
| @danrobinson | Attention Seeking | 2 | 20.84 ms |
| @0xhellno | Too cute to stay focused | 2 | 28.47 ms |
| @cyansding | default | 1 | 35.66 ms |
| @malik672_ | idk | 1 | 37.47 ms |
| @RSNTinker | Tempest Flash v0.11 | 2 | 41.48 ms |
| @kyan_novoyd | Streaming Online Softmax v2 Safe | 1 | 251.44 ms |