Paper Detail
Terry Chen, Zhifan Ye, Bing Xu, Zihao Ye, Timmy Liu, Ali Hassani, Tianqi Chen, Andrew Kerr, Haicheng Wu, Yang Xu, Yu-Jung Chen, Hanfeng Chen, Aditya Kane, Ronny Krashinsky, Ming-Yu Liu, Vinod Grover, Luis Ceze, Roger Bringmann, John Tran, Wei Liu, Fung Xie, Michael Lightstone, Humphrey Shi
Agentic Variation Operators (AVO) are a new family of evolutionary variation operators that replace the fixed mutation, crossover, and hand-designed heuristics of classical evolutionary search with autonomous coding agents. Rather than confining a language model to candidate generation within a prescribed pipeline, AVO instantiates variation as a self-directed agent loop that can consult the current lineage, a domain-specific knowledge base, and execution feedback to propose, repair, critique, and verify implementation edits. We evaluate AVO on attention, among the most aggressively optimized kernel targets in AI, on NVIDIA Blackwell (B200) GPUs. Over 7 days of continuous autonomous evolution on multi-head attention, AVO discovers kernels that outperform cuDNN by up to 3.5% and FlashAttention-4 by up to 10.5% across the evaluated configurations. The discovered optimizations transfer readily to grouped-query attention, requiring only 30 minutes of additional autonomous adaptation and yielding gains of up to 7.0% over cuDNN and 9.3% over FlashAttention-4. Together, these results show that agentic variation operators move beyond prior LLM-in-the-loop evolutionary pipelines by elevating the agent from candidate generator to variation operator, and can discover performance-critical micro-architectural optimizations that produce kernels surpassing state-of-the-art expert-engineered attention implementations on today's most advanced GPU hardware.
No structured notes yet. Add `summary_sections`, `why_relevant`, `claim_impact`, or `next_action` in `papers.jsonl` to enrich this view.
No ranking explanation is available yet.
No tags.
@misc{chen2026avo,
title = {AVO: Agentic Variation Operators for Autonomous Evolutionary Search},
author = {Terry Chen and Zhifan Ye and Bing Xu and Zihao Ye and Timmy Liu and Ali Hassani and Tianqi Chen and Andrew Kerr and Haicheng Wu and Yang Xu and Yu-Jung Chen and Hanfeng Chen and Aditya Kane and Ronny Krashinsky and Ming-Yu Liu and Vinod Grover and Luis Ceze and Roger Bringmann and John Tran and Wei Liu and Fung Xie and Michael Lightstone and Humphrey Shi},
year = {2026},
abstract = {Agentic Variation Operators (AVO) are a new family of evolutionary variation operators that replace the fixed mutation, crossover, and hand-designed heuristics of classical evolutionary search with autonomous coding agents. Rather than confining a language model to candidate generation within a prescribed pipeline, AVO instantiates variation as a self-directed agent loop that can consult the current lineage, a domain-specific knowledge base, and execution feedback to propose, repair, critique, a},
url = {https://huggingface.co/papers/2603.24517},
keywords = {evolutionary variation operators, autonomous coding agents, language model, candidate generation, domain-specific knowledge base, execution feedback, attention kernels, NVIDIA Blackwell, cuDNN, FlashAttention-4, grouped-query attention, micro-architectural optimizations, huggingface daily},
eprint = {2603.24517},
archiveprefix = {arXiv},
}
{}