AI labs favor Rust as engineers, highlighted by Andrej Karpathy’s recent note
Rust is slipping into the daily toolkit of the people who actually ship the models that power today’s chatbots. Across research groups and product teams, engineers are swapping out familiar Python‑centric stacks for a language that promises tighter memory control and predictable latency. The shift isn’t just hype from hobbyists; it’s showing up in the codebases that underpin the most widely deployed language models.
When senior developers start to rebuild foundational components—like the piece that breaks raw text into tokens—using Rust instead of the usual high‑level libraries, it signals a deeper appetite for performance‑first engineering. That move matters because tokenisation sits at the very front of the inference pipeline, where any slowdown multiplies across billions of requests. One high‑profile AI practitioner highlighted this trend in a late‑last‑year post, noting his own decision to craft a tokeniser in Rust rather than lean on existing tools.
The observation offers a concrete glimpse into how major labs are rethinking their software choices.
---
The same pattern shows up inside major AI labs. Andrej Karpathy provided one of the clearest signals late last year. While documenting experiments with AI‑assisted coding, he mentioned building a tokeniser in Rust instead of relying on existing libraries. Tokenisers sit directly on the performance p
The same pattern shows up inside major AI labs. Andrej Karpathy provided one of the clearest signals late last year. While documenting experiments with AI-assisted coding, he mentioned building a tokeniser in Rust instead of relying on existing libraries.
Tokenisers sit directly on the performance path of every model. Choosing Rust there is a quiet admission that Python's convenience ends where correctness and speed begin. Even Elon Musk has publicly agreed that Rust is the language for AGI.
At OpenAI, the shift is explicit rather than implied. The Codex command line interface was rewritten entirely in Rust and released publicly. OpenAI said the Rust version is faster, more stable and easier to reason about as it scales.
Engineers pointed directly to Rust's guarantees as a reason. For autonomous agents that read and write files, manage tools and run unattended, that property matters far more than syntactic comfort. A similar learning curve is visible at Anthropic.
Engineers there have discussed learning Rust while building tooling around Claude, often with AI systems help them write the code itself. Rust is no longer treated as too difficult for fast-moving teams. AI assistance has flattened the learning curve enough that teams can get Rust's benefits without years of accumulated muscle memory.
Popular Meta apps, including Instagram and Facebook Messenger, utilised a decade-old C codebase for their messaging library, it led to problems with memory management and a subpar developer experience. In July last year, the company finally switched to Rust. On an episode of Meta Tech Podcast, three of the company's engineers, Eliane W, Buping W and Pascal Hartig, shared their experience with using C, discussing the challenges faced, the migration process and their success with Rust.
"I think one of the biggest things about Rust is the compile-time memory safety enforcement.
Rust is clearly gaining traction among AI engineers. Greg Brockman's remark that “Rust is a perfect language for agents” sparked an immediate chorus of agreement across the community. Is this enthusiasm enough to reshape tooling?
Within hours, developers echoed the sentiment, suggesting a shared confidence in the language’s suitability. The 2025 Stack Overflow Developer Survey backs this perception, naming Rust the most admired language with a 72 % rating. Andrej Karpathy’s own experiment further illustrates the trend: he built a tokeniser in Rust rather than using existing libraries, highlighting Rust’s appeal for performance‑critical components.
Tokenisers sit directly on the performance path, making language choice consequential. Yet, whether Rust will become the default across all AI lab tooling remains uncertain. The evidence points to strong enthusiasm, but broader adoption metrics are still limited.
As more labs report similar preferences, the pattern may solidify, though it is unclear if the momentum will sustain beyond current projects. For now, Rust’s growing profile reflects a notable shift in engineering preferences within the AI sector.
Further Reading
- 2025 LLM Year in Review from Andrej Karpathy - MLOps Newsletter
- Andrej Karpathy on Software 3.0: Software in the Age of AI - Latent Space
- 2025 LLM Year in Review - karpathy.bearblog.dev
Common Questions Answered
Why are AI labs like those mentioned by Andrej Karpathy switching from Python to Rust for building tokenisers?
Karpathy highlighted that tokenisers sit directly on the performance path of every model, making speed and correctness critical. Rust offers tighter memory control and predictable latency, which Python cannot guarantee, leading engineers to prefer it for core components.
What did Greg Brockman say about Rust's suitability for AI agents, and how did the community react?
Greg Brockman remarked that "Rust is a perfect language for agents," sparking immediate agreement among developers. Within hours, many echoed his sentiment, indicating a shared confidence in Rust's fit for building reliable AI agents.
How does the 2025 Stack Overflow Developer Survey reflect the growing admiration for Rust among AI engineers?
The survey named Rust the most admired language with a 72 % rating, underscoring its rising reputation. This high admiration aligns with AI engineers' increasing adoption of Rust for performance‑critical workloads.
What advantages does Rust provide over Python for engineers working on foundational AI components?
Rust delivers tighter memory management and predictable latency, essential for low‑level tasks like tokenisation. These advantages address the limitations of Python's convenience when correctness and speed become paramount.