You run mytool --help and wait. Two seconds. Three. No network requests, no error, no disk thrashing. Just Python dutifully loading PyTorch, NumPy, pandas, and a dozen other heavy libraries it will never touch — all so it can print a usage message and exit. This isn’t a hypothetical scenario from a conference slide deck. This is what Instagram engineers were dealing with every day in production. It’s what Hudson River Trading’s researchers were enduring across hundreds of CLI tools in their monorepo. And it’s the reason Python now has a lazy keyword coming in version 3.15 — though getting there took three years, two PEPs, a Steering Council rejection, a Language Summit showdown, and production evidence from some of the largest Python codebases on the planet.
The Companies That Couldn’t Wait
Long before the Python Steering Council had any consensus on how to solve the import problem, the companies running the biggest Python codebases had already solved it themselves. They had to. Waiting wasn’t an option.
Meta built Cinder, a performance-oriented fork of CPython that included lazy imports alongside a JIT compiler and a handful of other aggressive optimizations. Instagram’s backend ran on Cinder. The team documented their results: startup time improved by up to 70%, and memory usage dropped by up to 40% on real-world CLI tools. Germán Méndez Bravo, who implemented the lazy imports feature inside Instagram’s codebase, later described how the transition was surprisingly smooth for most internal code — the overwhelming majority of modules just worked when laziness was enabled globally.
Hudson River Trading (HRT), the quantitative trading firm, did something similar. Their Python ecosystem lives in a monorepo where internal modules are importable everywhere — convenient for collaboration, painful for performance. In the most tangled portions of their codebase, a single script’s imports alone could take over thirty seconds. A small volunteer team built a prototype during HRT’s 2023 internal hackathon (they call it “Surge”), forking CPython 3.10 and cherry-picking lazy import commits from Cinder. The prototype worked well enough to get greenlit for full-time investment. By Q2 2025, HRT had migrated the entire firm to lazy-by-default Python. Their August 2025 blog post is unusually candid: tools that previously cost users several minutes just to start up now launched in seconds.
The point here isn’t that these companies are clever. The point is that the need for lazy imports was real enough, and urgent enough, that sophisticated engineering organizations were willing to fork CPython and maintain their own interpreters to get it. That’s not something anyone does for a nice-to-have feature. That’s the kind of signal that a language’s governance body can’t easily ignore.
PEP 690: The First Attempt
In April 2022, Germán Méndez Bravo and Carl Meyer — both at Meta — wrote PEP 690. Barry Warsaw, a longtime Python core developer then at LinkedIn, sponsored the proposal. The design was straightforward and practical: add a -L flag (and a corresponding importlib.set_lazy_imports() API) to make all imports lazy by default. Application developers could flip the switch once and get the gains across their entire codebase without annotating thousands of individual import lines.
The workaround PEP 690 was trying to replace looked like this:
# Python 3.x — common workaround pattern (pre-PEP 810)
def get_numpy():
import numpy as np # deferred inside function
return np
This pattern works in isolation, but it’s deeply unsatisfying at scale. It forces every module to restructure its code around deferred imports. It kills static analysis — tools like mypy and pyright can’t see the imports at module level. It breaks the from module import name idiom that Python developers use thousands of times a day. And it’s fragile: one accidental top-level import of a heavy dependency anywhere in the chain undoes the entire effort.
Analysis of CPython’s own standard library showed that roughly 17% of all imports outside test files — nearly 3,500 imports across 730 files — were already placed inside functions specifically to defer execution. Developers were already doing lazy imports by hand. They just didn’t have language-level support for it.
PEP 690 proposed fixing this at the interpreter level. But the Steering Council said no.
In December 2022, Gregory P. Smith posted the rejection on behalf of the council. They acknowledged the problem: faster startup time is desirable, they wrote, and large CLI tools in particular suffer because that’s a human user experience. But they identified a fundamental problem with the -L flag approach. It would create two Pythons — one where imports are eager, one where they’re lazy. Libraries would need to be tested in both modes. Code that ran fine under eager imports could fail silently under lazy imports, with exceptions popping up at first use rather than at import time. The Steering Council described this as creating “a split in the community over how imports work.”
They went further. They noted that a world where Python only supported lazy imports would probably be great — but that world can’t exist now. Python has decades of code that relies on import-time side effects. Introducing a global lazy mode wouldn’t just add a feature; it would add complexity to the entire ecosystem.
The council also flagged implementation concerns. PEP 690 had modified Python’s core dict internals to support lazy loading. The PyDict_Next function, used throughout CPython’s C API, would need to trigger deferred imports — a fragile, performance-sensitive change that would bleed into every part of the runtime that iterates over dictionaries.
PEP 690 was dead. But the problem it addressed wasn’t going anywhere.
The Language Summit Moment
Carl Meyer didn’t give up. At PyCon US 2023’s Language Summit in Salt Lake City, he raised the question again in a lightning talk: is lazy imports dead, or is there a path forward?
He brought receipts. The Instagram team had seen startup time improvements of 50–80% and memory reductions of 40–90% by adopting lazy imports in their Cinder fork. These weren’t projections or theoretical calculations. These were production numbers from one of the most-used Python applications in the world.
Meyer floated several possible modifications to the rejected proposal. He asked the room to weigh in on each one. Should lazy imports use explicit opt-in syntax — something like lazy import inspect — instead of a global flag? Should the PEP include a clear roadmap for eventually making laziness the default? Should the implementation avoid modifying the dict data structure? Should the feature support generalized “lazy names” beyond just imports?
The room unanimously agreed that avoiding changes to dict internals would make them more likely to support a revised proposal. They were split on whether explicit syntax or a default-lazy approach was the right path. But one response stood out. Only a single attendee said they could never support any form of lazy imports in Python. That attendee was Thomas Wouters — a sitting member of the Steering Council.
Meyer noted the irony. The room was mostly open to trying again, but the one person who said “never” happened to be in a position of governance authority. It wasn’t a hostile exchange. It was a genuine disagreement about whether the feature could be added without fracturing the ecosystem. The kind of disagreement that doesn’t get resolved in a thirty-minute lightning talk.
PEP 810: The Right Design
Three years after PEP 690, a new proposal emerged — and this time, the authorship told a story. PEP 810 was published on October 2, 2025, co-authored by Pablo Galindo Salgado, Germán Méndez Bravo, Thomas Wouters, Dino Viehland, Brittany Reynoso, Noah Kim, and Tim Stumbaugh. Galindo Salgado was a sitting Steering Council member. Thomas Wouters — the “never” vote from the Language Summit — was also a co-author. The people who had been most cautious about lazy imports were now helping design the solution.
The design inversion is the heart of the story. Instead of opt-out (everything lazy by default, mark exceptions as eager), PEP 810 is opt-in. Instead of a global flag, it introduces a keyword on individual import statements:
# Python 3.15+ (PEP 810 — not yet released)
lazy import json
lazy import numpy as np
lazy from pathlib import Path
The lazy keyword is soft — meaning it only has special meaning when it appears directly before an import statement. Everywhere else, lazy can be used as a variable name, a function name, a class name. No existing code breaks.
What happens at runtime is elegant in its simplicity. When the interpreter encounters lazy import json, it doesn’t load the json module. It doesn’t execute any of json’s top-level code. It doesn’t add json to sys.modules. Instead, it binds the name json in the current module’s namespace to a lightweight proxy object. That proxy sits there, dormant, taking up almost no memory. The moment your code actually uses json — calls json.dumps(), accesses json.JSONEncoder, anything — the proxy intercepts the access, performs the real import, replaces itself with the actual module object, and forwards the operation. From that point on, json behaves identically to a normal import. The switch is transparent.
This proxy-based approach is a deliberate departure from PEP 690’s implementation strategy. PEP 690 had modified CPython’s internal dictionary type to support lazy resolution — meaning every dictionary operation throughout the entire interpreter had to account for the possibility of lazy objects. PEP 810 confines the laziness to the proxy objects themselves. Cleaner boundary. Easier to reason about. No impact on unrelated dictionary operations.
The practical impact for CLI tools is immediate. Consider this common pattern:
# Before (Python 3.x) — eager loading
import argparse
import numpy as np # loaded even for --help
import torch # loaded even for --help
import my_heavy_module # loaded even for --help
def main():
parser = argparse.ArgumentParser()
# ...
Every one of those imports executes immediately when the module loads, even if the user just wants to see usage information. With PEP 810:
# After (Python 3.15+, PEP 810) — only load what you use
import argparse
lazy import numpy as np
lazy import torch
lazy import my_heavy_module
def main():
parser = argparse.ArgumentParser()
# numpy, torch, my_heavy_module never load if unused
If the user runs the tool with --help, argparse does its thing and the program exits. NumPy, PyTorch, and any other heavy dependencies never load. The startup cost drops from seconds to milliseconds. One keyword per line. No restructuring. No function-level import hacks.
PEP 810 also includes a global lazy imports flag and a filter API for scenarios where teams want to experiment with broader laziness — similar to what HRT and Meta were doing internally. But the baseline model is granular, explicit, and opt-in. One import at a time.
What Didn’t Change (And Why That Matters)
PEP 810 is not magic, and pretending otherwise would be a disservice to anyone planning to adopt it. There are real constraints and genuine edge cases.
Wildcard imports — from foo import * — cannot be lazy. This makes sense: a wildcard import requires the interpreter to resolve the module immediately so it knows which names to bring into scope. There’s no way to defer that without changing what the wildcard means. If you try to write lazy from foo import *, it’s a syntax error.
Import errors behave differently under lazy imports. Normally, a ModuleNotFoundError fires at the import line. With a lazy import, that error is deferred to first use. If your code has a lazy import nonexistent_module at the top of a file but never actually touches nonexistent_module, the error never fires. This is fine for short-lived scripts. For long-running processes, it means an import failure might surface minutes or hours into execution, in whatever thread first happens to touch the name. PEP 810 is explicit about this tradeoff: lazy imports shift when errors occur, which is exactly why the feature requires an explicit keyword instead of a silent global flag.
Thread safety is preserved — the import lock discipline that CPython already uses is maintained. But because deferred loading can now happen in any thread that first touches a lazy name, developers working with multi-threaded code need to be aware that an import (and all its side effects) might execute in a thread they didn’t expect.
HRT’s migration blog documents exactly these failure modes. They hit them during their rollout. One common pattern: module foo imports bar.baz internally, and other code relies on bar.baz being available as a side effect of importing foo. Under lazy imports, foo hasn’t loaded yet, so bar.baz isn’t available either. HRT solved this by maintaining an exclusion list — modules that must always import eagerly. PEP 810 provides a filter API that serves the same purpose.
None of these constraints are dealbreakers. But they’re real, and anyone adopting lazy imports should understand them rather than discovering them in production.
Three Years Was the Right Amount of Time
Here’s what actually happened over those three years. The wrong design was proposed. It was rejected for legitimate reasons. The right design needed time to be formulated by people who understood the rejection — and crucially, by some of the same people who issued it.
Pablo Galindo Salgado co-authoring PEP 810 wasn’t incidental. Having a Steering Council member as an author meant the proposal was shaped with the council’s concerns already internalized. Thomas Wouters going from “never” to co-author tells you the design genuinely addressed his objections rather than steamrolling them. The explicit opt-in syntax, the proxy-based implementation, the preservation of eager behavior as the default — every major design choice in PEP 810 maps directly to a specific concern raised during PEP 690’s rejection.
The firms running internal forks — Meta, HRT, and others — provided three years of real-world evidence. The need was real. The gains were real. The failure modes were documented, categorized, and solved. That corpus of production experience from organizations with millions of lines of Python code made PEP 810 a much easier case to argue. HRT’s blog post explicitly stated they supported the Steering Council’s rejection of PEP 690, agreeing that implicit lazy imports weren’t right for upstream — while simultaneously demonstrating that the underlying feature was transformational.
On November 3, 2025, the Python Steering Council unanimously accepted PEP 810. Barry Warsaw, writing on behalf of the council, acknowledged that this had been a feature the Python community had wanted for a long time, and that the proposal struck the right balance. The four eligible council members all voted yes. Galindo Salgado recused himself as a co-author.
The PEP drew over 450 comments during its discussion period. People debated whether defer sounded more professional than lazy. They argued about whether from . lazy import bar should be valid syntax (it won’t be — it parses as from .lazy import bar). They raised edge cases around context managers, class bodies, and exception handling. The volume of discussion was, by the authors’ own admission, “quite challenging.” But the core design held up.
By October 2026, when Python 3.15 ships, a lazy import before your heavy CLI dependency will be a one-line fix for a problem that used to require restructuring your entire module. For the Instagram engineers, the HRT researchers, and every developer who’s ever stared at a terminal waiting for --help to respond — the wait is almost over.
Sources: