Compiling Dinner
When you read a recipe, you’re already programming. Ingredients are inputs. Actions—chop, stir, simmer—are instructions. The kitchen is your runtime environment, and you, the cook, are the processor. If you follow the recipe to the letter, you get the expected output: a finished dish. Miss a step, and you’ve introduced a bug. Burn the onions, and you’ve hit a runtime error.
Seen this way, recipes are languages, and cooking is compilation.
⸻
Recipes as Grammar
A recipe might say: “Sauté onions in butter until golden.” Even without thinking, you break it down: • Action: sauté • Ingredient: onions • Resource: butter • Condition: until golden
That’s parsing. You’ve taken natural language and structured it into a sequence the kitchen can execute. If you were to formalize it, you could even describe recipes in a simple grammar:
Recipe ::= Ingredients Steps
Step ::= Action Ingredient (Condition)?
The idea isn’t to make cooking robotic. It’s to notice that the same mental steps compilers use to translate source code are happening every time you follow instructions in the real world.
⸻
What LLMs Add
Traditionally, building a compiler—even for toy programming languages—required deep expertise and a lot of patience. You had to design the grammar, write the parser, build analyzers, and construct an interpreter. Most people never tried.
But large language models change that. You can describe the rules in plain English—“I want a recipe language where steps look like ‘Bake chicken at 350°F for 40 minutes’”—and the model will sketch out the code to tokenize, parse, and execute it. You can ask for a shopping list generator, and it will show you how to transform recipe text into structured data. You can ask for optimizations, and it will suggest ways to reorder steps so multiple dishes finish at the same time.
It won’t give you a Michelin-starred compiler. But it will give you a sketch, a scaffold. Enough to start playing.
⸻
Compilers Everywhere
Once you see cooking this way, you notice the pattern elsewhere. • A workout routine is a program: sets, reps, rest periods—compiled into physical execution. • A business process is a program: approvals, conditions, escalations—compiled into organizational action. • Even music can be compiled: notes into MIDI, instructions into sound.
BNF, the formal notation once reserved for programming languages, becomes a way to articulate structure in any symbolic domain. And LLMs make it easier than ever to draft those grammars and sketch the pipelines that run them.
⸻
Why It Matters
This shift matters because it changes who gets to experiment. Before, compilers were a niche reserved for specialists. Now, anyone can sketch one for their domain of interest—food, fitness, finance, music—and immediately see the connections between intent and execution.
It’s not about building perfect systems. It’s about learning to see the world through a compiler’s lens: as layers of grammar, structure, transformation, and action. That literacy makes everyday systems—from recipes to workflows—feel less like mysteries and more like programmable environments.
⸻
The Engineer’s Role
LLMs don’t replace rigor. They hand us outlines. It’s still up to humans to decide what values to encode in the compiler. A recipe compiler could optimize for speed, nutrition, or flavor—each choice shaping the output in a different way. A workflow compiler could prioritize efficiency, fairness, or resilience.
But the horizon has shifted. For the first time, you don’t need to be a specialist to imagine a compiler for your corner of the world. You just need to ask: what’s the grammar of this domain, and how do I want to execute it?
⸻
Compilers are no longer just for code. They’re for cooking, for coordination, for any place where structured intent becomes action. And with LLMs, sketching them has never been easier.