Notes on building a personal AI operating environment.
I write about what I have built around Anthropic's Claude Code, including the memory, hooks, safety layers, infrastructure and the specific patterns that have earned their place; every post is grounded in something concrete rather than in speculation about the future of AI.
Techniques
All techniques →Five skills that have shaped how I work, each with a short page explaining what it does and why it earned a permanent slot in my own setup.
A persistent task runner with a retry loop and an email fallback, so "email me when done" actually means that.
Read →Claude, Gemini and GPT-5.4 on the same question in parallel, with a blind round followed by informed rounds; the receipts come from several months of use.
Read →Nightly consolidation that promotes useful session insights into the canonical topic files, which is the memory layer that stops rotting.
Read →Build a task-specific context file before the task starts; the sub-session then reads only that file, which stops the usual context-window guessing.
Read →Twenty-three numbered patterns of how I have broken my own system, checked against every non-trivial change before it ships.
Read →Featured essays
A personal AI operating environment: worked example and receipts
What happens when one person uses an AI coding assistant as the primary interface to a real physical and operational life, and systematically fixes every failure that occurs along the way.
Six layers of defence for an AI agent over a 3D printer
The printer-safety architecture I now run, the specific incidents that produced each layer, and why the pattern generalises beyond 3D printing.
Lessons as code: turning postmortems into pre-flight checks
A file I read at the start of every session, twenty-three numbered patterns of how I have broken my own system, and the pre-flight skill that checks proposed work against them. The pattern is the most portable thing on this site.
Recent writing
All writing →- EssayContext as a first-class artifact: the /deep-context pipelineStop hoping that relevant information will fit in the context window. Manufacture a task-specific context file before the task starts. The architecture, the real numbers, and the 24 hours of dead ends that produced them.
- Story918MB, an Ofsted inspection, and a governor who is not a developerOne of the schools in my kids federation was rated Requires Improvement and is waiting on a re-inspection. The evidence base is 1,650 files and 918MB, which no governor was realistically going to read end to end, so a few of us built something that could.
- StoryBuilding from my phone while watching the kidsThe five-step evolution of how I reach the development environment on the Mac Mini from an iPhone in a playground. The useful insight is that where you build shapes what you build.
- Essay"Email me when done": a persistent task runner with a delivery guaranteeLong-running tasks fail silently if the session dies before the result is ready. This is the runner I built to make "email me when done" actually mean that. Retry loop, fallback email paths, and a last-ditch file.
- StoryFrom model to agent: what changed when I stopped predicting and started investigatingWhy the regression models that came out of the hackathon got replaced within weeks by three agentic tools. The short version: probability scores without narrative are not what analysts need.
- EssayMemory that sleeps: a tiered memory architecture with daily consolidationA two-tier retrieval system (semantic plus keyword), canonical topic files as curated truth, and a nightly consolidation pass that promotes session insights into the canonical tier. Why each piece exists and what fails without it.
- EssayOne hour, one command: disaster recovery for a solo AI shopWhat backups, what intentional exclusions, and a sequence that reconstitutes the whole personal AI operating environment in under an hour. The honest version, including the accepted gaps.
- StoryOne hour, one marketing listA vague ask ("give me a list of prospects that look like X") turned into a working pipeline across three data sources in under sixty minutes. A small build, but the speed is the point.