Every Choice Has a Reason

I have changed my development environment many times over forty years. Every time I thought I had found the definitive setup, something better came along. Today, I have stopped searching. Here is what I use — and more importantly, why.

From Docker Desktop and Devcontainers to WSL with Native Docker

My starting point was Docker Desktop on Windows with devcontainers — specifically Microsoft's Bookworm-based image, which came pre-configured with Node.js and TypeScript. A solid starting point. Fast to get running, reproducible, with everything in a container.

The problem is not what devcontainers do. The problem is the layer they add. Docker Desktop on Windows runs Linux containers inside a virtual machine it manages internally. Every file operation crosses multiple abstraction layers: the Windows filesystem, the VM, the container. The result is visible in practice: slow builds, unpredictable volume permission behaviour, and a constant low-level friction that accumulates over a working day.

The solution is surgical: remove Docker Desktop entirely and install Docker Engine natively inside a WSL2 distro. With this setup, the filesystem is real Linux, the container runtime is native, and the performance is real performance. No translation layer, no VM overhead, no permission surprises.

This reasoning applies equally to Linux users running devcontainers: the extra containerisation layer adds complexity without equivalent benefit once your distro is already stable and well-configured.

Why Debian — and Why Trixie Specifically

When I moved to a native WSL environment I had to choose a base distribution. The obvious candidate was Ubuntu — it is popular, well documented, and Microsoft recommends it. I tried it. It worked, until it did not.

The problem surfaced when I was building my first NestJS application with JWT authentication. I wanted to deploy the backend on an Ubuntu LTS image because it was lighter. The bcrypt library — which compiles native bindings via node-gyp — failed. The GLIBC version and the build environment on that Ubuntu release did not align with what the compiled binaries expected. The fix required recompiling from source, adjusting the build environment, and ultimately the result was fragile. I switched to a Bookworm lite image and the problem disappeared.

For WSL I use Trixie — Debian's current testing branch, fully compatible with Bookworm. The official Bookworm WSL image does not exist in the Microsoft Store, but Trixie is available and behaves identically in practice. Debian gives me a minimal, stable base with predictable system library versions. Native Node.js addons — bcrypt, argon2, canvas, anything that compiles with node-gyp — build consistently on the first attempt, without adjusting the build environment.

The principle is simple: Ubuntu is opinionated and moves fast. Debian is minimal and predictable. For a development environment you depend on daily, predictability is worth more than convenience.

Why VSCode

I have used many editors over the years. Today I use VSCode — not because it is the best editor in theory, but because it has the best integration with everything I actually use.

The WSL Remote extension is the key piece. It allows VSCode running on Windows to connect directly to the Linux filesystem inside WSL, transparently. The terminal, the file explorer, the debugger, the Git integration — all of it runs in the Linux environment while the interface stays on Windows. The distance between the two operating systems disappears entirely.

Beyond WSL, VSCode's extension ecosystem is unmatched for practical day-to-day work. Formatting, linting, Docker management, database clients, REST clients, Git visualisation — everything is there, maintained, and consistent. I stopped evaluating alternatives when I realised I was more productive spending that time building things.

Why an AI Coding Assistant — and How I Use One

I currently use GitHub Copilot. The choice of which assistant is secondary — the reasoning applies equally to Claude Code or any similar tool. What matters is not the tool. What matters is how you use it.

A coding assistant used without context is a generic tool. It generates plausible code for the average project. Your project is not average — it has a specific stack, specific architectural decisions, specific conventions that took time to establish. If the assistant does not know these things, every suggestion needs correction. The back-and-forth costs more time than it saves.

The solution is the .github/instructions/ directory. These are markdown files that tell Copilot — explicitly, in structured natural language — what your project is, how it is organised, what rules apply, what patterns to follow. I maintain separate instruction files for different file types: one for content structure, one for assets (css, scss, images) and language (php, angular, typescrypt, python, java) conventions, one for general project rules.

The difference is not subtle. A Copilot that knows your project generates output that looks like code you would have written. It follows your naming conventions, respects your architecture, uses the libraries you use. It stops being a generic assistant and starts being a collaborator that understands the context.

Building and maintaining these instruction files is work. It is also the most leveraged work I do on a project — because every session that follows benefits from it.