Skip to content

A 7-week project in 1.5 hours: The full story

· 3 min read

This is a story that sounds exaggerated. But it happened — and it’s my own project.

I had a project I estimated at seven weeks for one person. A classic estimate, built on experience. Reasonable, careful, realistic.

The result? 1.5 hours of my own work. Claude Code then ran autonomously for several more hours, but I didn’t need to be there.


The 7-week estimate, deconstructed

Seven weeks for one developer. The estimate broke down like this:

  • 2 weeks: Analysis, research, architecture. Understanding the problem space, choosing technologies, designing the structure.
  • 2 weeks: Backend implementation. API, data model, business logic.
  • 1.5 weeks: Frontend. Components, state management, API integration.
  • 1 week: Testing, debugging, deployment.
  • 0.5 weeks: Buffer.

Each line item made sense. I wasn’t padding. I was estimating based on how I’d always worked.


What happened

I brought in Claude Code. Not as “better autocomplete” — as an autonomous partner that I gave a goal and let work.

First 30 minutes: I defined the project — what the app should do, what structure it should have, which technologies to use. Claude Code proposed the architecture, I refined it. No hours of googling, no reading documentation. A conversation.

Next 30 minutes: Course corrections. Claude Code generated code, I reviewed outputs and refined the brief. Not line by line — entire blocks of functionality.

Last 30 minutes: Fine-tuning, final adjustments, review. Then I let Claude Code continue running on its own.

1.5 hours of my work. The rest was Claude Code running autonomously. Result: a functional application, not a prototype.


Where the real bottleneck was

Seven weeks was based on the classic model: first understand, then plan, then implement, then test. A serial process. Each phase waits for the previous one.

With AI, this model collapses. Understanding and implementation happen simultaneously. Architecture iterates in real time. Testing starts with the first line of code. And crucially — AI doesn’t need time to “get up to speed,” doesn’t need to read documentation, doesn’t need a lunch break.


The takeaway

This isn’t about AI replacing the developer. I was there. I steered the direction, made decisions, checked quality. But instead of seven weeks of routine work, I spent 90 minutes on what actually requires a human brain — decision-making and direction.

This isn’t a story about faster code. It’s a story about a fundamental shift in the human role in software development.

If you’re estimating projects in weeks, it’s very likely your estimate is based on assumptions that no longer hold. If you want a handy reference for working with AI tools, grab the free AI Cheat Sheet for Developers.


Curious what this would look like for your project? Let’s talk.


You might also like

Share

Free Claude Code cheat sheet

Commands, prompts, plugins and workflows from €3,000/day workshops. Download free.

Get the cheat sheet →

Related posts

How I built an invoicing system with double-entry bookkeeping in 3 days

Invoicing in 1 day, double-entry accounting in 2 more days. AI generated a specification against legislation – 3,430 lines, 142 sources. The system was built overnight.

4 min read

Also about: case study, Claude Code

Also about: Claude Code

/loop — How I Turned Claude Code Into an Autonomous Agent

One terminal command turns an AI assistant into an agent that plans, implements, and cleans up. A detailed walkthrough of my /improve-gitlab setup.

10 min read

Also about: Claude Code