About

I've been making engineering teams ship better for a decade. Now I do it with AI.

Ten years at DoorDash, Square, and Mudflap, mostly working on one problem: how engineering teams actually ship faster — not just look busier.

Why DomeWorks exists

Recently, that work shifted to AI. Not the hype cycle version, but the engineering version. How do you take a team that has AI tools and make those tools actually change how the team ships?

I watched the same pattern play out at every company: leadership buys Cursor licenses, rolls them out in a Slack announcement, maybe runs a lunch-and-learn. Three months later, adoption is at 15% and nobody can explain why.

I can explain why. There's no intelligence infrastructure that replaces coordination overhead. Nothing connects the tools to how the team works. No shared context feeding into AI tools. No standard developer workflows automated around AI. No agent infrastructure connecting your codebase, tickets, and docs to how engineers actually ship.

Over 80% of organizations report no measurable impact on EBIT despite their AI investments (BCG, 2024). Nobody is building the layer between the tools and the work. That's where DomeWorks operates.

How I work

01

Embedded, not advisory

I join your standups. I pair with your engineers. I sit in on planning. I write code, build systems, and ship alongside your team. You can't transfer capability from a slide deck.

02

Designed to end

Every engagement ships a handoff package: documented systems, runbooks, and a knowledge transfer session. Your team owns what I built and can maintain it without me. If you need ongoing support after that, that's what the Fractional engagement is for — but you're never locked in.

03

Working systems over opinions

I deliver running systems, not recommendations. Configured tools, documented workflows, measured before-and-after. Things your team can point to and build on.

Current engagement

Case study — Q2 2026

Autonomous coding agent for a national professional services organization

An engineering organization with a backlog of well-scoped maintenance tasks — lint fixes, dependency updates, test coverage gaps — each taking 2–3 days to complete including queue time and context-switching, even when the actual work was under an hour.

I built a one-shot autonomous coding agent system: receives a task, executes in an isolated environment, validates against CI, delivers a pull request. No human intervention during execution.

Target: median task completion of 15 minutes from dispatch to PR, down from 2–3 days. Publishing the full case study when the engagement closes.

Want to talk about your team?

30 minutes, no obligation. I'll tell you whether this is something I can help with.

Book a call