News flash: I have a tendency to be abstract.
I like to think about things. Then think about thinking about them. It... spirals pretty fast 😆
So today I thought I'd share something that brings me back to practicality for a while: we're going to talk about a Developer Experience Audit.
Developer Experience is one of those topics you can skate by and stay abstract about your whole career. Toss a few buzzwords in a meeting, gesture at a diagram, and you’re good. Everybody thinks you've got it.
But that doesn’t build trust with your team... and it sure doesn’t fix flaky tests or bring them peace of mind. I’d rather get to the part where we do something. Wouldn't you?
Why Audit our DevEx?
If you’ve been following The Adventures of Blink, you know how much we value DevEx around here. (If not, hey... go check those out later!) Accelerating your developer team means accelerating your revenue generation capabilities.
So you might think the answer is to form a team. A task force. A strike squad.
Avengers... assemble?
I mean, sure—it makes sense to dedicate people to improving DevEx, but there's one thing you should establish before doing anything like that:
👀 VISIBILITY 👀
This is where the DevEx Audit comes from! Imagine you did build that team of folks. You bring everybody onto a call together, and it goes like this:
You: Ok team, we've assembled you to be the DevEx Improvement Task Force.
Them: ...
You: So... go improve it! Why are you just sitting there?
This is why you need a DevEx Audit. No matter how experienced your team is, if you present the improvement process to them like this you're just going to get vacant stares. Having a DevEx Audit helps them understand in quantifiable terms what's broken and what you're focusing on fixing.
So let's make a list...
The DevEx Audit Checklist
Note: this is not a comprehensive list, but it should be enough to get you started. If you think of other DevEx topics and measurements you'd like to know about, drop 'em in the comments to share with the rest of us!
🧪 Onboarding
- [ ] Can a new hire get our code running locally in under 30 minutes?
- [ ] Is environment setup automated or documented clearly?
- [ ] Are secrets and credentials managed safely?
⚙️ CI/CD Feedback
- [ ] Do our tests run in under 5 minutes?
- [ ] Do we fix all test failures (rather than ignoring them and pushing the pipeline forward anyway)?
- [ ] Does the pipeline tell me what broke and why?
💥 Error Surfaces
- [ ] Are our error messages actually helpful?
- [ ] Do logs help me debug without digging?
💻 Local Dev
- [ ] Is there a single startup command (
make dev
,npm run dev
, etc.)? - [ ] Can I develop offline?
- [ ] Is hot reload reliable?
- [ ] If I need a new tool on my laptop, can I have it up and running in less than a day?
⏱️ Feedback Loops
- [ ] How long from "I write code" to "I see the result"?
- [ ] How fast is code review?
- [ ] Do I hear from the end users of my code frequently?
📝 Documentation (idea credit for this section: @anchildress1)
- [ ] Is our documentation accessible?
- [ ] Is our documentation kept up-to-date?
- [ ] Is our documentation easy to locate?
Concerns for Scoring the Checklist
Obviously we want this to be something easy to use. Note that in my checklist, getting a check on an item is always an improvement in the process. This is important because we want to be able to give ourselves a score, and use the checklist periodically to evaluate our progress.
It might also be worthwhile in making a gradient for each question rather than a binary ✅ / ❌ - maybe answer the question from 0 to 5, with 0 being the worst and 5 being the best. This will build in a little room for subjective opinions while keeping things mostly objective.
We also want to preserve the sections / topics so that we can gather the scores of related questions into these categories. This means that we can start to look at where our biggest problems are.
An Example
Perhaps management has decided that they want to hire a new Platform Engineer, but the audit shows that we struggle with onboarding.
Does it make sense to hire someone new before we put any work into fixing our onboarding experience? The new person, fighting through our onboarding problems, is going to take longer to become effective, and they'll probably suffer a morale drop in that time too. So even when they're fully "on"... we're not getting their best, because we ground it out of them as they struggled through their newbie phase.
Having the visibility of a DevEx Audit shows us that we need to prioritize some onboarding work before expanding our team, so that we can be more effective.
Get To Work!
I told you earlier, I like to think about thinking... but this is the point where we are ready to quit thinking about it and start doing. So what will we do?
Run the audit. Find a low-scoring entry. Fix it. Repeat.
It's important to realize that this will become a maintenance process. You're not going to reach a finish line and never think about DevEx again; change is constant in an organization so running a DevEx audit periodically will be necessary to ensure you don't slip!