Your Team Is Working Harder Than You Think
A few years ago, I sat down to review the work my team had done at the end of a busy quarter. It was a shockingly short list. My first reaction was irritation. What had we been doing the whole time?!
But we’d been busy. Our updates were packed with tasks people had been working on, we’d had to push deadlines because we didn’t have the capacity to finish working on things. We’d worked our tails off for… Very little tangible output.
What I had discovered is one of the more insidious things about managing a data team: busy and productive can look identical from the outside, and sometimes from the inside too. The team is at their desks, responding to messages, showing up to meetings, handling things… But nothing consequential is moving.
The five-minute question
Every data team I’ve been part of has had some version of this. A stakeholder sends a message with something framed as a quick ask. Can you check what that number was last month? Can you pull the breakdown by region? Won’t take long.
And they’re right that it doesn’t take long. Ten minutes, maybe. The problem is that it doesn’t happen once. A busy analyst might field five or six of these in a day, each arriving at a random point in whatever they were actually working on. Research from UC Irvine found it takes around 23 minutes to return to deep focus after an interruption. Five of those in a day and you’ve lost the better part of the productive hours regardless of how fast or easy each answer was.
The five-minute question is rarely malicious. Stakeholders aren’t trying to wreck the week. They genuinely think it’s a small ask. And because it is small, it feels difficult to push back on. So the analyst answers it, switches back to their actual work, gets another ping, and the cycle repeats. At the end of the week, they have happy stakeholders, but have shipped almost nothing of substance.
Meetings that don’t need everyone in them
Data teams attract meeting invites the way a light attracts moths. A business review happens and someone figures the data team should probably be there. A cross-functional planning session gets going and the data team is looped in just in case numbers come up. Before long, multiple analysts are sitting on a call where they’re useful for about five minutes and present for an hour.
A lot of this is well-intentioned. Managers add analysts to meetings because they don’t want them blindsided by decisions that affect their work. Analysts accept invites because declining feels like opting out. Meeting FOMO is real, and in organizations where presence gets read as contribution, it’s not an irrational instinct.
But the cost isn’t just the meeting itself. It’s what happens around it. An analyst who has a meeting at 10am can’t really start deep work at 9:30. After the meeting they need time to reload whatever context they’d dropped. Stack two or three of these across the day and you’ve created a schedule that looks full on the calendar and produces almost nothing requiring sustained thought. Research puts genuine deep focus time for the average knowledge worker at under three hours in a typical eight-hour day. Meetings compress that further.
The question worth asking about any meeting with an analyst isn’t “could they contribute here?” Most people can contribute to most meetings. It’s whether their live presence is worth what it costs. Often a short debrief afterward would have done the same job.
The data quality rabbit hole
An analyst sits down to answer a business question. Midway through, something looks off — a number that doesn’t match expectations, a gap in records for a period that should have data. So they stop and dig. One hour becomes three. The original analysis sits unfinished. The stakeholder follows up, and the analyst explains they hit a data issue.
Add to this the time that can be wasted convincing IT teams there’s a real problem, explaining the context, revalidating fixes, etc. There’s a whole rabbit warren underneath some of these issues.
Data quality problems are real and important. Some of them genuinely need to be resolved before you can trust an output, but not every anomaly is material to the question at hand, and the habit of treating every irregularity as a full stop — rather than asking whether it actually changes the conclusion — is one of the more reliable ways to turn a half-day analysis into a multi-day investigation.
The question to ask is: does this affect the answer? If the gap represents 0.3% of records and the business question is directional, probably not. Flag it, document what you excluded and why, deliver the analysis with a note. If the issue is in a core field the entire conclusion rests on, stop and fix it. Those are genuinely different situations and they warrant different responses.
Teaching analysts to make that materiality call is one of the more underrated things a manager can work on. We’re teaching good judgement.
A little assignment
Look at the last five working days across your team and pull three numbers. How many ad hoc requests came in framed as quick asks, and how many of those interrupted something already in progress? How many meetings had more than one analyst present — and for each one, was that actually necessary? How many analyses stalled or slipped for reasons that had nothing to do with the analysis itself?
You’re looking for a rough read on how much of the week was genuinely available for focused, consequential work versus going to things that felt productive but weren’t moving anything important.
Next week: what you can do about it.
Some of what I write about here I’ve also turned into proper field guides — a bit more structured, a bit more actionable. You can find them at the Penguin Analytics store.

