Do More by Doing Less
Growth should create leverage, not more administrative work.
If revenue doubles and your finance team is just reviewing twice as many receipts, that’s not scale. That’s linear workload growth.
That’s why I use Brex.
Brex built an Intelligent Finance platform with AI agents that automate repetitive work — receipts, categorization, policy enforcement, reconciliation — so that finance output improves as the business grows, and I can focus on scaling the MM empire.
If you care about operating leverage, your finance stack should reflect that.
See why 35,000 companies like Anthropic, DoorDash (and me!) use Brex to spend smarter and move faster.
How Superhuman Structures Its Analytics Team
The analytics function is increasingly landing in finance's lap.
Yes, the people we used to joke wore both a belt and suspenders to work are being thrown the keys to a 600 horse power data warehouse.
They're being asked to own the entire lifecycle of extremely compound and ever-changing metrics they've never had to produce before, like gross margin at the customer and cohort levels, product usage by segment, and retention by marketing acquisition channel. And I'm not talking about setting up dashboards in Looker or Sigma. Or building a pivot table for your CMO who can't use Excel.
I'm talking about building a high performing feedback loop that spans all the types of data the company touches. It’s the driving force behind goal setting and forecasting.
Chris Byington is the Head of Data at Superhuman Mail. His team owns:
Analytics,
OKRs, and
Forecasting.
I talked to him on Run the Numbers about how he structures the team, how they partner with the business, and how they prove the team is actually worth having.
Here's what's worth stealing.

Steal this analytics framework, and listen to System while you work
Where analytics should sit
There's no universal answer here. And Chris is honest about that. But there are three archetypes worth knowing.
The first is finance. Analytics reports to the CFO.
You get authority and budget control.
The tradeoff is perception.
Other teams start to see the data function as an extension of finance, which means they interact with it like they interact with finance (carefully, defensively, and only when they have to)
The product team isn't coming to you with half-baked ideas.
The sales team isn't asking you to poke holes in their pipeline assumptions.
You become (perceived as) the people who audit, not the people who help.
Chris saw this firsthand at Heroku. The data team was embedded in finance, doing good work, but the product team kept their distance.
They felt like "data-leaning finance people."
When you're perceived as a watchdog, nobody invites you into the room early enough to actually change anything. They just call you after they break a window.

The second is operations. Analytics reports to a COO-type. This works best when that person is focused on how the company runs (planning, goal-setting, accountability) rather than just go-to-market. If the COO is really a CRO in disguise, you end up with the same problem as the finance archetype, just with a sales bias instead. You also get a bit of whistling by the graveyard. Leading indicators that sales are lagging may not surface immediately, because nobody wants to be the team that owns the data and delivers the bad news about themselves.
The third is engineering. Analytics reports to the CTO. This makes the most sense when data is core to the product itself. At Grammarly, 30% to 40% of the data team's energy goes toward end-user features (i.e., the suggestions that power the product). For that reason, the Grammarly data team sits in engineering.
The tradeoff is that you're now routing financial data through people who didn't grow up reading a P&L. You also run the risk of chasing things that are interesting, rather than things that are impactful to the business. Engineers are wired to explore. On the other hand, the upside is speed. Nobody iterates on a technical setup faster than an engineering-led team.
If we're being honest, the real variable is where exec support lives. Where do you have someone at the table motivated to apply data to daily decisions, and the backbone to hold the rest of the company to that standard. Finance, ops, engineering… these are all different fonts of the same mission (Chris’ analogy). The org chart matters less than finding the one executive who will fight for it.
There's a fourth option Chris doesn't name but he lives every day. Analytics as a standalone function with its own seat at the exec table. Kill the dotted lines and intermediaries. At Superhuman Mail, that's where his team sits.
The hub and spoke model

Once you know where analytics lives, you have to figure out how it works. Chris runs a hub and spoke model with three layers.
The first two are horizontal. They serve everyone.
The first is the data platform. This is the warehouse: pulling all of your business data into one place. Product usage, support tickets, billing data, CRM data.
The goal is really straight forward: when someone asks a question about a customer or a cohort, the answer exists. Chris gave me a basic example: how many support tickets did this account submit before their renewal? At most companies, that question lives across Zendesk, Amplitude, and Salesforce. So three systems, with zero connective tissue. The data platform is what makes it answerable.
Sidenote: Chris was clear that this is the part most companies underestimate. He realizes that getting it right is harder and takes longer than anyone budgets for.
The second is BI and data enablement. This is the last mile. The goal is making sure people across the company can find answers themselves without filing a ticket and putting the onus back on you.
Chris has a rough heuristic:
70% of data questions should be answerable without the data team's involvement.
If you're at 20% or 30%, you need to invest in your self-serve muscle. If you're at 70%, your team can focus on the work only they can do.
The line between what belongs in self-serve and what belongs with the data team comes down to three things:
how repeatable the question is,
how well-defined the metric is, and
how much judgment the answer requires.
If someone has asked it before, the definition is agreed upon, and the answer lives in the BI tool, that's self-serve. If the question is being asked for the first time, the metric doesn't exist yet, or the answer requires someone to figure out what's actually being asked before they can answer it, that's data team work. Most requests that come in as "can you build me this dashboard" are actually the second kind in disguise.
Chris's approach is to respond to every request with the same question: what are you actually trying to decide? It sounds simple. It isn't.
Most people are wired to skip straight to the solution. They come to the data team with a fully formed output in mind (a dashboard, a report, a model) because that feels like a concrete ask. But the output they're picturing is usually a proxy for a decision they haven't fully articulated yet, even to themselves. Chris quoted a line he likes:
“Few problems survive their thorough articulation.”
Once you've fully described what you're trying to solve, the solution gets obvious pretty fast. Sometimes it's a dashboard. Sometimes it's a single report. Sometimes it's a pivot table. And sometimes (Chris was serious about this) it's just a number in a Slack message. And that’s cool too, because the format matters less than the decision.
The third layer is vertical. These are data scientists embedded directly in individual business units: one for marketing, one for product, one for sales. This is the spoke part.
And Chris has a specific definition of embedded. It's not "available to." It's not "supports." Embedded means 80% of your energy goes to that team. You go to their offsites. You live in their goals. You know their problems before they bring them to you. You’re actually a part of their team.
The payoff is twofold.
First, you stop being a ticket-taker. No great BI person wants to be a dry cleaner clerk, just processing whatever comes over the counter.

Source: Ten of the funniest data memes
Second, you stop getting invited to meetings after the decision is made. Leaders start pulling you in before anything is decided.
Chris joked: you know the embedded model is working when a business unit leader gets annoyed that their data scientist is out sick.
When the same team owns goals and data
Most companies have a version of the following problem.
The analytics team finds something real in the data… Something that should change what everyone is working on.
They bring it to the right people.
The answer comes back: it's not on the roadmap.
We're already committed to these goals.
Come back next planning cycle.

By the time next planning cycle arrives, the moment is gone.
Chris's fix was structural. At Superhuman Mail, the same team that owns the data also owns the OKRs and the forecast. Yes. The analytics team owns the OKRs and the forecast. For a lot of people reading this, that's a jarring sentence. FP&A has owned the forecast since the beginning of time.
Chris is proposing something different, and at Superhuman Mail, it's working.
When an insight surfaces, there's no negotiation about whether it's relevant. The people who found it are the same people who set the goals. They can act on it.
The second thing that changes is planning. Most goal-setting processes go like this:
The CEO throws out a number that sounds good in a board meeting.
The CFO anchors to last year plus a reasonable growth rate.
They negotiate until everyone is mildly unhappy and call it a plan.
When the analytics team owns the goals, that conversation gets replaced with a question:
What does the data say is actually achievable?
The target and the evidence behind it come from the same place.
And because the same team owns the forecast, the model is internally consistent from the start.
The third thing is more human.
OKRs break down when individual contributors can't see themselves in the goals. It happens at almost every company past a certain size. The objectives get set at the top, they cascade down, and somewhere between the executive team and the person actually doing the work, the thread snaps. When the analytics team owns the OKRs and lives in the data, they can build goals that connect all the way down. The engineer, the salesperson, the customer success rep… they should all be able to point to a number and say that's mine.
Chris kept coming back to one thing. Most companies look at data after the fact…
to explain what happened,
to justify a decision already made, or
to build a case for something they already believe.
Chris calls the opposite approach calling your shot.
Before you run the analysis, you decide what you'll do based on what you find. If activation drop-off is biggest at the payment stage, we focus there. If it's not, we change our minds. That pre-commitment also pressure tests whether the analysis is worth doing at all.
If you're not willing to change your behavior based on what you find, why are you running it?
That discipline is most powerful when the team enforcing it is the same team setting the goals and owning the forecast. The people calling the shot are the same people who have to live with the outcome.
Owning all three - data, goals, forecast - closes that loop.
The hero metric
Most companies have too many metrics. Not because anyone planned it that way, but because every department builds its own scoreboard. So they proliferate.
Chris's answer is the hero metric. One number that the whole company orients around for the year.
At Superhuman Mail, the hero metric for 2024 was NRR for teams.
Simple reason: the business was transitioning from individual users to team-based accounts, and NRR is what tells you whether that transition is working. When you land a team and they grow on their own (adding seats, expanding usage) NRR goes above 100. When they churn or contract, it drops below.
Superhuman's NRR for teams was below 100 in 2024. Not a surprise in isolation, as individuals can't expand. You can retain them or lose them, but there's no seat to add, no tier to upgrade into.
Teams are different. Land five seats at a hundred-person company and you have ninety-five seats of runway. That's why the transition to teams mattered so much, and why NRR was the right metric to anchor it.
They spent the entire year fixing the bucket. That meant breaking down NRR into its components (retention, expansion, contraction, churn) and finding the input metric that drove each one. They didn’t hone in on NRR as a single output to stare at, but the specific levers underneath it. They went fully upstream. And this is where having the analytics team own the goals paid off directly. They were the ones who could deconstruct the metric, track the inputs, and hold people accountable to moving it.
Once they got there, the hero metric changed. In 2025 it became year over year growth of teams. They got to it through the same logic: one number, one direction, one year. Everything else is subordinate to it.
The hero metric also does something less obvious. It makes saying no easier. If a project doesn't contribute to the hero metric, it goes to the back of the line. You don't have to relitigate priorities every time something new comes up. The metric does it for you.
How to say no without losing friends
The hardest thing about running a high-performing analytics team isn't the technical work. OK, that's still really hard (I personally don't know Python). But what's even harder is the prioritization. Left unmanaged, the data team becomes a help desk.

Chris has three ways to manage it.
The first is the goal system. If a request doesn't connect to a company goal, it's an automatic no, and the burden of proof flips back on the person asking. The question (awkwardly) becomes: why are you working on something that doesn't drive the company's goals?
Most people stop asking.
The second is self-serve. If 70% of questions are answerable without the data team's involvement, 70% of requests never become requests. They get answered before they arrive. That only works if the self-serve infrastructure is good (i.e., the right tools, the right definitions, the right training). But when it works, it's the most efficient no you'll ever say, because you never have to say it.

The third is how you say no when you have to. Chris's rule is that the answer always starts with yes. Not yes to the request, but yes to the person. Yes, happy to help, here's what I'm working on, here's why it needs to stay in the queue, here's what we can do instead.
A no with an alternative is a different conversation than a flat no. It also shows you took the time to think about what they brought to you. That you care. That you're searching for the ultimate outcome. People love to be heard. His data team has office hours once a week - a time-boxed hour where anyone can bring questions, get quick answers, and learn to self-serve. It's a pressure release valve. And it keeps the relationship intact without blowing up the roadmap.
How do you know the team is working out?
OK, enough of the woo woo stuff. Chris has three concrete ways to measure it.
The first is the self-serve survey. Once a year, send the company a single question:
Can you confidently make good decisions with data in your day to day work?
It’s measured on a simple one to five scale. You can track it over time. It's a slow metric to move, which is exactly why it's worth tracking.
The second is strategic decisions. A term I hate because everyone uses it for anything that requires more than five minutes of thought. But let's be specific. There are only two, maybe three decisions per year that actually shape the company's trajectory. Do you go upmarket or stay mid-market? Do you build a sales team or stay product-led? Do you focus on individuals or teams? If the analytics team is doing its job, they're in the room before those decisions get made, and not after. You can point to the decisions that changed because of what the data showed.
The third is the pull test. Go to each business unit leader and tell them you're pulling their embedded data scientist back to the central team. What happens? If the answer is okay, thanks for everything, your embedded data scientist wasn't all that embedded. As they say in Ocean's Eleven, you're either in or out.

If the answer is wait, can we fund the role from our own budget, you have something that's actually working. Chris was direct about this: you can't fully measure it.
And, ironically, this isn’t the most analytical answer. But you'll know.
Quote I’ve Been Pondering
“No" is a complete sentence.”
Wishing you an analytics team with a strong self serve function,
CJ








