Back to Blog
How To
Alex Halliday
July 15, 2024

How to run a great Weekly Business Review and build a data-driven culture: A conversation with MasterClass COO Mark Williamson

Table of Contents

Section Name

Get the latest in growth and AI workflows delivered to your inbox each week

Thank you for subscribing!
Oops! Something went wrong while submitting the form.

SHARE

If you’ve read AirOps COO Matt Hammel’s thoughts on leadership meetings and Weekly Business Reviews, you already know that there are a few truths we hold dear here at AirOps, such as... 

  • WBRs shouldn’t be a nightmare
  • WBRs are (very) expensive, it’s worth your while to make them worthwhile
  • WBRs should set the stage for a frank, but friendly, discussion that sucks all the BS out of the room and encourages your teams to do more of their best work
What is a Weekly Business Review (WBR)?

A WBR is a weekly meeting between your executive team and the directly responsible individuals (DRIs) who own your company’s most important goals.

Team AirOps has spent a lot of time thinking about how to help teams build better WBRs. That’s why I asked Mark Williamson, COO of MasterClass, if he’d be willing to share his thoughts, best practices, shoulds, and shouldn’ts. He agreed and a transcript of our conversation is below.

So, enjoy! I think you’ll learn a lot. For me, our chat was a great reminder about why WBRs are so critically important for growing companies. I’ll definitely keep these takeaways in mind as we build AirOps 🚀.

Alex Halliday: Let’s start with a high-level overview: Why should organizations invest in Weekly Business Reviews? Why are they such a high ROI activity for growing companies?

Mark Williamson: Here's how I think about it: Businesses are complex systems. By their nature, they're subject to the classic “elephant problem,” where everyone has a different view of what they *think* might be happening. A well-run Weekly Business Review helps you get the right people together in one room to pull all the data together and do diagnostics. 

I believe that this is the most important thing a business can do – figure out what causes what. And that's a really hard thing to figure out and a hard thing to learn. 

But, the reason it's become so important is that we are living in a more dynamic business environment than we’ve ever seen over the last decade and a half. Covid created a situation where you had the sudden collapse or expansion of markets. We now have inflation that's changing consumer spending habits and patterns. We've got consumer fatigue in many different segments of the economy and it’s imperative to understand what is actually driving the business either forward or backward in terms of its growth. 

It's harder to figure out now than it was in 2019 when we were in a low-interest rate, low inflation, “economy doing well” type of market. In that environment, determining the causes of different things in a complex business system was much more straightforward.

Now it's a lot more complicated and dynamic, therefore you need much more sophisticated diagnostic capabilities to figure out what's driving a business and how to influence it.

Yeah, it sounds like you're also referring to the ability for the business to react quickly and flow through those market systems signals into business decisions in a compressed time frame. In your mind, what are some of the really important pillars that you think should be present in a successful WBR?

There's so much art involved here, but the most important thing is that you’ve got to start with the foundational elements… which should always be data. And it's got to be data you trust. If you're making decisions based on bad data, you’re not going to end up in a good place. Data integrity and data quality are incredibly important. 

Then, you should make sure that you've got somebody in the business whose job it is to understand and question the data by asking questions like, “What are the primary variables that we should be looking at here?” 

And I think it's a combination of input and output metrics. You want things like sales metrics. That's an output of a bunch of other activities. But you also want to be looking at key projects and the inputs that you're optimizing for, including big key strategic initiatives.

One mistake I often see is Weekly Business Reviews that focus entirely on output metrics, not input metrics or the inputs that tie into key big initiatives.

Are there any other core pieces that you think should be present on top of the data, like additional narrative or check-ins on previous commitments?

One thing that I think is really important is being really thoughtful about which time series of data you’re presenting. Are you marking critical events that happened? 

Because oftentimes, you might go back and say, “Why did our CAC drop there?” because you ran a promotion and forgot to mark that in the data somewhere. You want to be hyper-attuned to different events that may have caused different things to happen. Having a timeline of business changes overlaying time series data can so often uncover important relationships that would go unnoticed, and not having this is a wasted opportunity to learn.

Basically, getting maniacally obsessed about the clarity of the data, and the narrative that it’s telling you about what happened in the past is really, really important.

When you've seen these things be rolled out before, what are some of the common mistakes that organizations make when implementing a wide-scale WBR process?

I think one big mistake is everybody just comes in and looks at the data and goes… “Okay, cool.”

The best way to run these meetings is to encourage truth seekers. The people you want in the room for a WBR are people who aren’t defensive about their functional area. You need a set of people whose highest order is wanting to know what the truth is about this business, whether the news is bad or good. 

Oftentimes, you end up in a world where you get people who aren't truth-seeking or people who aren't able to think in systems. I'm a big believer in systems thinking as a critical skill throughout the business. When you don't have systems thinkers who can see the whole picture you can burn a lot of time in these meetings and make no progress. 

How do you go about coaching new leaders to help level them up so that they can perform in the WBR effectively?

What I tend to find effective here is building an understanding for them from the ground up, so that they know what's trying to be achieved in the meeting. Basically taking them through how we’re going to solve the “elephant problem” I talked about earlier: We're going to get all the data from all the different departments in one place. We're going to all look at it together. 

Our jobs are to do diagnostics. We're truth seekers, rather than coming in with an agenda. 

If you're pounding your pet peeve on another team in that meeting, you lose credibility instantly. Because it shouldn't be about driving an agenda. It should be about driving an understanding and making sure that leaders know you want a vigorous debate about hypotheses around what might be driving the business. 

From there, you can say, “Let's leave this meeting with an understanding of what things we're going to test to see how they change the business.”

The process of getting a WBR organized is pretty complicated. How do you think about the core responsibilities in that process and what are two to three core roles that you need to really set up for success to make sure everyone can go into the meeting properly prepared?

Every company will have different functional structures that might run this. Sometimes there'll be a CFO, sometimes it’ll be the COO or someone from the business operations group. 

But the most foundational one is definitely data, and trust in that data. So you've got the data team heavily involved in pulling that information together. 

Then, you need somebody who does a little bit of annotating and editorializing on top of the data to say, “This week we ran a campaign.” Typically, you need each functional area to provide a broad-level overview of what transpired. 

So if I'm on the performance marketing team, I might say, “CAC increased and we think it's because of iOS 14.5.” The growth team might say, “We're seeing a drop in conversions.” 

From there, if I start to look at those two data points, I might start to assume that what we have is a traffic quality problem, which would be consistent with a particular OS privacy update that’s affecting our ability to target customers, right? Then you say, “Okay, I've got two data points that might prove this hypothesis. Do I have anything else here that might confirm or refute that?” One classic mistake I see is that people won't ask for evidence that might refute their hypothesis, but it’s an important step and practice to encourage.

A WBR is an expensive meeting with lots of senior team members involved. What's your mental playbook for making sure those sixty to ninety minutes are really high-value?

I think for a lot of people in the business that attend that meeting, the highest order for many of them is just understanding what is actually going on in the business. 

I think what you tend to see though, is that there’s a subset of that group that can do the diagnostics, the hypothesis generation, and then figure that out. So there are almost multiple levels of what you want to achieve in this meeting: What is the alignment around what’s actually even happening? You want everybody to walk away with an alignment of what is going on in the business and an understanding about what we’re testing next and why.

But there's also a subset of people who are trying to tune their mental models of the universe. Here’s a really quick example of this: Covid happens and the adoption of ecommerce explodes. What does Amazon do? They started building out massive infrastructure warehouses, distribution facilities, and everything else because their thesis was once you start using ecommerce, you never stop. Amazon has never had a cohort of customers who come in and collectively buy less a year later. It just doesn't happen. But, what we've all seen in the press recently is that Amazon is now shedding all of these distribution facilities. Turns out, they overbuilt because that growth reverted back to the mean.

Maybe Amazon did debate this before building, I don’t know – they’re such a well-run company. I don't want to pick on them. It just turns out that the data is public so I can talk about it. I'm going to bet there maybe wasn't someone who questioned, “Well, what happens if this cohort of Covid people doesn’t stick around?” 

Even if they did have that debate, they went in the direction of “We believe these people are going to stick around.” But the reason I'm mentioning this is to highlight the importance of debate. Is this a short-term market anomaly or a long-term market trend? What is driving these numbers? Because the world went from clear skies with a steady breeze, to all of a sudden having a hurricane and a tsunami at the same time.

Totally agree – any business that experienced truly unprecedented pandemic accelerated demand (Shopify, Pelaton, Zoom) would have had this challenge, obviously, those needing physical capacity had the hardest strategic challenge. The need to be close to trusted numbers and be responsive to market signals would have been critical to successfully navigating the ups and downs. 

Moving onto actioning the WBR outputs, how do you make sure that there’s a feedback loop between decisions being made in the WBR and action being taken in the business?

The best practice, in my opinion, is that you're taking note of action items in the Doc. You're highlighting who has action items, what they are, when they’re due, and then reviewing those action items and their status at the start of every WBR. 

Also, when you’re reviewing the business and working with different teams, look for things that say, “Maybe we're not doing a good enough job here” or, “Maybe this is an indication that we're doing a great job.” We all look at the output metrics and when things are up to the right, it's like “That team is crushing it!” For a solid decade or so, that's kind of what everybody expected. 

In today's environment, I'm more interested in the explanatory understanding of what’s happening in the business. If your numbers are up and to the right, but you don't understand why… that’s not good. If your numbers are down, but you understand why that's better. 

In the long run, teams that understand the causal mechanisms that drive output will win. 

You've talked to me many times about the importance of understanding the levers of the business system and investing in that understanding, either through tests or via studying cause and effect along the way. It becomes an accumulating asset for that team and allows it to control its own outcomes, which was an incredibly powerful takeaway.

I was thinking about the data evolution that you've been on at MasterClass. I’d love for you to talk a little bit about the journey to getting to the point you are now, where presumably you can walk in on a weekly basis and have confidence in the numbers. What did it take to get there and what were some of the big pieces that you had to get in place to be set up for success?

I think what's interesting about data quality and integrity is that progress on that dimension isn't linear. It's not always getting better over time. You can have external events that shake that stuff to the core. 

Many months ago, we probably all felt more comfortable in our understanding of performance marketing data. Then iOS 14.5 comes out now and it's like we got dragged back to the late 90s. Well, it’s not quite that bad, but I mean it highlights an important point: You don't ever want to become complacent in your assumptions about the quality of your data. If something looks off or different, even if you think you have good data, you still need to dig in and try to understand what's really happening. 

But generally speaking, you're right, data quality will improve over time within an organization and you'll typically start with no data at all, or log files, or some dashboard that an engineer has built. From there, you start to realize, “We need an actual system because if we keep running reports on production, we're going to crash our production system.” 

So, you get Looker or Tableau or something else to start looking at that data. And then you tend to realize “I need a CDP.” So, you get Segment in place. Then you say, “I need Amplitude so I can look at funnel reports.”

And now all of a sudden you've got multiple “sources of truth” that are never perfectly aligned. That is, to me, an instantaneous inflection point for businesses in their relationship to data. Do they suddenly start to say I no longer trust any data because none of it agrees with each other? Or, are you able to ultimately come to grips with the fact that the data will never fully agree with each other? And from there, you're going to pick a few sources of truth and move forward with those sources of truth? 

I think every company eventually gets to a point where they need to pick their source of truth and what they’re going to make decisions on. Otherwise, they’re going to be rudderless from lack of data. 

So you pick your thing and you say, “Okay, we'll use Looker and Amplitude and they won't always agree. But, we generally understand why they're going to disagree at different times, and this is why we're going to use each one for a different purpose. And we're going to march forward, hand in hand.”

Tactically speaking, who in the organization should be driving the process to agree on data sources and then metric definitions?

I'm glad you brought that up, Alex.

For example, you’d imagine that the definition of LTV is simple. It’s the lifetime value of a customer. But, if you’re in an early business, how do you know the lifetime value of your customer when you've only been around for a year and a half? 

Then, there are lots of finance people who only do three-year value and call that LTV.

Soon enough, you end up debating things like what discount rate do you use for future cash flows… is it your weighted average cost of capital as a startup? Or should it be the weighted average cost of capital in a late-stage company? These are meaningful debates. If you shift those numbers a little bit, well, your LTV can radically change, which then changes your CAC, because you're probably running off of an LTV to CAC ratio. 

So, the definition of metrics, in my opinion, needs to be done in large part by the biz ops organization in partnership with the finance team. Definitions need to be agreed upon by the key operating executives, down to the nitty-gritty of, “What are our weighted costs of capital? What is our capital if we're discounting cash flows in the future? How are we handling this stuff?” Because it really does matter.

How do you think about sharing a common understanding of metrics definitions, and also where to go for different types of data needs and which reports to look at? Is it documentation, or is it “go to the point person in the company”? How do you think about data democratization at scale for a company like MasterClass?

This is an interesting challenge. There are really several inflection points in businesses. There's a classic one when you get beyond 150 people – once you pass Dunbar's number things kind of get different. 

From a data transparency point of view, I've always held the belief that you want to be as transparent as humanly possible for as much of the business as possible. I'm increasingly interested in the notion of leading by context and trying to make sure that everybody else has as much context as I have. They can weigh that information when they're making decisions because the context of how the business is doing really matters. 

When thinking about a multi-layered communication strategy within the organization to talk about how this business is doing, think about: How often are you getting up at the All Hands and talking about the key performance metrics within the business? 

This will vary from company to company, but I think that too many companies default to being more secretive about metrics. If you really want to empower people to make the best decisions, the quality and the impact that you have on the business will be proportional to the quality of the decisions you make… and what improves decisions is better information and better context.

I think the key thing here is that sometimes data points alone aren't sufficient for people without the additional context and narrative, especially if you're a cyclical business. Communication and data access, whether that's through a deck or dashboards or Google Sheets, needs to put the data in context.

Is there anything else about the WBR process that we haven’t covered that you’d like to add?

My answer is this book: Working Backwards. It's a really great book. There's an entire chapter devoted to metrics and Weekly Business Reviews. Basically, it’s the Amazon operating model and they do a great job of breaking down their Weekly Business Review process: What does that look like? How do they do diagnostics? How do they think about time series data? And a lot more.

Thanks so much for taking the time to talk about WBRs with me today, Mark.  

The Weekly Business Review is the primary opportunity to understand important shifts in the business, educate and align the wider team as well as make important decisions. 

Growing companies are rightfully focused on improving their metrics, but what’s often missed is also growing the collective understanding of the business system and levers that govern those metrics. Mastery of the levers of performance allows teams to control their own destiny and, among other things, deliver increasingly efficient growth. 

As the macro environment for startups becomes increasingly volatile and challenging there has never been a more important time to nail systems understanding,  information sharing, alignment, and decision making. Importantly, underpinning these critical workflows is a foundation of trusted, accessible data.

AirOps makes it easy for business teams to get trusted data into the tools and operating docs they use to run the company, including Weekly Business Reviews. We help free up your team’s precious time to focus on truth-seeking, not squabbling about data.

Are you ready to uplevel the ROI of your WBR? Sign up for early access here.

Scale your most ambitious SEO strategies

Use AI-powered workflows to turn your boldest content strategies into remarkable growth

Book a CallStart Building