NPS Won't Tell You Who's About to Churn
Every SaaS company tracks NPS. It’s the retention metric that everyone agrees on as simple, comparable, and trusted by executives. If your score is climbing, retention must be improving.
Except it isn’t.
The customer success team recently celebrated a 67 NPS score. However, less than two weeks later, a customer who had rated them 9 left. This single change erased a five-figure amount of ARR. As a result, budgets tightened, their internal supporter left, and new team members began removing tools they considered unnecessary.
And none of those showed up in the survey.
In this article, we are going to explore behavioral signals that can forecast customer churn weeks, reasons why many teams miss these signals despite understanding their significance, and the outcomes of early intervention.
Usage Drops Before Sentiment Does
The pattern is always the same: usage decreases, engagement drops, and finally, the customer admits they’re leaving. By the time they’re willing to tell you they’re unhappy, they’ve already mentally moved on.
Someone who used to log in daily suddenly reduces their visits to twice a week, then once a week, and eventually disappears for a month. When you finally check in, they tell you they’ve been “too busy” to use the product. What they mean is: it’s no longer important to them.
Or you have a customer who uses only one feature. They’re happy with that feature and might even give you a high NPS score. But they’re vulnerable. The day they find a tool that does that one thing better, cheaper, or bundled with something else they need, they’ll leave.
NPS is Easy, Behavioral Tracking is Difficult (That’s Why Nobody Does It)
Everyone understands that usage matters more than survey scores. So why do teams still depend on NPS?
Because NPS is simple. One survey, one number, one trend line for the leadership team. Tracking behavioral signals across hundreds of accounts is complicated. You need a solid data infrastructure, clear thresholds, and someone who will actually respond to the alerts.
Most companies lack systems to track usage at the account level. They can see aggregate metrics: total logins this month and overall feature adoption, but they can’t identify which specific accounts are declining. Their analytics tools show them what’s happening across the whole customer base, not what’s happening with the account that’s up for renewal next month.
Even when they have the data, they don’t know which threshold matters. Is a 20% drop in usage over two weeks concerning? What about 30% over four weeks? Without testing it against your own churn data, you’re just guessing.
And here’s the real issue: behavioral tracking creates extra work. When an account’s usage drops, someone has to call them. That conversation can be uncomfortable. You’re basically saying, “I noticed you’re not using us much anymore,” which forces them to either lie about why or admit they’re considering other options. Most customer success teams would rather wait until the scheduled quarterly check-in than have that talk.
NPS helps you avoid all of this. The survey sends automatically, and the score updates. Everyone stays informed. No awkward calls needed.
How to Set Up Tracking Without Drowning in Alerts
Here’s how to set this up without drowning in noise:
Start with your churn data instead of industry benchmarks. Gather data on everyone who churned in the past year. Examine their usage patterns 60-90 days before cancellation. Identify the pattern that consistently emerges. That’s your threshold. For a B2B SaaS company (for example), usage typically declines by 30% to 40% over 3-4 weeks. However, your number might differ.
Track the appropriate usage metric for your product. Daily active users don’t matter if your product is meant for weekly use. Find the action that shows real engagement. In project management, it’s about creating or completing tasks. For analytics platforms, it’s running reports. For communication tools, it’s sending messages. Track that action, not just logins.
Set up a simple flag system, not a complex score. You don’t need another scoring algorithm. You need three categories: healthy (usage stable or growing), watch (usage down 15-25%), act now (usage down 30%+). Route “act now” accounts to whoever owns retention. Route “watch” accounts to a weekly review.
The Intervention Call (And What Each Response Actually Means)
When someone’s usage decreases by 30% over three weeks, you probably have about two weeks before they start exploring other options if they haven’t already.
The approach is straightforward: “I pulled up your account and noticed your team’s been completing fewer projects than usual. Everything okay on your end?”
Here’s what each response actually means and what to do:
“Yeah, we’ve been struggling with [specific feature]” → This is your best-case scenario. They face a specific problem that’s blocking them. You have 48 hours to either fix it or provide a workaround. If you can’t resolve it within 2 days, they’ll assume the product doesn’t suit their needs and start looking elsewhere. Don’t promise a fix that takes three weeks. They won’t wait.
“We’ve been slammed with [other project/initiative]” → They’re telling you you’re not important. Now you need to figure out if that’s temporary or permanent. Ask: “When does that wrap up?” If they give you a specific date, set a reminder to check back. If they’re vague - ” Oh, you know how it is, always busy” - they’re deprioritizing you forever. Your next question should be: “What would need to change for this to become a priority again?” If they can’t answer that, they’re gone.
“We had some people leave, and the new folks haven’t been trained.” → Offer immediate onboarding for new team members. Their response reveals everything. If they say, “Yes, that would be great. Can we schedule something this week?” they’re salvageable. If they say “We’ll get back to you on that” or “Let me check with the team,” they’re already evaluating options and don’t want to waste time training someone on a tool they’re about to replace.
“We’re looking at all our tools right now.” / “Budget’s tight.” → Stop avoiding the issue. Ask directly: “Are you evaluating alternatives to us?” If they say yes, at least you know where you stand and can try to influence the decision. If they say no, they’re lying. Assume you have 30 days max. The question becomes: is it worth fighting for this account, or should you let them go quietly?
The truth nobody tells you is that most of these conversations don’t save the account. Maybe 30-40% of the time, you catch a real problem early enough to fix it. The rest of the time, you’re just getting the bad news three weeks earlier than you would have otherwise.
But that’s still valuable. Three weeks is enough time to:
Stop wasting CS resources on accounts that are already lone
Get a head start on replacing that revenue.
Truly understand why customers leave, rather than just hearing the sanitized version they provide at cancellation.
The customers will tell you exactly what’s wrong if you ask them directly. Those who are already gone will dodge the question or give vague answers. Knowing how to tell the difference is the whole game.
Your Churn Data Already Proves This
Pull your churn list from the last six months. Check what their NPS scores were 60 days before they canceled. Most of them probably looked fine. Not promoters, maybe, but not detractors either. Somewhere in the passive range, giving you 7s and 8s while they quietly evaluated alternatives.
Now review their usage data for those same 60 days, including login frequency and feature adoption. The warning signs were evident: declining usage and stagnant engagement. You weren’t looking at the correct data.
NPS is useful for board decks and executive dashboards. It’s a clear, numerical indicator that shows trends over time. But if you want actually to increase revenue, you need to pay attention to what customers do, not just what they say when you interrupt them with a survey.
The customers who are about to leave are already showing you. They do it through their login patterns, not their survey responses.


