Product AdoptionEdTechUser Engagement

5 Signs Your Users Are Stuck in the 'Trying' Phase

6 min read

There is a crucial gap in the EdTech adoption journey that does not get enough attention. It happens after a teacher creates an account but before they actually integrate your product into their practice. Call it the trying phase: users who have access, who may have even logged in a few times, but who have not crossed the threshold into real adoption.

This phase is deceptively dangerous because the metrics can look okay on the surface. You have sign-ups. You have logins. You might even have some initial engagement. But none of it is translating into the sustained usage that drives retention and renewals.

Research on SaaS product adoption distinguishes between activation, a single moment when users experience your product's value, and adoption, when users fully integrate your product into their routine (Business of Apps, 2025). Many EdTech users activate but never adopt. They try, but they do not convert. Here are five signs this is happening in your product.

1. High Day 1 Engagement That Crashes by Day 7

The pattern looks like this: a teacher signs up, explores the product enthusiastically on day one, maybe completes an onboarding sequence, and then disappears. If you are seeing strong initial engagement followed by a steep drop-off within the first week, your users are trying but not forming habits.

Industry benchmarks put this in stark relief. Education apps see Day 1 retention around 14 to 15%, but by Day 30, that number drops to roughly 2 to 3% (Business of Apps, 2025). After onboarding 5,000 new users, many EdTech platforms see fewer than 15% still active after two weeks.

The gap between Day 1 and Day 7 is where the trying phase lives. Users who make it to Day 7 with meaningful engagement are far more likely to become long-term adopters. Users who drop off before then are signaling that your product did not become part of their workflow quickly enough.

The critical window is Days 3 to 30, when most churn happens. This is where onboarding, content pacing, and the habit-formation process all converge. If you are not tracking this window specifically, you are missing where adoption breaks down.

2. Users Complete Onboarding but Never Return to Core Features

Onboarding completion is a common success metric, but it can be misleading. A teacher might dutifully click through your setup wizard, watch your intro videos, and check all the onboarding boxes without ever experiencing the actual value your product delivers.

The better question is: what do users do after onboarding? If your onboarding completion rate is high but your feature adoption rate is low, users are going through the motions without reaching the aha moment that turns trying into using.

Time to value is the metric that matters here. It measures how long it takes a new user to realize tangible benefits after they start using your product (Business of Apps, 2025). If teachers are completing onboarding in 20 minutes but not experiencing value until session three or four, you have a trying-phase problem. The onboarding taught them how the product works but did not show them why it matters for their classroom.

3. Usage Spikes After Training, Then Flatlines

Professional development sessions often create temporary engagement surges. Teachers attend a training, get excited about the possibilities, log in with fresh enthusiasm, and then... nothing. The usage spike decays back to baseline within a week or two.

This pattern indicates that training is generating interest but not building capability or habit. Research from the EdWeek Research Center found that more than half of educators report their EdTech professional development experiences are mostly one-time events with little or no follow-up coaching or training (EdWeek, 2022). Without reinforcement, the trying phase never converts to adoption.

If your usage data shows consistent post-training spikes that do not sustain, the problem is not that teachers are uninterested. It is that your training is creating motivation without creating the conditions for that motivation to translate into changed behavior.

4. Users Engage with Simple Features but Avoid the Core Value

Watch where your users actually spend their time. In the trying phase, teachers often gravitate toward low-friction features: browsing content libraries, adjusting settings, exploring dashboards. They engage with your product at the periphery while avoiding the core actions that deliver real value.

This is not laziness. It is risk avoidance. Teachers know that truly integrating a new tool into their classroom practice requires effort, and that effort might not pay off. Trying the edges of your product feels safer than committing to the center.

Feature adoption rate becomes the key diagnostic here. If users are logging in regularly but only engaging with 20 to 30% of your product's capabilities, they are in trying mode. They have not committed. The use of multiple features is a strong proxy for users getting value (Business of Apps, 2025). When teachers start using your product's interconnected features, that is the signal they have moved from trying to adopting.

5. Support Requests Are About Navigation, Not Outcomes

The nature of support questions reveals where users are in their journey. Users stuck in the trying phase ask questions like: Where do I find this feature? How do I export this report? What does this button do? They are still figuring out the mechanics.

Users who have moved into adoption ask different questions: How do I use this to address my struggling readers? Can I customize this for my AP class? What is the best way to share results with parents? These questions are about outcomes, not navigation. They indicate a teacher who is thinking about your product in terms of their actual classroom needs.

If your support tickets are dominated by how-do-I-find questions rather than how-do-I-use questions, your users are still trying to figure out what your product does rather than trying to do something meaningful with it.

The shift from navigation questions to outcome questions is one of the clearest signals that a user has crossed from trying to adopting. Track it.

Moving Users from Trying to Adopting

If these signs sound familiar, the path forward is not to add more features or run more trainings. It is to redesign the early user experience around a faster path to value.

Research suggests that a 10 to 15 minute first session that achieves a specific goal tends to lift both Day 1 and Day 7 retention (Business of Apps, 2025). This means identifying what success looks like for a teacher in their first real use of your product and then ruthlessly optimizing the path to that success. Everything else can come later.

It also means building reinforcement loops that sustain momentum through the critical Days 3 to 30 window. In-app messages can boost retention by approximately 30% when implemented well (Business of Apps, 2025). But the messages need to be about progress toward meaningful outcomes, not just product announcements.

Most importantly, it means tracking the right metrics. Time to first value. Feature adoption rate. The gap between Day 1 and Day 7 engagement. These numbers tell you whether your users are becoming adopters or remaining perpetual tryers.

If your product has users stuck in the trying phase and you are not sure how to move them forward, that is the kind of adoption challenge I help teams diagnose and solve.

Free Assessment

How strong is your adoption strategy?

Take the free 3-minute Adoption Diagnostic. Score your product across five critical dimensions and get personalized recommendations.

Your Product Is Strong. Let’s Make Sure Educators See It.

Most product adoption challenges aren’t about the product itself. They’re about how it’s positioned, introduced, and supported. Let’s figure out what yours needs.

Start the Conversation