Skip to content

Product Metrics

YeboLearn's product metrics track how schools, teachers, and students engage with our platform. These metrics inform product decisions, prioritize roadmap items, and identify opportunities for improvement.

Engagement Metrics

Daily Active Schools (DAS)

Definition: Unique schools with at least one user login AND feature usage in a 24-hour period

Current Performance: 109 schools/day (75% of active customer base)

Target: 75%+ of total active schools (maintaining current level)

DAS Calculation:

Daily Active School criteria:
1. At least 1 teacher or student login
2. At least 1 feature interaction (not just dashboard view)
3. Activity timestamp within 24-hour UTC period

Excludes: API-only activity, automated tasks, admin-only logins

DAS Trends (Last 30 Days):

WeekDASActive SchoolsDAS %Week-over-Week
Week 1 (Nov 1-7)10614374%-
Week 2 (Nov 8-14)10814475%+1.9%
Week 3 (Nov 15-21)10914575%+0.9%
Week 4 (Nov 22-28)11114577%+1.8%
Average10914475%+1.5%

DAS by Day of Week:

DayAvg DAS% of SchoolsPattern
Monday11881%Highest (week planning)
Tuesday11579%High (full teaching)
Wednesday11277%High (mid-week)
Thursday11076%Moderate
Friday9868%Lower (week winding down)
Saturday4229%Low (weekend)
Sunday3826%Lowest (weekend prep)

Insights:

  • Strong weekday engagement (Monday-Thursday)
  • Weekend usage primarily from teachers planning next week
  • Opportunity: Increase Friday engagement with weekly recap features

DAS by Subscription Tier:

TierSchoolsAvg DASDAS %Engagement Level
Enterprise1212100%Exceptional
Professional857285%Strong
Essentials482552%Moderate
Total14510975%Good

Strategic Priority: Increase Essentials tier engagement through better onboarding and feature discovery

Weekly Active Users (WAU)

Definition: Unique teachers and students with platform interaction in 7-day period

Current Performance: 11,240 WAU (average across all schools)

Target: 10,000+ WAU, growing with customer acquisition

WAU Composition:

User TypeWeekly ActiveTotal UsersActivity Rate
Teachers1,8602,32080%
Students9,38014,85063%
Total11,24017,17065%

WAU Growth Trends:

MonthWAUTotal UsersActivity RateMoM Growth
Aug 20259,85015,20065%-
Sep 202510,42016,15065%+5.8%
Oct 202510,88016,80065%+4.4%
Nov 202511,24017,17065%+3.3%

Target: Maintain 65%+ activity rate while growing total users

Teacher vs Student Engagement Patterns:

MetricTeachersStudents
Avg sessions/week8.54.2
Avg time per session24 min18 min
Features used/week4.82.1
Peak usage time8am-10am2pm-4pm
Most used featureLesson PlannerQuiz Generator

Monthly Active Users (MAU)

Definition: Unique users with platform interaction in 30-day period

Current Performance: 13,680 MAU

Target: 80%+ of total registered users

MAU/WAU Ratio: 1.22 (indicates healthy weekly engagement)

  • Ratio >1.5: Low engagement, users only active occasionally
  • Ratio 1.2-1.5: Good engagement, users active multiple times per month
  • Ratio <1.2: Exceptional engagement, most users active weekly

Stickiness (DAU/MAU Ratio)

Definition: Daily Active Users / Monthly Active Users (measures habit formation)

Current Performance: 42% stickiness

Target: >40% (current performance is strong)

Stickiness Calculation:

Average DAU (monthly): 5,746 users
MAU: 13,680 users

Stickiness = (5,746 / 13,680) × 100 = 42%

Industry Benchmarks:

  • <20%: Low stickiness, users engage occasionally
  • 20-40%: Moderate stickiness, regular usage
  • 40%+: High stickiness, daily habit formed (YeboLearn is here)

Stickiness by User Type:

  • Teachers: 58% (use platform almost daily)
  • Students: 38% (use 2-3 times per week)

Feature Adoption Metrics

Overall Feature Adoption Rate

Definition: Percentage of schools using 5 or more features monthly

Current Performance: 87 schools (60% of active schools)

Target: 60%+ schools using ≥5 features

Feature Adoption Distribution:

Features UsedSchools% of TotalEngagement Level
1-2 features1812%Low (at-risk)
3-4 features4028%Moderate
5-7 features5236%Good
8-10 features2517%High
11+ features107%Power users

Correlation with Retention:

  • 1-2 features: 45% annual retention
  • 3-4 features: 72% annual retention
  • 5-7 features: 91% annual retention
  • 8+ features: 98% annual retention

Insight: Schools using 5+ features have 2x better retention

Feature Usage Breakdown

Top 10 Features by Adoption (30-day active usage):

FeatureSchools Using% AdoptionAvg Uses/WeekPower Users
1. AI Lesson Planner12888%12.485 schools
2. Smart Quiz Generator11881%8.772 schools
3. Auto-Grading11277%15.268 schools
4. Curriculum Alignment10673%6.358 schools
5. Resource Library9868%4.842 schools
6. Progress Tracking9566%18.665 schools
7. Plagiarism Checker8961%5.238 schools
8. Parent Portal7854%3.128 schools
9. Student Analytics7250%7.835 schools
10. Live Collaboration6847%2.922 schools

Power User Definition: Using feature 2x or more than average

Feature Launch Performance (Last 3 Launches):

FeatureLaunch Date30-Day Adoption60-Day AdoptionTargetStatus
Live CollaborationSep 1, 202535%47%40%Exceeded
Student AnalyticsJul 15, 202528%50%45%Exceeded
Parent PortalJun 1, 202542%54%50%Exceeded

Target: >40% adoption within 60 days of feature launch

AI Feature Usage Metrics

Definition: Percentage of schools using AI-powered features weekly

Current Performance: 102 schools (70% of active schools)

Target: 70%+ schools using AI features weekly

AI Feature Breakdown:

AI FeatureWeekly Users% of SchoolsAvg Uses/User/Week
AI Lesson Planner11881%3.8
Smart Quiz Generator10875%2.4
Auto-Grading9566%8.2
Plagiarism Checker7854%1.6
Writing Assistant7250%2.1
Content Recommendations6847%5.3
Any AI Feature10270%23.4

AI Usage Intensity (schools using AI features):

Intensity LevelSchools% of AI UsersAI Uses/Week
Light (1-10 uses)2827%5.2
Medium (11-25 uses)4544%17.8
Heavy (26-50 uses)2222%36.4
Power (51+ uses)77%72.6

AI Usage Growth:

MonthSchools Using AI% AdoptionTotal AI UsesMoM Growth
Aug 20258862%78,400-
Sep 20259465%85,200+8.7%
Oct 20259868%91,800+7.7%
Nov 202510270%98,500+7.3%

Target: 80% AI adoption by Q2 2026

Feature Engagement Score

Definition: Composite score measuring breadth and depth of feature usage

Calculation:

Feature Engagement Score =
  (# of features used × 10) +
  (total feature interactions / 10) +
  (AI feature usage × 5)

Example for a high-engagement school:
  (12 features × 10) + (840 interactions / 10) + (65 AI uses × 5)
  = 120 + 84 + 325 = 529 points

School Segmentation by Engagement Score:

SegmentScore RangeSchoolsAvg MRRRetention
Power Users400+18$3,20098%
High Engagement250-39945$2,10094%
Medium Engagement150-24952$1,65088%
Low Engagement50-14922$1,10068%
At-Risk<508$85042%

Strategic Action: Focus on moving "Low Engagement" schools to "Medium" (targeting 150+ score)

Platform Performance Metrics

Platform Uptime

Definition: Percentage of time platform is available and responsive

Current Performance: 99.94% uptime (last 30 days)

Target: 99.9%+ uptime (SLA commitment)

Uptime Tracking:

MonthUptime %Downtime (min)IncidentsMTTR
Aug 202599.89%48 min224 min
Sep 202599.96%17 min117 min
Oct 202599.92%35 min218 min
Nov 202599.94%26 min126 min

MTTR = Mean Time to Recovery

SLA Tiers:

  • Enterprise: 99.95% uptime guarantee (5 schools impacted by Nov incident)
  • Professional: 99.9% uptime target
  • Essentials: 99.5% uptime (best effort)

Incident Response:

  • <15 min: Excellent recovery
  • 15-30 min: Acceptable
  • 30 min: Needs improvement

Page Load Times

Definition: Time from request to fully loaded page

Current Performance: 1.8s average load time

Target: <2.0s average, ❤️.0s for 95th percentile

Load Time Distribution (November):

PercentileLoad TimeTargetStatus
50th (Median)1.4s<1.5sGood
75th2.1s<2.5sGood
95th2.9s❤️.0sGood
99th4.2s<5.0sGood

Load Time by Feature:

FeatureAvg Load95th PercentileStatus
Dashboard1.2s2.1sExcellent
Lesson Planner2.4s3.8sGood
Quiz Generator3.1s4.5sNeeds improvement
Resource Library1.8s2.6sGood
Student Analytics2.9s4.2sAcceptable

Action Item: Optimize Quiz Generator load time (target: <2.5s average)

API Response Times

Definition: Time for API to respond to requests

Current Performance: 280ms average response time

Target: <300ms average, <500ms for 95th percentile

API Performance:

Endpoint CategoryAvg Response95th PercentileRequests/Day
Authentication120ms185ms45,000
Content Retrieval220ms380ms128,000
AI Generation1,850ms3,200ms18,500
Data Analytics450ms680ms12,400
File Upload680ms1,100ms8,200

Note: AI generation endpoints have higher latency due to external API calls (OpenAI, etc)

Error Rates

Definition: Percentage of requests resulting in errors

Current Performance: 0.12% error rate

Target: <0.5% error rate

Error Breakdown (November):

Error TypeCount% of TotalImpact Level
5xx Server Errors1,24058%High
4xx Client Errors72034%Medium
Timeout Errors1808%High
Total Errors2,140100%-

Error Rate Trend:

WeekTotal RequestsErrorsError Rate
Week 14.2M5800.14%
Week 24.5M5200.12%
Week 34.4M4800.11%
Week 44.6M5600.12%

Top Error Sources:

  1. AI API timeouts (OpenAI rate limits) - 32% of errors
  2. Database connection pool exhaustion - 18% of errors
  3. File upload size limits exceeded - 12% of errors
  4. Invalid authentication tokens - 11% of errors

User Behavior Metrics

Session Duration

Definition: Average time users spend in platform per session

Current Performance:

  • Teachers: 24 minutes average
  • Students: 18 minutes average
  • Overall: 19.8 minutes average

Target: Maintain >15 minutes average (indicates meaningful engagement)

Session Duration by Feature:

FeatureAvg Session DurationSessions/WeekTotal Time/Week
Lesson Planner32 min4,8502,587 hours
Auto-Grading18 min6,2001,860 hours
Quiz Generator22 min3,8401,408 hours
Progress Tracking12 min5,1001,020 hours
Resource Library15 min2,900725 hours

Total Weekly Active Learning Hours: 12,840 hours across 145 schools = 88.6 hours/school/week

Monthly Average: 385 hours/school/month (target: 800 hours/school/month)

Opportunity: Increase session frequency and duration to reach target learning hours

Features Per Session

Definition: Average number of distinct features used in single session

Current Performance: 2.4 features per session

Target: >2.0 features per session (indicates feature discovery and cross-usage)

Session Patterns:

Features Used% of SessionsAvg DurationUser Satisfaction
1 feature42%12 min3.2/5
2 features31%18 min3.8/5
3 features18%26 min4.2/5
4+ features9%38 min4.6/5

Insight: Multi-feature sessions have higher satisfaction and longer duration

Feature Discovery Rate

Definition: Percentage of users discovering new features each month

Current Performance: 28% of users try a new feature monthly

Target: >25% monthly feature discovery

Discovery Methods:

Method% of DiscoveriesEffectiveness
In-app prompts42%High
Email campaigns25%Medium
Onboarding wizard18%High
User exploration12%Low
Support team3%Medium

Strategic Focus: Increase in-app feature prompts and improve onboarding wizard

Retention Metrics

User Retention Cohorts

7-Day Retention (users active 7 days after first session):

Signup MonthNew UsersDay 7 ActiveRetention %
Aug 20251,8401,28870%
Sep 20252,1201,59075%
Oct 20252,2801,78978%
Nov 20251,9201,53680%

30-Day Retention (users active 30 days after first session):

Signup MonthNew UsersDay 30 ActiveRetention %
Jul 20251,6801,02561%
Aug 20251,8401,19565%
Sep 20252,1201,42167%
Oct 20252,2801,57369%

Target: 75%+ 7-day retention, 65%+ 30-day retention

Feature-Specific Retention

Retention Rate by First Feature Used:

First Feature7-Day Retention30-Day Retention
AI Lesson Planner82%72%
Quiz Generator76%68%
Resource Library68%58%
Progress Tracking72%65%
Curriculum Alignment79%71%

Insight: Users starting with AI features have higher retention

Monitoring and Alerts

Product Health Alerts

Critical (Immediate Action):

  • Platform uptime <99.5% for >15 minutes
  • Error rate >1% for >5 minutes
  • DAS decline >15% day-over-day
  • Major feature completely unavailable

Warning (Review Within 2 Hours):

  • Load time >5s for 95th percentile
  • Error rate 0.5-1% sustained for >30 minutes
  • DAS decline 10-15% day-over-day
  • Feature adoption decline >10% week-over-week

Opportunity (Weekly Review):

  • Feature adoption increase >20% week-over-week
  • New feature usage pattern detected
  • Power user segment growth >10%
  • Session duration increase >15%

Product Metric Goals (2026)

MetricCurrentQ1 2026Q2 2026Q4 2026
DAS %75%77%80%82%
WAU11,24013,50016,00020,000
Feature Adoption (5+)60%65%70%75%
AI Usage Rate70%75%80%85%
Platform Uptime99.94%99.95%99.96%99.97%
Avg Session Duration19.8 min21 min23 min25 min
Features/Session2.42.62.83.0

Next Steps

YeboLearn - Empowering African Education