Product Metrics
YeboLearn's product metrics track how schools, teachers, and students engage with our platform. These metrics inform product decisions, prioritize roadmap items, and identify opportunities for improvement.
Engagement Metrics
Daily Active Schools (DAS)
Definition: Unique schools with at least one user login AND feature usage in a 24-hour period
Current Performance: 109 schools/day (75% of active customer base)
Target: 75%+ of total active schools (maintaining current level)
DAS Calculation:
Daily Active School criteria:
1. At least 1 teacher or student login
2. At least 1 feature interaction (not just dashboard view)
3. Activity timestamp within 24-hour UTC period
Excludes: API-only activity, automated tasks, admin-only loginsDAS Trends (Last 30 Days):
| Week | DAS | Active Schools | DAS % | Week-over-Week |
|---|---|---|---|---|
| Week 1 (Nov 1-7) | 106 | 143 | 74% | - |
| Week 2 (Nov 8-14) | 108 | 144 | 75% | +1.9% |
| Week 3 (Nov 15-21) | 109 | 145 | 75% | +0.9% |
| Week 4 (Nov 22-28) | 111 | 145 | 77% | +1.8% |
| Average | 109 | 144 | 75% | +1.5% |
DAS by Day of Week:
| Day | Avg DAS | % of Schools | Pattern |
|---|---|---|---|
| Monday | 118 | 81% | Highest (week planning) |
| Tuesday | 115 | 79% | High (full teaching) |
| Wednesday | 112 | 77% | High (mid-week) |
| Thursday | 110 | 76% | Moderate |
| Friday | 98 | 68% | Lower (week winding down) |
| Saturday | 42 | 29% | Low (weekend) |
| Sunday | 38 | 26% | Lowest (weekend prep) |
Insights:
- Strong weekday engagement (Monday-Thursday)
- Weekend usage primarily from teachers planning next week
- Opportunity: Increase Friday engagement with weekly recap features
DAS by Subscription Tier:
| Tier | Schools | Avg DAS | DAS % | Engagement Level |
|---|---|---|---|---|
| Enterprise | 12 | 12 | 100% | Exceptional |
| Professional | 85 | 72 | 85% | Strong |
| Essentials | 48 | 25 | 52% | Moderate |
| Total | 145 | 109 | 75% | Good |
Strategic Priority: Increase Essentials tier engagement through better onboarding and feature discovery
Weekly Active Users (WAU)
Definition: Unique teachers and students with platform interaction in 7-day period
Current Performance: 11,240 WAU (average across all schools)
Target: 10,000+ WAU, growing with customer acquisition
WAU Composition:
| User Type | Weekly Active | Total Users | Activity Rate |
|---|---|---|---|
| Teachers | 1,860 | 2,320 | 80% |
| Students | 9,380 | 14,850 | 63% |
| Total | 11,240 | 17,170 | 65% |
WAU Growth Trends:
| Month | WAU | Total Users | Activity Rate | MoM Growth |
|---|---|---|---|---|
| Aug 2025 | 9,850 | 15,200 | 65% | - |
| Sep 2025 | 10,420 | 16,150 | 65% | +5.8% |
| Oct 2025 | 10,880 | 16,800 | 65% | +4.4% |
| Nov 2025 | 11,240 | 17,170 | 65% | +3.3% |
Target: Maintain 65%+ activity rate while growing total users
Teacher vs Student Engagement Patterns:
| Metric | Teachers | Students |
|---|---|---|
| Avg sessions/week | 8.5 | 4.2 |
| Avg time per session | 24 min | 18 min |
| Features used/week | 4.8 | 2.1 |
| Peak usage time | 8am-10am | 2pm-4pm |
| Most used feature | Lesson Planner | Quiz Generator |
Monthly Active Users (MAU)
Definition: Unique users with platform interaction in 30-day period
Current Performance: 13,680 MAU
Target: 80%+ of total registered users
MAU/WAU Ratio: 1.22 (indicates healthy weekly engagement)
- Ratio >1.5: Low engagement, users only active occasionally
- Ratio 1.2-1.5: Good engagement, users active multiple times per month
- Ratio <1.2: Exceptional engagement, most users active weekly
Stickiness (DAU/MAU Ratio)
Definition: Daily Active Users / Monthly Active Users (measures habit formation)
Current Performance: 42% stickiness
Target: >40% (current performance is strong)
Stickiness Calculation:
Average DAU (monthly): 5,746 users
MAU: 13,680 users
Stickiness = (5,746 / 13,680) × 100 = 42%Industry Benchmarks:
- <20%: Low stickiness, users engage occasionally
- 20-40%: Moderate stickiness, regular usage
- 40%+: High stickiness, daily habit formed (YeboLearn is here)
Stickiness by User Type:
- Teachers: 58% (use platform almost daily)
- Students: 38% (use 2-3 times per week)
Feature Adoption Metrics
Overall Feature Adoption Rate
Definition: Percentage of schools using 5 or more features monthly
Current Performance: 87 schools (60% of active schools)
Target: 60%+ schools using ≥5 features
Feature Adoption Distribution:
| Features Used | Schools | % of Total | Engagement Level |
|---|---|---|---|
| 1-2 features | 18 | 12% | Low (at-risk) |
| 3-4 features | 40 | 28% | Moderate |
| 5-7 features | 52 | 36% | Good |
| 8-10 features | 25 | 17% | High |
| 11+ features | 10 | 7% | Power users |
Correlation with Retention:
- 1-2 features: 45% annual retention
- 3-4 features: 72% annual retention
- 5-7 features: 91% annual retention
- 8+ features: 98% annual retention
Insight: Schools using 5+ features have 2x better retention
Feature Usage Breakdown
Top 10 Features by Adoption (30-day active usage):
| Feature | Schools Using | % Adoption | Avg Uses/Week | Power Users |
|---|---|---|---|---|
| 1. AI Lesson Planner | 128 | 88% | 12.4 | 85 schools |
| 2. Smart Quiz Generator | 118 | 81% | 8.7 | 72 schools |
| 3. Auto-Grading | 112 | 77% | 15.2 | 68 schools |
| 4. Curriculum Alignment | 106 | 73% | 6.3 | 58 schools |
| 5. Resource Library | 98 | 68% | 4.8 | 42 schools |
| 6. Progress Tracking | 95 | 66% | 18.6 | 65 schools |
| 7. Plagiarism Checker | 89 | 61% | 5.2 | 38 schools |
| 8. Parent Portal | 78 | 54% | 3.1 | 28 schools |
| 9. Student Analytics | 72 | 50% | 7.8 | 35 schools |
| 10. Live Collaboration | 68 | 47% | 2.9 | 22 schools |
Power User Definition: Using feature 2x or more than average
Feature Launch Performance (Last 3 Launches):
| Feature | Launch Date | 30-Day Adoption | 60-Day Adoption | Target | Status |
|---|---|---|---|---|---|
| Live Collaboration | Sep 1, 2025 | 35% | 47% | 40% | Exceeded |
| Student Analytics | Jul 15, 2025 | 28% | 50% | 45% | Exceeded |
| Parent Portal | Jun 1, 2025 | 42% | 54% | 50% | Exceeded |
Target: >40% adoption within 60 days of feature launch
AI Feature Usage Metrics
Definition: Percentage of schools using AI-powered features weekly
Current Performance: 102 schools (70% of active schools)
Target: 70%+ schools using AI features weekly
AI Feature Breakdown:
| AI Feature | Weekly Users | % of Schools | Avg Uses/User/Week |
|---|---|---|---|
| AI Lesson Planner | 118 | 81% | 3.8 |
| Smart Quiz Generator | 108 | 75% | 2.4 |
| Auto-Grading | 95 | 66% | 8.2 |
| Plagiarism Checker | 78 | 54% | 1.6 |
| Writing Assistant | 72 | 50% | 2.1 |
| Content Recommendations | 68 | 47% | 5.3 |
| Any AI Feature | 102 | 70% | 23.4 |
AI Usage Intensity (schools using AI features):
| Intensity Level | Schools | % of AI Users | AI Uses/Week |
|---|---|---|---|
| Light (1-10 uses) | 28 | 27% | 5.2 |
| Medium (11-25 uses) | 45 | 44% | 17.8 |
| Heavy (26-50 uses) | 22 | 22% | 36.4 |
| Power (51+ uses) | 7 | 7% | 72.6 |
AI Usage Growth:
| Month | Schools Using AI | % Adoption | Total AI Uses | MoM Growth |
|---|---|---|---|---|
| Aug 2025 | 88 | 62% | 78,400 | - |
| Sep 2025 | 94 | 65% | 85,200 | +8.7% |
| Oct 2025 | 98 | 68% | 91,800 | +7.7% |
| Nov 2025 | 102 | 70% | 98,500 | +7.3% |
Target: 80% AI adoption by Q2 2026
Feature Engagement Score
Definition: Composite score measuring breadth and depth of feature usage
Calculation:
Feature Engagement Score =
(# of features used × 10) +
(total feature interactions / 10) +
(AI feature usage × 5)
Example for a high-engagement school:
(12 features × 10) + (840 interactions / 10) + (65 AI uses × 5)
= 120 + 84 + 325 = 529 pointsSchool Segmentation by Engagement Score:
| Segment | Score Range | Schools | Avg MRR | Retention |
|---|---|---|---|---|
| Power Users | 400+ | 18 | $3,200 | 98% |
| High Engagement | 250-399 | 45 | $2,100 | 94% |
| Medium Engagement | 150-249 | 52 | $1,650 | 88% |
| Low Engagement | 50-149 | 22 | $1,100 | 68% |
| At-Risk | <50 | 8 | $850 | 42% |
Strategic Action: Focus on moving "Low Engagement" schools to "Medium" (targeting 150+ score)
Platform Performance Metrics
Platform Uptime
Definition: Percentage of time platform is available and responsive
Current Performance: 99.94% uptime (last 30 days)
Target: 99.9%+ uptime (SLA commitment)
Uptime Tracking:
| Month | Uptime % | Downtime (min) | Incidents | MTTR |
|---|---|---|---|---|
| Aug 2025 | 99.89% | 48 min | 2 | 24 min |
| Sep 2025 | 99.96% | 17 min | 1 | 17 min |
| Oct 2025 | 99.92% | 35 min | 2 | 18 min |
| Nov 2025 | 99.94% | 26 min | 1 | 26 min |
MTTR = Mean Time to Recovery
SLA Tiers:
- Enterprise: 99.95% uptime guarantee (5 schools impacted by Nov incident)
- Professional: 99.9% uptime target
- Essentials: 99.5% uptime (best effort)
Incident Response:
- <15 min: Excellent recovery
- 15-30 min: Acceptable
30 min: Needs improvement
Page Load Times
Definition: Time from request to fully loaded page
Current Performance: 1.8s average load time
Target: <2.0s average, ❤️.0s for 95th percentile
Load Time Distribution (November):
| Percentile | Load Time | Target | Status |
|---|---|---|---|
| 50th (Median) | 1.4s | <1.5s | Good |
| 75th | 2.1s | <2.5s | Good |
| 95th | 2.9s | ❤️.0s | Good |
| 99th | 4.2s | <5.0s | Good |
Load Time by Feature:
| Feature | Avg Load | 95th Percentile | Status |
|---|---|---|---|
| Dashboard | 1.2s | 2.1s | Excellent |
| Lesson Planner | 2.4s | 3.8s | Good |
| Quiz Generator | 3.1s | 4.5s | Needs improvement |
| Resource Library | 1.8s | 2.6s | Good |
| Student Analytics | 2.9s | 4.2s | Acceptable |
Action Item: Optimize Quiz Generator load time (target: <2.5s average)
API Response Times
Definition: Time for API to respond to requests
Current Performance: 280ms average response time
Target: <300ms average, <500ms for 95th percentile
API Performance:
| Endpoint Category | Avg Response | 95th Percentile | Requests/Day |
|---|---|---|---|
| Authentication | 120ms | 185ms | 45,000 |
| Content Retrieval | 220ms | 380ms | 128,000 |
| AI Generation | 1,850ms | 3,200ms | 18,500 |
| Data Analytics | 450ms | 680ms | 12,400 |
| File Upload | 680ms | 1,100ms | 8,200 |
Note: AI generation endpoints have higher latency due to external API calls (OpenAI, etc)
Error Rates
Definition: Percentage of requests resulting in errors
Current Performance: 0.12% error rate
Target: <0.5% error rate
Error Breakdown (November):
| Error Type | Count | % of Total | Impact Level |
|---|---|---|---|
| 5xx Server Errors | 1,240 | 58% | High |
| 4xx Client Errors | 720 | 34% | Medium |
| Timeout Errors | 180 | 8% | High |
| Total Errors | 2,140 | 100% | - |
Error Rate Trend:
| Week | Total Requests | Errors | Error Rate |
|---|---|---|---|
| Week 1 | 4.2M | 580 | 0.14% |
| Week 2 | 4.5M | 520 | 0.12% |
| Week 3 | 4.4M | 480 | 0.11% |
| Week 4 | 4.6M | 560 | 0.12% |
Top Error Sources:
- AI API timeouts (OpenAI rate limits) - 32% of errors
- Database connection pool exhaustion - 18% of errors
- File upload size limits exceeded - 12% of errors
- Invalid authentication tokens - 11% of errors
User Behavior Metrics
Session Duration
Definition: Average time users spend in platform per session
Current Performance:
- Teachers: 24 minutes average
- Students: 18 minutes average
- Overall: 19.8 minutes average
Target: Maintain >15 minutes average (indicates meaningful engagement)
Session Duration by Feature:
| Feature | Avg Session Duration | Sessions/Week | Total Time/Week |
|---|---|---|---|
| Lesson Planner | 32 min | 4,850 | 2,587 hours |
| Auto-Grading | 18 min | 6,200 | 1,860 hours |
| Quiz Generator | 22 min | 3,840 | 1,408 hours |
| Progress Tracking | 12 min | 5,100 | 1,020 hours |
| Resource Library | 15 min | 2,900 | 725 hours |
Total Weekly Active Learning Hours: 12,840 hours across 145 schools = 88.6 hours/school/week
Monthly Average: 385 hours/school/month (target: 800 hours/school/month)
Opportunity: Increase session frequency and duration to reach target learning hours
Features Per Session
Definition: Average number of distinct features used in single session
Current Performance: 2.4 features per session
Target: >2.0 features per session (indicates feature discovery and cross-usage)
Session Patterns:
| Features Used | % of Sessions | Avg Duration | User Satisfaction |
|---|---|---|---|
| 1 feature | 42% | 12 min | 3.2/5 |
| 2 features | 31% | 18 min | 3.8/5 |
| 3 features | 18% | 26 min | 4.2/5 |
| 4+ features | 9% | 38 min | 4.6/5 |
Insight: Multi-feature sessions have higher satisfaction and longer duration
Feature Discovery Rate
Definition: Percentage of users discovering new features each month
Current Performance: 28% of users try a new feature monthly
Target: >25% monthly feature discovery
Discovery Methods:
| Method | % of Discoveries | Effectiveness |
|---|---|---|
| In-app prompts | 42% | High |
| Email campaigns | 25% | Medium |
| Onboarding wizard | 18% | High |
| User exploration | 12% | Low |
| Support team | 3% | Medium |
Strategic Focus: Increase in-app feature prompts and improve onboarding wizard
Retention Metrics
User Retention Cohorts
7-Day Retention (users active 7 days after first session):
| Signup Month | New Users | Day 7 Active | Retention % |
|---|---|---|---|
| Aug 2025 | 1,840 | 1,288 | 70% |
| Sep 2025 | 2,120 | 1,590 | 75% |
| Oct 2025 | 2,280 | 1,789 | 78% |
| Nov 2025 | 1,920 | 1,536 | 80% |
30-Day Retention (users active 30 days after first session):
| Signup Month | New Users | Day 30 Active | Retention % |
|---|---|---|---|
| Jul 2025 | 1,680 | 1,025 | 61% |
| Aug 2025 | 1,840 | 1,195 | 65% |
| Sep 2025 | 2,120 | 1,421 | 67% |
| Oct 2025 | 2,280 | 1,573 | 69% |
Target: 75%+ 7-day retention, 65%+ 30-day retention
Feature-Specific Retention
Retention Rate by First Feature Used:
| First Feature | 7-Day Retention | 30-Day Retention |
|---|---|---|
| AI Lesson Planner | 82% | 72% |
| Quiz Generator | 76% | 68% |
| Resource Library | 68% | 58% |
| Progress Tracking | 72% | 65% |
| Curriculum Alignment | 79% | 71% |
Insight: Users starting with AI features have higher retention
Monitoring and Alerts
Product Health Alerts
Critical (Immediate Action):
- Platform uptime <99.5% for >15 minutes
- Error rate >1% for >5 minutes
- DAS decline >15% day-over-day
- Major feature completely unavailable
Warning (Review Within 2 Hours):
- Load time >5s for 95th percentile
- Error rate 0.5-1% sustained for >30 minutes
- DAS decline 10-15% day-over-day
- Feature adoption decline >10% week-over-week
Opportunity (Weekly Review):
- Feature adoption increase >20% week-over-week
- New feature usage pattern detected
- Power user segment growth >10%
- Session duration increase >15%
Product Metric Goals (2026)
| Metric | Current | Q1 2026 | Q2 2026 | Q4 2026 |
|---|---|---|---|---|
| DAS % | 75% | 77% | 80% | 82% |
| WAU | 11,240 | 13,500 | 16,000 | 20,000 |
| Feature Adoption (5+) | 60% | 65% | 70% | 75% |
| AI Usage Rate | 70% | 75% | 80% | 85% |
| Platform Uptime | 99.94% | 99.95% | 99.96% | 99.97% |
| Avg Session Duration | 19.8 min | 21 min | 23 min | 25 min |
| Features/Session | 2.4 | 2.6 | 2.8 | 3.0 |
Next Steps
- Growth Metrics - Acquisition, activation, and expansion metrics
- Product Analytics - Deep-dive into user behavior and feature performance
- User Analytics - User segmentation and journey analysis
- Feature Analytics - Feature-specific performance tracking