Product Analytics Overview
YeboLearn's product analytics framework provides deep insights into how schools, teachers, and students interact with our platform. This data drives product decisions, feature prioritization, and user experience improvements.
Product Analytics Philosophy
Our product analytics approach is built on three core principles:
- User-Centric: Understand behavior, not just metrics
- Actionable: Every insight should inform product decisions
- Holistic: Track the complete user journey from signup to power user
Analytics Framework
Data Collection Strategy
Event Tracking Architecture:
User Actions → Segment → Data Warehouse (BigQuery) → Analytics Tools
↓
Mixpanel (Real-time Product Analytics)
Metabase (Business Intelligence)
Custom Dashboards (Executive Views)Event Categories:
| Category | Events Tracked | Purpose | Volume/Day |
|---|---|---|---|
| Authentication | Login, logout, session | User access patterns | 28,000 |
| Feature Usage | Feature opens, interactions, completions | Adoption tracking | 145,000 |
| Content Creation | Lessons, quizzes, resources created | Value delivery | 38,000 |
| AI Interactions | AI requests, completions, ratings | AI usage patterns | 52,000 |
| Engagement | Page views, clicks, time on page | Platform navigation | 285,000 |
| Collaboration | Shares, comments, invites | Social features | 12,000 |
| Administrative | Settings changes, permissions | Platform management | 8,500 |
Total Events Tracked Daily: ~568,500 events
User Segmentation
Primary Segments:
| Segment | Definition | Size | Key Metrics |
|---|---|---|---|
| Power Users | 8+ features, daily usage, high AI adoption | 18 schools | 98% retention, $3,200 ARPU |
| High Engagement | 5-7 features, 4+ days/week usage | 45 schools | 94% retention, $2,100 ARPU |
| Medium Engagement | 3-4 features, 2-3 days/week usage | 52 schools | 88% retention, $1,650 ARPU |
| Low Engagement | 1-2 features, <2 days/week usage | 22 schools | 68% retention, $1,100 ARPU |
| At-Risk | Declining usage, <1 feature/week | 8 schools | 42% retention, $850 ARPU |
Engagement Correlation: Direct relationship between engagement level and retention/ARPU
Secondary Segments:
- By Role: Teachers, Students, Administrators, Parents
- By School Type: Private, Public, Charter, Religious
- By Region: Gauteng, Western Cape, KZN, Eastern Cape, Other
- By Tier: Enterprise, Professional, Essentials
- By Tenure: New (❤️ months), Growing (3-12 months), Mature (12+ months)
Behavior Tracking
User Journey Tracking:
Signup → Onboarding → First Value → Regular Usage → Power User
↓ ↓ ↓ ↓ ↓
100% 95% 80% 65% 12%Key Conversion Points:
- Signup → Onboarding: 95% (5% never complete setup)
- Onboarding → First Value: 84% (reach 5+ features in 14 days)
- First Value → Regular Usage: 81% (become weekly active)
- Regular Usage → Power User: 18% (adopt 8+ features, daily usage)
Journey Optimization Focus: Improve "First Value → Regular Usage" (currently 81%, target 85%)
Feature Performance Tracking
Feature Health Metrics:
| Metric | Definition | Measurement |
|---|---|---|
| Adoption Rate | % of schools using feature monthly | Monthly calculation |
| Engagement Rate | % of adopters using feature weekly | Weekly calculation |
| Stickiness | DAU/MAU ratio for feature | Daily tracking |
| Retention Impact | Retention difference with/without feature | Cohort analysis |
| Time to Adoption | Days from signup to first use | User journey tracking |
| Depth of Use | Actions per session with feature | Session analysis |
Feature Lifecycle Stages:
- Launch (0-30 days): Track initial adoption, identify early issues
- Growth (30-90 days): Monitor adoption curve, optimize onboarding
- Maturity (90+ days): Measure steady-state usage, plan enhancements
- Decline (usage decreasing): Identify causes, plan refresh or sunset
Analytics Tools and Platforms
Mixpanel (Product Analytics)
Primary Use Cases:
- Real-time feature usage tracking
- User funnel analysis
- Cohort retention analysis
- A/B test result tracking
- User path analysis
Key Dashboards:
- Daily Product Health (engagement, adoption, errors)
- Feature Performance (individual feature deep-dives)
- User Cohort Analysis (retention, behavior patterns)
- Conversion Funnels (signup, onboarding, feature adoption)
Data Retention: 5 years of event data
Metabase (Business Intelligence)
Primary Use Cases:
- Custom SQL queries on data warehouse
- Business metric dashboards
- Cross-functional reporting
- Executive summaries
- Financial analytics integration
Key Dashboards:
- Executive Dashboard (high-level KPIs)
- Revenue Analytics (MRR, churn, expansion)
- School Health Scores (engagement + financial metrics)
- Product Roadmap Impact (feature ROI analysis)
Update Frequency: Real-time for critical metrics, hourly for standard reports
Google Analytics (Website Analytics)
Primary Use Cases:
- Marketing website traffic
- Lead generation tracking
- Content performance
- SEO monitoring
- Conversion rate optimization
Not Used For: In-app product analytics (handled by Mixpanel)
Custom Dashboards (Internal Tools)
Built For:
- Real-time operational monitoring
- Customer Success team workflows
- Sales pipeline integration
- Support ticket correlation with usage
- Engineering performance monitoring
A/B Testing Framework
Testing Philosophy
When to A/B Test:
- New feature releases (test adoption approaches)
- UI/UX changes (test interface variations)
- Onboarding flows (test completion rates)
- Pricing changes (test conversion impact)
- Marketing copy (test messaging effectiveness)
When Not to Test:
- Critical bug fixes (ship immediately)
- Obvious improvements (don't delay value)
- Insufficient traffic (need statistical significance)
- Legal/compliance requirements (no choice)
Testing Process
Standard A/B Test Workflow:
Hypothesis Definition (Day 0)
- Problem statement
- Proposed solution
- Success metrics
- Minimum detectable effect
Test Design (Days 1-2)
- Control vs variant definition
- Sample size calculation
- Test duration estimate
- Randomization strategy
Implementation (Days 3-5)
- Build variant
- Implement tracking
- QA test both versions
- Set up monitoring
Test Execution (Days 6-20)
- Launch to percentage of users
- Monitor for errors
- Track metrics daily
- Ensure sample size reached
Analysis (Days 21-22)
- Statistical significance check
- Secondary metric review
- Segment analysis
- Decision recommendation
Rollout (Days 23-30)
- Winner to 100% of users
- Monitor for issues
- Document learnings
- Archive test results
Statistical Requirements:
- Minimum Sample Size: 1,000 users per variant
- Confidence Level: 95%
- Minimum Test Duration: 14 days (capture weekly patterns)
- Maximum Test Duration: 30 days (avoid metric drift)
Active Tests and Results
Recent Test Results (Last 90 Days):
| Test | Variants | Winner | Improvement | Impact |
|---|---|---|---|---|
| Onboarding Flow | 3-step vs 5-step | 3-step | +15% completion | Launched |
| AI Feature Prompts | In-context vs Modal | In-context | +22% adoption | Launched |
| Dashboard Layout | Cards vs List | Cards | +8% engagement | Launched |
| Quiz Generator UI | Wizard vs Form | Wizard | +18% completion | Launched |
| Pricing Page | Feature table vs Comparison | Comparison | +12% signups | Launched |
Current Active Tests:
- Parent Portal Onboarding: Email invite vs In-app notification
- Lesson Planner Templates: 10 templates vs 25 templates
- Upgrade CTA Placement: Dashboard banner vs Feature limit modal
User Behavior Patterns
Session Analysis
Average Session Metrics:
| User Type | Sessions/Week | Avg Duration | Pages/Session | Features/Session |
|---|---|---|---|---|
| Teachers | 8.5 | 24 min | 12.4 | 3.2 |
| Students | 4.2 | 18 min | 6.8 | 1.8 |
| Admins | 3.1 | 32 min | 18.6 | 5.4 |
| Parents | 1.8 | 8 min | 4.2 | 1.2 |
Session Depth Distribution:
| Depth Level | % of Sessions | Avg Duration | Conversion to Next Session |
|---|---|---|---|
| Shallow (1-2 pages) | 28% | 3 min | 42% |
| Medium (3-5 pages) | 38% | 12 min | 68% |
| Deep (6-10 pages) | 24% | 24 min | 85% |
| Very Deep (11+ pages) | 10% | 42 min | 92% |
Insight: Deeper sessions have higher return rates. Focus on encouraging exploration.
Feature Navigation Patterns
Most Common Feature Paths (Top 5):
Dashboard → Lesson Planner → Quiz Generator (28% of sessions)
- Use case: Complete lesson planning workflow
- Avg completion time: 32 minutes
- Satisfaction: 4.5/5
Dashboard → Auto-Grading → Progress Tracking (18% of sessions)
- Use case: Grading and student monitoring
- Avg completion time: 22 minutes
- Satisfaction: 4.6/5
Dashboard → Resource Library → Lesson Planner (15% of sessions)
- Use case: Resource discovery and lesson creation
- Avg completion time: 28 minutes
- Satisfaction: 4.3/5
Dashboard → Student Analytics → Parent Portal (8% of sessions)
- Use case: Student performance review and parent communication
- Avg completion time: 18 minutes
- Satisfaction: 4.2/5
Dashboard → AI Lesson Planner → Curriculum Alignment (7% of sessions)
- Use case: Standards-aligned lesson creation
- Avg completion time: 26 minutes
- Satisfaction: 4.7/5
Drop-off Points:
- 22% drop-off at Quiz Generator (complexity barrier)
- 18% drop-off at Parent Portal (unclear value)
- 12% drop-off at Student Analytics (information overload)
Time-Based Usage Patterns
Peak Usage Times (UTC+2):
| Time Block | Teacher Usage | Student Usage | Total Events |
|---|---|---|---|
| 6am-8am | High (prep) | Low | 38,000 |
| 8am-10am | Peak (teaching) | Medium | 95,000 |
| 10am-12pm | High (teaching) | Medium | 82,000 |
| 12pm-2pm | Medium (lunch) | Peak (learning) | 68,000 |
| 2pm-4pm | Medium (teaching) | Peak (learning) | 112,000 |
| 4pm-6pm | High (grading) | Medium | 78,000 |
| 6pm-8pm | Medium (planning) | Low | 42,000 |
| 8pm-10pm | Low | Low | 18,000 |
Infrastructure Implications:
- Scale up servers 1:30pm-4:30pm (peak student usage)
- Optimize for mobile during 6am-8am (teachers at home)
- Schedule maintenance 10pm-6am (minimal impact)
Device and Platform Distribution
Access Methods:
| Platform | % of Sessions | Avg Session Duration | Primary User |
|---|---|---|---|
| Desktop Web | 58% | 28 min | Teachers |
| Mobile Web | 28% | 12 min | Teachers (planning) |
| Tablet | 12% | 22 min | Students |
| Mobile App | 2% | 8 min | Parents |
Browser Distribution (Desktop):
- Chrome: 72%
- Safari: 18%
- Firefox: 7%
- Edge: 3%
OS Distribution (Mobile):
- Android: 78%
- iOS: 22%
Platform Optimization Priorities:
- Desktop web (highest usage, teachers)
- Mobile web (growing, important for access)
- Tablet (student learning experience)
- Mobile app (lower priority, limited use case)
User Feedback Integration
In-App Feedback Collection
Feedback Mechanisms:
| Mechanism | Response Rate | Monthly Responses | Quality Score |
|---|---|---|---|
| Feature rating (1-5 stars) | 12% | 4,200 | High |
| NPS survey (quarterly) | 38% | 1,850 | High |
| Bug report button | 2% | 680 | Medium |
| Feature request form | 5% | 1,420 | High |
| Live chat feedback | 8% | 2,240 | Very High |
Feedback Analysis:
- Automated sentiment analysis on all text feedback
- Weekly product team review of themes
- Monthly executive summary of user voice
- Quarterly deep-dive on feature requests
User Research Program
Research Activities:
| Activity | Frequency | Participants | Purpose |
|---|---|---|---|
| User interviews | Weekly | 4-6 users | Deep problem understanding |
| Usability testing | Bi-weekly | 6-8 users | Feature validation |
| Survey campaigns | Monthly | 200+ users | Quantitative insights |
| Beta testing | Per feature | 15-20 schools | Early feedback |
| Advisory board | Quarterly | 12 schools | Strategic input |
Research Impact: 75% of major features informed by user research
Data Privacy and Security
Data Collection Principles
What We Track:
- Feature usage and interaction patterns
- Performance metrics (load times, errors)
- Aggregate behavior (anonymized where possible)
- User-initiated feedback
What We Don't Track:
- Student content or work (POPIA/GDPR compliant)
- Personal student information beyond usage
- Third-party tool usage outside our platform
- Teacher grading decisions or comments
Data Retention:
- Usage events: 5 years
- Personal data: Retained while customer active + 90 days
- Aggregated analytics: Indefinite (anonymized)
Compliance
Regulatory Compliance:
- POPIA (Protection of Personal Information Act, South Africa)
- GDPR (for any European users)
- FERPA principles (student privacy, US standard)
- ISO 27001 (information security)
User Controls:
- Opt-out of non-essential analytics
- Data export on request
- Account deletion with data purge
- Usage report access for administrators
Analytics Team Structure
Roles and Responsibilities
Product Analytics Lead:
- Define analytics strategy
- Ensure data quality and governance
- Train team on analytics tools
- Present insights to executive team
Data Analysts (2):
- Build dashboards and reports
- Conduct ad-hoc analysis
- Support A/B test design and analysis
- Create weekly/monthly analytics summaries
Data Engineers (2):
- Maintain data pipeline
- Ensure event tracking accuracy
- Optimize data warehouse performance
- Build custom integrations
Product Managers (All):
- Define feature success metrics
- Review analytics weekly
- Use data to inform roadmap
- Share insights with stakeholders
Success Metrics for Analytics Function
Analytics Team Goals:
| Metric | Current | Target |
|---|---|---|
| Dashboard uptime | 99.8% | 99.9% |
| Data freshness | <15 min | <5 min |
| Metric accuracy | 99.2% | 99.5% |
| Self-service usage | 68% | 80% |
| Time to insight | 2.5 days | 1 day |
| PM satisfaction | 8.2/10 | 9/10 |
Impact Metrics:
- Product decisions backed by data: 85% (target: 90%)
- Features validated with A/B tests: 62% (target: 75%)
- User research participants recruited via analytics: 78%
- Revenue attributed to data-driven decisions: $1.2M ARR
Next Steps
For detailed analytics deep-dives:
- Product Dashboards - Real-time monitoring and executive views
- Feature Analytics - Feature-specific performance tracking
- User Analytics - User segmentation and behavior analysis
- Product Metrics - Core product KPIs and targets