The Coders Guild

Metrics Tree

What gets measured, how it's tracked, where the gaps are, and what should be measured to connect leading indicators to business outcomes.

What success looks like

Business Goals

The outcomes the business is driving towards. Every metric below should connect back to one of these.

Growth
60 apprentices enrolled by end of June 2026
Then 100 ("self-fulfilling machine"), then 200 by Christmas
Current: ~8 live apprentices, ~20 pipeline slots filled (March 2026)
Quality at Scale
Maintain achievement rates as cohort sizes grow
85%+ achievement rate at 60+ apprentices. 100% pass rate sustained.
Current: 85.71% achievement at 25 learners. Quality improved as cohorts got smaller.
Operational Efficiency
Scale without proportional headcount increase
60-80 apprentices before needing new FTE hire. Automate the "paper element".
Current: Almost everything manual. Google workbook (matrix) is the bottleneck.
Target: 60 apprentices enrolled by end of June 2026. Cohort economics: 16 per cohort is optimal, 12 is workable, below that unit economics suffer but delivery effort is roughly the same.
Outcomes

Lagging Indicators

Results metrics. These tell you how the business performed, but by the time they move it's too late to change course. Every lagging indicator should have leading indicators feeding into it.

85.7%
Achievement Rate (25-26)
100%
Pass Rate (25-26)
85.7%
Retention Rate (25-26)
4%
Withdrawal Rate
?
NPS (not shared)
Quality & Compliance Outcomes
Metric
Current
Target
How Tracked
Status
Achievement Rate (QAR)
% of leavers who achieve qualification
85.71% (25-26)
85%+
PICS/LMI Achievement Rates dashboard. Reviewed monthly at quality meeting. Shelley only.
Tracked
Retention Rate
% of starters who complete (don't withdraw)
85.71% (25-26)
90%+
PICS/LMI QAR dashboard. Year-on-year trend: 89.47% in 24-25.
Tracked
Pass Rate
% of achievers who pass end-point assessment
100% (25-26)
95%+
PICS/LMI QAR dashboard. Shelley monitors and feeds back to quality meeting.
Tracked
Withdrawal Rate
% and count of apprentices who withdraw
4% (1 of 25)
<5%
PICS/LMI AAF Dashboard. Shown as % and absolute number.
Tracked
Past Planned End Date
Apprentices who haven't completed by planned end
8% past 180+ days (2 of 25)
0%
PICS/LMI AAF Dashboard. Split by 90-180 days and 180+ days.
Tracked
NPS (Net Promoter Score)
Aggregate satisfaction, splittable by learner and employer
Tracked but value not shared
-
Monday.com feedback surveys. Reported at quality meeting. Aggregate and by segment.
Tracked
OTJ Hours Compliance
Planned vs actual off-the-job hours
0 issues (0 of 25)
100% compliant
PICS/LMI AAF Dashboard + Laravel App (apprentice self-logging). ~420 hours standard. Shelley also monitors government sites monthly beyond just PICS data.
Tracked
Growth & Revenue Outcomes
Metric
Current
Target
How Tracked
Status
Total Apprentices Enrolled
Active apprentices on programme
~8 (March 2026)
60 by June 2026
PICS enrolment records + Monday.com CRM deals board. No single view.
Partial
Pipeline Slots Filled
Committed but not yet enrolled
~20 of 60
60
Monday.com CRM deals board. Not visible to delivery team.
Partial
Revenue per Apprentice
Funding amount per enrolment
Varies by SGA/duration
-
Funding calculator (spreadsheet). Per-apprentice, not aggregated anywhere.
Partial
Employer Retention / Cross-sell
Repeat business from existing employers
Not tracked
-
Not tracked anywhere. Recognised as important but no system or process for it. Shelley also proactively reaches out with information updates (awards, NMW changes). Balance of not bombarding but ensuring support.
Gap
Government Payment Submissions
Monthly funding claims to government
Monthly
Monthly
PICS data submission. Shelley runs additional reports. Noted as potentially out of scope.
Tracked
Predictive

Leading Indicators

Metrics that predict future outcomes. If these move in the wrong direction, lagging indicators will follow. Most of these are either not tracked or partially tracked.

Sales & Pipeline Drives: Total Apprentices Enrolled, Pipeline Slots Filled
Metric
Current
Target
How Tracked
Status
Email Open Rate
Outbound campaign performance
60-70% (10KSB list)
-
Apollo (cold) + Monday.com (CRM sequences). Different systems per channel.
Partial
Conversion Rate by Stage
Prospect > Lead > Qualified > Deal > Enrolled
Not tracked
-
Not tracked. Stage transitions happen in Monday.com but no reporting on conversion between stages.
Gap
Conversion Rate by Channel
Which channels produce enrolled apprentices
Not tracked
-
Lead source captured in Monday.com but not connected to enrolment outcomes in PICS.
Gap
Speed Through Pipeline
Time from first contact to enrolled
~12 weeks (estimated)
2 weeks (aspiration)
Not measured. Dates exist in Monday.com but nobody calculates the deltas.
Gap
Pipeline Value
Number of apprentices in pipeline by stage
~20 slots filled
60
Monday.com deals board. Visible to Francesca/Crispin only. Rob has no sight of this.
Partial
Prospect List Size
ICP-matched companies in Apollo
1,500 (10KSB, exhaustible)
-
Apollo ICP filtering. Francesca manages. One-time list, not replenishing.
Partial
Onboarding Speed Drives: Total Enrolled, Employer Satisfaction (NPS), Compliance
Metric
Current
Target
How Tracked
Status
Time in Compliance Documentation
Days from conversion email to all docs received
Not measured
-
Not tracked. Documents chased manually via email. No dates recorded for request vs receipt.
Gap
DAS Connection Time
Days from request to confirmed connection
Not measured (weeks for levy transfer)
-
Not tracked. Levy transfers are the known bottleneck. Payment type only recently surfaced earlier.
Gap
Assessment Turnaround
Days from SGA/E&M sent to reviewed
Not measured (weeks)
3-5 days
Google Forms (SGA) + BKSB/PICS (E&M). Sent manually, reviewed manually. No timestamps compared.
Gap
Document Completion Rate
% of required docs received vs outstanding
Not measured
100% before start
Shelley manually tracks in Google Drive folders. No dashboard or aggregate view.
Gap
Delivery Quality Drives: Achievement Rate, Retention, NPS, Pass Rate
Metric
Current
Target
How Tracked
Status
Session Feedback Scores
Learner and trainer feedback after each session
Tracked (threshold: 3 and below flagged)
No negative flags
Monday.com feedback forms. Automations flag negatives. Reviewed at monthly quality meeting.
Tracked
Session Feedback Response Rate
% of surveys completed vs sent
Not measured
-
Monday.com surveys sent after sessions. "Not everybody completes it" (Mar 17). Nobody measures the completion rate - only the scores of those who respond.
Gap
Trainer Feedback Completion Rate
% of trainers submitting post-session feedback
Tied to invoices
100%
Trainers must submit feedback on every session before invoices are paid (Mar 17). Financial incentive mechanism. Tracked informally - no dashboard showing compliance rate.
Partial
Dip Testing Completion
Recorded sessions reviewed for quality
Done but no coverage metric
-
Coaching/training meetings recorded. Dip tested by Shelley/Crispin. No tracking of % reviewed or frequency.
Partial
Coaching Session Completion Rate
% of planned coaching sessions that actually happened
Not measured
100%
Sessions scheduled on Monday.com and Google Calendar. Nobody checks whether they all happened.
Gap
Learner Progress (on track / behind)
Whether apprentice is hitting milestones
Checked manually per learner
Real-time visibility
Google workbook (matrix) reviewed by coach in meetings. No aggregate view. Shelley checks individually.
Partial
Content Readiness
Is material ready before each session?
Not measured
100% ready 1 week before
Rob manages via Monday.com delivery board. "Edits needed" status exists but no reporting on it.
Gap
Capacity & Resource Drives: Quality at Scale, Operational Efficiency, Delivery Risk
Metric
Current
Target
How Tracked
Status
Coach Caseload
Learners per coach (max 22 before burnout)
Known informally
Max 22 per FT coach
Managed through conversations. Not stored in any system. Shelley has it in her head. Explicitly flagged as needing a visual dashboard for scaling (Mar 17).
Gap
Trainer Availability
Hours/month each trainer can deliver
15-30 hrs/month (varies)
-
Rob asks trainers ad hoc. Not stored. No system. 4-5 months advance notice needed.
Gap
Capacity Utilisation
% of available trainer/coach capacity in use
Not calculated
70-85%
Cannot be calculated because neither capacity nor utilisation is recorded.
Gap
Cohort Size vs Optimal
Actual cohort enrolment vs 16 target
Varies
16 per cohort
Monday.com delivery board. Visible to Rob. Not connected to pipeline data.
Partial
Operational cadence

Operational Metrics

The rhythm of the business. These aren't KPIs in the traditional sense - they're the operational constants that define how TCG runs day-to-day.

Delivery Cadence
1-2 group training sessions/month per cohort
1 group coaching session/month per cohort
12-13 months programme duration
~420 off-the-job hours standard
Management Rhythm
Monthly - Shelley meets each coach (all learners reviewed)
Monthly - Quality meeting (all feedback aggregated)
Monthly - Government payment submission via PICS
Quarterly - Standardisation meetings
Stakeholder Touchpoints
~8 weeks - Employer progress reviews
~8 weeks - Shelley pastoral meeting with each apprentice
Immediate - Coach escalation via email/Slack
Ad hoc - Employer updates (awards, NMW changes)
Capacity Constraints
22 max learners per full-time coach
60-80 apprentices with current pool
40-50 triggers FTE recruitment
6 weeks minimum trainer booking lead time
Sales Activity
~1 hr/day LinkedIn engagement (Francesca)
Monthly webinars via Eventbrite
2.5 people doing sales (capacity constraint)
1,500 prospect list (one-time, exhaustible)
Transformation Targets
Assessment turnaround: weeks → 3-5 days
Onboarding speed: weeks → 1 day
Sales cycle: ~12 weeks → 2 weeks
Content creation: 3-6 months lead time
Recommended

Proposed Metrics

Metrics that aren't tracked today but would connect leading indicators to business outcomes. Ordered by impact - the first few would make the biggest difference at the current stage.

High Impact
Pipeline Velocity
Days from first contact to enrolled, broken by stage. The single most important metric for hitting 60 by June. If onboarding takes 12 weeks and it's March, you can count backwards to see if the maths works.
Drives:
Total apprentices enrolled, cohort fill rates
Source data:
Monday.com stage dates (already captured, not measured)
Owner:
Francesca / Crispin
High Impact
Coach Caseload Dashboard
Learners per coach, visualised. Shelley said "having a visual plan of people's capacity is key as they scale." Without this, the 22-learner burnout limit is invisible until someone breaks.
Drives:
Retention rate, achievement rate, quality at scale
Source data:
Needs creating - coach assignments not stored centrally
Owner:
Shelley
High Impact
Conversion Rate by Stage
What % of prospects become leads, leads become deals, deals become enrolled? Without this, you can't diagnose whether the problem is top-of-funnel volume or mid-funnel drop-off.
Drives:
Total enrolled, pipeline planning, channel investment decisions
Source data:
Monday.com stage transitions (captured, not reported)
Owner:
Francesca / Crispin
High Impact
Compliance Document Completion Rate
% of required documents received per employer vs outstanding. The gap between "deal committed" and "enrolled" is largely compliance paperwork. If you can see the bottleneck you can chase it.
Drives:
Pipeline velocity, enrolment speed, employer NPS
Source data:
Google Drive folders (documents exist, no tracking of completeness)
Owner:
Francesca / Shelley
Medium Impact
Learner On-Track Rate
% of active apprentices who are on track vs behind. Currently checked per-learner in the matrix. An aggregate number in a dashboard would give Shelley, Crispin, and employers instant visibility.
Drives:
Achievement rate, past-end-date %, employer satisfaction
Source data:
Google workbook (matrix) - would need automation or Laravel app integration
Owner:
Shelley / Coaches
Medium Impact
Coaching Session Completion Rate
Did every planned coaching and training session actually happen? If sessions get missed or rescheduled, learners fall behind. This is a leading indicator of the achievement rate.
Drives:
Achievement rate, learner progress, OTJ hours
Source data:
Monday.com delivery board + Google Calendar (scheduled vs happened)
Owner:
Rob
Medium Impact
Channel Attribution to Enrolment
Which lead sources actually produce enrolled apprentices? The 10KSB list has 60-70% open rates but that means nothing if they don't convert. With the list exhaustible, knowing which channels work long-term is critical.
Drives:
Marketing spend decisions, long-term pipeline health
Source data:
Monday.com (lead source) + PICS (enrolment) - currently disconnected
Owner:
Francesca / Crispin
Medium Impact
Trainer Capacity Dashboard
Available hours per trainer, booked hours, and remaining capacity. Rob currently manages this through conversations. At 3+ concurrent cohorts, this becomes unmanageable without a visual.
Drives:
Cohort scheduling confidence, delivery risk, quality
Source data:
Needs creating - trainer availability not stored anywhere
Owner:
Rob
Future
Employer Lifetime Value
Revenue per employer over time including repeat cohorts, cross-selling to different standards, and CPD. Currently 0% commercial retention post-apprenticeship. Even tracking "has this employer come back?" would be a start.
Drives:
Revenue growth, sales efficiency, relationship strategy
Source data:
Monday.com CRM (deals by org) - exists but nobody looks at it this way
Owner:
Crispin / Francesca
Future
Dip Testing Coverage Rate
% of coaches/trainers dip tested in last quarter. Currently "done" but with no coverage tracking. As the team grows, this becomes an audit risk if some coaches are never reviewed.
Drives:
Delivery quality, audit readiness, coach development
Source data:
Needs creating - no record of which sessions reviewed
Owner:
Shelley
How it fits together

Metric Connections

How leading indicators drive lagging outcomes. This is the logic for the metrics tree - if you can move the leading indicators, the lagging indicators follow.

Goal: 60 by June
Growth Chain
Prospect list size → determines outreach volume
Email open/response rate → prospects become leads
Conversion rate by stage → leads become deals
Pipeline velocity → deals become enrolled in time
Compliance completion rate → enrolled means paperwork done
= Total apprentices enrolled
Goal: Quality at Scale
Quality Chain
Coach caseload → overloaded coaches miss things
Session completion rate → missed sessions = learners behind
Learner on-track rate → early warning of problems
Feedback scores → session quality drives engagement
Dip testing coverage → quality assurance at scale
= Achievement rate + retention + NPS
Goal: Operational Efficiency
Efficiency Chain
Capacity utilisation → are trainers optimally loaded?
Cohort size vs optimal → unit economics per cohort
Pipeline velocity → faster = less admin per apprentice
Content readiness → no scrambling = no wasted time
Trainer availability (stored) → scheduling without phone tag
= Scale without proportional headcount
The core insight: TCG tracks lagging indicators well through PICS. What's almost entirely missing is the leading indicators that predict those outcomes. By the time achievement rate drops, it's 12 months too late. The proposed metrics above close that gap - they give Crispin, Shelley, and Rob early warning signals connected to the outcomes they care about.
Working document

March 2026. Sources: Feb discovery, Mar 4 coordination, Mar 10 and Mar 17 process mapping workshops, Rob's ops notes, Shelley's Mar 18 PICS dashboard screenshots, and product strategy discovery questions.

Created by Specs for The Coders Guild