WS1.5 — Candidate Evaluation

WS1.5 — Candidate Evaluation

Purpose: Systematic evaluation of 3 navigation candidates by walking each through 6 real admin workflows, scoring on 7 criteria, and producing a data-driven recommendation with sensitivity analysis and risk assessment. Inputs: WS1.2 (workflow maps), WS1.3 (pre-filter), WS1.4 (candidate structures), Interview Report Date: March 2026


Part A: Workflow Walkthroughs (6 Workflows x 3 Candidates)

Each walkthrough traces the exact screen path an admin would follow using the candidate's navigation tree from WS1.4, then measures screen count, click estimate, information co-location, external tool dependencies, billing accessibility, and improvement over the current state documented in WS1.2.


Workflow 1: Set Up a New Semester

Current state (WS1.2): 19+ screens | 50-80+ clicks | 3 external tools (Vimeo, Sheets, InfusionSoft) | Pain: Critical

Dimension Candidate A: Domain-First Candidate B: Hub-and-Spoke Candidate C: Entity-Centric
Screen sequence Dashboard (check phase prompt) → Semester Mgmt > Semesters → Semester Hub Page [Overview tab: create semester, configure dates] → [Setup Checklist tab: review content status per type/level] → Content > Video Lessons (create/clone per level) → Content > Resources → Content > Recordings → Content > Tutorials → Content > MCQ Questions → Content > Quizzes → return to Semester Hub [Setup Checklist tab: verify] → Semester Mgmt > Welcome Package → Semester Mgmt > Email Management (update templates) → Semester Mgmt > Tags → Teacher Mgmt > Teacher Assignment Criteria → Scheduling > TA Schedules → Semester Hub [Overview tab: activate semester] Dashboard (phase tile: "Semester setup: 0 of N items configured" — click tile) → Semesters > Semester Setup Checklist (see all content types and config status) → Content > Video Lessons → Content > Resources → Content > Recordings → Content > Tutorials → Content > MCQ Questions → Content > Quizzes → Semesters > Welcome Package → Semesters > Email Management → Semesters > Tags → Teachers > Teacher Assignment Criteria → Scheduling > TA Schedules → Semesters > Semester Setup Checklist (verify all green) → Semesters > Semesters (activate) Dashboard → Semesters > [Semester Entity Page] [Overview tab: create semester] → [Content Status tab: review all 7 content types inline, create/clone per type per level — Video Lessons, Submissions, Resources, Recordings, Tutorials, MCQ Questions, Quizzes all managed within this tab] → [Welcome Package tab: upload/clone] → [Automation tab: Email Management + Tags] → [Scheduling tab: Live Sessions] → Teachers > [TA Entity Page] > [Assignment Rules tab] → Teachers > [TA Entity Page] > [Schedule tab] → Semesters > [Semester Entity Page] [Overview tab: activate]
Screen count 10-12 distinct screens (Semester Hub + 6 Content screens + Welcome Package + Email Mgmt + Tags + Teacher Assignment + TA Schedules + Dashboard) 10-13 distinct screens (Dashboard + Setup Checklist + 6 Content screens + Welcome Package + Email Mgmt + Tags + Teacher Assignment + TA Schedules + Semesters detail) 4-5 distinct screens (Semester Entity Page with 5 tabs + TA Entity Page with 2 tabs; content screens are embedded within the Semester Entity Content Status tab)
Click estimate 25-40 clicks (sidebar nav between 4 domains: Semester Mgmt, Content, Teacher Mgmt, Scheduling; plus tab switches within Semester Hub; plus per-item create/clone clicks within each content screen) 25-40 clicks (similar cross-spoke navigation as A; setup checklist provides overview but admin still visits individual content screens in the Content spoke) 15-25 clicks (most navigation is tab switches within Semester Entity Page; only leave for TA Entity Page; content CRUD happens inline within Content Status tab)
Info co-location Partial — Setup Checklist tab on Semester Hub shows completion status, but content creation happens in separate Content section screens. Admin must navigate away from the hub to do the actual work, then return to verify. Partial — Setup Checklist is a dedicated screen that tracks status, but the actual content creation happens in separate spoke screens. The checklist is a monitoring tool, not a workspace. Yes — Content Status tab on the Semester Entity Page shows status AND provides inline access to content creation/editing for all 7 content types. Admin works within the semester context throughout. Welcome Package and Email Management are also tabs on the same entity page.
External tools still needed 1 (Vimeo for video URLs — structural, cannot be eliminated by nav redesign). InfusionSoft sync is automated. Google Sheets eliminated by Calendar View and TA Detail Pages. 1 (Vimeo). Same rationale as A. 1 (Vimeo). Same rationale as A.
Billing accessible in context No — Billing is not directly relevant during semester setup, but payment plan configuration for the upcoming semester would require visiting Billing & Payments domain separately. No — Same as A. Payment Plans live in the Billing spoke. No — Same. Payment Plans live in Cross-Entity: Billing.
Improvement over current Screen count: 19+ → 10-12 (37-47% reduction). Clicks: 50-80+ → 25-40 (50% reduction). External tools: 3 → 1 (67% reduction). Setup Checklist tab eliminates the manual verification pass (Step 16). But content creation still requires visiting 6 separate screens in the Content domain. Screen count: 19+ → 10-13 (32-47% reduction). Clicks: 50-80+ → 25-40 (50% reduction). External tools: 3 → 1 (67% reduction). Dashboard tile provides entry point, but spoke structure mirrors A's improvement profile. Screen count: 19+ → 4-5 (74-79% reduction). Clicks: 50-80+ → 15-25 (55-69% reduction). External tools: 3 → 1 (67% reduction). The Semester Entity Page absorbs 7 content types + Welcome Package + Email Mgmt + Tags into tabs, producing the deepest consolidation. This is the largest single improvement of any candidate on any workflow.

Workflow 2: Handle a Student Support Issue

Current state (WS1.2): 5-7 screens | 15-25 clicks | 3 external tools (Stripe, Sheets, Email) | Pain: High

Dimension Candidate A: Domain-First Candidate B: Hub-and-Spoke Candidate C: Entity-Centric
Screen sequence Student Mgmt > Students (search) → [Student Detail Page] [Profile tab: check basics] → [Submissions tab: check submission history/grades] → [Payments tab: check Stripe data, family plan, scholarship] → [Appointments tab: check scheduling] → [Semester History tab: check cross-semester context] → [Actions tab: take action — deactivate+cancel, reset, promote] Dashboard (if alert-driven: click failed payment alert or student-behind alert → arrives at filtered Students list) → Students > Students (search) → Student detail page (enhanced with payment status field and submission count) → Billing > Payments or student billing record (for detailed billing investigation) → Students > Submissions (if submission detail needed) → Students > Student Report (if report needed) Dashboard (search student name in global search) → Students > [Student Entity Page] [Profile tab] → [Submissions tab] → [Payments tab] → [Appointments tab] → [Semester History tab] → [Actions tab]
Screen count 1-2 (Student Detail Page with 6 tabs covers almost everything; only leave if bulk submission view needed) 3-5 (Student detail + Billing spoke for detailed payment investigation + Submissions list for detail + Student Report for cross-referencing) 1-2 (Student Entity Page with 6 tabs; deepest integration — full billing data inline, no need to visit Billing section for individual student issues)
Click estimate 5-8 clicks (navigate to Student Mgmt: 1, search: 1, open student: 1, tab switches: 2-4) 7-12 clicks (dashboard or nav to Students: 1-2, search: 1, open student: 1, click through to Billing spoke for details: 2-3, return for other views: 2-4) 4-7 clicks (global search: 1, select student: 1, tab switches: 2-5)
Info co-location Yes — Student Detail Page has Payments tab (Stripe data), Submissions tab, Appointments tab, and Semester History tab. All diagnostic information is on one page across tabs. Partial — Student detail page has payment status field and submission count, but detailed billing investigation requires navigating to the Billing spoke. The information is spread across the student detail page and the Billing spoke. Yes — Student Entity Page has the deepest Payments tab (full Stripe integration, family plan, scholarship/deferment), Submissions, Appointments, and History. Everything is on one entity page.
External tools still needed 0 (Stripe integrated via Payments tab, Google Sheets data absorbed into Family Plans/Scholarships, email remains for communication but issue queue replaces inbound reporting) 0 (same integrations, though payment detail requires spoke navigation rather than inline) 0 (same integrations, deepest inline integration)
Billing accessible in context Yes — Payments tab on Student Detail Page shows subscription status, payment history, family plan, scholarship/deferment, and action buttons. Admin never leaves student context. Partial — Student detail shows payment status summary, but full billing detail (history, plan adjustment, family plan investigation) requires clicking through to Billing spoke. The admin must leave the student context for detailed billing work. Yes — Payments tab on Student Entity Page is the deepest of any candidate: full payment history, subscription management, family plan details, scholarship/deferment terms, and inline actions (cancel, adjust, apply coupon).
Improvement over current Screen count: 5-7 → 1-2 (71-86% reduction). Clicks: 15-25 → 5-8 (60-68% reduction). External tools: 3 → 0 (100% reduction). The rich Student Detail Page directly addresses Interview Q2's consolidated profile request. Screen count: 5-7 → 3-5 (29-43% reduction). Clicks: 15-25 → 7-12 (40-53% reduction). External tools: 3 → 0 (100% reduction). Dashboard may speed up issue detection (alert-driven), but resolution requires more navigation than A or C because billing detail lives in a separate spoke. Screen count: 5-7 → 1-2 (71-86% reduction). Clicks: 15-25 → 4-7 (65-81% reduction). External tools: 3 → 0 (100% reduction). Global search provides the fastest path to the student. Deepest entity page means the most complete diagnosis without navigation.

Workflow 3: Close a Semester / Promote Students

Current state (WS1.2): 8+ screens | 40-70+ clicks | 3 external tools (Stripe, Sheets, Email) | Pain: Critical

Dimension Candidate A: Domain-First Candidate B: Hub-and-Spoke Candidate C: Entity-Centric
Screen sequence Semester Mgmt > Semesters → Semester Hub [Close Workflow tab: Phase 1 — EOC Review] → Student Mgmt > Students → [Student Detail Pages for edge cases] → Semester Hub [Close Workflow tab: Phase 2 — Promotions — bulk promote] → Semester Hub [Close Workflow tab: Phase 3 — Payment Setup — auto-generate Stripe subscriptions] → Billing & Payments > Payment Overview (verify) → Semester Hub [Close Workflow tab: Phase 4 — Deactivation — bulk deactivate + cancel subscriptions] → Student Mgmt > Promoted Students (verify) Dashboard (phase tile: "Semester close: Phase 1 of 4" — click tile) → Semesters > Semester Close Tracker [Phase 1: EOC Review] → Students > Student Report (cross-reference) → Students > Students (individual edge case profiles) → Semester Close Tracker [Phase 2: Promotions] → Semester Close Tracker [Phase 3: Payment Setup] → Billing > Payment Overview (verify) → Semester Close Tracker [Phase 4: Deactivation] → Students > Promoted Students (verify) Semesters > [Semester Entity Page] [Close Workflow tab: Phase 1 — EOC Review, with inline student data] → Students > [Student Entity Pages for edge cases, with Payments tab visible] → Semester Entity Page [Close Workflow tab: Phase 2 — Promotions] → Semester Entity Page [Close Workflow tab: Phase 3 — Payment Setup] → Cross-Entity: Billing > Payment Overview (verify) → Semester Entity Page [Close Workflow tab: Phase 4 — Deactivation] → Cross-Entity: Reports > Promoted Students (verify)
Screen count 4-6 (Semester Hub with Close Workflow tab + Student Detail Pages for edge cases + Payment Overview + Promoted Students) 5-7 (Dashboard + Close Tracker + Student Report + individual student profiles + Payment Overview + Promoted Students) 4-6 (Semester Entity Page with Close Workflow tab + Student Entity Pages for edge cases + Payment Overview + Promoted Students report)
Click estimate 15-25 clicks (Semester Hub is the orchestration center; tab switches within hub: 4-5; edge case student visits: 2-3 per student; verification: 3-5) 18-30 clicks (Dashboard to Close Tracker: 2; phase navigation within tracker: 4-5; cross-spoke navigation to Students: 3-5; Billing verification: 2-3; Promoted Students: 2) 12-22 clicks (Semester Entity Page is the orchestration center; tab/phase switches: 4-5; edge case student visits via inline links: 2-3 per student; verification: 3-5)
Info co-location Partial — Close Workflow tab orchestrates the sequence, but edge-case investigation requires navigating to Student Detail Pages (in a different domain). Payment verification requires visiting Billing domain. The gated sequence is well-orchestrated, but resolution touches 3 domains. Partial — Close Tracker is a standalone page that orchestrates the sequence, but student investigation and billing verification require navigating to separate spokes. More spoke-hopping than A because student profiles lack payment depth. Partial — Close Workflow tab orchestrates from the Semester Entity Page, and edge-case investigation uses rich Student Entity Pages (with inline billing). But the workflow still spans two entity types (Semester and Student). Slightly better co-location than A or B because Student Entity Pages have full billing inline, reducing the need to visit a separate Billing section.
External tools still needed 0 (Stripe integrated into payment setup phase; EOC assessment in-system; Sheets eliminated) 0 (same) 0 (same)
Billing accessible in context Yes — Payment Setup is Phase 3 of the Close Workflow tab, directly on the Semester Hub. Student-level billing is on Student Detail Pages' Payments tab. Payment Overview is in the Billing domain for aggregate verification. Partial — Payment Setup is Phase 3 of the Close Tracker. But detailed per-student billing investigation requires navigating to the Billing spoke. The Tracker links to billing but doesn't embed it. Yes — Payment Setup is Phase 3 on the Semester Entity Page. Per-student billing is fully inline on Student Entity Pages (Payments tab). The deepest billing context during close operations.
Improvement over current Screen count: 8+ → 4-6 (38-50% reduction). Clicks: 40-70+ → 15-25 (55-64% reduction). External tools: 3 → 0 (100% reduction). The gated Close Workflow tab directly solves the premature payment setup problem [Q3]. Linked deactivation+cancellation on Student Detail Pages eliminates the most dangerous error. Screen count: 8+ → 5-7 (13-38% reduction). Clicks: 40-70+ → 18-30 (55-57% reduction). External tools: 3 → 0 (100% reduction). Close Tracker provides gating, but the admin must navigate across spokes more than A or C for edge-case investigation. Screen count: 8+ → 4-6 (38-50% reduction). Clicks: 40-70+ → 12-22 (63-69% reduction). External tools: 3 → 0 (100% reduction). Entity pages provide the richest context during edge-case investigation. Close Workflow tab on the Semester Entity Page is the orchestration center with the most integrated data.

Workflow 4: Daily Monitoring / Operations

Current state (WS1.2): 4-5 screens | 12-18 clicks | 3 external tools (Stripe, Sheets, Email) | Pain: High

Dimension Candidate A: Domain-First Candidate B: Hub-and-Spoke Candidate C: Entity-Centric
Screen sequence Dashboard (check alerts: failed payments, TA response times, students behind, appointment utilization) → click alert to navigate to relevant domain screen (e.g., Billing > Payment Overview for failed payments, Teacher Mgmt > TA Detail for response time issues, Student Mgmt > Students for submission issues) Dashboard (HUB — all alerts on one screen: failed payments tile, TA response time tile, students-behind tile, appointment utilization tile, issue queue tile, semester phase tile, student body snapshot) → click specific alert → navigate to filtered spoke view → resolve → return to Dashboard Dashboard (lightweight alerts + global search) → click alert → navigate to relevant entity page or cross-entity section
Screen count 2-4 (Dashboard + 1-3 domain screens depending on how many alerts need action) 1-3 (Dashboard is the primary workspace; only leave for resolution actions in spokes — may resolve some issues directly from dashboard quick actions) 2-4 (Dashboard + entity pages or cross-entity sections for resolution)
Click estimate 6-12 clicks (Dashboard: 0 — landing page; per alert investigation: 2-3 clicks to navigate to domain screen) 3-8 clicks (Dashboard: 0 — landing page; alerts link directly to pre-filtered spoke views; quick actions handle simple tasks without leaving; resolution clicks: 1-3 per alert) 5-10 clicks (Dashboard: 0 — landing page; alerts link to entity pages; per alert: 2-3 clicks)
Info co-location Partial — Dashboard shows alert counts and summaries, but detailed investigation requires navigating to domain screens. The dashboard surfaces what needs attention; the domains provide the resolution context. Yes — Dashboard shows alert details with context (student name, failure reason, TA name, overdue count). Deep links go to pre-filtered views. The hub is designed for monitoring — this is its primary use case. Quick actions (search student, view today's appointments) reduce the need to leave. Partial — Dashboard is lightweight (alert counts + search), so detailed monitoring requires navigating to entity pages or cross-entity sections. Less monitoring-oriented than B's hub.
External tools still needed 0 (Stripe alerts integrated into Dashboard, Google Sheets schedule eliminated by Calendar View, issue queue replaces email for student-reported issues) 0 (same, plus the most comprehensive alert integration of any candidate — Dashboard is the definitive monitoring surface) 0 (same integrations, but Dashboard is tertiary — monitoring is distributed across entity pages and cross-entity sections)
Billing accessible in context Partial — Dashboard shows failed payment count; clicking navigates to Billing domain. Payment detail is one click away from the alert. Yes — Dashboard's failed payment tile shows student names, failure reasons, and retry status. Clicking goes to the Billing spoke with the failed payments pre-filtered. The admin sees billing context directly on the dashboard. Partial — Dashboard shows alert count; clicking navigates to either a Student Entity Page (Payments tab) or Cross-Entity Billing.
Improvement over current Screen count: 4-5 → 2-4 (20-56% reduction). Clicks: 12-18 → 6-12 (33-50% reduction). External tools: 3 → 0 (100% reduction). Dashboard provides actionable alerts — a qualitative leap from the current "decorative" dashboard [Q6]. Screen count: 4-5 → 1-3 (40-78% reduction). Clicks: 12-18 → 3-8 (56-83% reduction). External tools: 3 → 0 (100% reduction). The hub is purpose-built for daily monitoring. This is the single strongest workflow outcome of any candidate on any workflow — the dashboard replaces 4-5 screens and 3 external tools with 1 screen. Screen count: 4-5 → 2-4 (20-56% reduction). Clicks: 12-18 → 5-10 (33-44% reduction). External tools: 3 → 0 (100% reduction). Lightweight dashboard with alerts provides a starting point, but monitoring is not the entity model's strength. Resolution requires navigating to entity pages.

Workflow 5: Manage Appointments / Scheduling

Current state (WS1.2): 4-6 screens | 12-20 clicks | 2 external tools (Sheets, Email) | Pain: Medium

Dimension Candidate A: Domain-First Candidate B: Hub-and-Spoke Candidate C: Entity-Centric
Screen sequence Scheduling > Calendar View (see all TAs, all sessions, all appointments) → click appointment to reschedule → inline panel shows TA availability → confirm reschedule → Scheduling > Holidays (if holiday, unified system — add once, select scope) Dashboard (if alert: "TA X unavailable tomorrow" or appointment conflict) → Scheduling > Calendar View → reschedule inline → Scheduling > Holidays (if needed) Cross-Entity: Scheduling > Calendar View → reschedule inline → Teachers > [TA Entity Page] > [Schedule tab] (if TA availability change needed) → Cross-Entity: Scheduling > Holidays (if needed)
Screen count 1-2 (Calendar View for most rescheduling + Holidays if applicable) 2-3 (Dashboard + Calendar View + Holidays if applicable) 2-3 (Calendar View + TA Entity Page for schedule changes + Holidays if applicable)
Click estimate 5-8 clicks (navigate to Scheduling: 1, Calendar View: 1, click appointment: 1, select new slot: 1, confirm: 1, holiday if needed: 2-3) 5-10 clicks (Dashboard: 0, navigate to Scheduling spoke: 1-2, Calendar View interaction: 3-5, holidays: 2-3) 5-10 clicks (navigate to Cross-Entity Scheduling: 1-2, Calendar View: 3-5, TA Entity Page for schedule update: 2-3)
Info co-location Yes — Calendar View in the Scheduling domain shows all TAs, all sessions, all appointments. Reschedule happens inline. Holidays are in the same domain section. Yes — Calendar View in the Scheduling spoke has the same consolidation. Dashboard may surface scheduling issues proactively. Partial — Calendar View shows the aggregate picture, but TA schedule modification lives on the TA Entity Page (different section). The scheduling work is split between a cross-entity view and individual entity pages.
External tools still needed 0 (Calendar View replaces Google Sheets schedule; automated notifications replace email) 0 (same) 0 (same)
Billing accessible in context No — Billing is not relevant to scheduling workflows. No — Same. No — Same.
Improvement over current Screen count: 4-6 → 1-2 (58-75% reduction). Clicks: 12-20 → 5-8 (50-60% reduction). External tools: 2 → 0 (100% reduction). Calendar View + unified holidays is the key improvement. All scheduling operations in one domain section. Screen count: 4-6 → 2-3 (42-58% reduction). Clicks: 12-20 → 5-10 (42-50% reduction). External tools: 2 → 0 (100% reduction). Dashboard adds proactive alerting, but scheduling resolution is similar to A. Screen count: 4-6 → 2-3 (42-58% reduction). Clicks: 12-20 → 5-10 (42-50% reduction). External tools: 2 → 0 (100% reduction). Calendar View provides the aggregate picture, but TA-specific schedule changes require navigating to a different section (TA Entity Page), adding a context switch that A avoids.

Workflow 6: Onboard / Manage a TA

Current state (WS1.2): 6-8 screens | 20-35 clicks | 1 external tool (Sheets) | Pain: Medium

Dimension Candidate A: Domain-First Candidate B: Hub-and-Spoke Candidate C: Entity-Centric
Screen sequence Teacher Mgmt > Teaching Assistants (create TA account) → [TA Detail Page] [Profile tab: enter details] → [Schedule tab: configure weekly slots, clone from previous semester] → [Students & Groups tab: create groups, assign students] → [Performance tab: verify baseline] → Teacher Mgmt > Teacher Assignment Criteria (configure auto-assignment rules) Teachers > Teaching Assistants (create TA) → TA detail page (enhanced — enter details, see student count and response time) → Scheduling > TA Schedules (configure weekly slots) → Teachers > Student Groups (create groups) → Teachers > Teacher Assignment Criteria (configure rules) → Teachers > TA Reports (verify baseline) Teachers > [TA Entity Page] [Profile tab: create account, enter details] → [Schedule tab: configure weekly slots, clone from previous semester] → [Students & Groups tab: create groups, assign students] → [Assignment Rules tab: configure auto-assignment] → [Performance tab: verify baseline]
Screen count 2-3 (TA Detail Page with 4 tabs + Teacher Assignment Criteria as a separate screen in the same domain section) 5-6 (TA detail + TA Schedules + Student Groups + Teacher Assignment Criteria + TA Reports — all separate screens in the Teachers spoke) 1-2 (TA Entity Page with 5 tabs — Assignment Rules absorbed as a tab; almost everything is on one page)
Click estimate 8-14 clicks (navigate to Teacher Mgmt: 1, create TA: 2, tab switches on TA Detail: 3-4, Teacher Assignment Criteria: 2-3, per-slot schedule entries: varies) 15-25 clicks (navigate to Teachers spoke: 1, create TA: 2, navigate to TA Schedules: 2-3, navigate to Student Groups: 2-3, navigate to Assignment Criteria: 2-3, navigate to TA Reports: 2-3) 6-12 clicks (navigate to Teachers: 1, create TA: 2, tab switches: 3-5, per-slot schedule entries: varies)
Info co-location Yes — TA Detail Page shows profile, schedule, students/groups, and performance on one page. Only Teacher Assignment Criteria requires visiting a separate screen (but it is in the same domain section). No — TA-related information is distributed across 5-6 separate screens within the Teachers spoke. The admin must navigate between list, schedules, groups, criteria, and reports as separate pages. Yes — TA Entity Page absorbs everything including Assignment Rules as a tab. The most complete single-page TA view of any candidate.
External tools still needed 0 (Calendar View replaces Google Sheets schedule) 0 (same) 0 (same)
Billing accessible in context No — Billing is not relevant to TA onboarding. No — Same. No — Same.
Improvement over current Screen count: 6-8 → 2-3 (58-67% reduction). Clicks: 20-35 → 8-14 (53-60% reduction). External tools: 1 → 0 (100% reduction). TA Detail Page consolidates 4 of the current 6-8 screens into tabs. Schedule cloning eliminates repetitive manual entry [Q1]. Screen count: 6-8 → 5-6 (17-25% reduction). Clicks: 20-35 → 15-25 (25-29% reduction). External tools: 1 → 0 (100% reduction). Standard list/detail in the Teachers spoke means minimal screen consolidation. Navigation is cleaner (all in one spoke), but the admin still visits 5-6 separate screens. This is Candidate B's weakest workflow. Screen count: 6-8 → 1-2 (75-83% reduction). Clicks: 20-35 → 6-12 (60-71% reduction). External tools: 1 → 0 (100% reduction). TA Entity Page provides the deepest consolidation. Assignment Rules absorbed as a tab means zero cross-screen navigation. This is Candidate C's strongest individual workflow outcome relative to current state.

Part A Summary: Screen Count Comparison

Workflow Current Candidate A Candidate B Candidate C
WF1: Semester Setup 19+ 10-12 10-13 4-5
WF2: Student Support 5-7 1-2 3-5 1-2
WF3: Semester Close 8+ 4-6 5-7 4-6
WF4: Daily Monitoring 4-5 2-4 1-3 2-4
WF5: Scheduling 4-6 1-2 2-3 2-3
WF6: TA Onboarding 6-8 2-3 5-6 1-2
Average (midpoint) ~9.3 ~4.8 ~5.8 ~3.3
Average % reduction ~48% ~38% ~65%
Workflow Current Clicks Candidate A Candidate B Candidate C
WF1: Semester Setup 50-80+ 25-40 25-40 15-25
WF2: Student Support 15-25 5-8 7-12 4-7
WF3: Semester Close 40-70+ 15-25 18-30 12-22
WF4: Daily Monitoring 12-18 6-12 3-8 5-10
WF5: Scheduling 12-20 5-8 5-10 5-10
WF6: TA Onboarding 20-35 8-14 15-25 6-12
Average (midpoint) ~31.4 ~12.5 ~14.5 ~9.3
Workflow Current External Tools Candidate A Candidate B Candidate C
WF1: Semester Setup 3 1 1 1
WF2: Student Support 3 0 0 0
WF3: Semester Close 3 0 0 0
WF4: Daily Monitoring 3 0 0 0
WF5: Scheduling 2 0 0 0
WF6: TA Onboarding 1 0 0 0
Total 15 1 1 1

All three candidates achieve near-identical external tool elimination. The differentiation is in screen consolidation and click reduction.


Part B: 7-Criterion Scoring Matrix

Each criterion is scored 1-5 (5 = best). Every score includes evidence from the walkthroughs and source documents.


Criterion 1: Screen-Hop Reduction

Metric: Average screen count across all 6 workflows compared to current state. Lower screen count = higher score.

Candidate Average Screens (midpoint) Current Average Reduction Score
A 4.8 9.3 48% 4
B 5.8 9.3 38% 3
C 3.3 9.3 65% 5

Rationale:


Criterion 2: Mental Model Match

Metric: Does the candidate's top-level navigation match the admin's Interview Q4 description? Her proposed structure: (1) Student Management, (2) Semester Management, (3) Content, (4) Scheduling, (5) Teacher Management, (6) Reporting, (7) Misc/Admin.

Candidate Match Level Score
A High 5
B Medium-High 4
C Medium 3

Rationale:


Criterion 3: Secondary User Learnability

Metric: Simulate "I'm a TA, I need to check a student's submission status." Score the discoverability path.

Candidate Discoverability Score
A Obvious 5
B Obvious 5
C Obvious 5

Rationale:

All three candidates achieve high learnability for this specific task because all three use recognizable noun labels ("Students," "Submissions") and because the TA view is role-filtered to show only relevant items. Differentiation on this criterion is negligible.


Criterion 4: Billing Accommodation

Metric: Does billing fit naturally into the navigation model? Assessment considers (a) student-level billing accessibility during workflows, (b) aggregate billing placement, and (c) whether billing feels native or bolted-on.

Candidate Fit Score
A Natural fit 5
B Workable 3
C Natural fit 5

Rationale:


Criterion 5: Scalability

Metric: Test by adding 3 hypothetical features: (1) Student App Issue Reporting, (2) TA Performance Reviews, (3) Content Versioning. Assess where each feature goes and whether the placement is clean.

Candidate Feature Placement Score
A Clean 4
B Clean 5
C Messy 3

Rationale:

Student App Issue Reporting:

TA Performance Reviews:

Content Versioning:

Scoring:


Criterion 6: Catch-All Risk

Metric: Count orphan items — items that do not fit cleanly into the candidate's primary organizing framework and must be placed in a residual "other" section.

Candidate Orphan Items Score
A 2-3 4
B 1-2 5
C 4-5 3

Rationale:


Criterion 7: Implementation Effort

Metric: How much development effort is required beyond sidebar reorganization? Categories: Nav-only (lowest), Moderate restructuring, Significant rearchitecture (highest).

Candidate Effort Level Score
A Moderate restructuring 3
B Moderate restructuring (different profile) 4
C Significant rearchitecture 2

Rationale:


Scoring Matrix Summary

Criterion Weight Candidate A Candidate B Candidate C
1. Screen-hop reduction 1x 4 3 5
2. Mental model match 1x 5 4 3
3. Secondary user learnability 1x 5 5 5
4. Billing accommodation 1x 5 3 5
5. Scalability 1x 4 5 3
6. Catch-all risk 1x 4 5 3
7. Implementation effort 1x 3 4 2
Total (equal weight) 30 29 26

Part C: Preliminary Recommendation

Recommended Candidate: A — Domain-First with Rich Entity Pages

Candidate A scores highest overall (30/35) and — critically — achieves the highest or tied-highest score on the two criteria that most directly address the admin's stated needs: Mental Model Match (5/5) and Billing Accommodation (5/5).

Where Each Candidate Has a Decisive Advantage

Candidate Decisive Advantage Evidence
A Mental model match The only candidate whose top-level structure was proposed by the admin herself [Q4]. Zero learning curve for the primary user. The sidebar labels match her vocabulary.
B Daily monitoring (WF4) and scalability The hub is purpose-built for the most frequent workflow. Screen count for WF4 drops to 1-3 (best of any candidate). New features plug in as spoke items + dashboard tiles without disruption.
C Screen-hop reduction and entity depth Achieves the lowest average screen count (3.3 vs. A's 4.8 and B's 5.8). WF1 (semester setup) is the most dramatic single-workflow improvement: 19+ → 4-5 screens. WF6 (TA onboarding) drops to 1-2 screens.

Trade-Offs of the Recommended Candidate (A)

  1. WF1 (semester setup) is not as consolidated as C: Candidate A reduces WF1 from 19+ to 10-12 screens, but Candidate C achieves 4-5 screens by absorbing content into the Semester Entity Page. A's Content domain keeps content screens standalone, meaning the admin still navigates between Semester Management and Content during setup. Mitigation: The Setup Checklist tab on the Semester Hub Page provides a monitoring surface that tracks progress across content types, reducing the cognitive overhead of visiting separate screens even if the screen count is higher.

  2. WF4 (daily monitoring) is not as strong as B: Candidate A's dashboard is secondary — alerts and phase prompts exist but the dashboard is not the operational nerve center B achieves. A's monitoring circuit requires 2-4 screens vs. B's 1-3. Mitigation: A's dashboard can be iteratively enhanced toward B's vision. The structural difference is emphasis, not architecture — A has all the same alert capabilities, just with less prominence on the dashboard.

  3. Implementation effort is moderate (3/5), not the lightest: Three multi-tab entity pages (Student, Semester, TA) are substantial development investments. Mitigation: Entity pages can be built incrementally — Student Detail Page first (highest daily-use impact), then Semester Hub Page (highest setup/close impact), then TA Detail Page. The 70% nav-only changes can ship immediately, delivering value before entity pages are complete.

Elements to Incorporate from Other Candidates

From Candidate B, incorporate into A:

From Candidate C, incorporate into A:


Part D: Sensitivity Analysis

Testing whether the recommendation changes when individual criteria are double-weighted.

Scenario 1: Mental Model Match weighted 2x

Criterion Weight A B C
1. Screen-hop reduction 1x 4 3 5
2. Mental model match 2x 10 8 6
3. Secondary user learnability 1x 5 5 5
4. Billing accommodation 1x 5 3 5
5. Scalability 1x 4 5 3
6. Catch-all risk 1x 4 5 3
7. Implementation effort 1x 3 4 2
Total 35 33 29

Result: Recommendation unchanged. A's lead widens from 1 point to 2 points. This is expected — mental model match is A's strongest criterion. C falls further behind because its entity-centric structure departs the most from the admin's Q4 description.

Scenario 2: Implementation Effort weighted 2x

Criterion Weight A B C
1. Screen-hop reduction 1x 4 3 5
2. Mental model match 1x 5 4 3
3. Secondary user learnability 1x 5 5 5
4. Billing accommodation 1x 5 3 5
5. Scalability 1x 4 5 3
6. Catch-all risk 1x 4 5 3
7. Implementation effort 2x 6 8 4
Total 33 33 28

Result: Recommendation changes to a tie between A and B. If implementation speed is the dominant concern, B's lighter entity pages (enhanced list/detail vs. multi-tab rebuilds) and focused investment profile (one dashboard system vs. three entity page rebuilds) make it equally attractive. C falls to a distant third — its significant rearchitecture requirement is penalized heavily. If implementation effort is the deciding factor, B should be selected because it delivers the same nav-only benefits as A with lower custom development per screen.

Scenario 3: Screen-Hop Reduction weighted 2x

Criterion Weight A B C
1. Screen-hop reduction 2x 8 6 10
2. Mental model match 1x 5 4 3
3. Secondary user learnability 1x 5 5 5
4. Billing accommodation 1x 5 3 5
5. Scalability 1x 4 5 3
6. Catch-all risk 1x 4 5 3
7. Implementation effort 1x 3 4 2
Total 34 32 31

Result: Recommendation unchanged — A still wins, though the margin narrows to 2 points over B and 3 over C. Even with screen-hop reduction double-weighted, C's penalties on mental model match (3), scalability (3), catch-all risk (3), and implementation effort (2) outweigh its perfect screen-hop score. A's balanced profile holds up under this weighting.

Sensitivity Summary

Scenario Winner Runner-up Margin
Equal weight (baseline) A (30) B (29) 1
Mental model match 2x A (35) B (33) 2
Implementation effort 2x A/B tie (33/33) C (28) 0 / 5
Screen-hop reduction 2x A (34) B (32) 2

Candidate A is robust — it wins or ties in all four scenarios. The only scenario where B catches A is when implementation effort is double-weighted, producing a tie. Candidate C never wins under any weighting scenario, because its penalties on mental model, scalability, catch-all risk, and implementation effort accumulate across all weightings.


Part E: Risks of the Recommendation

Risk 1: Content Domain Friction During Semester Setup

Failure mode: The admin finds it tedious to navigate between the Semester Hub Page (Setup Checklist) and the standalone Content section (6 separate screens) during semester setup. She compares the experience unfavorably to Candidate C's integrated approach, where all content is managed within the semester context. Over time, she avoids using the Setup Checklist and reverts to her current approach of manually tracking what has been configured.

Likelihood: Medium. The Setup Checklist reduces cognitive load (she can see what is missing), but the physical navigation to Content screens remains a source of friction — 10-12 screens vs. C's 4-5 for WF1.

Mitigation: Enhance the Setup Checklist tab to include direct deep links to each content type's create/edit screen with the semester pre-filtered. Add a "return to checklist" affordance that returns the admin to the Setup Checklist after saving a content item. Consider allowing inline content previewing (thumbnail + title) within the checklist tab so the admin can spot-check without navigating away. These enhancements bring A closer to C's consolidation without restructuring the Content domain.

Risk 2: Dashboard Remains Underinvested

Failure mode: Because A treats the dashboard as secondary (not the hub), it receives lower development priority and ships as a minimal implementation — alert counts without context, no deep links, no configurable thresholds. The admin continues using the dashboard as she does today: ignoring it. Daily monitoring (WF4) does not materially improve despite the structural capability.

Likelihood: Medium-High. Development teams naturally prioritize the primary organizing element. If A's primary investment goes to entity pages and domain reorganization, the dashboard may be deprioritized.

Mitigation: Explicitly include dashboard alert logic in the MVP scope, not as a Phase 2 enhancement. Define the minimum viable dashboard as: (1) failed payment alerts from Stripe, (2) TA response time flags, (3) students-behind-on-submissions count, (4) semester phase prompt. Each alert must have a deep link to the resolution context. Borrowing B's dashboard specification as the target ensures the dashboard delivers value from launch.

Risk 3: TA/Support Staff Navigation Confusion During Transition

Failure mode: TAs and view-only admins who are accustomed to the current navigation struggle to find things in the new 8-domain structure. The domain labels (Student Management, Semester Management) are meaningful to the primary admin but may not be immediately clear to occasional users who thought in terms of "People" and "Program." Training materials are needed, and without them, support tickets increase.

Likelihood: Low-Medium. WS1.4 specifies role-based sidebar filtering — TAs see only 4 items (My Students, My Schedule, My Groups, Recordings), which is simpler than the current full sidebar. View-only admins see all 8 domains but with action buttons disabled. The simplification for TAs actually reduces confusion. The risk is primarily for view-only admins who see the full structure.

Mitigation: The Criterion 3 analysis showed all three candidates score 5/5 on secondary user learnability for the tested task. The domain labels use familiar nouns (Students, Appointments, etc.) within sections, which supports discovery. Provide a one-page orientation guide mapping old locations to new locations for view-only admins. Consider inline tooltips ("formerly under Program") during the first 30 days after launch.


Part F: Convergence Preview — Billing Integration (Previewing Workstream 2)

Workstream 2 will focus on billing integration — bringing Stripe data, family plans, scholarships, and deferments into the admin backend. This section previews how well each candidate's structure accommodates that integration.

Candidate A: Natural Billing Home

Billing in Candidate A has a dedicated domain (Billing & Payments) with 6 screens: Payment Overview, Payments history, Payment Plans, Coupons, Family Plans, and Scholarships & Deferments. Student-level billing is embedded on the Student Detail Page's Payments tab.

Convergence strength: The Billing & Payments domain provides a natural top-level home for all aggregate billing features, including future additions like revenue dashboards, payment analytics, refund management, and dunning workflows. The Student Detail Page's Payments tab provides the student-level integration point. The dual-level design (domain for aggregate, entity tab for individual) means Workstream 2 development has clear architectural targets — aggregate features go to the Billing domain, student-level features go to the Payments tab.

Convergence risk: The Billing domain is a new section that does not exist today. Workstream 2 must build both the domain screens and the Stripe API integration simultaneously. If the Billing domain is deprioritized or descoped, the structure degrades gracefully — Payment Plans and Coupons can remain in Settings temporarily, and the Student Detail Page's Payments tab can be released without the full domain.

Score: 5/5 — The strongest billing convergence because billing has a first-class structural home at both the aggregate and individual levels.

Candidate B: Dashboard-First Billing Detection, Spoke-Based Resolution

Billing in Candidate B lives in a standalone Billing spoke (aggregate) with the same 6 screens as A. Student-level billing is an enhanced field on the student detail page (payment status) that links to the Billing spoke for detail.

Convergence strength: The Dashboard's failed payment tile and billing alert integration means billing exceptions are immediately visible. Workstream 2's alert-related features (failed payment detection, expiring subscription warnings, dunning triggers) plug directly into the hub as new tiles — the easiest integration path for billing monitoring.

Convergence risk: Student-level billing is shallow. When a support staff member needs to investigate a student's billing issue, they must navigate from the student detail page to the Billing spoke, losing student context. Workstream 2 would need to either (a) deepen the student detail page with a billing tab (moving toward A's model) or (b) accept the spoke-hop for billing investigation. Option (a) is the likely outcome, which means B naturally evolves toward A's entity page depth over time.

Score: 3/5 — Billing detection is strong (dashboard tiles), but billing resolution is structurally separated from the student context. Workstream 2 will likely need to add entity-page-level billing tabs, partially negating B's lighter entity page design.

Candidate C: Deepest Student-Level Billing, Cross-Entity Aggregate

Billing in Candidate C is embedded most deeply at the student level — the Student Entity Page's Payments tab includes full payment history, subscription management, family plan details, and inline actions. Aggregate billing lives in the Cross-Entity Billing section.

Convergence strength: The Student Entity Page's Payments tab is the most complete billing integration of any candidate at the individual student level. For Workstream 2 features that are student-centric (payment plan adjustments, refund processing, subscription status changes), the entity page provides the richest context.

Convergence risk: The "Cross-Entity" label signals that billing is a second-class citizen in the entity model — it does not have its own entity, so it lives in a cross-cutting section. If Workstream 2 introduces complex billing features (revenue analytics, payment gateway management, billing rules engine), they must go into the cross-entity section, which is already the model's weakest structural area. The cross-entity sections are C's catch-all, and adding more billing complexity there increases the orphan/catch-all pressure documented in Criterion 6.

Score: 4/5 — Deepest student-level integration but weaker aggregate-level structural home. Workstream 2's complex billing features would strain the cross-entity architecture.

Convergence Summary

Dimension Candidate A Candidate B Candidate C
Student-level billing depth Deep (entity tab) Shallow (enhanced field + link) Deepest (entity tab with full CRUD)
Aggregate billing home First-class domain Spoke (same structure as A) Cross-entity section (second-class)
Billing monitoring/alerts Secondary (Dashboard) Primary (Hub tiles) Tertiary (lightweight Dashboard)
Workstream 2 architectural targets Clear (domain + entity tab) Clear for alerts, unclear for depth Clear for student-level, strained for aggregate
Future billing feature scalability High Medium (evolves toward A) Medium (cross-entity strain)
Convergence Score 5/5 3/5 4/5

Candidate A provides the cleanest path for Workstream 2 because billing has a first-class structural home at both the aggregate level (Billing & Payments domain) and the individual level (Student Detail Page Payments tab). Development teams can build aggregate billing features into the domain and student-level billing features into the entity tab, with the dashboard surfacing exceptions — a clear separation of concerns that maps directly to the billing integration work.


Enhanced Candidate A: The Final Recommendation

After incorporating the best elements from B and C, the recommended "Enhanced Candidate A" has these characteristics:

From Candidate A (core):

Incorporated from Candidate B:

Incorporated from Candidate C:

Estimated workflow improvements (Enhanced A vs. current state):

Workflow Current Enhanced A Reduction
WF1: Semester Setup 19+ screens, 50-80 clicks, 3 ext tools 8-10 screens, 20-35 clicks, 1 ext tool 47-58% screens, 56-75% clicks, 67% ext tools
WF2: Student Support 5-7 screens, 15-25 clicks, 3 ext tools 1-2 screens, 4-7 clicks, 0 ext tools 71-86% screens, 65-81% clicks, 100% ext tools
WF3: Semester Close 8+ screens, 40-70 clicks, 3 ext tools 4-6 screens, 15-25 clicks, 0 ext tools 38-50% screens, 55-64% clicks, 100% ext tools
WF4: Daily Monitoring 4-5 screens, 12-18 clicks, 3 ext tools 1-3 screens, 3-8 clicks, 0 ext tools 40-78% screens, 56-83% clicks, 100% ext tools
WF5: Scheduling 4-6 screens, 12-20 clicks, 2 ext tools 1-2 screens, 5-8 clicks, 0 ext tools 58-75% screens, 50-60% clicks, 100% ext tools
WF6: TA Onboarding 6-8 screens, 20-35 clicks, 1 ext tool 2-3 screens, 8-12 clicks, 0 ext tools 58-75% screens, 60-66% clicks, 100% ext tools

External tool reduction: From 15 external tool dependencies across 6 workflows to 1 (Vimeo for video URLs, which is structural and cannot be eliminated by nav redesign). Google Sheets eliminated entirely. Stripe dashboard eliminated entirely.


Summary

Candidate A (Domain-First with Rich Entity Pages) is recommended based on the following evidence:

  1. Highest overall score (30/35 at equal weight), winning or tying on 4 of 7 criteria.
  2. Robust under sensitivity analysis — wins or ties in all four weighting scenarios.
  3. Best mental model match — the only candidate whose structure was proposed by the primary user.
  4. Strongest billing convergence for Workstream 2 — billing has a first-class home at both aggregate and individual levels.
  5. Balanced improvement across all 6 workflows (average screen reduction of 48%) without the implementation risk of Candidate C's rearchitecture or the monitoring overemphasis of Candidate B.

Key elements to incorporate from the other candidates: B's dashboard alert depth and Issue Queue; C's global search and deeper content status integration on the Semester Hub.

The primary risks — Content domain friction during semester setup and dashboard underinvestment — are mitigable through targeted enhancements that do not require structural changes to the recommended navigation architecture.