How It Works

The Course Factory

The Course Factory is the operating model that turns stalled catalogs into finished ones. We scope the catalog, lock standards, prove the workflow on a pilot, and ship verified batches so fixed launch dates survive real-world constraints.

What ships with the engagement

  • Course inventory + complexity scoring (Light / Standard / Heavy)
  • 1 to 2 templates + standards rules (navigation, naming, module structure)
  • Batch delivery (25 to 50 courses at a time)
  • QA checklist + exception log per batch
  • QA Evidence Pack for sign-off
  • Delivery inside your LMS (Canvas, Blackboard, Moodle, and others: confirm your platform on the fit call)
  • 30-day warranty window for defects within acceptance criteria after each batch is accepted

Best fit when

  • 20+ courses
  • Hard deadline / launch window
  • Small internal team
  • Inconsistent standards, messy source content, or SME availability constraints

Not a fit when

  • Open-ended redesign with no scope boundaries
  • Projects where subject matter experts are entirely unavailable for review
  • Heavy net-new media production (video, animation, or interactive simulations) unless separately scoped
Delivery Flow

How a catalog moves from inventory to sign-off

Each stage reduces risk before the next one starts: scope first, pilot next, then production batches with QA evidence.

01

Inventory + Complexity Scoring

We walk your entire catalog and build a course-by-course inventory: what exists, what is missing, what has integrations or dependencies, and how much work each course actually requires.

  • Light: primarily text-based, single-template, minimal transformation (policy acknowledgments, simple content pages)
  • Standard: multiple content types, embedded media, and structured assessments, including clinical and didactic program structures (the bulk of most catalogs)
  • Heavy: interactive elements, LTI integrations, branching logic, or complex assessment structures requiring custom build and extended QA

The deliverable is a scored inventory spreadsheet with a delivery schedule and fixed price.

  • Inventory every course, asset, and integration
  • Score complexity (Light / Standard / Heavy)
  • Define batch size, delivery schedule, and fixed price
02

Lock Standards

Locking standards before production starts eliminates that drift.

The standards document specifies templates, naming conventions, navigation rules, and the written Definition of Done that every course is verified against. Without it, quality is a matter of opinion. With it, quality is a checklist.

  • 1 to 2 templates selected and configured
  • Naming, navigation, and module structure rules defined
  • Written Definition of Done (acceptance criteria) locked
03

Pilot: Prove the Process

We build 3 to 5 courses first: not a demo, but a live test of the entire workflow before committing to full production.

After the pilot, your team has confirmed:

  • Actual build velocity
  • Review cadence requirements
  • Where automation rules need tuning
  • 3 to 5 course pilot batch built and delivered
  • Workflow validated end-to-end before scale
  • Automation rules and QA checks calibrated to real content
04

Batch Delivery + QA Evidence

Production runs in batches of 25 to 50 courses, each verified against the Definition of Done. Foundry: Tamitu's proprietary automated production pipeline handles course assembly, template application, and LMS publishing programmatically across Canvas LMS, Blackboard, Moodle, MHE Connect, Cengage MindTap, and Elsevier, eliminating manual rebuild steps for each course. Delivery: Each batch is delivered into your LMS with a Tamitu Verified QA Evidence Pack: checklist, exception log, and sign-off record produced against the Tamitu Verified framework. Cadence: Weekly review keeps the work visible and confirms direction before the next batch starts.

Every exception is categorized, assigned an owner, and given a next step.

  • Integration: depends on a third-party system or client IT action
  • Content: requires SME input or source material
  • Scope: falls outside the agreed Definition of Done and requires a change order

After sign-off, any defect within the acceptance criteria that surfaces within 30 days is covered under warranty.

  • Batches of 25 to 50 courses with weekly review cadence
  • QA checklist + categorized exception log per batch
  • QA Evidence Pack + sign-off + 30-day warranty
Sample Timeline

Representative timeline for 100 courses

This is a representative 100-course engagement: scope first, pilot next, then production batches with QA evidence and sign-off.

Weeks 1 to 2

Inventory, complexity scoring, and standards lock. You know the scope, price, and timeline before production starts.

Week 3

Pilot batch (3 to 5 courses). Proves the process works and calibrates expectations before full production begins.

Weeks 4 to 10

Production batches of 25 to 50 courses. QA evidence + exception log with each batch. Weekly review cadence.

Week 11+

Final sign-off, 30-day warranty window for defects within acceptance criteria, and clean handoff documentation.

Actual timelines depend on catalog size, complexity mix, and SME availability.

Blueprint Sprint starts at $2,500. Course Factory Sprints start at $9,500. Exact pricing is scoped to catalog size and complexity: both are confirmed in writing before work starts.

What Your Team Provides

What needs to be ready before production starts

The number one reason catalog projects miss deadlines is late source content, unclear LMS access, or a review contact unavailable when sign-off is needed.

Source Content

When content is late: The batch sequence shifts when source content arrives late, and downstream batches shift with it.
What we need: Raw drafts, slide decks, and rough outlines are all workable.

  • Existing course files, slide decks, PDFs, or documents
  • Raw drafts are fine: we work with what you have
  • Clear indication of which content is current vs. outdated

LMS Access

We work inside your LMS on accounts your IT team provisions and controls.

  • A sub-account with course editor or designer-level permissions
  • No admin access required unless the scope demands it
  • Access is revoked at project close

Review Availability

Review delays are the single most common source of missed catalog deadlines. We need a named contact who can review a batch within the agreed window (typically 2 to 3 business days) and either sign off or return specific feedback.

  • A designated point of contact who can review and approve each batch
  • Subject matter expert time for content accuracy checks (typically 1 to 2 hours per batch)
  • Timely sign-off at each wave: delays on approval delay the next batch
Acceptance

How “done” is defined before work starts

The Definition of Done is agreed before production starts and does not change without a documented change order. It is Tamitu Verified-compliant, meaning it maps to Quality Matters (QM) standards, not just internal checklists. The framework supports healthcare career colleges, higher education institutions, and nonprofit training organizations.

Sign-off is objective: a course either meets the criteria or it does not. Exceptions are tracked: anything that fails is either fixed or logged with an assigned owner. Scope is protected: anything outside the DoD is a change request, not a defect.

Core

  • 0 critical broken links in student-facing areas (or documented exceptions)
  • Templates + standards applied consistently
  • Publish states correct
  • Files organized (no obvious duplicates or stray content)

Accessibility (baseline)

  • Heading structure is properly sequenced
  • Key images have alt text
  • Template elements pass contrast
  • Accessibility checks target conformance to WCAG 2.1 Level AA and Section 508 standards
  • Anything out of scope is documented in the exception log

Evidence

  • QA checklist completed
  • Exception log delivered (owner + next step for each item)
  • QA Evidence Pack generated per batch
  • Sign-off captured from client contact
Talk Through Your Catalog