🔧 This page is being updated — some sections may look incomplete. Check back soon.
Site update in progress · some pages may look incomplete · check back soon
Cheat Sheet · Instructional Design

ID Models &
Frameworks

3 are true design models — they give you a process. 4 are frameworks, taxonomies, or principles. Both matter, but they operate at different levels of the design process.

7 tools.
3 models.
4 frameworks.
ID Model — gives you a design process
Framework / Taxonomy / Principles — shapes your thinking
The 3 True ID Models
ID Model · Process
ADDIE
The classic sequential design process
Analyze Design Develop Implement Evaluate
A linear scaffold. You don't have to follow it strictly, but the Analysis phase alone is worth the whole framework — it forces you to define the problem before building the solution.
Use when: Clear scope, waterfall delivery, defined stakeholder handoffs.
ID Model · Process
SAM
Successive Approximation Model — agile ID
Preparation Iterative Design Iterative Dev
Built for iteration. Design → prototype → review → repeat. SMEs give feedback on real prototypes, not abstract storyboards. Less useful when you need a clear deliverable timeline upfront.
Use when: SMEs available for iterative feedback rounds.
ID Model · Process
4C/ID
Four-Component Instructional Design — complex skills
Learning Tasks Supportive Info Procedural Info Part-task Practice
The most demanding of the three. Designed for genuinely complex skills where part-task practice alone isn't enough. Overkill for most corporate eLearning. Essential when the skill demands it.
Use when: Complex whole-task performance (clinical, technical, managerial).
The 4 Frameworks, Taxonomies & Principles
Taxonomy · Cognitive Levels
Bloom's
6-level classification of learning
Remember Understand Apply Analyze Evaluate Create
Not a design process. A reference for writing objectives at the right cognitive level. If you're not using Bloom's to write objectives, you're probably measuring the wrong thing.
Use for: Writing measurable learning objectives.
Principles · Instruction
Gagné's 9
9 events that define effective instruction
Gain Attention Inform Objectives Recall Prior Present Content Guide Learning Elicit Performance Provide Feedback Assess Transfer
Chronically underrated. Most IDs know the name but haven't sequenced against all 9. When you do, gaps appear — especially the missing Elicit Performance step before feedback.
Use for: Sequencing and pacing a course correctly.
Principles · Instruction
Merrill's
First Principles of Instruction
Problem-centred Activation Demonstration Application Integration
Demanding to apply properly, but courses built on it feel noticeably different. Learners do more, discuss more, apply more. It's not a process — it's a set of conditions for effective learning.
Use for: Designing applied, problem-centred learning experiences.
Framework · Evaluation
Kirkpatrick
4-level training evaluation framework
L1 Reaction L2 Learning L3 Behavior L4 Results
Not a design model at all — it's an evaluation framework. Everyone cites it. Very few orgs measure beyond Level 1 (smile sheets). The model is sound. The implementation gap is the real problem.
Use for: Measuring training effectiveness after delivery.
Same Topic · 7 Lenses
Topic: "How to handle an angry customer"
A classic corporate soft-skill scenario — see how each model/framework shapes the output differently.
ADDIE
ID Model
How it shapes the design Analyze: Survey frontline staff to identify top 5 angry-customer scenarios. Define gap between current and desired behavior.
Design: Map learning objectives (Apply de-escalation steps), choose branching scenario format.
Develop: Build Storyline module with 3 branching paths.
Implement: Deploy to LMS, notify managers.
Evaluate: Run post-training observation checks at 30 days.
SAM
ID Model
How it shapes the design Week 1: Rapid prototype — a rough 3-slide branching scenario. Share with 2 SMEs (team lead + customer service manager).
Week 2: Revise based on feedback. "The escalation path felt too easy." Add a harder consequence branch.
Week 3: Second prototype. SME signs off. Begin full development.
Result: The module reflects real edge cases because you iterated with real feedback, not assumptions.
4C/ID
ID Model
How it shapes the design You don't teach "Step 1: Listen. Step 2: Apologise. Step 3: Solve." as separate lessons. Throw the learner straight into a full, realistic angry-customer call and let them figure it out — just like real life. The component pieces are still there, but as reference support, not the main event.
Bloom's
Taxonomy
How it shapes the design ✗ "Understand how to handle angry customers." Nobody can test that.

✓ "Handle a live angry customer call and resolve it without escalating." Now you know exactly what practice looks like, what the test looks like, and what success looks like.
Gagné's 9
Principles
How it shapes the design Most courses: watch a video of someone handling an angry customer → take a quiz. Gagné says: handle a simulated angry customer yourself first → then watch the expert version and see where you went wrong. You learn more from failing the attempt than from watching the answer.
Merrill's
Principles
How it shapes the design Don't open with a slide that says "Module 3: De-escalation Techniques." Open with a recording of a real call that went badly — then ask: "What would you have done?" Now the learner has a reason to care about everything that comes next.
Kirkpatrick
Framework
How it shapes the design Ask the business: "How will we know it worked in 3 months?" If they say "fewer calls escalating to a manager" — great, that's your finish line. Now build backwards toward it. If they say "we'll check course completions" — that tells you nothing about whether anyone actually got better.
The Core Mantras
ADDIE
"A blueprint before you build."
Think architect, not developer. The value is in the plan. Without Analysis, you're building a solution to an undefined problem.
↳ Angry customer — in practice
You talk to the actual reps before building anything. Turns out they already know how to stay calm — they just aren't allowed to offer refunds without a manager's approval. No amount of training fixes a process problem. You just saved 3 weeks building the wrong course.
SAM
"Show, don't tell — then show again."
Get something in front of stakeholders fast. Feedback on a rough prototype beats feedback on a 10-page storyboard document nobody reads properly.
↳ Angry customer — in practice
You throw together a rough 3-screen demo in one day and show it to the manager. She says: "This customer sounds too nice — ours actually swear at us." You fix it on day two. If you'd spent a month building first, that feedback would have cost weeks. Show early, fix cheap.
4C/ID
"Whole tasks, not just parts."
If the job requires doing the whole thing under pressure, you have to train the whole thing under pressure — not individual steps in isolation.
↳ Angry customer — in practice
You don't teach "Step 1: Listen. Step 2: Apologise. Step 3: Solve." in separate lessons. You throw the learner straight into a full angry customer call and let them figure it out — just like real life. The steps are still there, but as support, not the main event.
Bloom's
"What will they DO with this knowledge?"
If the verb in your objective is "understand," rewrite it. Bloom's forces you to be specific about what cognitive level you're actually targeting — and assessing.
↳ Angry customer — in practice
✗ "Understand how to handle angry customers." Nobody can test that.

✓ "Handle a live angry customer call and resolve it without escalating." Now you know exactly what practice looks like, what the test looks like, and what good looks like.
Gagné's 9
"Did they actually practice before you told them how they did?"
Most courses skip the practice step entirely. They show content, then immediately give feedback on a quiz. That's not training — that's testing knowledge they haven't used yet.
↳ Angry customer — in practice
Most courses: watch a video of someone handling an angry customer → take a quiz. Gagné says: handle a simulated angry customer yourself first → then watch the expert version and see where you went wrong. You learn more from failing the attempt than from watching the answer.
Merrill's
"Start with a real problem, end with real application."
If the learner never does anything with the content, it's not instruction — it's a presentation. Merrill's keeps you honest about whether the design actually demands performance.
↳ Angry customer — in practice
Don't open with a slide that says "Module 3: De-escalation Techniques." Open with a recording of a real call that went badly — then ask: "What would you have done?" Now the learner has a reason to care about everything that comes next.
Kirkpatrick
"Agree on what success looks like before you build."
Work backwards from the real-world result. If you can't say what will be different in people's behaviour after training, you probably don't need training — you need a process fix.
↳ Angry customer — in practice
Ask the business: "How will we know it worked in 3 months?" If they say "fewer calls escalating to a manager" — great, that's your finish line. Now build backwards toward it. If they say "we'll check course completions" — that tells you nothing about whether anyone actually got better at handling angry customers.
Which one do I reach for?
"What's my design process for this project?"
ADDIE (waterfall) or SAM (iterative)
↳ Angry customer
Got a fixed deadline and a manager who needs to approve every stage? → ADDIE. Manager happy to look at rough drafts every week and give notes? → SAM.
"How complex is the skill I'm designing for?"
→ If genuinely complex whole-task: 4C/ID
↳ Angry customer
Rep who handles angry customers all day, every day → the skill is complex enough to need full simulation. A cashier who gets one rude customer a month → a short scenario is plenty. Don't use a sledgehammer on a nail.
"Are my learning objectives measurable?"
→ Check against Bloom's — fix the verb
↳ Angry customer
"Understand angry customers" tells you nothing about how to test it. Rewrite it as "Handle a difficult customer call without escalating it." Now you know exactly what the practice and the test look like.
"Does my course sequence feel thin or rushed?"
→ Map against Gagné's 9 Events — find the gap
↳ Angry customer
Go through your slides one by one. Did you ever ask the learner to actually try handling an angry customer before telling them if they got it right? If not — that's the missing step. Add it before the feedback, not after.
"Are learners doing enough in this course?"
→ Apply Merrill's Principles — add real problems
↳ Angry customer
If your course is mostly slides with a quiz at the end, learners are watching — not doing. Flip it: put the angry customer scenario at the start, let them try to handle it, then teach the technique. They'll remember it because they felt the problem first.
"How will I prove this training worked?"
→ Define Kirkpatrick levels before you build
↳ Angry customer
Ask your stakeholder before you build: "In 3 months, what will be different if this training works?" If they say "fewer escalations to supervisors" — great, build toward that. If they say "everyone will have completed it" — that just means people clicked through. It says nothing about whether anything actually changed.
Cheat Sheet

7 models. One right choice.

3 are true design models — they give you a process. 4 are frameworks, taxonomies, or principles. Both matter, but they operate at different levels.

ID Model — gives you a design process
Framework / Taxonomy / Principles
The 3 True ID Models
ID Model · Process
ADDIE
The classic sequential design process
Analyze Design Develop Implement Evaluate

A linear scaffold. The Analysis phase alone is worth the whole framework — it forces you to define the problem before building the solution.

✓ Use when: Clear scope, waterfall delivery, defined stakeholder handoffs.
ID Model · Process
SAM
Successive Approximation Model — agile ID
Preparation Iterative Design Iterative Dev

Built for iteration. Design → prototype → review → repeat. SMEs give feedback on real prototypes, not abstract storyboards.

✓ Use when: SMEs available for iterative feedback rounds.
ID Model · Process
4C/ID
Four-Component ID — complex skills
Learning Tasks Supportive Info Procedural Info Part-task Practice

The most demanding of the three. Essential when the skill demands it — overkill for most corporate eLearning.

✓ Use when: Complex whole-task performance (clinical, technical, managerial).
The 4 Frameworks, Taxonomies & Principles
Taxonomy · Cognitive Levels
Bloom's
6-level classification of learning
Remember Understand Apply Analyze Evaluate Create

Not a design process. A reference for writing objectives at the right cognitive level.

✓ Use for: Writing measurable learning objectives.
Principles · Instruction
Gagné's 9
9 events of effective instruction
Attention Objective Prior Knowledge Content Guide Practice Feedback Assess Transfer

When you sequence against all 9, gaps appear — especially the missing Elicit Performance step before feedback.

✓ Use for: Sequencing and pacing a course correctly.
Principles · Instruction
Merrill's
First Principles of Instruction
Problem-centred Activation Demonstration Application Integration

Courses built on Merrill feel noticeably different. Learners do more, discuss more, apply more.

✓ Use for: Applied, problem-centred learning experiences.
Framework · Evaluation
Kirkpatrick
4-level training evaluation framework
L1 Reaction L2 Learning L3 Behavior L4 Results

Not a design model — it's an evaluation framework. Everyone cites it. Very few orgs measure beyond Level 1.

✓ Use for: Measuring training effectiveness after delivery.

Not sure which model fits your project?

Try the ID Model Selector ↗