ADDIE Model

A five-phase instructional design framework consisting of Analysis, Design, Development, Implementation, and Evaluation, used to create structured training programs and learning experiences.

What Is the ADDIE Model?

Key Takeaways

  • ADDIE stands for Analysis, Design, Development, Implementation, and Evaluation. It's the most widely used instructional design framework in corporate training and education.
  • The model was originally created in 1975 for the US Army by Florida State University. It has since been adapted and used across industries worldwide.
  • Each phase builds on the previous one: you analyze the problem before designing the solution, design before developing content, develop before implementing, and evaluate throughout.
  • ADDIE works best for large-scale, high-stakes training programs where upfront planning justifies the time investment: compliance training, onboarding programs, certification courses, and technical skill-building.
  • Modern variations like SAM (Successive Approximation Model) and agile instructional design have emerged as faster alternatives, but ADDIE remains the foundational framework most L&D professionals learn first.

ADDIE is how training professionals turn a business problem into a learning solution. It's a step-by-step process: figure out what employees need to learn, design a program to teach it, build the content, roll it out, and check whether it worked. The model provides structure to what could otherwise be a chaotic process. Without a framework, training development tends to jump straight from "we need a course" to "let's build slides," skipping the critical analysis that determines whether a course is even the right solution. ADDIE forces you to slow down and ask the right questions before investing in content creation. That said, ADDIE isn't perfect. Its linear nature can make it slow. In fast-moving organizations, spending weeks in the analysis phase while the business need evolves isn't practical. That's why many teams use modified versions of ADDIE that overlap phases or build in rapid prototyping. The framework matters less than the discipline of systematic thinking about learning design.

5Sequential phases: Analysis, Design, Development, Implementation, Evaluation
1975Year the original model was created for the US Army by Florida State University's Center for Educational Technology
67%Of instructional designers report using ADDIE or an ADDIE-based model in their work (ATD, 2023)
~200hrsAverage development time for one hour of instructor-led training using ADDIE (Chapman Alliance, 2023)

Phase 1: Analysis

The analysis phase answers the fundamental question: what problem are we trying to solve, and is training the right solution?

Performance gap analysis

Start by defining the gap between current performance and desired performance. A sales team closing 15% of leads when the target is 25% has a measurable performance gap. But is it a skill gap? Or is it a process problem, a tool problem, a motivation problem, or a hiring problem? Training only works when the root cause is a lack of knowledge or skill. If salespeople know how to close but their CRM makes it hard to follow up, no sales training course will fix that.

Learner analysis

Who are the learners? What do they already know? What's their experience level? How do they prefer to learn? What constraints do they face (time, location, technology access)? A training program for entry-level warehouse workers looks very different from one for senior financial analysts. Understanding the audience prevents the most common design mistake: building content that's too basic for experts or too advanced for beginners.

Context analysis

What resources are available? What's the budget? What technology do learners have access to? What's the timeline? Are there regulatory requirements that dictate content or format? Context analysis sets realistic boundaries for design decisions. There's no point designing a VR-based simulation if the budget is $5,000 and learners work in areas without reliable internet.

Phase 2: Design

The design phase translates analysis findings into a learning blueprint. This is where you plan what to teach, how to teach it, and how to assess it.

Learning objectives

Every effective training program starts with clear, measurable learning objectives. Use Bloom's Taxonomy to write objectives at the right cognitive level. "Understand conflict resolution" is vague. "Apply the STAR feedback model to resolve a team disagreement within 48 hours" is specific, measurable, and testable. Write 3-5 objectives per module. Each objective should describe what the learner will be able to do, under what conditions, and to what standard.

Assessment strategy

Design assessments before designing content. This backwards design approach ensures content serves the learning objectives rather than the other way around. Decide how you'll measure whether learners achieved each objective: knowledge checks, skill demonstrations, case study analyses, role plays, or on-the-job observations. Include both formative assessments (during learning, to guide progress) and summative assessments (after learning, to verify achievement).

Content structure and sequencing

Organize content in a logical sequence. Move from simple to complex. Group related concepts. Build prerequisite knowledge before introducing advanced topics. Create a course map or storyboard showing the flow of modules, lessons, activities, and assessments. Most designers create a detailed design document at this stage that stakeholders review before development begins.

Delivery method selection

Choose the delivery format based on learning objectives, audience, and constraints. Instructor-led training for complex interpersonal skills. E-learning for compliance and knowledge-based content. Blended approaches for programs that need both. On-the-job training for procedural skills. Each delivery method has cost, scalability, and effectiveness trade-offs documented during the analysis phase.

Phase 3: Development

Development is the production phase where you build the actual training materials based on the design document.

Content creation

Write scripts, build slides, record videos, code e-learning modules, create job aids, and develop participant workbooks. Development is typically the most time-consuming phase. Industry benchmarks suggest 40-200+ hours of development time per hour of finished training, depending on the delivery method and interactivity level (Chapman Alliance, 2023). E-learning with custom interactions takes the longest. Instructor-led training with existing materials takes the least.

Prototyping and review cycles

Build a prototype of key modules and test with a small group of subject matter experts and representative learners before developing the full program. Catch design flaws early when they're cheap to fix. Most programs go through 2-3 review cycles: SME accuracy review, instructional quality review, and pilot test with actual learners.

Development time benchmarks

Budget your timeline realistically. A one-hour instructor-led session needs 40-80 development hours. A one-hour e-learning module with basic interactivity needs 90-150 hours. A one-hour simulation or game-based module needs 200-400 hours. These numbers include all analysis, design, development, and revision time. Rushing development is the most common reason training programs fail.

Delivery FormatDev Hours per 1 Hour of TrainingTypical Cost RangeShelf Life
Instructor-Led (basic)40-80 hrs$5,000-$15,0002-3 years
Virtual Instructor-Led50-90 hrs$6,000-$18,0001-2 years
E-Learning (Level 1: page-turner)50-100 hrs$8,000-$25,0002-3 years
E-Learning (Level 2: interactive)120-180 hrs$20,000-$50,0002-3 years
E-Learning (Level 3: simulation)200-400 hrs$40,000-$100,000+1-2 years
Video-Based Training80-150 hrs$10,000-$40,0001-2 years
Blended Program (multi-format)150-300 hrs$25,000-$75,0002-3 years

Phase 4: Implementation

Implementation is the rollout. This phase covers everything needed to deliver the training to learners at scale.

Facilitator preparation

For instructor-led programs, train-the-trainer sessions are critical. Facilitators need to understand not just the content but the design intent behind each activity. Provide facilitator guides with timing notes, discussion questions, common participant questions, and troubleshooting tips. A great program with a poor facilitator delivers worse results than a decent program with a skilled one.

Technology setup

Load courses into the LMS, test access from different devices and locations, configure enrollment and notification workflows, and verify reporting is capturing the right data. Technical issues during launch undermine learner confidence and create support burden. Test everything before going live. If the LMS crashes on day one, you lose the audience.

Communication and enrollment

Market the training internally. Explain why it matters, what learners will gain, and what's expected of them. Manager communication is especially important: if a manager says "this training is mandatory but not important," attendance will be high but engagement will be low. Send pre-work, schedule reminders, and make enrollment frictionless.

Pilot before full rollout

Run the program with a small pilot group first (10-20 people). Collect detailed feedback on content clarity, pacing, technology, relevance, and practical application. Fix issues before scaling to the full audience. A pilot catches problems that review cycles miss because real learners interact with material differently than SMEs do.

Phase 5: Evaluation

Evaluation determines whether the training achieved its objectives and how it can be improved. This is where most organizations fall short.

Formative evaluation (during development)

Formative evaluation happens throughout the ADDIE process, not just at the end. SME reviews during design, prototype testing during development, and pilot feedback during implementation are all formative. They catch problems early and improve quality before full deployment. Think of it as quality control built into the production line.

Summative evaluation (after delivery)

Summative evaluation measures the final impact. Use the Kirkpatrick Model: Level 1 (Reaction: did learners find it useful and engaging?), Level 2 (Learning: did they acquire the target knowledge and skills?), Level 3 (Behavior: are they applying it on the job?), Level 4 (Results: did business outcomes improve?). Most organizations stop at Level 1 and 2. The real value is in Level 3 and 4, but they require more effort and longer time horizons.

Continuous improvement

Evaluation feeds back into analysis. If evaluation shows learners understand the content but don't apply it, the problem might be in the work environment (no opportunity to practice), manager support (no reinforcement), or design (insufficient practice activities). Each delivery cycle should produce data that improves the next version. Training is never "done" because the business context it serves keeps changing.

ADDIE vs SAM: Choosing the Right Approach

The SAM (Successive Approximation Model) is the most popular alternative to ADDIE. Here's when each approach works best.

DimensionADDIESAM (Successive Approximation)
ApproachLinear, phase-gate (complete each phase before moving to the next)Iterative, agile (rapid prototyping with repeated cycles of design-develop-evaluate)
SpeedSlower (weeks to months of upfront analysis and design)Faster (working prototype in days, refined through iterations)
Stakeholder InvolvementReviews at phase gatesContinuous involvement throughout cycles
Risk ProfileRisk of over-engineering or analysis paralysisRisk of scope creep or insufficient analysis
Best ForHigh-stakes, regulated, complex programs where getting it right the first time mattersFast-moving environments where requirements change, SMEs have limited availability, or speed-to-market is critical
Cost ProfileHigher upfront investment, lower revision costsLower initial cost, potentially higher total if iterations aren't managed
Typical Timeline3-6 months for a major program4-8 weeks for an initial release, refined over time

Instructional Design Statistics [2026]

Data reflecting the current state of instructional design practices in corporate L&D.

67%
Of instructional designers use ADDIE or ADDIE-based modelsATD, 2023
200hrs
Average development time for one hour of interactive e-learningChapman Alliance, 2023
$1,252
Average cost to develop one hour of e-learning (basic level)Training Industry, 2024
42%
Of L&D teams now use agile or iterative design approaches alongside or instead of ADDIELinkedIn Learning, 2024

Frequently Asked Questions

Is ADDIE outdated?

ADDIE isn't outdated, but it has evolved. The original linear, phase-gate approach is too slow for many modern L&D needs. However, the five phases themselves, analysis, design, development, implementation, and evaluation, remain the fundamental activities of good instructional design regardless of the methodology. Most practitioners today use modified ADDIE with overlapping phases, rapid prototyping, and iterative feedback loops rather than the strict sequential model.

How long does an ADDIE project take?

It depends on complexity and scope. A simple one-hour e-learning module might take 4-8 weeks. A multi-day leadership development program could take 4-6 months. A full certification curriculum might take 9-12 months. The analysis and design phases typically account for 30-40% of the total timeline, development 40-50%, and implementation and evaluation 10-20%. Cutting the analysis phase to save time almost always increases development time because you're building the wrong thing.

Can ADDIE work for small, fast projects?

Yes, if you scale it appropriately. For a 30-minute e-learning module, the analysis phase might be a single conversation with the requesting manager and one SME interview. The design phase might be a one-page outline with three learning objectives. The key is to do each phase, even briefly, rather than skipping them entirely. The discipline of asking "what's the actual problem?" before building content is valuable at any scale.

What's the most important phase of ADDIE?

Analysis. Without a clear understanding of the performance gap, learner needs, and success criteria, every subsequent phase is guesswork. The most common reason training programs fail isn't poor content or bad technology. It's solving the wrong problem. An outstanding course on a topic nobody needs is still a waste of resources. Invest the most critical thinking in analysis, even if it's not the longest phase in terms of calendar time.

Do I need to be a certified instructional designer to use ADDIE?

No. ADDIE is a framework, not a credentialed practice. Subject matter experts, HR generalists, and managers can all apply ADDIE principles when creating training. The framework provides a structured thinking process that anyone can follow. That said, professional instructional designers bring expertise in learning science, assessment design, media production, and accessibility that improve quality. For high-stakes or large-scale programs, involving a trained ID is worth the investment.

How does ADDIE handle changes in requirements mid-project?

Traditional ADDIE handles change poorly because each phase is designed to be completed before the next begins. If business requirements change during development, you may need to revisit design or even analysis, which creates rework. This is ADDIE's biggest weakness. Modern adaptations address this by building review checkpoints into each phase, using rapid prototyping to validate direction early, and accepting that some iteration between phases is normal. If your organization's requirements change frequently, consider SAM or agile ID approaches.
Adithyan RKWritten by Adithyan RK
Surya N
Fact-checked by Surya N
Published on: 25 Mar 2026Last updated:
Share: