Company Name:
Program Name:
Instructional Designer:
Target Launch Date:
Analysis Phase
Identify the specific gap between current and desired performance that training is intended to close. The ADDIE model, originally developed for the US military by Florida State University in the 1970s, begins with rigorous analysis to ensure training is the right solution. Not all performance problems are training problems — some are caused by process failures, resource constraints, or motivation issues that training alone cannot fix.
Profile the intended learners including their current knowledge level, learning preferences, technology access, language requirements, job context, and any constraints (shift patterns, travel, remote work). Learner analysis ensures the instructional design is appropriate for the actual audience rather than an assumed one.
Assess the environment in which learning will take place — whether it is a classroom, virtual platform, on-the-job setting, or self-paced digital format. Consider constraints such as available technology, maximum session length, facilitator availability, and budget limitations that will shape design decisions.
Write clear, specific learning objectives that describe what learners will be able to do after the training, at what level of proficiency, and under what conditions. Use Bloom's taxonomy (Remember, Understand, Apply, Analyse, Evaluate, Create) to ensure objectives target the appropriate cognitive level for the performance need.
Define how training effectiveness will be measured before beginning design. Specify criteria at each Kirkpatrick level: satisfaction targets (Level 1), knowledge assessment pass rates (Level 2), on-the-job behavior change indicators (Level 3), and business impact metrics (Level 4). Pre-defining evaluation criteria prevents post-hoc rationalisation of results.
Design Phase
Map the overall structure of the learning experience including module sequence, estimated durations, prerequisite relationships, and the blend of delivery methods. A well-designed architecture ensures logical progression from foundational concepts to advanced application, following principles of scaffolded learning.
For each module, create a storyboard that specifies the learning objectives, content outline, instructional strategies, activities, media requirements, and assessment methods. Storyboards serve as the blueprint for development and should be reviewed and approved by subject matter experts and stakeholders before production begins.
Choose pedagogical approaches that match the type of learning required. Use case studies and simulations for application-level objectives, group discussions and debates for analysis and evaluation, worked examples for procedural knowledge, and scenario-based practice for behavioral skill development. The strategy must serve the objective, not the other way around.
Create formative assessments (knowledge checks, practice exercises, peer feedback) distributed throughout the learning experience and summative assessments (final tests, performance demonstrations, capstone projects) at the end. Effective assessments test the learning objectives directly — not peripheral knowledge — and provide constructive feedback to learners.
Apply Web Content Accessibility Guidelines (WCAG 2.1) for digital content, provide captions and transcripts for audio and video, ensure color contrast ratios meet standards, and design for screen reader compatibility. Universal Design for Learning (UDL) principles — multiple means of engagement, representation, and action — ensure the program is accessible to all learners.
Build a working prototype of one representative module to demonstrate the look, feel, interactivity, and instructional approach before investing in full development. Prototyping reduces the risk of costly rework by validating design decisions early with stakeholders, subject matter experts, and a sample of target learners.
Development Phase
Develop the full suite of learning assets including slide decks, facilitator guides, participant workbooks, e-learning modules, videos, job aids, and assessment instruments. Follow the approved storyboards precisely and maintain consistency in visual design, tone of voice, and branding across all materials.
Use industry-standard authoring tools such as Articulate Storyline, Rise 360, Adobe Captivate, or similar to create engaging digital learning experiences. Ensure modules follow SCORM or xAPI standards for LMS compatibility and include interactive elements such as branching scenarios, drag-and-drop activities, and knowledge checks to maintain learner engagement.
Create comprehensive facilitator guides that include session timings, discussion prompts, activity instructions, debrief questions, anticipated learner questions with suggested responses, and guidance for adapting content to different group sizes and experience levels. A strong facilitator guide ensures consistent delivery quality regardless of who facilitates.
Implement a structured review process where content is checked for accuracy by subject matter experts, instructional soundness by peer instructional designers, and technical functionality by QA testers. Use a standardised review checklist covering content accuracy, objective alignment, interactivity, accessibility, and technical performance.
Run a full pilot with 10–20 learners representative of the target audience. Observe the learning experience, collect detailed feedback on content clarity, pacing, engagement, and practical relevance, and test all technical components. Document all issues and feedback for revision before the production rollout.
Implementation Phase
Configure the LMS, virtual classroom platform, or physical training facilities well in advance of launch. Upload and test all content, set up enrolment workflows, configure completion tracking, and verify that all technical integrations (single sign-on, reporting, certificates) function correctly in the production environment.
Conduct train-the-trainer sessions for all facilitators, covering the program objectives, instructional approach, activity facilitation techniques, and assessment administration. Provide facilitators with practice opportunities and feedback to ensure they can deliver the program as designed.
Deploy a structured communication campaign that explains the program purpose, target audience, time commitment, enrolment process, and completion expectations. Manager endorsement is critical — equip managers with talking points to encourage their team members to participate and apply learning on the job.
Track enrolment, attendance, completion, and learner satisfaction data throughout the initial rollout. Attend or observe early sessions (in person or virtually) to assess facilitation quality, content reception, and pacing. Address any issues immediately to prevent poor early experiences from undermining program reputation.
Deploy follow-up activities such as application assignments, peer discussion forums, manager debriefs, and refresher microlearning modules in the weeks following the program. Research by Ebbinghaus on the forgetting curve shows that without reinforcement, learners forget approximately 70 per cent of new information within 24 hours and 90 per cent within a week.
Evaluation Phase
Administer learner satisfaction surveys within 24 hours of program completion to capture impressions of content relevance, facilitator effectiveness, pacing, and overall experience. While reaction data alone does not prove learning effectiveness, consistently low satisfaction scores indicate design or delivery issues that must be investigated.
Measure whether learners acquired the intended knowledge and skills using pre-test/post-test comparisons, scenario-based assessments, or practical demonstrations. Level 2 evaluation confirms that the instructional design effectively transferred knowledge, independent of whether learners subsequently apply it on the job.
Assess whether learners are applying new knowledge and skills in their work 60–90 days after the program. Use manager surveys, direct observation, self-assessments, and performance data to determine the degree of learning transfer. Level 3 is where most training evaluation fails — yet it is the most critical indicator of real-world impact.
Quantify the business impact of the training by measuring changes in the performance metrics identified during the analysis phase — such as sales revenue, error rates, customer satisfaction, safety incidents, or employee retention. Isolate the training effect from other variables using control groups, trend analysis, or expert estimation methods.
Document all evaluation data, learner feedback, facilitator observations, and business impact findings in a comprehensive report. Identify specific improvements for the next iteration of the program. ADDIE is an iterative model — evaluation findings directly inform the analysis phase of the next design cycle, creating a continuous improvement loop.
ADDIE is the most widely used instructional design model in the world, providing a five-phase systematic approach to creating effective training programs. The acronym stands for Analyse, Design, Develop, Implement, and Evaluate — five sequential phases that guide your team from identifying a learning need to delivering and measuring an effective learning solution.
The ADDIE training development model originated in the 1970s, created for the U.S. military by Florida State University’s Center for Educational Technology. It was initially designed to ensure consistent, high-quality instructional systems design across a massive, distributed organization. Since then, it has become the foundational course development methodology taught in virtually every instructional design program worldwide.
What makes ADDIE enduring is its flexibility. While the five phases follow a logical sequence, the learning design framework can be applied iteratively. Many modern adaptations — including rapid prototyping and agile instructional design — use the ADDIE structure as their backbone, allowing designers to test and refine content throughout the curriculum development process rather than waiting until final delivery.
If your organization creates any kind of training — onboarding programs, compliance modules, skills development courses, leadership curricula — ADDIE gives you a repeatable instructional design process for doing it consistently well. Without a structured training development approach, learning programs tend to be inconsistent in quality, unclear in their objectives, and difficult to measure.
The systematic course design framework saves time and money in the long run. By front-loading analysis and design before any content is created, you avoid the costly mistake of building training that doesn’t address the actual performance problem. Research from ATD shows it costs 5–10 times more to fix a learning program after launch than to design it correctly upfront using a structured instructional systems methodology.
For your L&D team, ADDIE also provides a common language and workflow for curriculum development. When everyone follows the same training design process, handoffs between team members are smoother, stakeholder reviews are more productive, and quality becomes predictable rather than dependent on individual designer talent.
The framework provides detailed guidance for each of the five ADDIE phases. The Analyse phase covers needs assessment, learner analysis, and defining measurable performance objectives. The Design phase walks you through creating learning objectives using Bloom’s taxonomy, selecting instructional strategies, sequencing content, and planning assessments aligned to outcomes.
The Develop phase of this training development model covers content creation, from storyboarding and scripting to media production, SME review cycles, and pilot testing. The Implement phase addresses delivery logistics, facilitator preparation, LMS configuration, and learner communication plans. Each phase includes detailed checklists and templates to keep your instructional design project on track and on budget.
The Evaluate phase covers both formative evaluation (during development) and summative evaluation (after delivery). You’ll find guidance on Kirkpatrick’s four levels of learning program evaluation — reaction, learning, behavior, and results — plus Phillips’ ROI methodology for calculating the financial return of your course development investment.
Pick the Brief version for a concise phase-by-phase training design checklist or the Detailed version for a comprehensive guide with templates, worked examples, and instructional design best practices. Both are available as instant downloads in PDF or DOCX format.
Customize the framework to fit your team’s curriculum development workflow. Modify the phase checklists, adapt the templates to your authoring tools and learning platforms, and adjust the evaluation criteria to match your organization’s priorities. The editable fields make it easy to integrate this systematic course design process into your existing L&D operations.
Hyring’s free framework generator gives you a professional-grade ADDIE instructional design guide that you can start using on your next training development project. No instructional design degree required — just a structured learning design process and practical tools.