
Notes from the Pod
March 20, 2026Play to make the learning stay
April 25, 2026Rethinking the Iterative Cycle When the Project Has Already Grown Up
Stephen Sutherlin
Graduate Candidate, University of New Mexico, Organization, Information, and Learning Sciences
Abstract
The ADDIE model (Analysis, Design, Development, Implementation, Evaluation) has served instructional designers since the 1970s as a foundational process framework. Its five phases are typically presented as a cycle, with the implication that each new iteration restarts at Analysis. This paper proposes that for mature instructional projects, programs that have already completed one or more full design cycles, the return to Analysis is rarely warranted. Instead, the most productive re-entry point is Evaluation. We propose EDDIE (Evaluation, Design, Development, Implementation, Evaluation) as a re-sequenced model for the second and subsequent iterations of an established instructional program. Drawing on cognitive load theory, the expertise reversal effect, human-centered design principles, and constructionist methodology, we argue that EDDIE better reflects how experienced instructional designers actually work when refining existing instruction, and that it produces more efficient, learner-responsive revision cycles.
The Problem with Perpetual Analysis
ADDIE was developed by the U.S. military in the 1970s as a systematic framework for training design. Its original ADDIE reference document describes the model as providing “a structured process for designing effective, learner-centered educational experiences” across five phases: Analysis, Design, Development, Implementation, and Evaluation (ADDIE Model Reference, n.d.). The model is explicitly described as “iterative and cyclical,” with designers expected to “revisit earlier stages as new information emerges or as improvements are made.”
But here is the practical problem: when an instructional program has been running for two semesters, or a corporate training module has been delivered to three cohorts, does the designer really need to re-conduct a full needs assessment? Do they need to re-analyze learner demographics, re-identify constraints, and re-specify delivery options from scratch? Almost never. What they need is to look at what happened, what worked, what did not work, and what the data tells them to change.
The Analysis phase, as defined in the standard model, asks designers to define instructional goals, conduct a needs assessment, analyze learners, identify constraints, and define delivery options. These are founding decisions. Once they are made well, they persist. The goals of a nursing fundamentals course do not change from semester to semester. The technology platform does not typically shift mid-program. The learner population remains broadly consistent.
What does change is performance data, learner feedback, engagement patterns, and emerging misalignments between intended and actual outcomes. These are the outputs of the Evaluation phase, not the Analysis phase.
Introducing EDDIE: Evaluation as the Engine of Iteration
We propose EDDIE as a re-sequenced model for mature instructional projects:
E – Evaluation (of the existing implementation)
D – Design (revised strategy based on evaluation findings)
D – Development (updated materials, assessments, media)
I – Implementation (relaunch with revisions)
E – Evaluation (of the revised implementation, feeding the next cycle)
The name is intentional. EDDIE is ADDIE’s older brother. Where ADDIE is the model you use when you are building something new, when you are asking “What should we teach, to whom, and how?”, EDDIE is the model you use when the thing exists and you are asking “How well is it working, and what should we change?”
This is not a rejection of Analysis. It is a recognition that Analysis is a first-iteration activity. If a mature project encounters a fundamental change in mission, audience, or technology platform, the designer should indeed return to Analysis. That is starting over. EDDIE is for the more common case: the project is sound, the structure holds, and the work is refinement.
Theoretical Support
Cognitive Load Theory and the Expertise Reversal Effect
Kalyuga, Chandler, and Sweller (1998) demonstrated that the optimal instructional format changes as learner expertise increases. Novice learners benefit from fully integrated text-and-diagram instruction that reduces split attention. But as learners gain expertise, that same integrated format becomes counterproductive. The redundant text imposes unnecessary cognitive load, and a diagram-only format produces better outcomes.
We argue that a parallel principle applies to instructional design processes, not just instructional materials. When a design team is new to a project, the full Analysis phase functions like an integrated instructional format: it provides the scaffolding necessary to understand the problem space. But when the team has been through one or more cycles, they have already acquired schemas for the project’s goals, audience, and constraints. Re-running Analysis is like forcing an experienced electrical engineer to re-read textual descriptions of circuit diagrams they can already interpret from the diagram alone. As Kalyuga et al. observed, “unnecessary detail may be distracting to a far greater extent than is sometimes thought” and “the need to process redundant information… can be a considerable impediment to further schema acquisition” (1998, p. 16).
In the EDDIE model, the Evaluation phase serves as the streamlined re-entry point that respects the design team’s existing expertise. It gives them what they need (performance data, gap analysis, learner feedback) without burdening them with what they already know.
Norman’s Iterative Cycle and the Spiral Method
Don Norman’s human-centered design framework describes an iterative cycle of observation, idea generation, prototyping, and testing (Norman, 2013). Norman explicitly calls this the “spiral method” to emphasize that each pass through the cycle makes progress rather than returning to the starting point. The Double-Diamond model he references similarly distinguishes between finding the right problem (divergence to convergence) and finding the right solution (divergence to convergence again).
The critical insight from Norman’s framework is that the first diamond, finding the right problem, corresponds to ADDIE’s Analysis phase. Once you have converged on the problem, subsequent iterations operate within the second diamond: generating, prototyping, and testing solutions. You do not reopen the problem definition unless evidence compels you to.
Norman’s Seven Stages of Action further support this reframing. The action cycle distinguishes between the Bridge of Execution (plan, specify, perform) and the Bridge of Evaluation (perceive, interpret, compare). For a mature instructional project, the Bridge of Evaluation is the productive starting point for each new iteration. You perceive what happened in the last implementation, interpret the patterns, and compare outcomes against goals. Only then do you cross back to execution with revised designs.
EDDIE formalizes this. It says: start with evaluation. Cross the Bridge of Evaluation first. Then design, develop, and implement accordingly.
Retrieval Practice and Evaluation as a Learning Event
Brown, Roediger, and McDaniel (2014) established that the act of retrieving information strengthens memory more effectively than re-studying the same material. This principle, which drives assessment design in instructional practice, also applies to the design process itself.
When a design team begins a revision cycle with Evaluation, they are performing retrieval practice on their own instructional decisions. They are asking: What did we intend? What actually happened? Where is the gap? This is a cognitively demanding, schema-strengthening activity. It forces the team to confront calibration errors, places where they thought the instruction was effective but the data says otherwise.
Starting with Analysis, by contrast, is closer to re-reading. The team reviews the learner profile, the goals, and the constraints they already know. This is low-effort review that produces little new understanding. EDDIE puts the team in their zone of proximal development as designers: challenged by real performance data, but working within a project structure they understand well enough to act on what they find.
Constructionism and the Maturation of Artifacts
A constructionist approach to instructional design holds that the artifact, the course, the module, the learning experience, is the site of knowledge construction for the designer, not just for the learner. Each iteration of a design cycle deepens the designer’s understanding of the problem space through the act of building and revising.
In this frame, the first ADDIE cycle is where the designer constructs their initial understanding of the problem. The artifact they produce embodies their best current theory of what will work. But the artifact is also a hypothesis. Implementation tests that hypothesis. Evaluation reveals where the hypothesis held and where it broke down.
EDDIE treats subsequent cycles as hypothesis refinement rather than hypothesis generation. The designer already has a working artifact. The question is no longer “What should this be?” (Analysis) but “How well does this work, and what should change?” (Evaluation). This aligns with how constructionist thinkers understand learning through making: the object evolves through successive encounters with the real world, and the maker’s understanding evolves with it.
McClusky’s Theory of Margin and Designer Efficiency
McClusky’s Theory of Margin describes the ratio between the load an adult carries (demands, expectations, obligations) and the power they have available to manage that load (resources, skills, energy). When load exceeds power, margin shrinks and the person’s capacity for learning and productive work declines.
Instructional designers are adult professionals who carry significant cognitive load from multiple concurrent projects, stakeholder demands, and institutional pressures. Requiring them to re-execute a full Analysis phase on a mature project consumes margin without proportional return. The Analysis outputs already exist. The needs assessment was already done. Re-running it feels like busywork, and that perception is cognitively accurate: the effort is real, but the new information yield is low.
EDDIE respects designer margin by starting where the highest-value information lives: in the evaluation data from the last implementation. It is not a shortcut. It is an efficiency that honors the reality of professional practice.
When EDDIE Applies and When It Does Not
Use EDDIE when: the instructional program has completed at least one full ADDIE cycle; the target audience, subject matter, and delivery platform remain broadly consistent; evaluation data (learner performance, feedback, engagement metrics, facilitator observations) is available from the previous implementation; and the revision goal is improvement, not reinvention.
Return to ADDIE when: the organization’s mission or strategic priorities have fundamentally changed; the target learner population has shifted dramatically (e.g., a course designed for novices is now serving experienced practitioners); the technology platform is being replaced (e.g., migrating from in-person to fully asynchronous); or a formal compliance or accreditation review requires documented needs analysis.
The distinction is between refinement and reinvention. EDDIE is a refinement model. ADDIE is a founding model. Most real-world instructional work, once a program is established, is refinement.
Conclusion
ADDIE is not broken. It remains the right model for starting a new instructional design project from scratch. But the field has treated ADDIE’s cyclical nature as though every iteration demands a full restart, and this is where practice diverges from the model.
Experienced instructional designers do not re-analyze stable projects. They evaluate them. They look at what happened, identify what needs to change, and redesign accordingly. EDDIE names this practice and gives it theoretical grounding.
The resequencing is small, moving Evaluation from the end to the beginning, but the conceptual shift is significant. It acknowledges that instructional projects mature, that design teams develop expertise about their own projects, and that the most productive entry point for a revision cycle is data, not assumptions.
ADDIE builds the house. EDDIE renovates it. Most of the real work in instructional design, once the foundation is laid, is renovation.
References
ADDIE Model Reference (n.d.). The ADDIE model: A guide for instructional design.
Brown, P. C., Roediger, H. L., & McDaniel, M. A. (2014). Make it stick: The science of successful learning. Harvard University Press.
Kalyuga, S., Chandler, P., & Sweller, J. (1998). Levels of expertise and instructional design. Human Factors, 40(1), 1-17.
McClusky, H. Y. (1963). The course of the adult life span. In W. C. Hallenbeck (Ed.), Psychology of adults. Adult Education Association of the U.S.A.
Murtaza, M., et al. (2022). AI-based personalized e-learning systems: Issues, challenges, and solutions. IEEE Access, 10, 81323-81342.
Norman, D. A. (2013). The design of everyday things (Rev. ed.). Basic Books.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.
