Last month, I sat down with Jamie Engel, the CEO and Founder of Neutopia. We connected on LinkedIn...
The real cost of scope creep in learning design

There’s an old saying about too many chefs in the kitchen, and it tends to show up very quickly in online education projects.
Most projects don’t start messy. They usually begin with a clear scope, agreed outcomes, and a structure that makes sense. Everyone signs off, the team gets moving, and there’s a shared understanding of what is being built.
Then, as always, things evolve.
Maybe someone raises a compliance requirement that wasn't mentioned at the start. Or mid way through development a new stakeholder appears and wants to include something they think is important.
Now, none of these inputs are unreasonable on their own. In fact, they are often valuable but the challenge is that they don’t arrive all at once, and they are not always filtered in a consistent way. Over time, they can start to reshape the program, and the team gradually absorbs work that was never part of the original design.
Most of us have been in that situation. You can feel the program drifting, but it is not always clear which inputs should be prioritised and which should be held back.
This is not really a people problem. It is a process issue. Without clear boundaries at the start, and without a way to assess changes before they are accepted, scope expands quietly and continuously.
Why structured organisations carry this risk
It might feel like scope drift is just part of working in complex environments. In some ways, it is — but understanding why it happens in universities and RTOs specifically makes it much easier to design around.
Programs in these environments involve multiple stakeholders with legitimate authority. Academic leads, compliance teams, quality committees, senior sponsors. Each of them will look at the same program and see a different set of priorities. Without a clear governance structure that defines who has decision rights at each stage, every voice becomes a potential source of expansion. And because these are not external stakeholders making unreasonable demands, but colleagues with real responsibilities, declining a request carries social cost.
Subject matter experts present a related challenge. Their depth of knowledge is extraordinary. But knowing the subject and knowing where the instruction of that subject should stop are different skills. Without a signed-off scope, every conversation with an expert risks surfacing content that was never planned, and refusing it feels like undermining the expertise you've relied on to build credibility.
Long timelines compound everything. The longer a program takes to develop, the more the world changes around it. Research evolves. Policy shifts. Software updates. Each change creates a window to reopen what was already settled. In the absence of a clear change process, those windows tend to stay open.
Underneath all of this is a dynamic that rarely gets named directly: the social cost of saying no. When declining a request from a senior academic or a regulator carries real professional risk, the path that feels safest is absorption. PMI research confirms that unmanaged scope change is the leading driver of budget overruns across complex service environments [1]. That framing is useful for conversations where "scope creep" can sound like a personal accusation.
What this costs in practice
What is often underestimated is how quickly small changes turn into much larger ones. Generally, on their own, most requests seem reasonable. A tweak to how a module is introduced. A slight shift in tone. An extra element added to improve clarity. None of these feel significant in isolation.
However, we have seen projects where a decision is made partway through to adjust something simple, like the way each module is introduced. On the surface, that sounds like a straightforward change. In reality, it means going back through everything that has already been built to make sure it is consistent. That triggers another round of review, small adjustments, and inevitably a few things get missed along the way because the team is now doubling back while also trying to move forward.
That is where the real cost sits because it is not just the time to make the change, but the ripple effect it creates across the rest of the program. Work that was previously complete is reopened, timelines stretch, and the team’s attention is split between progressing new content and revisiting old content.
Over time, this starts to affect quality. Not because the team is not capable, but because the structure they are working within is constantly shifting. What was a clear, well-designed program becomes harder to hold together.
What sound governance looks like in learning design practice
The decisions that prevent scope drift are not complicated. They are just almost always made too late.
The foundation is a scope document that is specific and signed off. Not a summary, not a slide deck from an early planning session — a clear written account of what the program will and will not include: audience, learning outcomes, content boundaries, assessment approach, and what sits explicitly outside the current build. Every subsequent request gets measured against this document, not against someone's memory of an early conversation.
Change will happen but the question is whether it is assessed before it is absorbed. A functional change process makes trade-offs visible. For example; if a new module is added, the team understands exactly what that means for the timeline, resourcing, and quality of what is already in development. When consequences are clear, decisions improve. Stakeholders who understood they were asking for a modest addition start to weigh whether the addition is worth what it costs.
And the most reliable anchor for every change request is the learner. What does the learner need to be able to do? Does this addition support that? Where the answer is unclear or contested, the request is most likely about stakeholder preference rather than pedagogical necessity. That distinction, made early and held consistently, changes what the program looks like at launch.
If you're already in it
If your program is already in motion and the drift is visible through the timeline, in the team's energy or in the gap between what was planned and what is now being built we recommend a formal alignment session with your key stakeholders. It will hopefully surface the assumptions that have accumulated, name the trade-offs that haven't been made consciously, and create the conditions for cleaner decisions from here.
A program should reflect the outcomes it was designed for. Most of the time, that's still achievable. It just requires a deliberate conversation.
Working through this with Oppida
Oppida's Strategic Program Design engagement exists for this moment — before development begins, when the foundation needs to be established clearly enough to hold under pressure.
That means clear purpose, a defined audience, and scope boundaries that can be communicated and defended. For programs already in motion, the same thinking applies: surface what has shifted, name the trade-offs, and find a path that brings the program back into alignment with its intended outcomes.
If that's where you are, a conversation is a useful place to start.
References & Further Reading
[1] Project Management Institute (PMI) (2024). A Guide to the Project Management Body of Knowledge (PMBOK Guide). Link
[2] TEQSA (2021). Higher Education Standards Framework. Standard 2.1. Link
[3] ASCILITE (2024). Reframing Learning Design for a Changing Sector. Conference Proceedings. Link
Frequently asked questions
What is scope creep in learning design?
Scope creep in learning design is the gradual expansion of a program's content, features, or requirements beyond its original, agreed-upon boundaries — typically without a corresponding adjustment to the timeline or budget. It is a process risk that most commonly occurs when change requests from stakeholders are absorbed without formal assessment of their impact on the project.
Why is scope creep common in higher education and VET program development?
Higher education and VET environments are structurally exposed to scope drift because of multiple stakeholders with legitimate authority (academic leads, compliance teams, quality committees), long development cycles that allow context to shift, and a social dynamic in which declining an addition from a senior colleague carries professional risk. Without explicit decision rights and a signed-off scope document, every stakeholder becomes a potential source of expansion.
How does scope creep affect learning quality?
When scope expands beyond the team's capacity, quality becomes the variable that gives way. The program begins to trade pedagogical depth for coverage — producing content that is broad but lacks the depth required to translate into capability. Programs built under scope pressure also tend to be harder to maintain after launch, introducing recurring operational costs that persist over time.
What governance mechanisms prevent scope creep in course development?
Three mechanisms are essential: a specific, signed-off scope document that defines what is in and out of the current build; a formal change process that makes trade-offs visible before any addition is absorbed; and outcome-anchored decision making that tests every new request against the learner's actual needs rather than stakeholder preference.
What should a learning design team do if scope creep is already occurring mid-project?
A formal scope alignment session with all key stakeholders is the recommended step. This surfaces the assumptions that have accumulated, names the trade-offs that haven't been made consciously, and creates cleaner decision-making conditions going forward. Continuing to build without a reset typically results in a program that is misaligned with its original intent and requires remediation post-launch.