Solving the e-learning costing conundrum

The impossible request

Have you ever been asked, “Can you just give me a quick quote for some e-learning?” with no details about scope, audience, or complexity? If so, you’ll know the frustration of the “piece of string” problem: being asked to estimate something without the information needed to make that estimate meaningful.

"Can you give me a quote for some e-learning? I need it for a research proposal deadline on Friday."
What we don't know...
Scope and complexity
Learning outcomes
Target audience
Platform requirements
Media needs
Compliance requirements
Timeline expectations

At the University of Southampton, we’ve developed a systematic approach to transform vague requests into realistic, evidence-based project plans. Here’s how.

The challenge: vague requests and unrealistic expectations

Colleagues often approach us with urgent deadlines and minimal detail. They might know the topic and desired duration, but not the learning outcomes, platform requirements, or compliance needs. Yet they expect a reasonable quote.

Without understanding scope and complexity, any figure is guesswork. A simple text-based module might take 120 hours, while an immersive simulation could require 750 hours or more. The difference is huge – and stakeholders rarely appreciate this upfront.

Why systematic scoping matters

Whether you’re in higher education, corporate training, or healthcare, the underlying challenge is the same: you need to uncover true project complexity before making estimates. This isn’t just about costing – it’s about resource allocation, timeline planning, and quality assurance.

Beyond monetary costing: Why systematic scoping matters.
Different institutional contexts, same challenge:
Research universities: Formal costing for grant applications
Teaching-focused institutions: Capacity planning and workload allocation
Commercial training providers: Client quotations and project scoping
Internal learning teams: Resource allocation across competing projects.
Appropriate resource allocation
Realistic timeline estimation
Stakeholder expectation management
Quality assurance.

Building on research: evidence-based time allocation

Our framework draws on industry benchmarks like Chapman Alliance’s 2010 study and recent data from ATD and Christy Tucker. For example:

  • Basic content: ~113 hours per 60-minute course
  • Intermediate interactive: ~534 hours
  • Advanced simulation: ~750 hours

These figures include compliance, accessibility, SME involvement, and review cycles – factors often overlooked in quick estimates.

Building on research: Evidence-based time allocation.
Chapman Alliance (2010): Industry benchmark study
	249 organisations, 3,947 learning professionals
	Level 1 (basic): 79 hours per finished hour
	Level 2 (interactive): 184 hours per finished hour
Our framework: Three tiers for healthcare/HE context
	Basic content: 113 hours per 60-minute course
	Intermediate interactive: 534 hours per 60-minute course
	Advanced simulation: 750 hours per 60-minute course
	Includes: compliance requirements, SME involvement, accessibility implementation, review cycles

What drives complexity?

Several elements can dramatically increase development time:

  • Scenario-based learning: Each branching scenario adds 40–75 hours.
  • Compliance: WCAG 2.1 AA adds ~35% to development time.
  • SME involvement: Healthcare projects require ~35% of total time for SME review.

These multipliers often surprise stakeholders – and they’re why systematic questioning is essential.

What transforms "simple" projects into complex ones.
Scenario-based learning
•	Moderate scenario (3–4 decisions): +~42 hours
•	Complex scenario (5+ pathways): +~75 hours
•	Multiple scenarios multiply quickly
Compliance requirements
•	PSBAR compliance including WCAG 2.1 AA: +35% development time
•	EAA compliance: Additional +5%
•	Subject matter expert involvement
•	Healthcare content: 35% of total time
•	General content: 30% of total time
•	Often underestimated or forgotten entirely

Our solution: 20 essential questions

We created a structured question set covering:

  1. Project fundamentals: Purpose, audience, governance
  2. Learning design: Outcomes, instructional strategy, scenario needs
  3. Compliance and delivery: Accessibility, platform, media
  4. Resources and risk: Timeline, SME availability, budget

This transforms the conversation from “give me a quote” to “let’s define what’s feasible.”

From vague requests to defined projects: 20 essential questions.
•	Project fundamentals (Questions 1–2)
•	Purpose, audience, strategic alignment
•	Stakeholder management and governance
•	Learning design (Questions 3–7)
•	Learning outcomes and instructional strategy
•	Content complexity and structure
•	Scenario and assessment requirements
•	Compliance and delivery (Questions 8–13)
•	Regulatory requirements and accessibility
•	Platform selection and media specifications
•	Resources and risk (Questions 14–20)
•	Timeline, SME availability, budget
•	Risk assessment and quality assurance

The calculator: From inputs to insights

To support this process, we built an Excel-based calculator that:

  • Guides scoping conversations with dropdown menus and validation
  • Applies evidence-based multipliers for complexity and compliance
  • Outputs a clear cost and time breakdown for stakeholders

It’s not just a costing tool – it’s an educational tool that helps colleagues understand what quality e-learning really involves.

Benefits we’ve seen so far

  • Improved accuracy in project scoping
  • Reduced scope creep
  • Better stakeholder understanding of complexity
  • Higher confidence in delivery timelines

And perhaps most importantly, better working relationships – because transparency builds trust.

Before and after: How the calculator transforms conversations.
Before the calculator 
	Vague requests with unrealistic expectations
	Time estimates pulled from thin air
	Scope creep during development
	Frustrated stakeholders and overworked teams
	Projects that don't deliver quality
After the calculator
	Systematic discovery of true requirements
	Evidence-based, defensible estimates
	Clear understanding of trade-offs
	Informed scope decisions
	Realistic timelines and resource allocation
Early pilot results:
	Improved accuracy of project scoping
	Better stakeholder understanding of complexity
	Reduced scope creep
	More realistic timeline expectations
	Higher confidence in delivery

Getting started

You don’t need the full calculator to begin. Start with the 20-question framework. Track actual project time honestly. Compare estimates to actual time taken. Over time, you’ll build your own evidence base and improve accuracy.

Your expertise: What would make this work in your context? 
What would you need to ask that we haven't included? Are there essential scoping questions we've missed? 
What institutional factors are critical in your setting that might not be as important in ours? For example, if you work in FE rather than HE, or in commercial training rather than public sector, what changes? 
Where do you lack reliable data for estimation? We've based our framework on Chapman, ATD, Moore, and our own experience, but are there areas where you simply don't have good benchmark data? This might be where we could collaborate as a sector
How would you need to adapt this approach for your context? Would you need different tiers? Different multipliers? Different outputs?

Key takeaways

  1. Systematic questioning transforms vague requests into defined projects.
  2. Evidence-based estimation is possible and valuable.
  3. Transparency improves relationships rather than complicates them.
  4. Start simple and build over time.

The “piece of string” problem is solvable – not perfectly, but well enough to change how we work and how we’re perceived.

Current limitations:
	Based primarily on UK HE context
	Healthcare/clinical content emphasis
	Limited data on AI impact (rapidly evolving)
	SME rates vary widely by discipline
	Platform-specific to our institutional choices
	
Ongoing development
	Validating with more diverse project types
	Incorporating feedback from pilot implementations
	Monitoring AI impact on development time
	Exploring sector-wide benchmark data collection
	Developing guidance for different institutional contexts
	Media costings
	
What we still need:
	More comprehensive sector benchmark data
	Evidence on long-term project accuracy
	Understanding of AI's true impact
	Validation across different institutional types

Want to learn more?

We’re piloting this framework during 2025/26. We are currently working on redeveloping the media production costing. If you’d like to collaborate on sector-wide benchmarking, get in touch: Tamsyn Smith – T.M.Smith@soton.ac.uk

Key research cited:


Associated resources

‘How long is a piece of string?’ presentation delivered at ALTC25

Essential scoping questions for SME discussion

This resource provides a comprehensive framework of 20 scoping questions designed to guide initial discussions between learning designers and subject matter experts (SMEs) when planning eLearning projects, particularly those with clinical or healthcare content.

The document systematically addresses critical project planning areas including strategic alignment, stakeholder management, learning design approaches, assessment strategies, accessibility requirements, technical specifications, and resource allocation. It incorporates evidence-based time estimates for different levels of content complexity (basic to advanced) and includes specific cost projections based on development hours.

Key features include:

  • Structured questions covering project fundamentals through to risk management
  • Specific guidance on scenario-based learning and branching complexity
  • Compliance and accessibility considerations aligned with WCAG 2.1 AA standards
  • Platform-specific development time estimates (Blackboard Ultra, Articulate Rise 360, Storyline 360)
  • Budget frameworks with realistic hour-per-finished-content ratios
  • Healthcare-specific considerations including clinical accuracy validation and SME involvement timeframes

This resource would be particularly valuable for learning designers initiating healthcare or clinically-focused eLearning projects, project managers establishing realistic timelines and budgets, or institutions seeking to standardise their eLearning scoping processes. The framework ensures comprehensive early-stage planning whilst managing scope, compliance requirements, and stakeholder expectations effectively.

Advice for reuse

Contextualisation is essential

This framework has been developed specifically for healthcare and clinical eLearning projects within a university setting. Users should adapt the questions to reflect their own institutional context, regulatory requirements, and content areas. Remove or modify healthcare-specific elements (such as clinical accuracy validation cycles) if working in non-clinical domains.

Adjust time and cost estimates

The development time multipliers and hourly rates provided are indicative and based on specific institutional rates and circumstances. Recalculate these figures to reflect your own staffing costs, institutional overheads, and local market rates for media production. Time estimates may also vary depending on your team’s experience and available tools.

Tailor the scope to project scale

Not all 20 question areas will be relevant for every project. Smaller-scale projects may require only a subset of these questions, whilst larger, more complex initiatives might benefit from the full framework. Use this as a starting point and select the sections most pertinent to your specific project requirements.

Update compliance and accessibility standards

Ensure that referenced standards (such as WCAG 2.1 AA) remain current, as accessibility guidelines and regulatory requirements evolve over time. Replace or supplement these with any additional standards relevant to your jurisdiction or professional context.

Integrate with existing processes

This framework should complement, not replace, existing institutional project management or quality assurance processes. Align the scoping questions with your organisation’s established workflows, approval mechanisms, and documentation requirements to avoid duplication or confusion.

UoS eLearning cost calculator V2

This evidence-based Excel calculator provides comprehensive time and cost estimates for eLearning development projects within higher education and healthcare contexts. The tool generates detailed project costings by incorporating multiple variables including content duration, platform selection, authoring tool requirements, scenario complexity, assessment specifications, compliance levels, and media production needs.

The calculator draws on peer-reviewed research and industry benchmarks (Chapman Alliance, 2010; ATD Research, 2023; Tucker, 2025) to produce realistic development time multipliers for different authoring tools and complexity levels. It automatically calculates platform-specific development hours (ranging from basic platform creation to advanced tools such as Articulate Storyline 360), applies appropriate compliance multipliers for WCAG 2.1 AA, PSBAR, and European Accessibility Act requirements, and generates risk assessments based on project specifications.

Key features include automatic platform selection based on target audience (Blackboard Ultra for students, Totara for staff, Course Catalog for external professionals), dynamic cost breakdowns showing development hours and associated costs, scenario-based learning calculations with complexity ratings, and integrated risk assessment with mitigation strategies.

The calculator is particularly suited for learning designers, project managers, and academic staff planning eLearning projects who require accurate budget estimates and timeline projections. It includes comprehensive maintenance documentation and lookup tables that can be updated to reflect changing institutional rates and industry standards.

Advice for reuse

Update institutional rates and costs

The calculator uses specific hourly rates and cost structures that reflect University of Southampton’s internal pricing model. Before using this tool in another institution, update all financial figures in the ‘Lookup_Tables’ sheet to reflect your own staffing costs, overheads, and media production rates. Annual licence costs for authoring tools should also be verified against current vendor pricing.

Verify platform compatibility

The calculator includes platform-specific configurations (Blackboard Ultra, Totara, Course Catalog) that auto-select based on target audience. Adapt these platform options and their associated development time multipliers to match your institution’s learning management systems and hosting arrangements. Remove or add platforms as appropriate for your context.

Review evidence sources and benchmarks

Whilst the calculator is based on Chapman Alliance (2010), ATD Research (2023), and Tucker (2025) benchmarks, development times vary significantly across institutions depending on team experience, available resources, and quality expectations. Validate the time multipliers against your own institutional data and adjust accordingly. Consider conducting pilot projects to calibrate estimates to your specific context.

Adapt compliance requirements

The compliance levels reflect UK higher education regulatory requirements (PSBAR, WCAG 2.1 AA) and European Accessibility Act standards. Modify these to reflect your jurisdiction’s accessibility legislation and institutional policies. Remove EAA compliance options if not relevant to your geographic context, or add additional regulatory frameworks as needed.

Customise authoring tool options

The dropdown menu of authoring tools reflects tools available at University of Southampton. Update this list to include only tools available within your institution, and adjust the associated development time multipliers based on your team’s proficiency with each tool. Remove tools your institution does not licence to avoid confusion.

Test all formula dependencies

The calculator uses named ranges and complex formula dependencies across multiple sheets. After making any modifications to the ‘Lookup_Tables’ or ‘Calculations’ sheets, thoroughly test all formula outputs to ensure calculations remain accurate. The ‘Maintenance_Guide’ sheet provides detailed documentation of the calculator’s architecture to support this process.

Adjust scenario complexity definitions

The scenario complexity levels (simple, moderate, complex) are defined by decision point counts that may not align with your institutional definitions. Review these thresholds and adjust both the categorical definitions and their associated development time multipliers to reflect your team’s typical scenario development workflows.

Align with existing project management processes

This calculator focuses specifically on development time and cost estimation. Integrate its outputs with your institution’s broader project management frameworks, quality assurance processes, and approval workflows. The tool should complement, not replace, existing governance structures for eLearning development projects.