AI in Instructional Design

AI in Instructional Design

It isn’t hard to miss the froth over the latest advances in AI particularly in the area of Large Language Models (LLMs) such as ChatGPT or Claude. I’ve been using these tools for a few months now, initially just to try things out but more recently to help me work more efficiently. As an instructional designer, I’m used to working with content of all different shapes and sizes so a tool like ChatGPT can be enormously useful in helping explore a topic area when I’m not actually an expert in that area.

In this post I’m going to look at how AI can help an instructional designer work more effectively. Later I’ll consider the question – ‘Can AI replace the core skills of an instructional designer?’

From my experience so far there are several areas where AI can act as useful digital assistant to the instructional design process:

Initial exploration of the topic area
Generally, as IDs our starting point is a set of content provided by the client’s subject matter expert(s) (SMEs). The format of this initial content dump is highly variable – usually there is too much stuff and it is often poorly organised. It’s rarely fit for purpose but then if it was the IDs task would be much less valuable. Using an LLM to summarise, outline, highlight key topics and to fill the gaps can be really insighful at this stage in the design process. Care must be taken however as SMEs will be alert to any errors and trust will be eroded quickly if you pitch the LLM against their expertise.

Exploring the audience
AI can be really useful when exploring what is going to work best for a particular audience. Often SMEs haven’t really nailed down their intended audience – that’s why there is so much stuff – as experts they feel they need to cover all bases. By using suitable prompts such as ‘…for a non-technical audience of marketing managers in a non-profit.’ for example, you can explore what content is likely to work best for your audience.

Creating an outline learning plan
The LLMs are good at suggesting a learning plan given the right prompts. For example you can ask the LLM to include reflection points or mini-assignments and these can be aimed specifically at the target audience you identifed in the step above. Remember you are using the AI to help with your thinking not actually come up with a final design.

Writing assessment questions
LLMs also do a reasonable first pass at assessment questions. They construct questions well and provide plausible distractors (incorrect responses to multiple choice questions), unlike SMEs who generally don’t write good questions and if they do are generally incapable of thinking up plausible distractors.

Exploring scenarios
LLMs can be useful for generating different types of scenarios to use in elearning. They can provide outlines quickly and explore different approaches which can then be refined during storyboarding.

Chunking content
Once you start storyboarding the the LLMs can be useful when chunking content. For example when creating a click to reveal that explores a process the LLMs can be used to provide draft content in equivalent-sized chunks.

Currently, my use of AI is largely done out of sight of the client. I use it to prepare for design sessions, to create potential outlines and to give me ideas that I can develop in the storyboarding stage. However I’m planning to use it in a design workshop with a client for a future project. I just need the right client and the right project!

There are some issues of course. The first is, is it acceptable to share SME content with the AI tool? Currently, I don’t do this directly, only indirectly using snippets. The second issue and probably the biggest downside to LLMs is the ‘generic corporate speak’ that seems to creep into their output. You can overcome this with clever prompt engineering but out-of-the-box LLMs tend to write text that while initially looks insightful is stuffed with corporate non-speak. This sort of text is poor in elearning content so steps must be taken to get the AI to ‘cut the crap’.

As an ID with a strong design background visuals are really important in my courses and particularly visuals that help people understand. I try and create learning visuals where possible or to use analogies or metaphors to help strengthen the learning. Visual interpretation is an area where it’s harder to get the AI to give decent results. Even asking the AI to create simple images can be tricky since there are so many ways in which pictures can be created with different styles, moods, colours etc.

Here’s a task I gave Chat GPT 4:
“Create a visual metaphor that I can use to explain the ADDIE process to someone learning to be an instructional designer.”

Here’s the output:
This metaphor illustrates the structured and nurturing approach of the ADDIE process, similar to growing and maintaining a garden

Analysis: A gardener planning the garden, with tools like a notebook and diagrams.
Design: The gardener laying out the garden beds and paths, with markers and measuring tools.
Development: Planting seeds and young plants, with a watering can and planting tools.
Implementation: Tending to the growing plants, with tools for weeding and watering.
Evaluation: Harvesting and assessing the produce, with baskets and scales.

ADDIE as a garden metaphor

So will AI replace instructional designers? 
Several learning authoring tools are promoting the use of AI in helping to create content and if you are creating a vanilla course from scratch on a generic topic these tools can genuinely get you off to a flying start. However, when it comes to effective learning design a good instructional designer still has a key role to play. But IDs will need to become effective prompt engineers and know when and how to use AI as part of the development process to maximise efficiency. As they say, two heads are better than one – even if one head is a bot!

All images in this post were created by ChatGPT4o.

Share on Social Media
No Comments

Sorry, the comment form is closed at this time.