Lessons From Building Training With GenAI
Generative AI has quickly become a powerful partner in learning design. It can summarize long Subject Matter Expert (SME) interviews, draft learning materials, structure content, and accelerate early-stage design work. In many ways, it acts like a tireless research assistant, helping transform raw expertise into learning experiences.
Yet anyone who has used generative AI in real projects knows the other side of the story: AI is not neutral. When the data is incomplete or the prompt is vague, the system does not simply respond with “I don’t know.” Instead, it fills the gaps. Sometimes with plausible but incorrect information. It may invent references, generate unsupported conclusions, or confidently propose ideas that are not aligned with the real context. For Learning and Development (L&D) professionals, this creates an important challenge: How can we use generative AI effectively without losing control over accuracy, authenticity, and accountability in learning content?
In my recent work developing leadership training programs, I found that the answer is not simply better prompting. The key is to build a process that integrates AI responsibly throughout the entire learning design workflow. Here I share several practices that helped me keep AI productive while still under control when building leadership training.
The Grandma Rule: Always Start With The Objective
My first rule comes from something I learned long before AI existed. During my high school teacher credentialing program, my supervisor used to repeat one simple piece of advice: always start with the objective. People are different, days are different, and the environment is always changing, but the objective remains the cornerstone that keeps the learning experience focused and meaningful. In AI-supported design, this principle becomes even more important.
Before generating any content, I define the learning objectives clearly and explicitly. Every prompt, outline, and content draft is tied back to those objectives. As the project evolves and conversations with SMEs deepen, the objectives may shift slightly, but they always remain the anchor of the process.
This practice helps prevent a common problem with generative AI: content expansion without direction. AI can produce large volumes of polished material, but without a clear objective structure, that material may drift away from the learning goals.
Objectives act as the control system that keeps AI outputs aligned with the purpose of the training.
Create A Dedicated AI Assistant With Your Sources
Another important practice is creating a project-specific AI assistant rather than relying on a generic chatbot. In my workflow, I upload key materials such as:
- Compliance and policy documents.
- SME notes and summaries.
- Learning frameworks.
- The document that defines course goals and objectives.
These materials become the source base that the AI assistant references when generating content. This approach significantly reduces hallucinations because the system is guided toward verified internal information instead of relying on general internet patterns. It also keeps prompts focused and ensures that generated materials remain connected to the specific learning context. In essence, the assistant becomes a structured knowledge environment rather than a free-floating text generator.
Real Practice Comes First
One of the most valuable lessons I learned is that authentic learning experiences must come from real practice, not from AI imagination. Generative AI can create convincing scenarios, but it struggles with the subtle details of local language, tone, and professional nuance. These elements are critical in leadership training and workplace learning.
To address this, I start with real experience.
In my projects, educators and facilitators often record short reflective videos for professional development. These videos capture real conversations, authentic language, and the subtle dynamics of practice. I collect transcripts from these recordings and use them as source material in my AI assistant. Then I prompt the AI to generate scripts or scenarios based on those transcripts while guided by the learning objectives.
This process allows the AI to structure and refine the material while preserving the authentic voice of practitioners. The result is learning content that sounds natural and grounded rather than artificial.
Scaling Learning Without Losing Meaning
One of the most promising uses of AI in learning design is scaling knowledge. Once content is grounded in real experience and aligned with objectives, AI can help refine and expand it. For example, I often prompt the assistant to improve language clarity and apply SEO-oriented phrasing. This helps learning materials become easier to search, discover, and navigate within digital platforms.
However, this step always comes after content alignment, not before. Every revision is checked again against the learning objectives to ensure that clarity improvements or keyword optimization do not distort the intended meaning. AI can amplify language patterns, but Instructional Designers must remain responsible for preserving the integrity of the learning message.
AI As A Language Mirror For Learning
In Language Machines, Leif Weatherby describes how AI surfaces collective language patterns and influences cultural meaning. In many ways, this is exactly what we see when generative AI is used in learning design. AI reflects how people speak, write, and structure ideas across a field. When used responsibly, it can help reveal patterns in organizational knowledge and accelerate the translation of expertise into learning experiences. But this only works when AI is embedded thoughtfully within the learning design process.
For me, this means integrating AI across the stages of the ADDIE model, analysis, design, development, implementation, and evaluation, while maintaining strong collaboration with Subject Matter Experts. AI does not replace the learning designer or the SME. Instead, it becomes a structured partner that helps organize knowledge, refine language, and scale learning experiences. When used this way, generative AI does not dilute authenticity. In fact, it can help protect it if we partner with it wisely.