· Training, Enablement & Knowledge Sharing · 4 min read
Building AI Skills Across the Documentation Organization
I designed and led hands-on AI prompting workshops for technical writers, helping dozens of peers use generative and agentic AI to solve real documentation problems—not just toy examples.

Overview
As generative AI tools became more widely available inside AWS, many technical writers were curious about how to use them—but most training examples focused on generic chat use cases, not real documentation problems. Writers needed practical guidance on how to design prompts that could help with their day-to-day work: checking accessibility, spotting gaps in history logs, and cleaning up complex content structures.
To address this, I designed and led two Quantum Leap training sessions focused on AI prompting for technical writers: an introductory session on prompting techniques and a follow-up workshop on writing prompts for specific, documentation-flavored tasks. Across three deliveries, about 45 writers attended each session.
These workshops helped turn AI from an abstract idea into a concrete tool that writers could use to improve quality, speed, and consistency in their work.
My Role
I owned the full lifecycle of the workshops, including:
- Defining learning objectives and selecting use cases that reflected real doc work
- Designing the content, slides, and hands-on exercises
- Testing prompts with internal tools before the sessions
- Facilitating live workshops with small-group collaboration
- Iterating based on feedback between sessions
The Problem
Technical writers were under growing pressure to “use AI,” but:
- Most examples they saw were generic (summarize this, fix this email, write a poem)
- There was little guidance on how to use AI for documentation-specific tasks
- People were unsure how to structure prompts for consistent, reliable outcomes
- The risk was that AI would become a novelty instead of a meaningful tool in the writing workflow
Writers needed concrete, relevant examples and a safe space to experiment with prompts that addressed real documentation challenges.
My Approach
1. Design training around real documentation tasks
Instead of teaching prompting in the abstract, I anchored the workshops around three documentation-focused scenarios:
- Finding tables that don’t meet accessibility standards
- Identifying missing entries in documentation history logs
- Detecting and pruning dead branches in a docs repository
These scenarios reflected genuine pain points and made the workshops feel directly applicable to participants’ work.
2. Build a clear prompting framework
For the Intro to prompting session, I structured the content around:
- Different prompting methods: zero-shot, one-shot, few-shot, chain-of-thought, iterative, prompt chaining, and negative prompting
- Key components of effective prompts (context, role, constraints, examples, and success criteria)
- Common mistakes, like under-specifying tasks or omitting constraints
This gave attendees a mental model they could reuse across tools and use cases.
3. Create hands-on exercises with internal tools
I designed the workshops so attendees weren’t just watching—they were doing:
- Participants worked directly with internal tools (like Q CLI and other AWS-internal AI assistants)
- Each exercise walked them through designing, testing, and refining prompts for a specific documentation task
- Writers compared outputs, discussed what worked or failed, and iterated in real time
The second workshop, Writing prompts for specific purposes, focused heavily on this hands-on, collaborative work.
4. Facilitate collaboration and peer learning
To make the sessions more engaging and less lecture-heavy, I:
- Organized participants into small working groups
- Assigned each group a different problem variant to encourage diverse approaches
- Brought everyone back together to share prompts, outcomes, and lessons learned
This format helped writers learn not just from me, but from each other.
The Solution
By the end of the two-part series, participants had:
- A reusable prompting framework tailored to documentation work
- Concrete examples of prompts for accessibility checks, history validation, and repo cleanup
- A better understanding of how to refine prompts iteratively for better results
- Increased confidence using AI tools as part of their writing and editing workflows
The content wasn’t theoretical—it reflected real issues technical writers face and showed how AI could actually help.
Impact
- Each session drew around 45 technical writers, with strong engagement and active participation.
- The Intro to prompting session recording became one of the most-watched videos on the TCX Writer Academy channel.
- Attendees shared follow-up feedback describing significant learning outcomes and immediate application to their work.
- A documentation manager described the workshop as “a great hands-on activity” and called out the effective small-group logistics and camaraderie it created.
Beyond the individual sessions, the workshops helped establish AI prompting as a practical skill for technical writers—not just a buzzword.
Skills & Tools Used
- Training and curriculum design
- Technical writing and example design for AI use cases
- Prompt engineering for documentation workflows
- Live workshop facilitation and small-group logistics
- Hands-on exercises with internal AI tools (including Q CLI)
- Feedback-driven iteration between sessions
Brandi Hopkins