Final lesson, and probably my favorite.
We move from a copilot with tools to a system that also uses agents as composable specialists.
Lessons in this series
Before you start (self-setup)
If you’re following along on your own, complete lesson 0 and lesson 7 first.
This lesson does not require a brand-new Azure resource, but it does add framework dependencies and orchestrates all prior components.
Self-setup: agent dependencies and readiness checks
- Install the Microsoft Agent Framework packages used by the workshop in your copilot project.
- Keep package versions aligned with the workshop repo to avoid API mismatches.
- Verify your existing resources still work before introducing agents:
- chat model calls succeed
- Tavily tool calls succeed
- MCP server lists and runs tools
- image generation tool returns URLs
Once these checks pass, add the agent orchestration code. Debugging is much easier when the underlying tools are already healthy.
Copilot vs agent
The workshop frames agents as “LIT”:
- LLM-powered
- Instruction-driven
- Tool-using
Copilot is the user-facing conversational surface.
Agents are focused components that can be called by the copilot (or by other agents) to do bounded jobs.
First agent: story creation
The initial StoryAgent is created with Microsoft Agent Framework and exposed as an AI tool.
This already gives a big capability jump: users can ask for tailored stories while the core copilot remains clean.
Multi-agent workflow
Then the workshop introduces an “agents as tools” orchestration pattern:
StoryAgentcreates storyStorySummaryAgentextracts scene promptsImageGenerationAgentuses image tool to generate visualsStoryGenerationAgentsupervises and returns story + image URLs
This is a practical orchestration pipeline, not just an abstract demo.
Why this matters
This pattern scales well because each agent has:
- narrow responsibility
- its own instructions
- reusable interface
You can improve one agent without rewriting the whole system.
It’s the same architectural principle as microservices, just in AI-native form.
Suggested banner prompt
A cinematic command-center scene with three specialized AI holograms (story writer, scene summarizer, image artist) collaborating under a supervising orchestration AI, producing a final illustrated story output. Epic space-opera style, high detail, no text, no logos.
Follow along
Workshop source for this lesson: Lesson 8 README.
And that’s the full 8-part journey: chat, memory, model choice, tools, MCP, RAG, multimodal, and agents.
Note: Original workshop repository: jimbobbennett/StarWarsCopilot.
