LLMs are smart, but they’re also confidently wrong sometimes.
Lesson 4 fixes that by giving the copilot a tool to query Wookieepedia via Tavily, so answers can be grounded in external data.
Lessons in this series
Before you start (self-setup)
If you’re following along on your own, complete lesson 0 and lesson 1 first.
This lesson adds one new external dependency: a Tavily API key.
Self-setup: Tavily API key
- Create a Tavily account at tavily.com (free tier is enough for this workshop).
- Generate an API key from the Tavily dashboard.
- Save it to user secrets:
dotnet user-secrets set "Tavily:ApiKey" "<your-tavily-api-key>"
Quick verification:
dotnet user-secrets list
Make sure Tavily:ApiKey is present before wiring the tool.
Why tools
Without tools, your copilot only knows what the model was trained on.
That leads to classic hallucinations for newer entities and events. In the workshop, asking about Kay Vess is a good example.
With tool calling, the model can:
- request a tool call
- receive tool results
- produce a grounded final answer
Building the tool
The workshop creates a WookiepediaTool derived from AIFunction and defines:
- name and natural-language description
- input JSON schema (
query) - return JSON schema (selected Tavily fields)
InvokeCoreAsyncto call Tavily search API
Then it enables function invocation middleware:
.UseFunctionInvocation()
and passes the tool through ChatOptions.
Important mental model
The LLM does not execute your code directly.
It emits a function-call request message. The SDK executes the tool, appends tool output, and calls the LLM again.
So one apparent “answer” can include multiple internal messages:
- assistant function call
- tool result
- final assistant response
Understanding this makes debugging much easier.
Prompting still matters
Even with tools registered, models can ignore them. The lesson improves reliability by nudging the system prompt:
If you’re not sure, use
WookiepediaTool.
Small instruction, big behavior shift.
Suggested banner prompt
A cinematic digital artwork of an AI assistant in a starship briefing room reaching into a holographic web of knowledge nodes labeled by icons, retrieving verified data into a chat window. Blue and gold lighting, high detail, dynamic composition, no text, no logos.
Follow along
Workshop source for this lesson: Lesson 4 README.
Next up: moving tools into an MCP server so they become reusable across clients.
Note: Original workshop repository: jimbobbennett/StarWarsCopilot.
