In my previous post, we saw how to build the simplest Semantic Kernel local AI agent using Semantic Kernel and Ollama in C#. In this short post, we will see how simple it is to extend the capabilities of the Semantic Kernel local AI agent by adding function calling.
Introduction
Function calling, sometimes referred to as tools, is a powerful way to extend the capabilities of your local AI agent. They allow you to call external APIs, access databases, and perform other tasks that are not possible with the built-in functions of the Semantic Kernel. To better understand function calling, you might read Learning AI function calling in C# with Llama 3.2 SLM and Ollama running on your machine, where I describe the core concepts behind it.
Extending the Simplest Non-agentic Agent with Functions
I won’t go through the process of creating the console application and adding the required NuGet packages. You can refer to the previous post for that.
This time, we are using the llama3.2 model with Ollama so that the SLM understands function calling.
Let’s enhance the simplest non-agentic agent using function calling by updating the created Program.cs
file as follows:
1 | var builder = Kernel.CreateBuilder(); |
In summary, this involves adding a plugin to the Kernel and configuring the agent to allow the model to decide whether to call the function.
It’s impressively simple. Note that there are multiple ways to define a plugin; here, the simplest approach is used.
Running the Agent
To see your agent in action, execute this command in your terminal:
1 | dotnet run |
You should see output similar to:
1 | In Illzach, France, the current time is 11:00. |
Conclusion
We demonstrated how easy it is to add function calling to your local AI agent in C# using Semantic Kernel and running locally with Ollama.
In the next post, we will continue to explore the capabilities of Semantic Kernel agents running locally.
References
Get the source code on GitHub laurentkempe/aiPlayground/SKOllamaAgentWithFunction