In my previous post, we saw how to build the simplest Semantic Kernel local AI agent using Semantic Kernel and Ollama in C#. In this short post, we will see how simple it is to extend the capabilities of the Semantic Kernel local AI agent by adding function calling.

Introduction

Function calling, sometimes referred to as tools, is a powerful way to extend the capabilities of your local AI agent. They allow you to call external APIs, access databases, and perform other tasks that are not possible with the built-in functions of the Semantic Kernel. To better understand function calling, you might read Learning AI function calling in C# with Llama 3.2 SLM and Ollama running on your machine, where I describe the core concepts behind it.

Extending the Simplest Non-agentic Agent with Functions

I won’t go through the process of creating the console application and adding the required NuGet packages. You can refer to the previous post for that.

This time, we are using the llama3.2 model with Ollama so that the SLM understands function calling.

Let’s enhance the simplest non-agentic agent using function calling by updating the created Program.cs file as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
var builder = Kernel.CreateBuilder();
// 👇🏼 Using llama3.2 with Ollama
builder.AddOllamaChatCompletion("llama3.2:3b", new Uri("http://localhost:11434"));

var kernel = builder.Build();

ChatCompletionAgent agent = new() // 👈🏼 Definition of the agent
{
Instructions =
"""
Answer questions about different locations.
For France, use the time format: HH:MM.
HH goes from 00 to 23 hours, MM goes from 00 to 59 minutes.
""",
Name = "Location Agent",
Kernel = kernel,
// 👇🏼 Allows the model to decide whether to call the function
Arguments = new KernelArguments(new PromptExecutionSettings
{ FunctionChoiceBehavior = FunctionChoiceBehavior.Auto() })
};

// 👇🏼 Define a time tool plugin
var plugin =
KernelPluginFactory.CreateFromFunctions(
"Time",
"Get the current time for a city",
[KernelFunctionFactory.CreateFromMethod(GetCurrentTime)]);
agent.Kernel.Plugins.Add(plugin);

ChatHistory chat =
[
new ChatMessageContent(AuthorRole.User, "What time is it in Illzach, France?")
];

await foreach (var response in agent.InvokeAsync(chat))
{
chat.Add(response);
Console.WriteLine(response.Content);
}

// 👇🏼 Define the time tool
[Description("Get the current time for a city")]
string GetCurrentTime(string city) =>
$"It is {DateTime.Now.Hour}:{DateTime.Now.Minute} in {city}.";

In summary, this involves adding a plugin to the Kernel and configuring the agent to allow the model to decide whether to call the function.
It’s impressively simple. Note that there are multiple ways to define a plugin; here, the simplest approach is used.

Running the Agent

To see your agent in action, execute this command in your terminal:

1
dotnet run

You should see output similar to:

1
In Illzach, France, the current time is 11:00.

Conclusion

We demonstrated how easy it is to add function calling to your local AI agent in C# using Semantic Kernel and running locally with Ollama.

In the next post, we will continue to explore the capabilities of Semantic Kernel agents running locally.

References

Get the source code on GitHub laurentkempe/aiPlayground/SKOllamaAgentWithFunction