DEV Community

Kenichiro Nakamura
Kenichiro Nakamura

Posted on

C#: Azure Open AI and Function Calling

As announced in this post, Azure Open AI now supports Function Calling feature.

I don't explain what it is, but I share my experiment result and C# code.

Prerequisites

  • Azure Subscription and Open AI account
  • Deploy model that supports Function Calling, e.g. gpt-35-turbo-16k

Test scenario

I wonder if LLM can chain functions if needed. How it behaves when it has multiple functions, etc. So, I tested.

Firstly, I added two functions.

GetWeatherFunction.cs

public class GetWeatherFunction { static public string Name = "get_current_weather"; // Return the function metadata static public FunctionDefinition GetFunctionDefinition() { return new FunctionDefinition() { Name = Name, Description = "Get the current weather in a given location", Parameters = BinaryData.FromObjectAsJson( new { Type = "object", Properties = new { Location = new { Type = "string", Description = "The city and state, e.g. San Francisco, CA", }, Unit = new { Type = "string", Enum = new[] { "Celsius", "Fahrenheit" }, } }, Required = new[] { "location" }, }, new JsonSerializerOptions() { PropertyNamingPolicy = JsonNamingPolicy.CamelCase }), }; } // The function implementation. It always returns 31 for now. static public Weather GetWeather(string location, string unit) { return new Weather() { Temperature = 31, Unit = unit }; } } // Argument for the function public class WeatherInput { public string Location { get; set; } = string.Empty; public string Unit { get; set; } = "Celsius"; } // Return type public class Weather { public int Temperature { get; set; } public string Unit { get; set; } = "Celsius"; } 
Enter fullscreen mode Exit fullscreen mode

GetCapitalFunction.cs

public class GetCapitalFunction { static public string Name = "get_capital"; // Return the function metadata static public FunctionDefinition GetFunctionDefinition() { return new FunctionDefinition() { Name = Name, Description = "Get the capital of the location", Parameters = BinaryData.FromObjectAsJson( new { Type = "object", Properties = new { Location = new { Type = "string", Description = "The city, state or country, e.g. San Francisco, CA", } }, Required = new[] { "location" }, }, new JsonSerializerOptions() { PropertyNamingPolicy = JsonNamingPolicy.CamelCase }), }; } // The function implementation. It always return Tokyo for now. static public string GetCapital(string location) { return "Tokyo"; } } // Argument for the function public class CapitalInput { public string Location { get; set; } = string.Empty; } 
Enter fullscreen mode Exit fullscreen mode

Then I register them in Program.cs

Uri openAIUri = new("https://<your account>.openai.azure.com/"); string openAIApiKey = "<your key>"; string model = "gpt-35-turbo-16k"; // Instantiate OpenAIClient for Azure Open AI. OpenAIClient client = new(openAIUri, new AzureKeyCredential(openAIApiKey)); ChatCompletionsOptions chatCompletionsOptions = new(); ChatCompletions response; ChatChoice responseChoice; // Add function definitions FunctionDefinition getWeatherFuntionDefinition = GetWeatherFunction.GetFunctionDefinition(); FunctionDefinition getCapitalFuntionDefinition = GetCapitalFunction.GetFunctionDefinition(); chatCompletionsOptions.Functions.Add(getWeatherFuntionDefinition); chatCompletionsOptions.Functions.Add(getCapitalFuntionDefinition); 
Enter fullscreen mode Exit fullscreen mode

I set user question like below.

string question = "What's the weather in the capital city of Japan?"; chatCompletionsOptions.Messages.Add(new(ChatRole.User, question)); 
Enter fullscreen mode Exit fullscreen mode

Then I call the Completion in a loop to see the finish reason is function or stop.

  • If the finish reason is function call, then
    • Get arguments value
    • Call the function
  • Register responses and results to chatCompletionsOptions.Messages.
  • Call LLM again with the history.
while (responseChoice.FinishReason == CompletionsFinishReason.FunctionCall) { // Add message as a history. chatCompletionsOptions.Messages.Add(responseChoice.Message); if (responseChoice.Message.FunctionCall.Name == GetWeatherFunction.Name) { string unvalidatedArguments = responseChoice.Message.FunctionCall.Arguments; WeatherInput input = JsonSerializer.Deserialize<WeatherInput>(unvalidatedArguments, new JsonSerializerOptions() { PropertyNamingPolicy = JsonNamingPolicy.CamelCase })!; var functionResultData = GetWeatherFunction.GetWeather(input.Location, input.Unit); var functionResponseMessage = new ChatMessage( ChatRole.Function, JsonSerializer.Serialize( functionResultData, new JsonSerializerOptions() { PropertyNamingPolicy = JsonNamingPolicy.CamelCase })); functionResponseMessage.Name = GetWeatherFunction.Name; chatCompletionsOptions.Messages.Add(functionResponseMessage); } else if (responseChoice.Message.FunctionCall.Name == GetCapitalFunction.Name) { string unvalidatedArguments = responseChoice.Message.FunctionCall.Arguments; CapitalInput input = JsonSerializer.Deserialize<CapitalInput>(unvalidatedArguments, new JsonSerializerOptions() { PropertyNamingPolicy = JsonNamingPolicy.CamelCase })!; var functionResultData = GetCapitalFunction.GetCapital(input.Location); var functionResponseMessage = new ChatMessage( ChatRole.Function, JsonSerializer.Serialize( functionResultData, new JsonSerializerOptions() { PropertyNamingPolicy = JsonNamingPolicy.CamelCase })); functionResponseMessage.Name = GetCapitalFunction.Name; chatCompletionsOptions.Messages.Add(functionResponseMessage); } // Call LLM again to generate the response. response = await client.GetChatCompletionsAsync( model, chatCompletionsOptions); responseChoice = response.Choices[0]; } Console.WriteLine(responseChoice.Message.Content); 
Enter fullscreen mode Exit fullscreen mode

Result

  1. I sent "What's the weather in the capital city of Japan?" to LLM.
  2. LLM returns CompletionsFinishReason.FunctionCall to use get_capital.
  3. Send back the message to LLM again with function result (Tokyo).
  4. LLM returns CompletionsFinishReason.FunctionCall to use get_current_weather.
  5. Send back the message to LLM again with function result (31 degree.)
  6. LLM returns final response saying: "The current weather in the capital city of Japan, Tokyo, is 31 degrees Celsius.".

Conclusion

I see that LLM can chain functions depending on the user input. This behavior is similar with Semantic Kernel Planner, so I will compare them when I have time.

Top comments (1)

Collapse
 
atinder_palsingh_405c410 profile image
Atinder Pal Singh

Image description

Did I miss to initialize responseChoice anywhere?