Introduction

The world of Ai software development is moving faster than most of us can build a product using it. The Open AI SDK is already on V2 for the Assistants API. Although the documentation isn’t bad there is a shortage of examples in c#. Even ChatGPT doesn’t quite know how to use the .net Open AI SDK properly.

Interacting with Functions

The Open Ai Assistants API is pretty well thought out, although, perhaps a bit over engineered. The idea of functions is that the model will determine when to call the function. Once this occurs you must intercept the call in the pooling loop and determine which function to call.

Key Example

This is the main loop which polls the thread until the run is completed.

                do
                {
                    await Task.Delay(TimeSpan.FromMilliseconds(500));
                    runResponse = await Client.GetRunAsync(threadId, RunResponse.Id); 

                    if (runResponse.Value.Status == RunStatus.RequiresAction) //The required code to intercept the function call
                    {
                        var ToolOutputs = new List<ToolOutput>();
                        foreach (var RequiredAction in runResponse.Value.RequiredActions)
                        {
                            var Result = await _chatBotService.ProcessFunctionCall(assistantId,RequiredAction.FunctionName, RequiredAction.FunctionArguments);
                            
                            var ToolResponse = new ToolOutput(RequiredAction.ToolCallId, Result);
                            ToolOutputs.Add(ToolResponse);

                        }

                        await Client.SubmitToolOutputsToRunAsync(threadId, RunResponse.Id, ToolOutputs);
                    }
                }
                while (runResponse.Value.Status == RunStatus.Queued || runResponse.Value.Status == RunStatus.InProgress || runResponse.Value.Status == RunStatus.RequiresAction);

					

Show in the example is my _chatBotService which does the work to call the function required. You would replace this with your own function calling code. Result would be a string result of your function call. For example if you are calling the function WeatherToday the result would be a temperature. The model will interpret the result and return a response once the run finishes.

var Result = await _chatBotService.ProcessFunctionCall(assistantId,RequiredAction.FunctionName, RequiredAction.FunctionArguments);

Full Example

The following shows a full service class that can interact with an Open Ai Assistant using functions.

using System.Text.RegularExpressions;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
    public class OpenAiService
    {
        private readonly ILogger<OpenAiService> _logger;
        private readonly ChatBotService _chatBotService;
        private OpenAIClient OpenAiClient { get; set; }
        private AssistantClient Client { get; set; }

        public OpenAiService(IOptions<AppSettings> settings, ILogger<OpenAiService> logger, ChatBotService chatBotService)
        {
            _logger = logger;
            _chatBotService = chatBotService;

        }

        public void Initialize(string apiKey)
        {
            OpenAiClient = new OpenAIClient(apiKey);
            Client = OpenAiClient.GetAssistantClient();
        }


        public async Task<Assistant> CreateAssistantAsync(string model, AssistantCreationOptions options)
        {
            try
            {
                var AssistantResponse = await Client.CreateAssistantAsync(model, options);
                return AssistantResponse.Value;
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "Error creating assistant");
                throw;
            }
        }

        public async Task<Assistant> GetAssistantAsync(string id)
        {
            try
            {
                var AssistantResponse = await Client.GetAssistantAsync(id);
                return AssistantResponse.Value;
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "Error getting assistant");
                throw;
            }
        }

        public async Task<AssistantThread> CreateThreadAsync()
        {
            try
            {
                var ThreadResponse = await Client.CreateThreadAsync();
                return ThreadResponse.Value;
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "Error creating thread");
                throw;
            }
        }

        public async Task<AssistantThread> GetThreadAsync(string id)
        {
            try
            {
                var ThreadResponse = await Client.GetThreadAsync(id);
                return ThreadResponse.Value;
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "Error getting thread");
                throw;
            }
        }

        public async Task<ThreadMessage> AddMessageAsync(string id, string message)
        {
            try
            {
                var MessageResponse = await Client.CreateMessageAsync(id, MessageRole.User, new MessageContent[] { message });
                return MessageResponse.Value;
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "Error adding message");
                throw;
            }
        }

        public async Task<List<MessageContent>> RunThreadWithPollingAsync(string threadId, string assistantId)
        {
            try
            {
                var runResponse = await Client.CreateRunAsync(threadId, assistantId, new RunCreationOptions() { });
                var RunResponse = runResponse.Value;

                do
                {
                    await Task.Delay(TimeSpan.FromMilliseconds(500));
                    runResponse = await Client.GetRunAsync(threadId, RunResponse.Id);

                    if (runResponse.Value.Status == RunStatus.RequiresAction)
                    {
                        var ToolOutputs = new List<ToolOutput>();
                        foreach (var RequiredAction in runResponse.Value.RequiredActions)
                        {
                            var Result = await _chatBotService.ProcessFunctionCall(assistantId,RequiredAction.FunctionName, RequiredAction.FunctionArguments);
                            
                            var ToolResponse = new ToolOutput(RequiredAction.ToolCallId, Result);
                            ToolOutputs.Add(ToolResponse);

                        }

                        await Client.SubmitToolOutputsToRunAsync(threadId, RunResponse.Id, ToolOutputs);
                    }
                }
                while (runResponse.Value.Status == RunStatus.Queued || runResponse.Value.Status == RunStatus.InProgress || runResponse.Value.Status == RunStatus.RequiresAction);

                var AfterRunMessagesResponse = Client.GetMessagesAsync(threadId);
                var Messages = new List<MessageContent>();
                await foreach (var Message in AfterRunMessagesResponse)
                {
                    Messages.AddRange(Message.Content);
                }

                return Messages;

            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "Error in Running Thread");
                throw;
            }
        }

    }

Bonus Example Streaming

With the V2 API it is trivial now to stream in C# using the await foreach. It’s almost like magic!

The example below is a basic implementation (that does work). Although you would need incorporate the function code above to add that behavior to streaming.

IStreamingOutput can be any interface you would like. This is a basic interface that I created which at the time was just streaming to the console. It could be hooked into SignalR or other ways to stream output.


        public async Task RunThreadWithStreamingAsync(string threadId, string assistantId, IStreamingOutput output)
        {
            try
            {

                await foreach (var StreamingUpdate in Client.CreateRunStreamingAsync(threadId, assistantId))
                {
                    if (StreamingUpdate.UpdateKind == StreamingUpdateReason.RunCreated)
                    {
                        //output.WriteLine($"--- Run started! ---");
                    }
                    if (StreamingUpdate is MessageContentUpdate ContentUpdate)
                    {
                        output.Write(ContentUpdate.Text);
                    }

                    if (StreamingUpdate.UpdateKind == StreamingUpdateReason.Done)
                    {
                        output.WriteLine("");
                    }
                }

            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "Error in Running Thread");
                throw;
            }
        }
		

Update 11-25-2024

Well, as I mentioned OpenAi moves quickly and I didn’t realize when making this post that they included quite a few samples directly in their source code! So here is a link to even more samples, some overlap the samples I have created.

You can view the samples here on their Github.