Introduction to Multifunction Calling using Azure OpenAI Agent recipe
Introduction to Multifunction Calling using
Azure OpenAI Agent recipe
The Multifunction Calling using Azure OpenAI Agent recipe is based on REST API. Use the
recipe to receive user queries from Microsoft Teams, which are then sent to Azure OpenAI for
function calling. This function calling directs the request to the appropriate AI agent. The AI
agent analyzes the request, performs the required actions, and returns the result to the
Microsoft Teams channel.
The process begins with receiving user input from Microsoft Teams and returns a
pre-configured answer in the same channel.
The process sends a request to Azure OpenAI containing the specified function declaration and
user prompt. Azure OpenAI interprets the query, extracts key variables and values, and returns
them in the appropriate format. Based on the function calling response, a request is made to
the Large Language Model (LLM) to summarize and explain the extracted parameters, with the
result sent back to the Microsoft Teams channel.
The process also sends a request to the Application Integration process with parameters
obtained from the function calling response, and calls the appropriate AI agent.
The Agent uses an LLM to generate the initial API request based on the user’s query. After
executing the service API call, the Agent evaluates the response received. If additional
information is required, the Agent formulates and executes a subsequent service API request,
leveraging the previous response as context. This process repeats until the Agent determines
that sufficient data has been gathered to address the original query.
Finally, the Agent compiles the accumulated context and uses the LLM to generate a
comprehensive final response to the user’s query. This LLM generated response is returned to
the user in the same Microsoft Teams channel.