When you invoke the Query LLM with Context using Azure OpenAI process, the process
prepares and sends requests to the Azure OpenAI LLM with the context.
You can run the process using one of the following options:
Run Using by passing the input parameters
REST or SOAP API endpoints in any API client such as cURL, Postman, SOAP UI, or any programming language
Web browser by passing the input parameters
For example, you can pass the input parameters with the Run Using option as shown in the
following image: