Simple RAG Consumption using Azure OpenAI with DataSources

Simple RAG Consumption using Azure OpenAI with DataSources

Step 4. Invoke the process

Step 4. Invoke the process

When you invoke the Query LLM with Context using Azure OpenAI process, the process prepares and sends requests to the Azure OpenAI LLM with the context.
You can run the process using one of the following options:
  • Run Using by passing the input parameters
  • REST or SOAP API endpoints in any API client such as cURL, Postman, SOAP UI, or any programming language
  • Web browser by passing the input parameters
For example, you can pass the input parameters with the Run Using option as shown in the following image:
The image shows the sample input parameters that you can pass using the Run
					Using option.

0 COMMENTS

We’d like to hear from you!