You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I am trying to use the Guardrails AI server framework with the Azure OpenAI API to validate LLM output. Despite modifying the example to fit Azure OpenAI, I encounter a 404 - {'detail': 'Not Found'} error. Below are the details of my setup, code, and debugging steps.
I started the Guardrails server with the following command: guardrails start --config config.py
I used the following Python code to connect to the Guardrails server and proxy my Azure OpenAI API request:
from openai import AzureOpenAI
# Set up Azure OpenAI client with Guardrails server acting as the endpoint
client = AzureOpenAI(
azure_endpoint="http://localhost:8000/guards/gibberish_guard/openai/v1",
api_key='api-key',
api_version="2024-09-01-preview"
)
# Send request through the Guardrails proxy
response = client.chat.completions.create(
model="model name",
messages=[{
"role": "user",
"content": "Make up some gibberish for me please!"
}]
)
# Access the validated response
print(response.choices[0].message.content)
print(response.guardrails['validation_passed'])
When running this code, I get the following error: openai.NotFoundError: Error code: 404 - {'detail': 'Not Found'}
Request for Assistance
Is there an example of using Guardrails with the Azure OpenAI API, or specific adjustments required to make them compatible?
How can I configure Guardrails to handle Azure OpenAI properly with custom validation routes?
I would also like to integrate the server with litellm to access other models beyond OpenAI. Is there documentation or guidance for such integration?
The text was updated successfully, but these errors were encountered:
Describe the bug
I am trying to use the Guardrails AI server framework with the Azure OpenAI API to validate LLM output. Despite modifying the example to fit Azure OpenAI, I encounter a 404 - {'detail': 'Not Found'} error. Below are the details of my setup, code, and debugging steps.
I started the Guardrails server with the following command:
guardrails start --config config.py
I used the following Python code to connect to the Guardrails server and proxy my Azure OpenAI API request:
When running this code, I get the following error:
openai.NotFoundError: Error code: 404 - {'detail': 'Not Found'}
Request for Assistance
The text was updated successfully, but these errors were encountered: