Bring Your Own Model

Configure Go to leverage models hosted in your own Accounts

The default models available in Go are served by V7's account with the respective model provider (OpenAI/Anthropic/Google, etc).

Some users may want to instead integrate models hosted with their own cloud providers (e.g. Azure). This guide explains how to set them up.

You can configure which models are available to work with from the /settings/ai-models page within the UI.

📘

Important! The naming standards provided below are required as they are used in the requests made from Go. If the configuration does not match, Go will not be able to successfully call the models.

IP Addresses

If you need to update firewall rules to allow Go traffic, the IPs are

  1. 18.202.82.120
  2. 54.75.198.141

Azure hosted Open AI GPT models

For general setup, please follow the Azure documentation.

The required naming standard in the below table is configured via Azure Open AI Service within the Deployments tab.

NameModel NameModel version
gpt-3_5-turbogpt-35-turbo-16k0613
gpt-4ogpt-4o2024-08-06

It is recommended that Content filter is set to NoJailbreakProtenction.

Azure OCR

Enable under Azure AI Service | Document intelligence.

In the Endpoint field, there will be a value in the form https://go-ocr.cognitiveservices.azure.com/. In this case go-ocr is the Project ID to provide in the Go configuration.

If you have any questions, reach out to [email protected] or contact support with the Intercom widget.