Skip to main content

Choose Your AI Provider and Use Your Own API Key

Control which AI model processes your org metadata

Written by Pablo Gonzalez

Control which AI model processes your org data — and optionally route all requests through your own API key. This is the right starting point for teams with compliance requirements, IT-approved vendor policies, or data residency constraints.


Choosing a provider

Go to Configuration → Integrations → AI Providers. Three providers are available:

Provider

Fast model

Standard model

Notes

OpenAI

GPT-4.1 Nano

GPT-5.2

Default

Anthropic

Claude 3.5 Haiku

Claude Sonnet 4

Google Gemini

Gemini 2.5 Flash

Gemini 2.5 Pro

Experimental

The provider setting applies to the entire workspace — all AI features (org assessments, metadata explanations, code analysis) use the active provider. Only workspace admins can change it.

dx0 uses two capability tiers internally: a fast model for quick explanations and summaries, and a standard model for deep analysis and complex reasoning. You don't choose the tier directly — it's selected automatically based on the feature.

On Google Gemini: This provider is marked as experimental. It may occasionally be unavailable due to capacity limits on Google's side, which can cause AI features to temporarily fail. OpenAI and Anthropic do not have this limitation. If uninterrupted AI features matter for your workflow, use one of those instead.

Assessment quality varies between providers. Each has different strengths, and the output for the same org can read differently depending on the model. If you're comparing, Anthropic tends to produce more structured prose; OpenAI tends to be more direct.


Using your own API key (BYOK)

By default, dx0's platform key is used for all AI requests and the cost is included in your subscription. If you want AI requests to flow through your own cloud account instead, you can provide a custom API key per provider.

Reasons to use your own key:

  • Your IT or security policy requires that data only passes through vendor accounts you directly control

  • You want usage to appear in your own cloud spend and dashboards

  • You need to enforce your organization's data residency requirements

  • You want to apply your own rate limits or model quotas

Before you add a key

dx0 will ask you to confirm that you have configured usage caps and billing alerts in your provider's console before saving a key. This is required, not optional. When a custom key is active, all AI costs are billed directly to your account — dx0 has no visibility into or control over your spending.

Configure limits here before proceeding:

  • OpenAI — OpenAI Dashboard → Usage → API Keys (set a usage limit under Billing)

  • Anthropic — Anthropic Console → Settings → API Keys (set spend limits under Plans & Billing)

  • Google Cloud — Google Cloud Console → APIs & Services → Gemini API (set budget alerts under Billing)

Create a dedicated API key for dx0 in each console rather than reusing an existing key. This makes it easy to track usage, set per-key quotas, and distinguish dx0 traffic from other applications.

Adding a key

In the AI Providers list, expand the provider you want and click Use Your Own Key. Paste your API key — dx0 validates it immediately against the provider before saving.

Before the key is saved, you must type the confirmation phrase: I have configured usage caps and alerts. This is a deliberate friction step, not a formality.

Once saved:

  • The key is encrypted at rest using envelope encryption. It cannot be viewed again after saving — only the last 4 characters are shown as a reference hint.

  • All AI requests from the workspace route through your key.

  • The provider card shows a Custom Key badge.

The key is independent of the active provider. You can save a key for Anthropic while OpenAI is active — it takes effect the moment you switch to Anthropic.

Removing a key

Click Remove Key on the provider card. AI requests immediately revert to the dx0 platform key. If you want to use your own key again, you'll need to enter it again — the previous key cannot be recovered.


Response language

AI-generated assessments can be written in a language other than English. Go to the bottom of the AI Providers settings page and select a language. All analysis, headings, and recommendations will use the selected language. Salesforce API names and technical identifiers stay in English regardless of the setting.

Available languages: English (default), Español, Português, Français, Deutsch, Italiano, Nederlands, 日本語.


Practical scenarios

Your firm has an approved vendor list that doesn't include OpenAI. Switch the active provider to Anthropic and save a BYOK key if your policy also requires that data route through your own account. The change takes effect immediately for all workspace members.

Your client's org contains sensitive configuration and they want to minimize third-party data exposure. Use BYOK to ensure AI requests go through your own cloud account, where you control data retention and can audit API activity directly in the provider console.

You want AI responses in Spanish for a client-facing report. Change the response language to Español. All assessments generated after that point will be written in Spanish. Switch back to English before sharing reports with English-speaking colleagues — the setting persists across sessions.

You need to audit which AI provider was used for a specific analysis. The provider setting is workspace-wide and applies at the time the job runs. If you change providers between generating two assessments, each will reflect the provider that was active at the time it was generated.

Google Gemini stops responding during a session. This can happen with the experimental provider under capacity constraints. Switch to OpenAI or Anthropic and re-run the affected analysis. You don't need to reconfigure anything other than the active provider.

Did this answer your question?