Configure LLM Providers and System Prompts
You can configure PromptQL LLM/model settings and system instructions all in one place, via the promptql_config.yaml
file. The following are possible:
- Configure the LLM provider and the model
- Configure the LLM provider and model for the AI Primitives like Classification, Summarization and Extraction tasks.
- Configure System Instructions that will apply for every PromptQL interaction.
PromptQL Config File
The promptql_config.yaml
file will be created by the Hasura DDN CLI when initializing a new project with --with-promptql
flag. This file will be preset in the root directory of the DDN project structure.
Here’s an example of a simple PromptQL config file:
kind: PromptQlConfig
version: v1
definition:
llm:
provider: hasura
system_instructions: |
You are a helpful AI Assistant.
When the llm provider is hasura
, the model is claude-3-5-sonnet-latest
, since that’s the recommended model for PromptQL Program generation.
Here’s what a configuration with a different provider for llm
, ai_primitives_llm
and system_instructions
can look like:
kind: PromptQlConfig
version: v1
definition:
llm:
provider: openai
model: o3-mini
ai_primitives_llm:
provider: openai
model: gpt-4o
system_instructions: |
You are a helpful AI Assistant.
Note:
- The
model
key is unsupported ifhasura
is the LLM provider. - If
ai_primitives_llm
definition is not provided, it will assume the default provider configured in thellm
defintion. - The
system_instructions
definition is optional.
Supported LLM Providers and Models
The following table outlines the valid configurations for provider
and model
for llm
and ai_primitives_llm
.
Provider | Supported Models |
---|---|
hasura | N/A |
openai | o1 , o3-mini , gpt-4o |
anthropic | claude-3-5-sonnet-latest |
This configuration ensures flexibility in selecting the right LLM provider and model for different use cases in PromptQL.