Getting Started with PromptQL
In this getting started guide, you’ll get hands on experience with PromptQL. In a few steps, you can have PromptQL running locally, connected to a sample dataset.
Setup
Install the DDN CLI
curl -L https://graphql-engine-cdn.hasura.io/ddn/cli/v4/get.sh | bash
Install Docker
Follow the instructions on the Docker website to install Docker on your machine.
Log in with the CLI
Logging in allows you to connect to the PromptQL runtime which is necessary for development. It also allows you to deploy your project to Hasura DDN.
ddn auth login
You will be redirected to DDN signup/login page.
Validate the installation
You can verify that the DDN CLI is installed correctly by running:
ddn doctor
Build your PromptQL app
Clone the example project
The example project is already set up to connect PromptQL to a Huggingface dataset
git clone [email protected]:hasura/huggingface-dataset-promptql.git
cd huggingface-dataset-promptql
Configure .env
Head to the app/connector/huggingface
directory to now configure the Dataset.
cd app/connector/huggingface
cp .env.sample .env
Modify the values for the HUGGINGFACE_DATASET
ENV. It is of the format: “user/dataset/file-path”.
For example:
HUGGINGFACE_DATASET="drossi/EDA_on_IMDB_Movies_Dataset/*.csv"
The IMDB example mentioned in the sample env is available as a sample dataset to choose. Feel free to configure the dataset that you would like to. Notice the usage of glob patterns for selecting all the CSV files. Replace this with the dataset of choice along with the path to the files. This should work for any “.csv”, “.parquet” or “.sqlite” files in Huggingface. Refer to this DuckDB blog for more examples of this format.
Introspect the Huggingface Connector
ddn connector introspect huggingface --log-level=DEBUG
Note: Depending on how big the dataset is, it should take sometime to fully import the data.
The above command runs in DEBUG mode to make it easy to catch errors for invalid files.
Add Models
Based on the dataset imported, a SQL schema would be generated. Let’s track all the models to get started quickly.
ddn model add huggingface "*"
Setup your DDN project
This will create a Hasura DDN cloud project and set up PromptQL keys to connect to the PromptQL runtime.
ddn project init
Fire up your PromptQL project
Build your supergraph.
ddn supergraph build local
Then bring up the PromptQL API server, the engine and the connector
ddn run docker-start
Act on your data
Open the PromptQL playground
In another terminal, run
ddn console --local
Browser support: PromptQL playground is supported on all browsers except Firefox and Safari. Support for these browsers should be available shortly.
Ask questions about your dataset
The console is a web app hosted at console.hasura.io that connects to your local PromptQL API and data sources. Your data is processed in the DDN PromptQL runtime but isn’t persisted externally.
Head over to the console and ask a few questions about your data.
> Hi, what are some questions that you can answer?