Serve OpenAI-compatible API
AI JSON flows can be served as an OpenAI-compatible API. This makes them easily accessible from other processes, machines, and programming languages.
Quickstart
Running from the Preview Pane
Assuming you have followed the Quickstart, and have a flow open in the preview window.
Open the preview window by clicking the “Preview” button in the top right corner of the VSCode editor. Navigate to Options > Serve OpenAI-compatible API.
Choose what variable the incoming prompt should be assigned to, and click “Serve”. The other variables will be set according to their values below.
Running from the Command Line
For example, with debono.ai.yaml
in your current folder, Run the following command to start the server:
You will see an output like:
Invoking
You can now call it like an OpenAI ChatCompletions endpoint, just switch the base url to where you are running the server.
Control what action output you retrieve by changing the model
parameter. We also expose a models
endpoint that lists all available action outputs.
The last message in the messages
array will be passed as the variable you chose when you ran the server.
For example, in python:
Note, you may need to set an arbitrary OPENAI_API_KEY
environment variable, or set openai.api_key
to an arbitrary string, as the OpenAI library requires it regardless of base_url
.