Basic Chat Completions
Overview
Basic chat completions allow you to send a message and receive a response from the model. This is the simplest way to interact with the LLM Gateway.
Getting started
Send a message and receive a response:
Python
JavaScript
API reference
Request
The LLM Gateway accepts POST requests to https://llm-gateway.assemblyai.com/v1/chat/completions
with the following parameters:
Request parameters
Message object
Content part object
Response
The API returns a JSON response with the model’s completion:
Response fields
Error response
If an error occurs, the API returns an error response: