Migration Guide: From LeMUR to LLM Gateway
LeMUR will be deprecated on March 31st, 2026. Please migrate to the LLM Gateway before that date for continued access to language model capabilities with more models and better performance.
Overview
This guide helps you migrate from LeMUR to AssemblyAI’s LLM Gateway. While LeMUR was designed specifically for processing transcripts, LLM Gateway provides a more flexible, unified interface for working with multiple language models.
LeMUR only processes the text field from the transcript JSON — it does not
have access to speaker-separated data such as utterances or speaker labels.
LLM Gateway is functionally equivalent in this regard, since you pass the same
transcript text in your request. If you need speaker-separated context, you
can format the utterances yourself and pass them as text to either LeMUR or
LLM Gateway.
Key differences
Migration steps
Step 1: Replace transcript processing
Before (LeMUR): LeMUR automatically retrieved transcript text using transcript IDs. It only used the basic text field — not speaker labels, utterances, or other structured data from the transcript.
After (LLM Gateway): You include the transcript text directly in your request. Since LeMUR only used the text field, migrating to LLM Gateway produces equivalent results.
Before: LeMUR
After: LLM Gateway
Step 2: Update model names
LLM Gateway uses different model identifiers:
Step 3: Modify response handling
Before: LeMUR returned a simple response object. After: LLM Gateway returns OpenAI-compatible response format.
Before: LeMUR
After: LLM Gateway
Complete migration example
Here’s a complete example showing how to migrate a LeMUR sentiment analysis task:
Python (Before: LeMUR)
Python (After: LLM Gateway)
Python SDK (Before: LeMUR)
Python SDK (After: LLM Gateway)
JavaScript SDK (Before: LeMUR)
JavaScript SDK (After: LLM Gateway)
Migration benefits
Moving to LLM Gateway provides several advantages:
More model choices
- 15+ models including Claude 4.5 Sonnet, GPT-5, and Gemini 2.5 Pro
- Better performance with newer, more capable models
Flexible input handling
- Multi-turn conversations for complex interactions
- Custom system prompts for better context control
Enhanced capabilities
- Tool calling for function execution
- Agentic workflows for multi-step reasoning
- OpenAI-compatible API for easier integration
Next steps
After migrating to LLM Gateway, explore additional capabilities:
- Multi-turn Conversations - Build conversational experiences
- Tool Calling - Enable function execution
- Agentic Workflows - Create multi-step reasoning
Need help?
If you encounter issues during migration:
- Check model availability - Ensure your chosen model is supported
- Verify API endpoints - LLM Gateway uses different URLs than LeMUR
- Update response parsing - Response format follows OpenAI standards
- Review token limits - Different models have different context windows
The LLM Gateway provides more flexibility and capabilities than LeMUR, but requires slightly more setup to include transcript content in requests.