Migration Guide: From LeMUR to LLM Gateway

Learn how to migrate your existing LeMUR prompts to the new LLM Gateway

LeMUR will be deprecated on March 31st, 2026. Please migrate to the LLM Gateway before that date for continued access to language model capabilities with more models and better performance.

Overview

This guide helps you migrate from LeMUR to AssemblyAI’s LLM Gateway. While LeMUR was designed specifically for processing transcripts, LLM Gateway provides a more flexible, unified interface for working with multiple language models.

LeMUR only processes the text field from the transcript JSON — it does not have access to speaker-separated data such as utterances or speaker labels. LLM Gateway is functionally equivalent in this regard, since you pass the same transcript text in your request. If you need speaker-separated context, you can format the utterances yourself and pass them as text to either LeMUR or LLM Gateway.

Key differences

FeatureLeMURLLM Gateway
PurposeTranscript-specific LLM tasksGeneral-purpose LLM interface
ModelsLimited model selection15+ models (Claude, GPT, Gemini)

Migration steps

Step 1: Replace transcript processing

Before (LeMUR): LeMUR automatically retrieved transcript text using transcript IDs. It only used the basic text field — not speaker labels, utterances, or other structured data from the transcript.

After (LLM Gateway): You include the transcript text directly in your request. Since LeMUR only used the text field, migrating to LLM Gateway produces equivalent results.

1# LeMUR automatically fetched transcript content
2result = transcript.lemur.task(
3 prompt="What was the emotional sentiment of the phone call?",
4 final_model=aai.LemurModel.claude_sonnet_4_20250514
5)

Step 2: Update model names

LLM Gateway uses different model identifiers:

LeMUR ModelLLM Gateway Model
aai.LemurModel.claude_sonnet_4_20250514"claude-sonnet-4-20250514"
"anthropic/claude-sonnet-4-20250514""claude-sonnet-4-20250514"

Step 3: Modify response handling

Before: LeMUR returned a simple response object. After: LLM Gateway returns OpenAI-compatible response format.

1result = transcript.lemur.task(prompt)
2print(result.response) # Direct access to response text

Complete migration example

Here’s a complete example showing how to migrate a LeMUR sentiment analysis task:

1import requests
2import time
3
4base_url = "https://api.assemblyai.com"
5headers = {"authorization": "<YOUR_API_KEY>"}
6
7# Step 1: Transcribe audio
8audio_data = {"audio_url": "https://assembly.ai/call.mp4"}
9transcript_response = requests.post(f"{base_url}/v2/transcript", headers=headers, json=audio_data)
10transcript_id = transcript_response.json()["id"]
11
12# Poll for completion
13while True:
14 status_response = requests.get(f"{base_url}/v2/transcript/{transcript_id}", headers=headers)
15 status = status_response.json()["status"]
16 if status == "completed":
17 break
18 elif status == "error":
19 raise RuntimeError("Transcription failed")
20 time.sleep(3)
21
22# Step 2: Apply LeMUR
23lemur_data = {
24 "prompt": "What was the emotional sentiment of the phone call?",
25 "transcript_ids": [transcript_id],
26 "final_model": "anthropic/claude-sonnet-4-20250514"
27}
28
29result = requests.post(f"{base_url}/lemur/v3/generate/task", headers=headers, json=lemur_data)
30print(result.json()["response"])

Migration benefits

Moving to LLM Gateway provides several advantages:

More model choices

  • 15+ models including Claude 4.5 Sonnet, GPT-5, and Gemini 2.5 Pro
  • Better performance with newer, more capable models

Flexible input handling

  • Multi-turn conversations for complex interactions
  • Custom system prompts for better context control

Enhanced capabilities

  • Tool calling for function execution
  • Agentic workflows for multi-step reasoning
  • OpenAI-compatible API for easier integration

Next steps

After migrating to LLM Gateway, explore additional capabilities:

Need help?

If you encounter issues during migration:

  1. Check model availability - Ensure your chosen model is supported
  2. Verify API endpoints - LLM Gateway uses different URLs than LeMUR
  3. Update response parsing - Response format follows OpenAI standards
  4. Review token limits - Different models have different context windows

The LLM Gateway provides more flexibility and capabilities than LeMUR, but requires slightly more setup to include transcript content in requests.