Skip to Content

Using Grok with the OpenAI SDK

This guide shows you how to use Grok through the OpenAI SDK with Liona. Grok is compatible with the OpenAI API format, allowing you to integrate Grok AI capabilities while maintaining security and cost control.

JavaScript/TypeScript Integration

You can use the OpenAI JavaScript/TypeScript SDK to access Grok. Follow these steps to integrate it into your application.

Install the OpenAI SDK

If you haven’t already, install the OpenAI SDK using npm, yarn, or pnpm:

npm install openai # or yarn add openai # or pnpm add openai

Initialize the OpenAI client for Grok

Initialize the OpenAI client using your Liona access key and the Grok provider endpoint:

import OpenAI from 'openai'; const openai = new OpenAI({ apiKey: 'your_liona_access_key_here', // Your Liona access key baseURL: 'https://api.liona.ai/v1/provider/grok', });

Use the OpenAI SDK to access Grok

Now you can use the OpenAI SDK to access Grok models:

async function generateWithGrok() { const completion = await openai.chat.completions.create({ model: 'grok-1', messages: [ { role: 'system', content: 'You are a helpful assistant.' }, { role: 'user', content: 'Explain the basics of quantum computing.' } ], temperature: 0.7, }); console.log(completion.choices[0].message.content); }

Client-side usage (browser)

One of Liona’s key benefits is enabling secure client-side usage. You can safely use your Liona access key directly in browser-based applications:

// In a React component or other client-side code import OpenAI from 'openai'; function GrokComponent() { const [result, setResult] = useState(''); async function handleSubmit(userInput) { // Safe to use in client-side code with Liona! const openai = new OpenAI({ apiKey: 'your_liona_access_key_here', // Your Liona access key baseURL: 'https://api.liona.ai/v1/provider/grok', dangerouslyAllowBrowser: true, // Required by OpenAI SDK }); const completion = await openai.chat.completions.create({ model: 'grok-1', messages: [{ role: 'user', content: userInput }], }); setResult(completion.choices[0].message.content); } // Component rendering... }
Note

The dangerouslyAllowBrowser flag is required by the OpenAI SDK, but using a Liona access key makes this approach secure because your actual API keys are never exposed.

Handle rate limits and policy errors

When using Liona, you should check for rate limit errors (HTTP 429) or policy limit exceeded messages:

async function callGrok() { try { const completion = await openai.chat.completions.create({ model: 'grok-1', messages: [{ role: 'user', content: 'Hello, Grok!' }], }); return completion.choices[0].message.content; } catch (error) { if (error.status === 429 || error.message.includes('policy limit exceeded')) { console.log('Rate limit or policy limit reached. Please try again later.'); // Handle gracefully - perhaps show a user-friendly message } else { console.error('Error:', error); } } }

Python Integration

You can also use the OpenAI Python SDK to access Grok through Liona.

Install the OpenAI Python package

If you haven’t already, install the OpenAI Python package:

pip install openai # or poetry add openai

Initialize the OpenAI client for Grok

Initialize the OpenAI client with your Liona access key and the Grok provider endpoint:

from openai import OpenAI client = OpenAI( api_key="your_liona_access_key_here", base_url="https://api.liona.ai/v1/provider/grok", )

Use the OpenAI client to access Grok

You can now use the OpenAI client to access Grok models:

response = client.chat.completions.create( model="grok-1", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Explain the concept of black holes."} ], temperature=0.7, ) print(response.choices[0].message.content)

Handle rate limits and policy errors

Handle rate limits and policy errors in your Python applications:

import openai from openai import OpenAI client = OpenAI( api_key="your_liona_access_key_here", base_url="https://api.liona.ai/v1/provider/grok", ) try: response = client.chat.completions.create( model="grok-1", messages=[{"role": "user", "content": "Hello, Grok!"}], ) print(response.choices[0].message.content) except openai.APIError as e: if e.status_code == 429 or "policy limit exceeded" in str(e).lower(): print("Rate limit or policy limit reached. Please try again later.") else: print(f"Error: {e}")

Additional Integration Options

Environment Variables

You can use environment variables to configure the OpenAI SDK for Grok:

# JavaScript OPENAI_API_KEY=your_liona_access_key_here OPENAI_BASE_URL=https://api.liona.ai/v1/provider/grok # Python OPENAI_API_KEY=your_liona_access_key_here OPENAI_BASE_URL=https://api.liona.ai/v1/provider/grok

Using with Next.js

For Next.js applications using server components:

// In app/api/route.js or similar import OpenAI from 'openai'; import { NextResponse } from 'next/server'; export async function POST(request) { const { prompt } = await request.json(); const openai = new OpenAI({ apiKey: process.env.LIONA_ACCESS_KEY, baseURL: 'https://api.liona.ai/v1/provider/grok', }); const completion = await openai.chat.completions.create({ model: 'grok-1', messages: [{ role: 'user', content: prompt }], }); return NextResponse.json({ result: completion.choices[0].message.content }); }

Common Issues and Troubleshooting

Rate Limits and Policies

If you encounter rate limits or permission errors, check:

  1. The policy assigned to your user in Liona
  2. Your current usage against the set limits
  3. Whether the specific Grok model is allowed by your policy
💡
Tip

You can check your usage and limits in the Liona dashboard under the “Usage” section.

Error Response Codes

Liona preserves the standard error structure while adding additional context:

  • HTTP 429: Rate limit or policy limit exceeded
  • HTTP 403: Unauthorized access to a specific model or feature
  • HTTP 401: Invalid or expired access key

Model Availability

Ensure you’ve connected the Grok provider in your Liona dashboard and that you have added your Grok API key to use Grok models through Liona.

Next Steps

Now that you’ve integrated Grok with the OpenAI SDK through Liona, you might want to:

Last updated on