๐Ÿ› ๏ธ SDKs & API

Ship faster with Celuxe SDKs

Use the OpenAI SDK you already know โ€” just change the base URL. Plus a live API playground to test right here.

pipinstall openai ๐Ÿ“‹
npminstall openai ๐Ÿ“‹
gogo-openai ๐Ÿ“‹
Mavenopenai-java ๐Ÿ“‹

Celuxe is OpenAI-compatible โ€” use the official OpenAI SDK, just set base_url="https://api.celuxe.shop/v1"

Py
Python
openai Python SDK v1.x
The industry standard. Full async with aiohttp.
JS
Node.js / TypeScript
openai Node SDK v4.x
Typed. Works in Node 18+ and modern browsers.
Go
Go
go-openai by sashabaranov
Lightweight, fast, idiomatic Go. High throughput.
Java
Java / Kotlin
The official OpenAI Java SDK
Enterprise-ready. Works with Java 11+ and Kotlin.

Test the API right here

No setup needed. Enter your API key, pick a model, and hit Run.

Request
0.7
256
Response
Response will appear here... Click โ–ถ Run to send your first request.

๐Ÿ’ป Code Examples โ€” just copy & paste
Python โ€” pip install openai
from openai import OpenAI client = OpenAI( api_key="sk-celu-xxxxxxxxxxxx", base_url="https://api.celuxe.shop/v1" # โ† This is the only change ) # Chat Completions response = client.chat.completions.create( model="gpt-4o", messages=[ {"role": "system", "content": "You are helpful."}, {"role": "user", "content": "Explain DeepSeek V3 in one sentence."} ], temperature=0.7, max_tokens=256 ) print(response.choices[0].message.content) # Streaming stream = client.chat.completions.create( model="deepseek-v3", messages=[{"role": "user", "content": "Write a Python function to fibonacci."}], stream=True ) for chunk in stream: if chunk.choices[0].delta.content: print(chunk.choices[0].delta.content, end="", flush=True) # Embeddings emb = client.embeddings.create( model="deepseek-embed", input="Hello, world!" ) print(emb.data[0].embedding[:5])
TypeScript โ€” npm install openai
import OpenAI from "openai"; const client = new OpenAI({ apiKey: "sk-celu-xxxxxxxxxxxx", baseURL: "https://api.celuxe.shop/v1" // โ† Only change needed }); const response = await client.chat.completions.create({ model: "gpt-4o", messages: [ { role: "system", content: "You are helpful." }, { role: "user", content: "What is Celuxe in one sentence?" } ], temperature: 0.7, max_tokens: 256 }); console.log(response.choices[0].message.content); // Streaming const stream = await client.chat.completions.create({ model: "deepseek-v3", messages: [{ role: "user", content: "Count to 5" }], stream: true }); for await (const chunk of stream) { process.stdout.write(chunk.choices[0].delta?.content ?? ""); }
Go โ€” go get github.com/sashabaranov/go-openai
package main import ( "context" "fmt" "log" openai "github.com/sashabaranov/go-openai" ) func main() { cfg := openai.DefaultConfig("sk-celu-xxxxxxxxxxxx") cfg.BaseURL = "https://api.celuxe.shop/v1" // โ† Only change client := openai.NewClientWithConfig(cfg) resp, err := client.CreateChatCompletion( context.Background(), openai.ChatCompletionRequest{ Model: "gpt-4o", Messages: []openai.ChatCompletionMessage{ {Role: "user", Content: "Hello! What is Celuxe?"}, }, Temperature: 0.7, MaxTokens: 256, }, ) if err != nil { log.Fatal(err) } fmt.Println(resp.Choices[0].Message.Content) }
curl โ€” Works with any shell
# Set your Celuxe API key export CELUXE_KEY="sk-celu-xxxxxxxxxxxx" # Chat Completions โ€” GPT-4o curl https://api.celuxe.shop/v1/chat/completions \\ -H "Authorization: Bearer $CELUXE_KEY" \\ -H "Content-Type: application/json" \\ -d '{ "model": "gpt-4o", "messages": [{"role": "user", "content": "What is Celuxe?"}], "temperature": 0.7, "max_tokens": 256 }' # Switch model โ€” just change the model name curl https://api.celuxe.shop/v1/chat/completions \\ -H "Authorization: Bearer $CELUXE_KEY" \\ -H "Content-Type: application/json" \\ -d '{"model": "deepseek-v3", "messages": [{"role": "user", "content": "Hello!"}]}' # Embeddings curl https://api.celuxe.shop/v1/embeddings \\ -H "Authorization: Bearer $CELUXE_KEY" \\ -H "Content-Type: application/json" \\ -d '{"model": "deepseek-embed", "input": "Hello, world!"}'

โšก Why use Celuxe SDKs?
๐Ÿ”
OpenAI Drop-in
Already use OpenAI SDK? One line change.
๐ŸŒŠ
Full Streaming
Native SSE streaming in every language.
๐Ÿ”„
30+ Models
One API key, any model you need.
๐Ÿ’ฐ
Cost Effective
Better pricing than going direct.
โšก
<200ms Latency
Global edge network, fast everywhere.
๐Ÿ”ง
Auto Retry
Built-in retry with backoff on failure.
๐Ÿ’ก

No Celuxe SDK needed. Celuxe is built on the OpenAI API protocol โ€” the world's most widely adopted AI API standard. Any SDK, tool, or code that works with OpenAI works with Celuxe. Just set base_url="https://api.celuxe.shop/v1" and you're done.

๐Ÿš€ Get Your Free API Key โ†’ ๐Ÿ“Š View Pricing ๐Ÿ“‹ Migration Guide