Skip to content

LLM API

LLM API provides access to large language models (AI) via an OpenAI-compatible API. Use for chatbots, text generation, analysis and more.

What it is

The LLM API in WAYSCloud gives developers and teams access to powerful language models through an OpenAI-compatible interface. Use the same SDKs, libraries, and tools you already know — just point them at WAYSCloud.

All inference runs in EU datacenters. Choose from general-purpose, reasoning, and specialized coding models.

When to use

Use this when you need:

  • AI-powered features in your applications (chat, summarization, generation)
  • OpenAI-compatible integration with existing code and SDKs
  • EU-hosted inference for data sovereignty and compliance
  • Multiple model options for different use cases and budgets

When NOT to use:

How it works

The LLM API works like OpenAI's Chat Completions API:

  1. Send a chat message with your chosen model
  2. Receive a completion response (or stream tokens in real-time)
  3. Optionally use reasoning models for step-by-step thinking

All requests are authenticated with your API key. Usage is billed per token.

Features

WAYSCloud LLM API is built for compatibility, performance, and data sovereignty.

Compatibility

  • OpenAI-compatible Chat Completions API
  • Works with official OpenAI SDKs (Python, Node.js, etc.)
  • Streaming support for real-time token delivery

Models

  • General-purpose models (Qwen, DeepSeek, Mixtral, LLaMA)
  • Reasoning models with chain-of-thought (DeepSeek-R1, Qwen-Thinking)
  • Specialized coding models (Qwen-Coder)
  • Content moderation (LlamaGuard)

Infrastructure

  • EU-hosted inference (no data leaves Europe)
  • Per-token billing with no minimum commitment
  • High availability with automatic failover

Getting started

All LLM API functionality is available via the WAYSCloud API.

List models

List available models (WAYSCloud endpoint)

bash
curl https://api.wayscloud.services/v1/llm/models \
  -H "X-API-Key: YOUR_API_KEY"

List models

List available models (OpenAI-compatible endpoint)

Returns a list of all available LLM models.

Response Example:

json
{
  "object": "list",
  "data": [
    {"id": "mixtral-8x7b", "object": "model", "owned_by": "wayscloud"},
    {"id": "qwen3-80b-instruct", "object": "model", "owned_by": "wayscloud"}
  ]
}
bash
curl https://api.wayscloud.services/v1/models \
  -H "X-API-Key: YOUR_API_KEY"

See all 5 endpoints in the LLM API reference.

Limits and quotas

Limits and quotas depend on the selected plan and region. See the dashboard or API for current constraints.

Open LLM API in dashboard

WAYSCloud AS