Files
9router/skills/9router-chat/SKILL.md
2026-05-04 11:29:02 +07:00

2.4 KiB

name, description
name description
9router-chat Chat / code generation via 9Router using OpenAI /v1/chat/completions or Anthropic /v1/messages format with streaming + auto-fallback combos. Use when the user wants to ask an LLM, generate code, summarize text, or run prompts through 9Router.

9Router — Chat

Requires NINEROUTER_URL (and NINEROUTER_KEY if auth enabled). See https://raw.githubusercontent.com/decolua/9router/main/app/skills/9router/SKILL.md for setup.

Endpoints

  • POST $NINEROUTER_URL/v1/chat/completions — OpenAI format
  • POST $NINEROUTER_URL/v1/messages — Anthropic format

Discover models

curl $NINEROUTER_URL/v1/models | jq '.data[].id'

Combos (e.g. vip, mycodex) auto-fallback through multiple providers.

OpenAI format

curl -X POST $NINEROUTER_URL/v1/chat/completions \
  -H "Authorization: Bearer $NINEROUTER_KEY" \
  -H "Content-Type: application/json" \
  -d '{"model":"openai/gpt-5","messages":[{"role":"user","content":"Hi"}],"stream":false}'

JS (OpenAI SDK):

import OpenAI from "openai";
const client = new OpenAI({ baseURL: `${process.env.NINEROUTER_URL}/v1`, apiKey: process.env.NINEROUTER_KEY });
const res = await client.chat.completions.create({
  model: "openai/gpt-5",
  messages: [{ role: "user", content: "Hi" }],
  stream: true,
});
for await (const chunk of res) process.stdout.write(chunk.choices[0]?.delta?.content || "");

Anthropic format

curl -X POST $NINEROUTER_URL/v1/messages \
  -H "Authorization: Bearer $NINEROUTER_KEY" \
  -H "anthropic-version: 2023-06-01" \
  -H "Content-Type: application/json" \
  -d '{"model":"cc/claude-opus-4-7","max_tokens":1024,"messages":[{"role":"user","content":"Hi"}]}'

Response shape

OpenAI (/v1/chat/completions):

{ "id": "chatcmpl-...", "object": "chat.completion", "model": "openai/gpt-5",
  "choices": [{ "index": 0, "message": { "role": "assistant", "content": "Hello!" }, "finish_reason": "stop" }],
  "usage": { "prompt_tokens": 8, "completion_tokens": 2, "total_tokens": 10 } }

Streaming (stream:true) emits SSE: data: {choices:[{delta:{content:"..."}}]}\n\n ... data: [DONE]\n\n.

Anthropic (/v1/messages):

{ "id": "msg_...", "type": "message", "role": "assistant", "model": "cc/claude-opus-4-7",
  "content": [{ "type": "text", "text": "Hello!" }],
  "stop_reason": "end_turn", "usage": { "input_tokens": 8, "output_tokens": 2 } }