AI SDK
Drop-in SDK for accessing AI models through Setu with automatic x402 payments via Solana USDC.
Overview
@ottocode/ai-sdk gives you a single entry point to OpenAI, Anthropic, Google, Moonshot, MiniMax, and Z.AI models. All you need is a Solana wallet — the SDK handles authentication, payment negotiation, and provider routing automatically.
It returns ai-sdk compatible model instances that work directly with generateText(), streamText(), generateObject(), and all other ai-sdk functions.
Install
bun add @ottocode/ai-sdk ai
# or
npm install @ottocode/ai-sdk aiQuick Start
import { createSetu } from "@ottocode/ai-sdk";
import { generateText } from "ai";
const setu = createSetu({
auth: { privateKey: process.env.SOLANA_PRIVATE_KEY! },
});
const { text } = await generateText({
model: setu.model("claude-sonnet-4-20250514"),
prompt: "Hello!",
});
console.log(text);External Signer (No Private Key)
Don't want to share your private key? Use an external signer with browser wallets, hardware wallets, or any custom signing logic. Framework-agnostic — just provide callbacks for signing.
const setu = createSetu({
auth: {
signer: {
walletAddress: "YOUR_SOLANA_PUBLIC_KEY",
signNonce: async (nonce) => await myWallet.signMessage(nonce),
signTransaction: async (tx) => await myWallet.signTransaction(tx),
},
},
});See Integration Guide for detailed examples with wallet adapters, auth-only mode, and more.
Provider Auto-Resolution
Models are resolved to providers by prefix — no need to specify the provider manually:
| Prefix | Provider | API Format |
|---|---|---|
claude- | Anthropic | Messages |
gpt-, o1, o3, o4, codex- | OpenAI | Responses |
gemini- | Native | |
kimi- | Moonshot | OpenAI Chat |
MiniMax- | MiniMax | Messages |
z1- | Z.AI | OpenAI Chat |
setu.model("claude-sonnet-4-20250514"); // → anthropic
setu.model("gpt-4o"); // → openai
setu.model("gemini-2.5-pro"); // → google
setu.model("kimi-k2"); // → moonshotExplicit Provider
Override auto-resolution when needed:
const model = setu.provider("openai").model("gpt-4o");
const model = setu.provider("anthropic", "anthropic-messages").model("claude-sonnet-4-20250514");Streaming
import { streamText } from "ai";
const result = streamText({
model: setu.model("claude-sonnet-4-20250514"),
prompt: "Write a short story about a robot.",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}Tool Calling
import { generateText, tool } from "ai";
import { z } from "zod";
const { text } = await generateText({
model: setu.model("claude-sonnet-4-20250514"),
prompt: "What's the weather in Tokyo?",
tools: {
getWeather: tool({
description: "Get weather for a location",
parameters: z.object({
location: z.string(),
}),
execute: async ({ location }) => {
return { temperature: 22, condition: "cloudy" };
},
}),
},
});Balance
// Setu account balance
const balance = await setu.balance();
// { walletAddress, balance, totalSpent, totalTopups, requestCount }
// On-chain USDC balance
const wallet = await setu.walletBalance("mainnet");
// { walletAddress, usdcBalance, network }
// Wallet address
console.log(setu.walletAddress);Custom Providers
Register providers at init or runtime:
// At init
const setu = createSetu({
auth,
providers: [
{ id: "my-provider", apiFormat: "openai-chat", modelPrefix: "myp-" },
],
});
// At runtime
setu.registry.register({
id: "another-provider",
apiFormat: "anthropic-messages",
models: ["specific-model-id"],
});
// Map a specific model to a provider
setu.registry.mapModel("some-model", "openai");API Formats
| Format | Description | Used by |
|---|---|---|
openai-responses | OpenAI Responses API | OpenAI |
anthropic-messages | Anthropic Messages API | Anthropic, MiniMax |
openai-chat | OpenAI Chat Completions (compatible) | Moonshot, Z.AI |
google-native | Google GenerativeAI native |
Low-Level: Custom Fetch
Use the x402-aware fetch wrapper directly:
const customFetch = setu.fetch();
const response = await customFetch(
"https://api.setu.ottocode.io/v1/messages",
{
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
model: "claude-sonnet-4-20250514",
messages: [{ role: "user", content: "Hello" }],
max_tokens: 1024,
}),
}
);Standalone Utilities
import {
fetchBalance,
fetchWalletUsdcBalance,
getPublicKeyFromPrivate,
addAnthropicCacheControl,
createSetuFetch,
} from "@ottocode/ai-sdk";
// Get wallet address from private key
const address = getPublicKeyFromPrivate(privateKey);
// Fetch balance without creating a full Setu instance
const balance = await fetchBalance({ privateKey });
// Fetch on-chain USDC
const usdc = await fetchWalletUsdcBalance({ privateKey }, "mainnet");
// Create a standalone x402-aware fetch
const setuFetch = createSetuFetch({
wallet: createWalletContext({ privateKey }),
baseURL: "https://api.setu.ottocode.io",
});How It Works
- You call
setu.model("claude-sonnet-4-20250514")— the SDK resolves this to Anthropic - It creates an ai-sdk provider (
@ai-sdk/anthropic) pointed at the Setu proxy - A custom fetch wrapper intercepts all requests to:
- Inject wallet auth headers (address, nonce, signature)
- Inject Anthropic cache control (if enabled)
- Handle 402 responses by signing USDC payments via x402
- Sniff balance/cost info from SSE stream comments
- The Setu proxy verifies the wallet, checks balance, forwards to the real provider, and tracks usage
Requirements
- Solana wallet with USDC (for payments)
aiSDK v6+ as a peer dependency- Node.js 18+ or Bun
See Configuration for full options, Caching for Anthropic prompt caching details, and Setu Integration for raw HTTP usage.