Using NodeJS
Project Setup
mkdir lazai-inference
cd lazai-inference
npm init -y
Install Alith
npm i alith@latest
Create TypeScript Configuration
Create a file named tsconfig.json
with the following content:
{
"compilerOptions": {
"target": "ES2022",
"module": "ESNext",
"moduleResolution": "bundler",
"allowSyntheticDefaultImports": true,
"esModuleInterop": true,
"allowJs": true,
"strict": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true
},
"ts-node": {
"esm": true
},
"include": ["*.ts"],
"exclude": ["node_modules"]
}
Set Environment Variables
For OpenAI/ChatGPT API:
export PRIVATE_KEY=<your wallet private key>
export OPENAI_API_KEY=<your openai api key>
For other OpenAI-compatible APIs (DeepSeek, Gemini, etc.):
export PRIVATE_KEY=<your wallet private key>
export LLM_API_KEY=<your api key>
export LLM_BASE_URL=<your api base url>
Step 1: Request Inference via LazAI Client
Create a file named app.ts
with the following content:
import { ChainConfig, Client } from "alith/lazai";
import { Agent } from "alith";
// Set up the private key for authentication
process.env.PRIVATE_KEY = "<your wallet private key>";
const node = "0xc3e98E8A9aACFc9ff7578C2F3BA48CA4477Ecf49"; // Replace with your own inference node address
const client = new Client(ChainConfig.testnet());
await client.getUser(client.getWallet().address);
console.log(
"The inference account of user is",
await client.getInferenceAccount(client.getWallet().address, node)
);
const fileId = 10;
const nodeInfo = await client.getInferenceNode(node);
const url = nodeInfo.url;
const agent = new Agent({
// OpenAI-compatible inference server URL
baseUrl: `${url}/v1`,
model: "gpt-3.5-turbo",
// Extra headers for settlement and DAT file anchoring
extraHeaders: await client.getRequestHeaders(node, BigInt(fileId)),
});
console.log(await agent.prompt("What is Alith?"));
Step 2: Run the Application
npx tsx app.ts
Last updated