Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Decentralized AI + Blockchain + Data Ownership
LazAI is a decentralized platform that combines AI, blockchain, and data ownership to enable verifiable, privacy-preserving intelligence on-chain.
This documentation provides developers with the tools and workflows to:
Build and deploy smart contracts on the LazAI Network
Contribute encrypted data and mint Data Anchor Tokens (DATs)
Create AI-powered agents and Digital Twins using on-chain compute
Integrate private inference APIs within decentralized applications
AI models and computations are executed within trusted environments (TEE), ensuring that sensitive data never leaves secure execution zones. This architecture supports verifiable inference without exposing raw data.
Contributors retain full control over their data through tokenized certificates called Data Anchor Tokens (DATs).
Each DAT records provenance, permissions, and rewards for data usage.
LazAI supports deploying AI agents on-chain using smart contracts. Developers can build autonomous, verifiable agents that interact with blockchains and external data sources.
Use these parameters to connect your wallet or development environment to the LazAI Testnet.
This section shows how to build and integrate with the LazAI Network using the Data Anchoring Token (DAT).
It covers how developers can upload data, mint tokens, verify proofs, and track their assets on-chain.
The Developer Implementation guides help you:
Contribute your data or model to the LazAI network.
Mint your DAT — a token that represents ownership, access rights, and value share.
Verify proofs on-chain to confirm authenticity.
Manage your DATs using the LazAI Dashboard.
The implementation flow connects your data pipeline to LazAI’s verified network:
Alith SDK (Python) – for data upload, proof, and token minting.
IPFS – decentralized file storage.
DAT Smart Contracts – manage ownership and usage rights.
Dashboard UI – monitor token activity and rewards.
Start with the following guides:
Would you like me to now write the Contribute Your Data page in the same clear, short, and professional format (so it connects smoothly to this one)?
Chain ID
133718
Currency Symbol
LAZAI
RPC
Block Explorer
Faucet
Step
Description
1. Contribute
Upload encrypted datasets or models to IPFS.
2. Mint
Register and issue your DAT token using the Alith SDK.
3. Verify
Generate and validate proofs through verified computing nodes.
4. Monitor
View all DATs and rewards in the dashboard.
Deploy and test your first smart contract on the LazAI Testnet.
This guide walks you through connecting to the network, setting up your environment, and deploying a sample contract using Hardhat, Foundry, or Remix.
LazAI provides a developer-friendly Web3 environment for building AI-powered decentralized applications.
Each LazAI chain supports EVM-compatible contracts, so if you’ve deployed contracts on Ethereum, you’ll feel right at home.
LazAI provides an EVM-compatible environment for deploying and testing smart contracts.
If you have experience developing on Ethereum, you can use the same tools and workflows on LazAI.
This page covers:
Use these parameters to connect your wallet or development environment to the LazAI Testnet.
After deployment:
Copy the deployed contract address.
Open the .
Paste the contract address to confirm deployment.
If needed, verify your source code through your chosen framework’s verification plugin or CLI command.
RPC connection error
Confirm your RPC URL and chain ID (133718).
Retry with a clean browser session or different RPC endpoint if available.
Insufficient funds
Request test tokens from the .
Contract verification issues
Match your Solidity compiler version with the one specified in your configuration file.
Continue with framework-specific deployment:
Chain ID
133718
Currency Symbol
LAZAI
RPC
Block Explorer
Faucet
LazAI enables data contributors to securely share privacy-sensitive data, computation, and resources, earning rewards while retaining full control over their data. This section will guide you through.Each DAT serves as a transparent certificate on the blockchain: you can trace exactly where the data came from, how it has been used, and who has rights to it.LazAI leverages OpenPGP encryption to secure privacy data. The encryption workflow is as follows:
A random symmetric key is generated to encrypt the data using a symmetric encryption algorithm (e.g., AES).
This symmetric key is then encrypted with the recipient’s public key via an asymmetric encryption algorithm (RSA), producing the final encrypted payload.
Workflow Steps:
A random key is derived from the user’s Web3 wallet (e.g., MetaMask) and used as a password to encrypt the user’s private data. This step authenticates the sender’s identity.
The encrypted symmetric key (via RSA) and the encrypted data are uploaded to a decentralized archive (DA) (e.g., IPFS, Google Drive, or Dropbox).
The encrypted data’s URL and the encrypted key are registered on the LazAI smart contract.
A test data verifier decrypts the symmetric key using their private key, downloads the encrypted data via the URL, and decrypts it within a trusted execution environment (TEE) to ensure security.
The decrypted data and a TEE-generated proof are uploaded to the LazAI contract for validation.
Upon successful verification, users can submit a request via LazAI to receive Data Anchor Tokens (DAT) as rewards.
Create a file named tsconfig.json with the following content:
For OpenAI/ChatGPT API:
For other OpenAI-compatible APIs (DeepSeek, Gemini, etc.):
Create a file named app.ts with the following content:
mkdir lazai-inference
cd lazai-inference
npm init -ynpm i alith@latest{
"compilerOptions": {
"target": "ES2022",
"module": "ESNext",
"moduleResolution": "bundler",
"allowSyntheticDefaultImports": true,
"esModuleInterop": true,
"allowJs": true,
"strict": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true
},
"ts-node": {
"esm": true
},
"include": ["*.ts"],
"exclude": ["node_modules"]
}export PRIVATE_KEY=<your wallet private key>
export OPENAI_API_KEY=<your openai api key>export PRIVATE_KEY=<your wallet private key>
export LLM_API_KEY=<your api key>
export LLM_BASE_URL=<your api base url>import { ChainConfig, Client } from "alith/lazai";
import { Agent } from "alith";
// Set up the private key for authentication
process.env.PRIVATE_KEY = "<your wallet private key>";
const node = "0xc3e98E8A9aACFc9ff7578C2F3BA48CA4477Ecf49"; // Replace with your own inference node address
const client = new Client(ChainConfig.testnet());
await client.getUser(client.getWallet().address);
console.log(
"The inference account of user is",
await client.getInferenceAccount(client.getWallet().address, node)
);
const fileId = 10;
const nodeInfo = await client.getInferenceNode(node);
const url = nodeInfo.url;
const agent = new Agent({
// OpenAI-compatible inference server URL
baseUrl: `${url}/v1`,
model: "gpt-3.5-turbo",
// Extra headers for settlement and DAT file anchoring
extraHeaders: await client.getRequestHeaders(node, BigInt(fileId)),
});
console.log(await agent.prompt("What is Alith?"));npx tsx app.ts "ipfs://metadata/med-dataset-class"
)
mintDAT(
user1,
1, // classId: Medical Dataset
1000 * 1e6, // value: 1000 (6 decimals)
500, // shareRatio: 5%
0 // expireAt: never expires
);
💡 Option: Mint multiple DATs to simulate multi-user ownership
transferValue(user1TokenId, agentTreasuryTokenId, 100 * 1e6);
agentTreasuryTokenId may be a treasury contract address
Supports “pay-as-you-use” or delegated billing models
Assume an agent earns 10 USDC:
agent.payToDATHolders(classId, 10 USDC);
Contract calculates each DAT’s shareRatio
Distributes revenue proportionally
require(block.timestamp < dats[tokenId].expireAt, "Token expired");
Used for subscription-based AI services or time-bound licenses
Field
Description
ID
Each DAT is a unique token representing a specific AI asset
CLASS
Category of asset (e.g., POV, Model, Inference)
VALUE
{
"name": "Timi + Fine-Tuned Llama Model v2",
"class": "0x03",
"value": 2500,
"proof": {
"type": "ZK-SNARK",
"hash": "0xabc123...",
"verifiedBy": "0xVerifierContract"
},
"usagePolicy": {
"maxCalls": 1000,
"expiresAt": "2026-12-31"
},
"revenueShare": {
"totalShares": 10000,
"holderShare": 2500
},
"rights": {
"license": "commercial_use_allowed",
"canTransfer": true
}
}
This page walks you through minting a Data Anchor Token (DAT) on the LazAI Pre-Testnet using one of our SDKs: Node.js, Python, or Rust.
LazAI enables contributors to securely share privacy-sensitive data, computation, and resources while earning rewards — all without surrendering ownership or control over their data.Data contribution is the cornerstone of the LazAI and Alith ecosystems. Contributors decide exactly how their data can be used (e.g., on-chain training, inference, evaluation) and also gain governance rights.
LazAI uses strong encryption to ensure that only authorized parties can access your data:
Symmetric encryption — A random symmetric key (e.g., AES) is generated to encrypt your data.
Asymmetric encryption — The symmetric key is then encrypted with the recipient’s public key using RSA.
The final payload (encrypted data + encrypted key) is stored securely in decentralized storage.
Key derivation — A random key is derived from your Web3 wallet (e.g., MetaMask) and used to encrypt your private data, proving sender authenticity.
Storage — The encrypted symmetric key and encrypted data are uploaded to a Decentralized Archive (DA) such as IPFS, Google Drive, or Dropbox.
On-chain registration — The storage URL and encrypted key are registered in LazAI’s smart contracts.
Verification — A designated verifier decrypts the symmetric key with their private key, retrieves the data, and decrypts it inside a Trusted Execution Environment (TEE).
The LazAI API provides a simple way to run AI inference on private data without losing control of your information.
It enables developers to perform context engineering, training, and evaluation directly on LazAI ensuring that data never leaves the owner’s control.
Once you contribute your private data and mint a Data Anchoring Token (DAT), you can invoke AI models in a privacy-preserving way.
This workflow guarantees that your sensitive data remains secure, auditable, and owned by you, while still powering intelligent AI services.
Proof & validation — The TEE generates a proof that is uploaded to LazAI for contract-level validation.
Rewards — Upon successful verification, contributors receive Data Anchor Tokens (DAT).
Data privacy is essential in industries such as healthcare, finance, and research.
Traditional AI services often require uploading datasets to centralized servers, increasing the risk of data exposure or misuse.
With LazAI, inference happens securely on your own terms:
No data handover: Your dataset never leaves your control.
End-to-end encryption: All model calls and outputs are cryptographically secured.
Verifiable execution: Each inference request can be verified using on-chain proofs.
Ownership preserved: You retain ownership and monetization rights via the DAT standard.
This allows you to build and run value-aligned AI agents that respect data sovereignty combining performance with full privacy compliance.
Continue with the following guides to learn how to use the LazAI API in different environments:
Be consumed per use, recharged, or reused
Example:
A DAT with CLASS = 0x03 and value = 1000 allows 1000 calls to a model or equivalent compute.
Extensions:
Value can be recharged
Value is deducted upon use and recorded on-chain for traceability
In dataset or model-based DATs, value also indicates fractional ownership, allowing holders to earn revenue share:
If a model is used by other agents, income is split proportionally
If a dataset is used for training, contributors earn a share
Rewards are automatically settled to token holders — no manual claim needed
Example:
Model DAT #1234 has 10,000 total shares
You hold a token with value = 500
If the model earns 10,000 DAT in a month, you receive: 500 / 10,000 × 10,000 = 500 DAT
Represents usage quota or revenue share
PROOF
Attached proof for authenticity (e.g., ZK, TEE, OP)
The Data Anchoring Token (DAT) framework defines how AI-native digital assets are represented, verified, and monetized on the LazAI Network.
It provides the foundational logic for data ownership, usage rights, and value attribution, using an interoperable, on-chain token standard.
This section explores the design principles, technical architecture, and lifecycle that power the DAT ecosystem.
At its core, the DAT standard bridges AI data provenance with decentralized economic infrastructure.
It ensures that every dataset, model, or computational artifact is:
Verifiable — its origin, authenticity, and contribution are cryptographically recorded.
Usable — its access rights are programmable and enforceable via smart contracts.
Rewardable — contributors automatically receive fair value for downstream AI usage.
By combining these three properties, LazAI establishes a new foundation for a transparent and composable AI economy.
The DAT system is built on five interoperable layers:
📘 Learn more: View the Architecture →
DATs follow a transparent and programmable lifecycle:
Create Class — Define an AI asset category (dataset, model, or agent).
Contribute Data — Upload and encrypt data, storing metadata in decentralized storage.
Mint DAT — Bind ownership and usage rights to a token.
Invoke Service — Use DATs to access or call AI services.
📘 Learn more: See Lifecycle & Value Semantics →
The DAT standard integrates privacy-preserving computation and cryptographic guarantees to protect sensitive AI data.
📘 Learn more: Explore Security & Privacy Model →
The Concepts & Architecture layer provides developers with a clear understanding of how DAT integrates cryptography, smart contracts, and token economics to create a verifiable and monetizable AI data framework.
Together, these concepts enable a scalable foundation for the decentralized AI economy built on LazAI.
Deploying a Counter Contract with Foundry
This guide will walk you through deploying a counter contract using Foundry, a fast and portable toolkit for Ethereum application development.
Before you begin, make sure you have:
A code editor (e.g., VS Code)
MetaMask wallet for deploying to testnets
RPC endpoint for deploying to a network
Open your terminal and run:
This installs foundryup, the Foundry installer.
Next, run:
This will install the Foundry toolchain (forge, cast, anvil, chisel).
Check the installation:
Create a new directory for your project and initialize Foundry:
This creates a project with the following structure:
src/ - for your smart contracts
test/ - for Solidity tests
script/ - for deployment scripts
Foundry initializes your project with a Counter contract in src/Counter.sol:
This contract stores a number and allows you to set or increment it.
Compile your smart contracts with:
This command compiles all contracts in src/ and outputs artifacts to the out/ directory.
Foundry supports writing tests in Solidity (in the test/ directory). To run all tests:
You’ll see output indicating which tests passed or failed. The default project includes a sample test for the Counter contract.
To deploy your contract to the LazAI testnet, you’ll need:
An RPC URL
A private key with testnet LAZAI
Example deployment command for LazAI testnet:
Replace <YOUR_PRIVATE_KEY> with your actual private key. Never share your private key.
You can use cast to interact with deployed contracts, send transactions, or query data. For example, to read the number variable on LazAI testnet:
This guide will walk you through deploying a counter contract using Hardhat, a popular JavaScript-based development environment for Ethereum.
In 2025, LazAI will focus on developing the foundational components of its ecosystem: launching an AI agent for data alignment, deploying a high-performance blockchain for AI execution, and implementing data-driven PoS + Quorum-Based BFT consensus. This year will be testnet-focused, allowing developers to experiment with AI data anchoring and verification mechanisms.
LazAI will launch Alith, a simple, composable, high-performance, and Web3-friendly AI agent framework designed to facilitate decentralized AI data processing, governance, and alignment. Unlike traditional AI agents that run in silos, Alith will interact with LazAI on-chain and off-chain AI data sources, ensuring data provenance and verification.
Successfully Delivered: Establish AI-native data workflows, allowing developers to access verified, decentralized AI datasets for training and inference, and build high-performance AI Agent applications.
LazAI will launch its testnet, providing a blockchain environment optimized for AI data integrity, validation, and settlement. This testnet will serve as a sandbox for developers experimenting with AI data tokenization, provenance tracking, and alignment verification.
Expected Outcome: Developers gain access to a secure, scalable blockchain infrastructure where AI data can be registered, exchanged, and verified with on-chain provenance guarantees.
DAT is live on Pre-Testnet, visit here.
LazAI will officially launch its mainnet, introducing a hybrid consensus protocol tailored for AI data verification and governance.
Expected Outcome: A robust, AI-data-driven blockchain with verifiable training datasets, model alignment guarantees, and tamper-proof AI execution workflows.
With a strong blockchain foundation in place, LazAI will shift its focus toward ensuring AI data security, cross-chain AI dataset interoperability, and decentralized AI workflow automation.
LazAI will implement its first major mainnet upgrade, reinforcing its position as the fastest blockchain for AI data transactions and provenance tracking.
Expected Outcome: LazAI will emerge as the leading blockchain for AI data integrity, verification, and decentralized training workflows.
To reinforce trust in AI data sources and computation, LazAI will implement enhanced dispute resolution and decentralized AI security guarantees.
Expected Outcome: LazAI will set a new standard for AI data validation, ensuring models remain ethical, accountable, and bias-resistant.
LazAI will expand beyond its native blockchain, integrating with decentralized AI data platforms, oracles, and federated learning networks.
Expected Outcome: LazAI will become the global hub for AI data exchange, enabling seamless AI asset movement across blockchain ecosystems.
LazAI is building the first AI-native blockchain ecosystem centered around AI data integrity, accessibility, and provenance tracking. Our roadmap ensures fair, scalable, and verifiable AI model governance.
🔹 2025: Building AI Data Infrastructure – Launching AI agents, enabling AI dataset tokenization, and establishing decentralized AI arbitration mechanisms.
🔹 2026: Scaling AI Data Governance – Enhancing AI dataset privacy, security, and interoperability.
🔹 Beyond 2026: Expanding AI data monetization, federated learning, and autonomous AI data governance.
POV Inlet (Point-of-View Inlet)
POV is a protocol and standard for data unification In LazAI. POV Inlet is a submission interface allowing individuals or iDAOs to inject human-annotated context or judgments into the AI validation pipeline.
Verified Computing
A modular system combining ZKPs, Optimistic Proofs, and TEE to ensure verifiable off-chain AI computation and inference.
ZK Proofs
Zero-Knowledge Proofs - Cryptographic proofs used to verify computation or data without revealing sensitive information.
Quorum
A decentralized validation group consisting of multiple iDAOs that reach consensus on dataset integrity, model validation, and proof verification.
Extension Layer
Allows LazAI to integrate external data sources, model providers, computing platforms, and oracle services.
Execution Layer
Supports high-throughput, low-latency AI inference and verifiable computation, including Parallel EVM Execution.
Settlement Layer
Handles the issuance, ownership, and lifecycle of AI assets, ensuring on-chain traceability and compliance.
Data Availability Layer
Ensures reliable access and verification for on-chain/off-chain datasets via IPFS, Arweave, Web2 APIs, etc.
DeFAI
Decentralized Financial AI - A LazAI-native DeFi module enabling staking, lending, bonding, and monetization of AI assets.
Optimistic Proofs (OP)
A lightweight validation mechanism assuming correctness by default, but allowing dispute via fraud proofs.
Fraud Proofs
A dispute mechanism where challengers submit evidence against invalid data or computation to trigger slashing or reward redistribution.
TEE
Trusted Execution Environment - Hardware-level secure enclave enabling verifiable and private AI execution.
AI Agent
A modular, autonomous agent powered by AI models, trained on verified datasets, and deployed via Alith.
Term
Definition
LazAI
A decentralized AI infrastructure platform integrating verifiable AI data, models, and computation with blockchain, enabling a trustless, composable, and privacy-preserving AI economy.
iDAO
Individual-centric DAO: The native social structure of the AI economy. Ensure decentralized validation of data sources and AI workflows and enable LazAI flow to process the reward allocation
LazChain
The dedicated blockchain infrastructure of LazAI for managing AI assets, proofs, consensus, and rewards.
Alith
LazAI’s decentralized AI Agent Framework for building and deploying autonomous AI Agents.
DAT
Data Anchoring Token - A semi-fungible token standard that anchors datasets, models, and agents with verifiable provenance, access rights, and ownership.
DAT Marketplace
A decentralized trading and verification hub where AI assets (datasets, models, agents) are exchanged and validated.
lib/ - for dependenciesfoundry.toml - project configuration file
LazAI testnet tokens (get from faucet)
Open MetaMask extension
Click on the network dropdown (usually shows "Ethereum Mainnet")
Click "Add Network"
Enter the following details:
Click "Save" to add the network
Switch to LazAI Testnet in MetaMask
Visit the LazAI faucet to get Testnet Tokens for deployment and transaction fees.
Open your web browser
Remix IDE will load automatically - no installation required
In the File Explorer (left panel), click the "+" icon next to "contracts"
Name your file Counter.sol
Add the following code:
Click on the "Solidity Compiler" tab (second icon in left panel)
Select compiler version 0.8.0 or higher
Ensure "Auto compile" is checked (optional but recommended)
Click "Compile Counter.sol" button
Check for any compilation errors in the console
Green checkmark indicates successful compilation
Click on "Deploy & Run Transactions" tab (third icon in left panel)
In the "Environment" dropdown, select "Injected Provider - MetaMask"
MetaMask will prompt you to connect - click "Connect"
Ensure you're connected to LazAI Testnet in MetaMask
Under "Contract" dropdown, select "Counter"
Click "Deploy" button
MetaMask will open asking you to confirm the transaction
Review gas fees and click "Confirm"
Wait for transaction confirmation
Before you begin, ensure you have:
Node.js installed (v12 or later)
npm (comes with Node.js)
A code editor (e.g., VS Code)
MetaMask wallet and testnet tokens for deployment
Open your terminal and create a new project directory:
Initialize a new npm project:
Install Hardhat and required dependencies:
Run the Hardhat setup wizard:
Choose “Create a JavaScript project” when prompted.
This will create a project structure like:
contracts/ - for Solidity contracts
igntion/ - for deployment scripts
test/ - for tests
hardhat.config.js - configuration file
Create a new file in the contracts directory, Counter.sol:
Compile your contracts with:
You should see a success message if there are no errors.
Create a new file in the ignition directory, Counter.js:
Create a .env file in your project root:
Edit hardhat.config.js:
Local Deployment (Optional)
Start the Hardhat local node in a separate terminal:
Deploy to local network:
Make sure to:
Get testnet tokens from the faucet
Add your private key to the .env file
Never share your private key
Deploy to LazAI:
Test Setup
Create test/Counter.js:
Running Tests
curl -L https://foundry.paradigm.xyz | bashfoundryupforge --versionforge init Countercd Counter// SPDX-License-Identifier: MIT
pragma solidity ^0.8.0;
contract Counter {
uint256 private count;
function increment() public {
count += 1;
}
function decrement() public {
count -= 1;
}
function getCount() public view returns (uint256) {
return count;
}
}forge buildforge testforge create --rpc-url https://testnet.lazai.network \ --private-key <YOUR_PRIVATE_KEY> \ src/Counter.sol:Counter \ --broadcastcast call <CONTRACT_ADDRESS> "number()(uint256)" --rpc-url https://lazai-testnet.metisdevops.link// SPDX-License-Identifier: MIT
pragma solidity ^0.8.0;
contract Counter {
uint256 private count;
function increment() public {
count += 1;
}
function decrement() public {
count -= 1;
}
function getCount() public view returns (uint256) {
return count;
}
}mkdir counter-project
cd counter-projectnpm init -ynpm install --save-dev hardhat @nomicfoundation/hardhat-toolbox dotenvnpm install --save-dev @nomicfoundation/hardhat-ignitionnpx hardhat// SPDX-License-Identifier: MIT
pragma solidity ^0.8.0;
contract Counter {
uint256 private count;
function increment() public {
count += 1;
}
function decrement() public {
count -= 1;
}
function getCount() public view returns (uint256) {
return count;
}
}npx hardhat compileconst { buildModule } = require("@nomicfoundation/hardhat-ignition/modules"); module.exports = buildModule("CounterModule", (m) => {
const counter = m.contract("Counter");
return { counter };
});PRIVATE_KEY=your_private_key_hererequire("@nomicfoundation/hardhat-toolbox");
require("dotenv").config();
module.exports = {
solidity: "0.8.28",
networks: {
hardhat: {
chainId: 31337,
},
lazai: {
url: "https://testnet.lazai.network",
chainId: 133718,
accounts: [process.env.PRIVATE_KEY],
},
},
};npx hardhat nodenpx hardhat ignition deploy ignition/modules/Counter.js --network localhostnpx hardhat ignition deploy ignition/modules/Counter.js --network lazaiconst { expect } = require("chai");
describe("Counter", function () {
it("Should increment the counter", async function () {
const Counter = await ethers.getContractFactory("Counter");
const counter = await Counter.deploy();
await counter.deployed();
await counter.increment();
expect(await counter.getCount()).to.equal(1);
});
});npx hardhat testDistribute Rewards — Automatically split revenue based on shareRatio.
Expire or Renew — Handle time-bound access or licensing renewal.
Layer
Description
1. Encryption & Data Layer
Encrypts contributed data and anchors its proof of existence.
2. Metadata & Provenance Layer
Records dataset or model attributes, versioning, and authorship on-chain.
3. Smart Contract Layer
Governs DAT minting, validation, transfers, and settlement logic.
4. Verification Layer (TEE/ZK)
Verifies authenticity through trusted execution or zero-knowledge proofs.
5. Tokenization & Economy Layer
Issues semi-fungible DATs that encode ownership, usage, and value participation.
Security Component
Description
Encryption at Source
Data is encrypted locally before upload using hybrid AES–RSA keys.
TEE Verification
Trusted enclaves validate computation integrity without exposing raw data.
Zero-Knowledge Proofs (ZKP)
Optional layer for verifying claims or usage without revealing private details.
Access Control Policies
Enforced on-chain to prevent unauthorized dataset or model invocation.
Feature
Description
Composable Data Assets
Combine or split data ownership across classes and users.
Royalty-Backed Tokenization
Link AI model or dataset revenue directly to token holders.
Programmable Usage Rights
Define dynamic access rules, quotas, or billing models.
Interoperable with EVM
Fully compatible with standard Ethereum tools and smart contracts.
Step
Action
1
Learn the DAT Architecture →
2
Understand Lifecycle & Value Semantics →
3
Review Security & Privacy Model →
4
Implement your first DAT using the Developer Implementation Guide →
In this tutorial, you will learn how to create a Node.js application that integrates X/Twitter with Alith’s Model Context Protocol (MCP) feature. This allows you to use LLM models and agents to fetch tweets from a specific user, post a new tweet, like a tweet, quote a tweet, etc.
Note: Although we used Node.js in this tutorial, you can still use Alith Rust SDK and Python SDK to complete this Twitter agent.
Before starting, ensure you have the following:
OpenAI API Key: Sign up at OpenAI and get your API key or use your favorite LLM models.
Node.js 18+ environment and pnpm.
A X/Twitter account.
Initialize the project and install the necessary Node.js libraries using pnpm:
Store your API keys and tokens as environment variables for security:
To obtain cookies:
Log in to Twitter in your browser.
Open Developer Tools (F12).
Go to the Application tab > Cookies.
Copy the values of auth_token and ct0 cookies.
Create a Typescript script (e.g., index.ts) and add the following code:
Note: We need to install tsc firstly.
Create a JSON file named mcp_twitter.json and add the following code:
Run your Typescript script to start and test the application:
The Data Anchoring Token (DAT) architecture defines the core technical stack that powers verifiable AI data ownership on the LazAI Network.
It provides a layered framework where data, metadata, and economic value interact securely through smart contracts and cryptographic proofs.
At a high level, the DAT framework connects contributors, AI agents, and the blockchain using a multi-layered architecture:
This structure ensures every contributed dataset, model, or inference is:
Third-Party Data Sources like IPFS & Arweave For other third-party data sources, we will currently maintain the same approach as with heterogeneous chain cross-chaining, i.e., using Oracles and third-party bridges. In the future, we will support native cross-chain methods.
mkdir alith-twitter-example && cd alith-twitter-example && pnpm init
pnpm i alith
pnpm i --save-dev @types/json-schemaexport OPENAI_API_KEY="your-openai-api-key"export AUTH_METHOD=cookies
export TWITTER_COOKIES=["auth_token=your_auth_token; Domain=.twitter.com", "ct0=your_ct0_value; Domain=.twitter.com"]export AUTH_METHOD=credentials
export TWITTER_USERNAME=your_username
export TWITTER_PASSWORD=your_password
export [email protected] # Optional
export TWITTER_2FA_SECRET=your_2fa_secret # Optional, required if 2FA is enabledexport AUTH_METHOD=api
export TWITTER_API_KEY=your_api_key
export TWITTER_API_SECRET_KEY=your_api_secret_key
export TWITTER_ACCESS_TOKEN=your_access_token
export TWITTER_ACCESS_TOKEN_SECRET=your_access_token_secretimport { Agent } from "alith";
const agent = new Agent({
name: "A twitter agent",
model: "gpt-4",
preamble: "You are a automatic twitter agent.",
mcpConfigPath: "mcp_twitter.json",
});
console.log(await agent.prompt("Search Twitter for tweets about AI"));
console.log(await agent.prompt('Post a tweet saying "Hello from Alith Twitter Agent!"'));
console.log(await agent.prompt("Get the latest tweets from @OpenAI"));
console.log(await agent.prompt("Chat with Grok about quantum computing"));{
"mcpServers": {
"agent-twitter-client-mcp": {
"command": "npx",
"args": ["-y", "agent-twitter-client-mcp"],
"env": {
"AUTH_METHOD": "cookies",
"TWITTER_COOKIES": "[\"auth_token=YOUR_AUTH_TOKEN; Domain=.twitter.com\", \"ct0=YOUR_CT0_VALUE; Domain=.twitter.com\", \"twid=u%3DYOUR_USER_ID; Domain=.twitter.com\"]"
}
}
}
}npx tsc && node index.jsEncrypted and verifiable
Anchored to an immutable provenance record
Represented as a semi-fungible DAT token
Linked to programmable usage and revenue logic
Encrypts raw data locally before submission using hybrid AES + RSA.
Generates a unique data fingerprint (SHA-256 hash) for integrity tracking.
Stores encrypted payloads in decentralized archives (IPFS, Filecoin, or private storage).
Output: Encrypted file + integrity hash
Records asset identity, class, description, and URI in a metadata schema.
Maintains the provenance of contribution (creator, timestamp, ownership chain).
Anchors metadata hashes to the blockchain for tamper-proof traceability.
Output: Immutable metadata anchor
Core on-chain logic that manages the DAT lifecycle:
Registering data contributions
Minting and binding tokens to assets
Managing value transfers and ownership rights
Enables composable operations like:
registerData(), mintDAT(), transferValue(), claimRewards()
Output: On-chain record of ownership, value, and access
Validates submitted data through Trusted Execution Environments (TEE) or Zero-Knowledge Proofs (ZKPs).
Ensures the computation or dataset matches the registered proof without revealing the raw data.
Provides verifiable attestations used for DAT minting authorization.
Output: Signed proof of authenticity
Issues a semi-fungible DAT token (SFT) representing the verified contribution.
Encodes three properties:
Ownership Certificate
Usage Rights (e.g., call credits, model usage)
Value Share (fractional rewards)
Integrates with payment and settlement contracts to automate royalty flow.
Output: Minted DAT with on-chain economic logic
Function
Description
createClass(name, uri)
Defines a new class of AI assets (datasets, models, or agents).
mintDAT(owner, classId, value, shareRatio, expiry)
Issues a token for a verified contribution.
transferValue(fromToken, toToken, amount)
Enables fine-grained value or credit transfer.
claimRewards(classId)
Distributes on-chain rewards proportionally.
verifyData(hash, proof)
Validates integrity through off-chain verifier.
Integration
Description
TEE Verifiers
Used for confidential validation without exposing data.
AI Agents / Oracles
Consume DATs as compute or model credits.
External Data Feeds
Can be integrated via API or SDK for automated registration.
Wallets & Dashboards
Manage minting, ownership, and analytics visually.
Principle
Description
Privacy First
No unencrypted data leaves the contributor’s device.
Interoperability
Fully EVM-compatible and modular for AI agent extensions.
Composability
DATs can be split, merged, or reused across workflows.
Transparency
Each operation emits verifiable on-chain events.
🔹 Lifecycle & Value Semantics →
Learn how DATs evolve from registration to value realization.
🔹 Security & Privacy Model →
Explore how encryption, TEE, and ZKP ensure data safety.
🔹 Developer Implementation →
Start building and minting your first DAT.
To avoid dependency conflicts and keep your environment clean, create and activate a Python virtual environment before installing any packages:
For OpenAI/ChatGPT API:
For other OpenAI-compatible APIs (DeepSeek, Gemini, etc.):
Note: The public address of the private key you expose to the inference server is the
LAZAI_IDAO_ADDRESS. Once the inference server is running, the URL must be registered using theadd_inference_nodefunction in Alith. This can only be done by LazAI admins.
Local Development
For OpenAI/ChatGPT API:
For other OpenAI-compatible APIs (DeepSeek, Gemini, etc.):
Production Deployment on Phala TEE Cloud
For production-ready applications, deploy your inference server on for enhanced security and privacy. Once deployed, you will receive an inference URL that needs to be registered using the add_inference_node function by LazAI admins.
You can also use the existing inference nodes.
Your data never leaves your control. Inference is performed in a privacy-preserving environment, using cryptographic settlement and secure computation.
Settlement headers ensure only authorized users and nodes can access your data for inference.
File ID links your inference request to the specific data you contributed, maintaining a verifiable chain of custody.
The Data Anchoring Token (DAT) is the foundational standard of the LazAI and Alith ecosystems.
It enables contributors to share privacy-sensitive datasets, AI models, or computation results while retaining full ownership, control, and economic rights.
DATs act as on-chain certificates of contribution, linking data provenance, access permissions, and value distribution directly to blockchain-based records.
Each DAT encodes three primary dimensions of data ownership and utility:
AI data is dynamic, composable, and frequently reused across models and tasks — properties that traditional token standards like ERC-20 and ERC-721 don’t fully support.
DAT introduces a semi-fungible token (SFT) model designed for modularity, traceability, and partial ownership of AI assets.
Data Contribution:
A user encrypts and uploads a dataset or model output through LazAI’s privacy framework.
Metadata Anchoring:
A smart contract logs encrypted metadata, provenance proofs, and ownership claims on-chain.
Verification:
Validators or trusted enclaves (TEE) confirm authenticity and compliance.
Standard: Semi-Fungible Token (SFT)
Purpose: Tokenize AI datasets, models, and computation outputs
Blockchain Layer: LazAI Testnet (EVM-compatible)
Supports: On-chain provenance, privacy-preserving validation, and composable ownership logic
DAT Architecture →
Lifecycle & Value Semantics →
Contribute Your Data →
graph TD
A[Contributor / AI Developer] --> B[Encryption & Data Layer]
B --> C[Metadata & Provenance Layer]
C --> D[Smart Contract Layer]
D --> E[Verification Layer (TEE / ZKP)]
E --> F[Tokenization & Economy Layer]
F --> G[DAT Holder / Service Consumer]1. Encrypt data → Generate hash
2. Upload to decentralized archive
3. Register metadata and hash on-chain
4. Validate via TEE or ZKP
5. Mint DAT token representing the asset
6. Use DAT to access AI services or earn rewardspython3 -m venv venvsource venv/bin/activatepip install llama-cpp-python pymilvus "pymilvus[model]"python3 -m pip install alith -Uexport PRIVATE_KEY=<your wallet private key>
export OPENAI_API_KEY=<your openai api key>export PRIVATE_KEY=<your wallet private key>
export LLM_API_KEY=<your api key>
export LLM_BASE_URL=<your api base url>from alith.inference import run
"""Run the server and use the following command to test the server
curl http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "X-LazAI-User: 0xc3e98E8A9aACFc9ff7578C2F3BA48CA4477Ecf49" \
-H "X-LazAI-Nonce: 123456" \
-H "X-LazAI-Signature: HSDGYUSDOWP123" \
-H "X-LazAI-Token-ID: 1" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "What is the capital of France?"}
],
"temperature": 0.7,
"max_tokens": 100
}'
"""
server = run(model="gpt-3.5-turbo", settlement=True, engine_type="openai")
from alith.inference import run
# Example: Using DeepSeek model from OpenRouter
server = run(settlement=True, engine_type="openai", model="deepseek/deepseek-r1-0528")from alith import Agent, LazAIClient
# 1. Join the iDAO, register user wallet on LazAI and deposit fees (Only Once)
LAZAI_IDAO_ADDRESS = "0xc3e98E8A9aACFc9ff7578C2F3BA48CA4477Ecf49" # Replace with your own address
client = LazAIClient()
try:
client.get_user(client.wallet.address)
print("User already exists")
except Exception:
print("User does not exist, adding user")
client.add_user(10000000)
client.deposit_inference(LAZAI_IDAO_ADDRESS, 1000000)
# 2. Request the inference server with the settlement headers and DAT file id
file_id = 11 # Use the File ID you received from the Data Contribution step
url = client.get_inference_node(LAZAI_IDAO_ADDRESS)[1]
print("url", url)
agent = Agent(
# Note: replace with your model here
model="gpt-3.5-turbo",
base_url=f"{url}/v1",
# Extra headers for settlement and DAT file anchoring
extra_headers=client.get_request_headers(LAZAI_IDAO_ADDRESS, file_id=file_id),
)
print(agent.prompt("summarize it"))Tokenization:
A DAT is minted as a semi-fungible token representing the data’s rights, access rules, and value distribution.
Capability
Description
Ownership Certificate
Records verifiable proof of contribution or authorship for datasets, models, or computation results.
Usage Rights
Defines how and where the data can be accessed — for example, by AI services, model training, or agent execution.
Value Share
Assigns proportional economic rewards to contributors based on usage, staking, or licensing activity.
Token Type
Description
Limitation
ERC-20 (Fungible)
Fully interchangeable tokens, ideal for currency or credits.
Cannot represent unique datasets or ownership records.
ERC-721 (Non-Fungible)
Unique tokens for singular assets (e.g., one-of-a-kind NFTs).
Lacks divisibility and modularity for AI workloads.
DAT (Semi-Fungible)
Hybrid model combining ERC-20 and ERC-721 traits — divisible, composable, and traceable.
Tailored for data provenance and AI-specific workflows.
Benefit
Description
Verifiable Provenance
Every dataset or model is cryptographically tied to its origin and contributor.
Data Monetization
Contributors can receive automatic rewards or royalties for approved AI usage.
Privacy by Design
Encryption and TEE validation ensure that raw data remains confidential.
Composable Ownership
DATs can be merged, split, or referenced across multiple models or applications.
This guide walks through the process of setting up your local environment, connecting your wallet, and minting your first Data Anchoring Token (DAT) using the LazAI Node.JS SDK.
It assumes familiarity with Node.JS, basic blockchain interactions, and access to the LazAI Pre-Testnet. For a guided walkthrough of the entire setup and minting process, watch the official tutorial: ➡️ https://youtu.be/ayD46LALXpo Prerequisites
Node.js ≥ 18.x (LTS)
A funded Testnet wallet (for gas)
Credentials:
PRIVATE_KEY – your wallet’s private key (starts with 0x)
IPFS_JWT – Pinata (or compatible) IPFS JWT
dotenv loads env vars, axios sends the proof request, node-rsa encrypts the symmetric key for the verifier.
Create a file index.ts and paste:
Create tsconfig.json:
Create a .env (recommended) or export in your shell:
Your PRIVATE_KEY must start with 0x.
The script reads these via dotenv.
This guide walks through the process of setting up your local environment, connecting your wallet, and minting your first Data Anchoring Token (DAT) using the LazAI Rust SDK.
It assumes familiarity with Rust, basic blockchain interactions, and access to the LazAI Pre-Testnet. For a guided walkthrough of the entire setup and minting process, watch the official tutorial: ➡️ https://youtu.be/LYN_ZaxFWXg
cargo init --bin lazai-dat-rust
cd lazai-dat-rust
# Alith (from GitHub) with needed features
cargo add alith --git https://github.com/0xLazAI/alith --features "lazai,wallet,crypto,ipfs"
# Runtime & utils
cargo add tokio --features full
cargo add reqwest --features json
cargo add anyhow
cargo add hex
cargo add rand_08Start a local or remote inference server that can process requests on your private data.
Use the LazAI client to send an inference request, referencing your File ID and providing settlement headers for secure access.
export PRIVATE_KEY=<your wallet private key> # must start with 0x
export IPFS_JWT=<your pinata ipfs jwt>mkdir LazAI-contribution
cd LazAI-contribution
npm init -ynpm install alith@latest axios dotenv node-rsa ts-node typescript
npm install --save-dev @types/node-rsaimport { config } from 'dotenv'
import { Client } from 'alith/lazai'
import { PinataIPFS } from 'alith/data/storage'
import { encrypt } from 'alith/data'
import NodeRSA from 'node-rsa'
import axios from 'axios'
import { promises as fs } from 'fs'
// Load environment variables
config()
async function main() {
try {
// Check for required environment variables
const privateKey = process.env.PRIVATE_KEY
const ipfsJwt = process.env.IPFS_JWT
if (!privateKey) {
throw new Error('PRIVATE_KEY environment variable is required')
}
if (!ipfsJwt) {
console.warn('Warning: IPFS_JWT environment variable not set. IPFS operations may fail.')
}
// Initialize client with private key as third parameter
const client = new Client(undefined, undefined, privateKey)
const ipfs = new PinataIPFS()
console.log('✅ Client initialized successfully')
console.log('✅ IPFS client initialized successfully')
// 1. Prepare your privacy data and encrypt it
const dataFileName = 'your_encrypted_data.txt'
const privacyData = 'Your Privacy Data'
const encryptionSeed = 'Sign to retrieve your encryption key'
const password = client.getWallet().sign(encryptionSeed).signature
const encryptedData = await encrypt(Uint8Array.from(privacyData), password)
console.log('✅ Data encrypted successfully')
// 2. Upload the privacy data to IPFS and get the shared url
const fileMeta = await ipfs.upload({
name: dataFileName,
data: Buffer.from(encryptedData),
token: ipfsJwt || '',
})
const url = await ipfs.getShareLink({ token: ipfsJwt || '', id: fileMeta.id })
console.log('✅ File uploaded to IPFS:', url)
// 3. Upload the privacy url to LazAI
let fileId = await client.getFileIdByUrl(url)
if (fileId == BigInt(0)) {
fileId = await client.addFile(url)
}
console.log('✅ File registered with LazAI, file ID:', fileId.toString())
// 4. Request proof in the verified computing node
await client.requestProof(fileId, BigInt(100))
const jobIds = await client.fileJobIds(fileId)
const jobId = jobIds[jobIds.length - 1]
const job = await client.getJob(jobId)
const nodeInfo = await client.getNode(job.nodeAddress)
const nodeUrl = nodeInfo.url
const pubKey = nodeInfo.publicKey
const rsa = new NodeRSA(pubKey, 'pkcs1-public-pem')
const encryptedKey = rsa.encrypt(password, 'hex')
const proofRequest = {
job_id: Number(jobId),
file_id: Number(fileId),
file_url: url,
encryption_key: encryptedKey,
encryption_seed: encryptionSeed,
nonce: null,
proof_url: null,
}
console.log('✅ Proof request prepared')
// Write proof request to file
await fs.writeFile('proof_request.json', JSON.stringify(proofRequest, null, 2))
console.log('✅ Proof request saved to proof_request.json')
const response = await axios.post(`${nodeUrl}/proof`, proofRequest, {
headers: { 'Content-Type': 'application/json' },
})
if (response.status === 200) {
console.log('✅ Proof request sent successfully')
} else {
console.log('❌ Failed to send proof request:', response.data)
}
// 5. Request DAT reward
await client.requestReward(fileId)
console.log('✅ Reward requested for file id', fileId.toString())
console.log('All operations completed successfully!')
} catch (error) {
console.error('❌ Error in main function:', error)
process.exit(1)
}
}
// Execute the main function
main().catch((error) => {
console.error('❌ Unhandled error:', error)
process.exit(1)
}){
"compilerOptions": {
"target": "ES2022",
"module": "Node16",
"moduleResolution": "node16",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"outDir": "./dist",
"allowJs": true,
"resolveJsonModule": true,
"allowSyntheticDefaultImports": true,
"experimentalDecorators": true,
"emitDecoratorMetadata": true
},
"ts-node": {
"esm": true,
"experimentalSpecifierResolution": "node"
},
"include": ["*.ts"],
"exclude": ["node_modules"]
}# .env
PRIVATE_KEY=<your wallet private key>
IPFS_JWT=<your pinata ipfs jwt> node --loader ts-node/esm index.ts✅ Client initialized successfully
✅ IPFS client initialized successfully
✅ Data encrypted successfully
✅ File uploaded to IPFS: https://...
✅ File registered with LazAI, file ID: <id>
✅ Proof request prepared
✅ Proof request saved to proof_request.json
✅ Proof request sent successfully
✅ Reward requested for file id <id>
All operations completed successfully!use anyhow::Result;
use reqwest;
use rand_08::thread_rng;
use alith::data::crypto::{encrypt, DecodeRsaPublicKey, Pkcs1v15Encrypt, RsaPublicKey};
use alith::data::storage::{DataStorage, PinataIPFS, UploadOptions};
use alith::lazai::{Client, ProofRequest, U256};
#[tokio::main]
async fn main() -> Result<()> {
// PRIVATE_KEY is read by the wallet inside Client::new_default()
let client = Client::new_default()?;
let ipfs = PinataIPFS::default();
// 1) Prepare privacy data and encrypt it
let data_file_name = "your_encrypted_data.txt";
let privacy_data = "Your Privacy Data";
let encryption_seed = "Sign to retrieve your encryption key";
// Sign the seed to derive a password
let password = client
.wallet
.sign_message_hex(encryption_seed.as_bytes())
.await?;
let encrypted_data = encrypt(privacy_data, password.clone())?;
// 2) Upload encrypted data to IPFS and get a share link
let token = std::env::var("IPFS_JWT")?;
let file_meta = ipfs
.upload(
UploadOptions::builder()
.name(data_file_name.to_string())
.data(encrypted_data)
.token(token.clone())
.build(),
)
.await?;
let url = ipfs.get_share_link(token, file_meta.id).await?;
// 3) Register the URL on LazAI
let mut file_id = client.get_file_id_by_url(url.as_str()).await?;
if file_id.is_zero() {
file_id = client.add_file(url.as_str()).await?;
}
// 4) Request a proof from a verified computing node
client.request_proof(file_id, U256::from(100)).await?;
let job_id = client.file_job_ids(file_id).await?.last().cloned().unwrap();
let job = client.get_job(job_id).await?;
let node_info = client.get_node(job.nodeAddress).await?.unwrap();
let node_url = node_info.url;
let pub_key_pem = node_info.publicKey;
// Encrypt the password with the node's RSA public key
let pub_key = RsaPublicKey::from_pkcs1_pem(&pub_key_pem)?;
let mut rng = thread_rng();
let enc_key_bytes = pub_key.encrypt(&mut rng, Pkcs1v15Encrypt, password.as_bytes())?;
let encryption_key = hex::encode(enc_key_bytes);
// Send proof request to the node
let resp = reqwest::Client::new()
.post(format!("{node_url}/proof"))
.json(
&ProofRequest::builder()
.job_id(job_id.to())
.file_id(file_id.to())
.file_url(url)
.encryption_key(encryption_key)
.encryption_seed(encryption_seed.to_string())
.build(),
)
.send()
.await?;
if resp.status().is_success() {
println!("✅ Proof request sent successfully");
} else {
println!("❌ Failed to send proof request: {:?}", resp);
}
// 5) Claim DAT reward
client.request_reward(file_id, None).await?;
println!("✅ Reward requested for file id {}", file_id);
Ok(())
}cargo run✅ Proof request sent successfully
✅ Reward requested for file id <file_id>LazAI’s Verified Computing system consists of the following core contract modules:
Contract
Responsibility
Verifier Contract
Core validator that receives proofs from the VSC and verifies TEE signatures, ZK proofs, or OP-based fraud evidence on-chain.
interface IVerifier {
function submitTEEProof(
bytes calldata taskId,
bytes calldata resultHash,
bytes calldata teeSignature,
bytes calldata attestationReport
) external;
function submitZKProof(
bytes calldata taskId,
bytes calldata zkProof,
bytes calldata publicSignals
) external;
function submitFraudProof(
bytes calldata taskId,
bytes calldata evidence,
address challenger
) external;
function verifyResult(bytes calldata taskId) external view returns (bool valid);
}
This section outlines the step-by-step process for verifiable execution under each supported mode: TEE, TEE+ZK, and OP. Each flow ensures secure and auditable AI computation within the LazAI framework.
The iDAO initiates a computation task via an off-chain UI or dApp, sending it to the VSC for orchestration.
Submitted Fields:
taskId: Unique hash for the task;
dataRef: Dataset reference (e.g., IPFS, Arweave hash);
modelId: The model or Agent identifier to execute;
params: Inference or training parameters;
The VSC stores this metadata and assigns the task to eligible TEE nodes.
The assigned TEE Worker Node performs the task inside an isolated enclave:
In-Enclave Process:
Initialize environment using the specified dataset and model;
Run computation (e.g., forward pass, token prediction, classification);
Generate resultHash from the output;
Sign the result using TEE’s private key;
Depending on verifyMode, the node additionally:
Generates a ZK-SNARK proving the correctness of the computation without revealing input/output (for TEE+ZK mode);
Prepares a Fraud-Proof-ready execution trace, hashed and stored off-chain (for OP mode).
Security Guarantees:
Execution is isolated and tamper-resistant;
Signed results are cryptographically bound to the specific TEE environment.
Once execution is complete, the VSC (Verifiable Service Coordinator) aggregates the following:
The VSC then formats this package and submits it as a transaction to the Verifier Contract on LazChain.
The Verifier Contract receives the task package and performs the following steps:
TEE Mode
Verifies the TEE signature using trusted attestation keys;
Validates that resultHash and attestation correspond to the original request;
Marks task as verified and stores the output hash on-chain.
TEE + ZK Mode
Verifies the TEE signature as above;
Executes on-chain ZK verifier to validate the ZK-SNARK proof;
Records both resultHash and proofResult in ExecutionRecord.
OP Mode
Stores the result and OP Trace Hash;
Opens a challenge window (e.g., 12–24 blocks) for registered challengers to submit fraud claims;
If no valid fraud proof is submitted within the window, the result is finalized as valid.
Once verification is complete:
The ExecutionRecord module stores taskId, verification status, resultHash, and timestamp;
The Settlement Contract:
Increases or adjusts DAT value for contributing iDAOs;
Releases access credits or rewards based on usage;
All results are publicly queryable and indexed on LazChain for audit, traceability, and downstream data monetization.
To establish clear and verifiable ownership of datasets and models within LazAI, the following protocol is implemented between iDAO users, Quorum nodes, and the LazChain infrastructure:
Initial Anchoring with TEE Attestation When a user (or iDAO member) first uploads a dataset or model to a Quorum, the selected TEE Worker Node performs a cryptographic attestation by:
Computing the data hash (e.g., SHA256 of the dataset or model);
Binding it to the uploader’s public key or LazChain address;
Signing this tuple using the TEE’s private attestation key.
Users can indirectly query and use encrypted data through computing nodes, including fine-tuning and inference.
Verify that nodes control accounts that can access the data through contracts.
Joint data analysis allows users to securely process information across multiple iDAOs.
This guide walks through the process of setting up your local environment, connecting your wallet, and minting your first Data Anchoring Token (DAT) using the LazAI Python SDK.
It assumes familiarity with Python, basic blockchain interactions, and access to the LazAI Pre-Testnet. For a guided walkthrough of the entire setup and minting process, watch the official tutorial: ➡️ https://youtu.be/YawyZ3aziRE?si=2i104Js04nxJ-oez
Ensure the following tools and credentials are available before starting:
Create a new working directory and (optionally) set up a virtual environment.
Install the LazAI SDK and required libraries.
If you prefer to load environment variables from a .env file instead of exporting them directly, install:
Set your credentials in the shell environment:
If you are using a .env file, include:
at the top of your Python script.
Create a file named mint.py and paste the following code.
Execute the minting process.
Expected Output:
This tutorial demonstrated how to:
Set up a local LazAI development environment.
Encrypt and upload data to IPFS.
Register and anchor it on the LazAI Network.
Generate verifiable proof and mint a DAT.
Once this is running successfully, you can integrate the same workflow into your agent or service pipeline using the Alith SDK for automated data anchoring and reward management.
To avoid dependency conflicts and keep your environment clean, create and activate a Python virtual environment before installing any packages:
For local development use the below installation command
Note: The public address of the private key you expose to the Query server is the LAZAI_IDAO_ADDRESS. Once the query server is running, the URL must be registered using the add_query_node function in Alith. This can only be done by LazAI admins.
For local development ask LazAI admins to add your wallet address & "" url to register in Alith funciton.
For OpenAI/ChatGPT API:
For other OpenAI-compatible APIs (DeepSeek, Gemini, etc.):
Local Development
For OpenAI API or OpenAI-compatible APIs (DeepSeek, Gemini, etc.):
Production Deployment on Phala TEE Cloud
For production-ready applications, deploy your data query server on for enhanced security and privacy. Once deployed, you will receive an data query URL that needs to be registered using the add_query_node function by LazAI admins.
Use this starter kit to create and push your Docker image
You can also use the existing data query nodes.
Your data never leaves your control. Data query is performed in a privacy-preserving environment, using cryptographic settlement and secure computation.
Settlement headers ensure only authorized users and nodes can access your data for data query.
File ID links your data query request to the specific data you contributed, maintaining a verifiable chain of custody.
python3 -m venv venv
source venv/bin/activatepip install llama-cpp-python pymilvus "pymilvus[model]"Requirement
Description
Python
Version 3.10 or later
pip
Latest package manager version
Wallet
Funded Pre-Testnet wallet for gas fees
Credentials
PRIVATE_KEY — your wallet private key (must start with 0x) IPFS_JWT — Pinata (or compatible) IPFS JWT token
Issue
Cause
Resolution
eth_keys or signing errors
Invalid private key format
Ensure PRIVATE_KEY starts with 0x and Python version ≥ 3.10
401 / 403 IPFS errors
Invalid or expired JWT
Verify your IPFS_JWT and permissions in Pinata
Request or connection failures
Node busy or network timeout
Wait a minute and retry; confirm your Pre-Testnet RPC configuration
StorageError
Invalid token or transient network issue
Retry the upload or reinitialize IPFS connection
mkdir lazai-contribution-py
cd lazai-contribution-py
# Optional: create a Python virtual environment
python3 -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activatepython3 -m pip install -U alith requests rsa eth-accountpython3 -m pip install python-dotenvexport PRIVATE_KEY=<your wallet private key> # must start with 0x
export IPFS_JWT=<your pinata ipfs jwt>from dotenv import load_dotenv
load_dotenv()from alith.lazai import Client, ProofRequest
from alith.data import encrypt
from alith.data.storage import (
PinataIPFS,
UploadOptions,
GetShareLinkOptions,
StorageError,
)
from eth_account.messages import encode_defunct
from os import getenv
import asyncio
import requests
import rsa
import aiohttp
from pydantic import BaseModel
from typing import Optional
class ActualPinataUploadResponse(BaseModel):
id: str
name: str
cid: str
size: int
number_of_files: int
mime_type: str
created_at: str
updated_at: str
network: str
streamable: bool
accept_duplicates: Optional[bool] = None
is_duplicate: Optional[bool] = None
group_id: Optional[str] = None
class CustomPinataIPFS(PinataIPFS):
async def upload(self, opts: UploadOptions):
url = "https://uploads.pinata.cloud/v3/files"
form = aiohttp.FormData()
form.add_field("file", opts.data, filename=opts.name, content_type="text/plain")
form.add_field("network", "public")
headers = {"Authorization": f"Bearer {opts.token}"}
try:
async with self.client.post(url, data=form, headers=headers) as response:
if response.status != 200:
error_text = await response.text()
raise StorageError(f"Pinata IPFS API error: {error_text}")
data = await response.json()
pinata_response = ActualPinataUploadResponse(**data["data"])
from alith.data.storage import FileMetadata
return FileMetadata(
id=pinata_response.cid,
name=pinata_response.name,
size=pinata_response.size,
modified_time=pinata_response.updated_at,
)
except aiohttp.ClientError as e:
raise StorageError(f"Network error: {str(e)}") from e
async def main():
client = Client()
ipfs = CustomPinataIPFS()
try:
# 1. Prepare and encrypt data
data_file_name = "your_encrypted_data.txt"
privacy_data = "Your Privacy Data"
encryption_seed = "Sign to retrieve your encryption key"
message = encode_defunct(text=encryption_seed)
password = client.wallet.sign_message(message).signature.hex()
encrypted_data = encrypt(privacy_data.encode(), password)
# 2. Upload to IPFS
token = getenv("IPFS_JWT", "")
file_meta = await ipfs.upload(
UploadOptions(name=data_file_name, data=encrypted_data, token=token)
)
url = await ipfs.get_share_link(GetShareLinkOptions(token=token, id=file_meta.id))
# 3. Register file on LazAI
file_id = client.get_file_id_by_url(url)
if file_id == 0:
file_id = client.add_file(url)
# 4. Request proof from verified node
client.request_proof(file_id, 100)
job_id = client.file_job_ids(file_id)[-1]
job = client.get_job(job_id)
node_info = client.get_node(job[-1])
node_url: str = node_info[1]
pub_key = node_info[-1]
encryption_key = rsa.encrypt(
password.encode(),
rsa.PublicKey.load_pkcs1(pub_key.strip().encode(), format="PEM"),
).hex()
response = requests.post(
f"{node_url}/proof",
json=ProofRequest(
job_id=job_id,
file_id=file_id,
file_url=url,
encryption_key=encryption_key,
encryption_seed=encryption_seed,
proof_url=None,
).model_dump(),
)
if response.status_code == 200:
print("Proof request sent successfully.")
else:
print("Failed to send proof request:", response.json())
# 5. Request DAT reward
client.request_reward(file_id)
print("Reward requested for file id", file_id)
except StorageError as e:
print(f"Storage error: {e}")
except Exception as e:
raise e
finally:
await ipfs.close()
if __name__ == "__main__":
asyncio.run(main())python mint.pyProof request sent successfully
Reward requested for file id <file_id>pip install llama-cpp-python pymilvus "pymilvus[milvus_lite]"python3 -m pip install alith -Uexport PRIVATE_KEY=<your wallet private key>
export OPENAI_API_KEY=<your openai api key>
export RSA_PRIVATE_KEY_BASE64=<your rsa private key>export PRIVATE_KEY=<your wallet private key>
export LLM_API_KEY=<your api key>
export LLM_BASE_URL=<your api base url>import logging
import sys
import json
import uvicorn
import argparse
from fastapi import FastAPI, Response, status
from fastapi.middleware.cors import CORSMiddleware
from alith.lazai import Client
from alith.lazai.node.middleware import HeaderValidationMiddleware
from alith.lazai.node.validator import decrypt_file_url
from alith import MilvusStore, chunk_text
from alith.query.types import QueryRequest
from alith.query.settlement import QueryBillingMiddleware
import os
from dotenv import load_dotenv
load_dotenv()
# Get OpenAI API key from environment variable
PRIVATE_KEY = os.getenv("PRIVATE_KEY")
RSA_PRIVATE_KEY_BASE64 = os.getenv("RSA_PRIVATE_KEY_BASE64")
# LLM_API_KEY = os.getenv("LLM_API_KEY")
# LLM_BASE_URL = os.getenv("LLM_BASE_URL")
# DSTACK_API_KEY = os.getenv("DSTACK_API_KEY")
# Set the API key for OpenAI
os.environ["PRIVATE_KEY"] = PRIVATE_KEY
os.environ["RSA_PRIVATE_KEY_BASE64"] = RSA_PRIVATE_KEY_BASE64
# os.environ["LLM_API_KEY"] = LLM_API_KEY
# os.environ["LLM_BASE_URL"] = LLM_BASE_URL
# os.environ["DSTACK_API_KEY"] = DSTACK_API_KEY
# Logging configuration
logging.basicConfig(
stream=sys.stdout,
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
)
logger = logging.getLogger(__name__)
client = Client()
app = FastAPI(title="Alith LazAI Privacy Data Query Node", version="1.0.0")
store = MilvusStore()
collection_prefix = "query_"
@app.get("/health")
async def health_check():
return {"status": "healthy", "message": "Server is running"}
@app.get("/")
async def root():
return {"message": "Alith LazAI Privacy Data Query Node", "version": "1.0.0"}
@app.post("/query/rag")
async def query_rag(req: QueryRequest):
try:
file_id = req.file_id
if req.file_url:
file_id = client.get_file_id_by_url(req.file_url)
if file_id:
file = client.get_file(file_id)
else:
return Response(
status_code=status.HTTP_400_BAD_REQUEST,
content=json.dumps(
{
"error": {
"message": "File ID or URL is required",
"type": "invalid_request_error",
}
}
),
)
owner, file_url, file_hash = file[1], file[2], file[3]
collection_name = collection_prefix + file_hash
# Cache data in the vector database
if not store.has_collection(collection_name):
encryption_key = client.get_file_permission(
file_id, client.contract_config.data_registry_address
)
data = decrypt_file_url(file_url, encryption_key).decode("utf-8")
store.create_collection(collection_name=collection_name)
store.save_docs(chunk_text(data), collection_name=collection_name)
data = store.search_in(
req.query, limit=req.limit, collection_name=collection_name
)
logger.info(f"Successfully processed request for file: {file}")
return {
"data": data,
"owner": owner,
"file_id": file_id,
"file_url": file_url,
"file_hash": file_hash,
}
except Exception as e:
return Response(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
content=json.dumps(
{
"error": {
"message": f"Error processing request for req: {req}. Error: {str(e)}",
"type": "internal_error",
}
}
),
)
def run(host: str = "0.0.0.0", port: int = 8000, *, settlement: bool = False):
# FastAPI app and LazAI client initialization
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
if settlement:
app.add_middleware(HeaderValidationMiddleware)
app.add_middleware(QueryBillingMiddleware)
return uvicorn.run(app, host=host, port=port)
if __name__ == "__main__":
description = "Alith data query server. Host your own embedding models and support language query!"
parser = argparse.ArgumentParser(description=description)
parser.add_argument(
"--host",
type=str,
help="Server host",
default="0.0.0.0",
)
parser.add_argument(
"--port",
type=int,
help="Server port",
default=8000,
)
parser.add_argument(
"--model",
type=str,
help="Model name or path",
default="/root/models/qwen2.5-1.5b-instruct-q5_k_m.gguf",
)
args = parser.parse_args()
run(host=args.host, port=args.port, settlement=False)
from alith.lazai import Client
import requests
client = Client()
node = "0x2591E4C0e6E771927A45eFAE8Cd5Bf20e585A57A" #change this address with one you registered with admin
try:
print("try to get user")
user = client.get_user(client.wallet.address)
print(user)
except Exception as e:
print("try to get user failed")
print(e)
print("try to add user failed")
client.add_user(1000000)
print("user added")
print("try to get query account")
url = client.get_query_node(node)[1]
print(url)
headers = client.get_request_headers(node)
print("request headers:", headers)
print(
"request result:",
requests.post(
f"{url}/query/rag",
headers=headers,
json={
"file_id": 10, #change with your file_id
"query": "summarise the best character?",
},
).json(),
)verifyMode: Enum specifying TEE, TEE+ZK, or OP.
Attach remote attestation report.
Optionally triggers stake updates or slashing (in case of OP fraud).
Verifier
ExecutionRecord
Status and result hash
Task tracking
Verifier
Settlement
valid status, DAT info
Reward or slashing
Ownership Registration via LazChain The Verifiable Service Coordinator (VSC) packages this attestation into a LazChain transaction, which includes:
The data or model hash;
The uploader’s public address;
The TEE signature and timestamp.
The Ownership Registry Contract stores this mapping, establishing an on-chain link between data hash and rightful owner.
Consensus-Based Verification and Transfer All future operations involving the dataset - whether validation, licensing, delegation, or ownership transfer - must be registered as consensus-approved transactions on LazChain. This ensures tamper-proof traceability and cryptographic auditability.
Ownership Query and Privacy-Preserving Claims To prove ownership, a user simply queries the on-chain registry for the binding between their address and the data hash. In cases where the user must prove specific dataset properties (e.g., contains 10,000 labeled entries, complies with regulatory filters) without revealing the raw data, a Quorum node may generate a Zero-Knowledge Proof (ZKP) certifying the claim, which can then be verified on-chain.
VSC Coordinator
Middleware that aggregates proofs from TEE nodes, including optional ZK/OP proofs, and submits them to the Verifier contract.
Challenger Registry
Maintains a list of Quorum-elected challenger nodes and tracks their activity, reputation, and rewards.
ExecutionRecord
Stores metadata of each computation task, including result hash and verification status.
Settlement Contract
Handles reward allocation and slashing based on verification results, impacting DAT ownership and value.
Data
Included When
Purpose
TEE Signature
Always
Confirms execution occurred inside TEE
ZK Proof
TEE+ZK only
Proves result validity without exposing data
OP Trace Hash
OP only
Enables later challenge by a third party
Execution Metadata
Always
Includes modelId, dataRoot, execution timestamp, etc.
Source
Target
Data Transmitted
Purpose
iDAO
VSC
taskId, modelId, dataId, verifyMode
Task configuration
TEE Node
VSC
resultHash, signature, attestation
Secure result
VSC
Verifier
Proofs and metadata
On-chain validation
Your Digital Twin is an AI persona that speaks in your voice. We generate it from your Twitter/X archive, store it in a portable Characterfile (character.json), and load it into an Alith agent that writes tweets for you (manually or on a schedule).
Portable persona: Single JSON file that any LLM agent can use
Separation of concerns: Keep style/persona in JSON; keep logic in code
Composable: Swap personas without touching the app
macOS/WSL/Linux with Node.js 18+
An OpenAI or Anthropic (Claude) API key
A Twitter/X account + your archive .zip
Request your archive
Get it from X/Twitter: Settings → Download an archive.
Run the generator
Choose openai or claude
Paste your API key when prompted
Output: a character.json in the current directory
Move it into the Digital-Twin-Starter-kit Directrory
Place character.json at your root (same level as index.js), e.g.:
We create an Alith agent at runtime and pass your character as a preamble. This keeps persona separate from code and makes it hot‑swappable.
Loads character.json
Builds an Alith preamble from bio/lore/style
Generates a tweet in your voice
Provides postTweet (manual) and autoTweet (cron) helpers
Environment
Install deps:
We schedule the agent to post automatically (e.g., at minute 5 every hour).
services/cronService.js
index.js (ESM)
Route
Manual test
Re‑run npx tweets2character any time you want a fresh persona.
Replace character.json and restart your server.
The agent will immediately pick up the new style/preamble.
CommonJS + ESM: Alith is ESM, your project is CJS → use dynamic import inside functions (shown above).
Length control: We trim to 240 chars to be safe with links/quoted tweets.
Fallbacks: If the model isn’t reachable, we fall back to postExamples from your character file.
Safety: Add your own guardrails (e.g., profanity, duplicates, rate limits) before posting.
git clone https://github.com/0xLazAI/Digital-Twin-Starter-kit.git
cd Digital-Twin-Starter-kit# You can run this anywhere; no cloning needed
npx tweets2character ~/Downloads/twitter-YYYY-MM-DD-<hash>.zip/Digital-Twin-Starter-kit
├─ controller/
├─ routes/
├─ services/
├─ character.json ← here
└─ index.js// controller/twitterController.js
const { initializeTwitterClient } = require('../config/twitter');
const fs = require('fs');
const path = require('path');
// Load character data
const characterData = JSON.parse(
fs.readFileSync(path.join(__dirname, '../character.json'), 'utf8')
);
// alith function with our character
const alithfunction = async (username = "") => {
try {
const { Agent, LLM } = await import('alith');
const preamble = [
`You are ${characterData.name}.`,
characterData.bio?.join(' ') || '',
characterData.lore ? `Lore: ${characterData.lore.join(' ')}` : '',
characterData.adjectives ? `Traits: ${characterData.adjectives.join(', ')}` : '',
characterData.style?.post ? `Style for posts: ${characterData.style.post.join(' ')}` : '',
].filter(Boolean).join('\n');
const model = LLM.from_model_name(process.env.LLM_MODEL || 'gpt-4o-mini');
const agent = Agent.new('twitter_agent', model).preamble(preamble);
const prompt = [
`Write one tweet in ${characterData.name}'s voice.`,
username ? `Optionally greet @${username}.` : '',
`<=240 chars, no code blocks, hashtags only if essential.`
].join(' ');
const chat = agent.chat();
const result = await chat.user(prompt).complete();
const text = (result?.content || '').toString().trim();
if (!text) throw new Error('Empty tweet from agent');
return text.slice(0, 240);
} catch (err) {
// Fallback to examples if Alith/model is unavailable
const examples = characterData.postExamples || [];
const base = examples[Math.floor(Math.random() * examples.length)] || 'Hello from my agent!';
return username ? `${base} @${username}`.slice(0, 240) : base.slice(0, 240);
}
};
const generateQuirkyMessage = async (username) => {
return await alithfunction(username);
};
let twitterClient = null;
// New function for cron job - posts tweet without requiring request/response
const postTweetCron = async () => {
try {
console.log('Cron job: Starting tweet posting...');
// Initialize Twitter client if not already initialized
if (!twitterClient) {
console.log('Cron job: Initializing Twitter client...');
twitterClient = await initializeTwitterClient();
}
// Generate message for cron job (you can customize this)
const message = await generateQuirkyMessage('cron');
console.log('Cron job: Posting tweet with message:', message);
// Send the tweet
const tweetResult = await twitterClient.sendTweet(message);
console.log('Cron job: Tweet result:', tweetResult);
// Log success
const tweetId = tweetResult.id || tweetResult.id_str;
if (tweetId) {
const tweetUrl = `https://twitter.com/${process.env.TWITTER_USERNAME}/status/${tweetId}`;
console.log('Cron job: Tweet posted successfully:', tweetUrl);
} else {
console.log('Cron job: Tweet posted but no ID received');
}
return { success: true, message: 'Tweet posted via cron job' };
} catch (error) {
console.error('Cron job: Error in postTweetCron:', error);
if (error.message.includes('authentication')) {
twitterClient = null;
}
throw error;
}
};
const postTweet = async (req, res) => {
console.log('Received request body:', req.body);
try {
const { username, address } = req.body;
console.log('Processing username:', username);
// Initialize Twitter client if not already initialized
if (!twitterClient) {
console.log('Initializing Twitter client...');
twitterClient = await initializeTwitterClient();
}
// Remove @ symbol if included
const cleanUsername = username.replace('@', '');
const message = await generateQuirkyMessage(cleanUsername);
console.log('Posting tweet with message:', message);
try {
// Send the tweet
const tweetResult = await twitterClient.sendTweet(message);
console.log('Tweet result:', tweetResult);
// Instead of fetching tweet details again, construct URL from the initial response
// Most Twitter API responses include either an id or id_str field
const tweetId = tweetResult.id || tweetResult.id_str;
console.log(tweetId)
if (!tweetId) {
console.log('Tweet posted but no ID received:', tweetResult);
return res.status(200).json({
success: true,
message: 'Tweet posted successfully',
tweetUrl: `https://twitter.com/${process.env.TWITTER_USERNAME}/`, // Fallback URL
tweetText: message,
updatedScore: 0
});
}
const tweetUrl = `https://twitter.com/${process.env.TWITTER_USERNAME}/status/${tweetId}`;
console.log('Constructed tweet URL:', tweetUrl);
return res.status(200).json({
success: true,
message: 'Tweet posted successfully',
tweetUrl,
tweetText: message,
updatedScore: 0
});
} catch (tweetError) {
console.error('Error with tweet operation:', tweetError);
throw tweetError;
}
} catch (error) {
console.error('Error in postTweet:', error);
if (error.message.includes('authentication')) {
twitterClient = null;
}
return res.status(500).json({
error: 'Failed to post tweet',
details: error.message
});
}
};
module.exports = {
postTweet,
postTweetCron
};
# .env
TWITTER_USERNAME=username
TWITTER_PASSWORD=password
TWITTER_EMAIL=email
# Alith / LLM
LLM_MODEL=gpt-4o-mini
ALITH_API_KEY=your_key_if_required # only if your Alith setup needs itnpm i alith node-cron
# or pnpm add alith node-cronconst cron = require('node-cron');
const { postTweetCron } = require('../controller/twitterController');
class CronService {
constructor() {
this.isRunning = false;
}
start() {
if (this.isRunning) {
console.log('Cron service is already running');
return;
}
console.log('Starting cron service...');
// Schedule tweet posting every 1 minute
cron.schedule('* * * * *', async () => {
console.log('Cron job triggered: Posting tweet...');
try {
await postTweetCron();
console.log('Tweet posted successfully via cron job');
} catch (error) {
console.error('Error in cron job tweet posting:', error);
}
}, {
scheduled: true,
timezone: "UTC"
});
this.isRunning = true;
console.log('Cron service started successfully. Tweets will be posted every minute.');
}
stop() {
if (!this.isRunning) {
console.log('Cron service is not running');
return;
}
console.log('Stopping cron service...');
cron.getTasks().forEach(task => task.stop());
this.isRunning = false;
console.log('Cron service stopped');
}
getStatus() {
return {
isRunning: this.isRunning,
nextRun: this.isRunning ? 'Every minute' : 'Not scheduled'
};
}
}
module.exports = CronService; import express from "express";
import cors from "cors";
import dotenv from 'dotenv';
dotenv.config();
import twitterRoutes from './routes/twitterRoutes.js';
import CronService from './services/cronService.js';
const app = express();
const cronService = new CronService();
app.use(cors({
origin: '*', // Allow all origins
}));
app.use(express.json());
app.use(express.urlencoded({ extended: true }));
// Debug middleware to log requests
app.use((req, res, next) => {
console.log('Received request:', {
method: req.method,
path: req.path,
body: req.body,
headers: req.headers
});
next();
});
app.use('/api', twitterRoutes);
// Add cron status endpoint
app.get('/api/cron/status', (req, res) => {
const status = cronService.getStatus();
res.json(status);
});
// Add cron control endpoints (optional - for manual control)
app.post('/api/cron/start', (req, res) => {
cronService.start();
res.json({ message: 'Cron service started' });
});
app.post('/api/cron/stop', (req, res) => {
cronService.stop();
res.json({ message: 'Cron service stopped' });
});
// Error handling middleware
app.use((err, req, res, next) => {
console.error('Server error:', err);
res.status(500).json({
error: 'Internal server error',
details: err.message
});
});
app.use(express.json({ limit: '10mb' }));
app.use(express.urlencoded({ limit: '10mb', extended: true }));
const port = process.env.PORT || 3005;
app.listen(port, () => {
console.log(`Server started on port ${port}`);
// Start the cron service
cronService.start();
});// routes/twitterRoutes.js
const express = require('express');
const router = express.Router();
const { postTweet } = require('../controller/twitterController');
router.post('/tweet', postTweet);
module.exports = router;curl -X POST http://localhost:3000/tweet \
-H "Content-Type: application/json" \
-d '{"username":"someone"}'npm run devComplete Step-by-Step Tutorial: On-chain AI Agent
This comprehensive tutorial will guide you through building an AI Agent that can handle both general conversations and blockchain operations, specifically token balance checking on the LazAI testnet.
Before starting, ensure you have:
Node.js 18+ installed
npm or yarn package manager
Basic knowledge of React, TypeScript, and Next.js
OpenAI API key (for AI conversations)
Code editor (VS Code recommended)
Your project should look like this:
What each package does:
ethers: Ethereum library for blockchain interactions
alith: AI SDK for OpenAI integration
node-loader: Webpack loader for native modules
Create or update next.config.ts:
Why this configuration is needed:
Handles native modules that can't be bundled by webpack
Prevents client-side bundling of server-only packages
Ensures proper module resolution
Create app/api/token-balance/route.ts:
Key Features:
Validates Ethereum addresses
Handles BigInt serialization
Provides fallback values for missing token data
Comprehensive error handling
Create app/api/chat/route.ts:
Key Features:
Smart message routing based on content analysis
Pattern recognition for balance requests
Automatic address extraction
Enhanced AI prompts with blockchain context
Create app/components/ChatInterface.tsx:
Key Features:
Real-time message updates
Markdown rendering for formatted responses
Auto-scroll to latest messages
Loading indicators
Create a .env.local file in your project root:
Go to
Sign up or log in to your account
Navigate to "API Keys" in the sidebar
Click "Create new secret key"
Security Note: Never commit your API key to version control!
Try these general questions:
"What is blockchain technology?"
"How does cryptocurrency work?"
"Tell me a joke"
"What are the benefits of decentralized systems?"
Try these formats:
"Check token balance for contract 0x1234567890123456789012345678901234567890 and wallet 0x0987654321098765432109876543210987654321"
"What's the balance of token 0x... in wallet 0x..."
"Check balance: contract 0x... wallet 0x..."
Note: You'll need valid contract and wallet addresses on the LazAI testnet for this to work.
Feature showcase for new users
Paste it in your .env.local file
# Create a new Next.js project with TypeScript
npx create-next-app@latest ai-agent --typescript --tailwind --eslint --app --src-dir=false --import-alias="@/*"
# Navigate to the project directory
cd ai-agentai-blockchain-chatbot/
├── app/
│ ├── globals.css
│ ├── layout.tsx
│ └── page.tsx
├── public/
├── next.config.ts
├── package.json
└── tsconfig.json# Install core dependencies
npm install ethers alith
# Install development dependencies
npm install --save-dev node-loaderimport type { NextConfig } from "next";
const nextConfig: NextConfig = {
webpack: (config, { isServer }) => {
if (isServer) {
// On the server side, handle native modules
config.externals = config.externals || [];
config.externals.push({
'@lazai-labs/alith-darwin-arm64': 'commonjs @lazai-labs/alith-darwin-arm64',
});
} else {
// On the client side, don't bundle native modules
config.resolve.fallback = {
...config.resolve.fallback,
'@lazai-labs/alith-darwin-arm64': false,
'alith': false,
};
}
return config;
},
// Mark packages as external for server components
serverExternalPackages: ['@lazai-labs/alith-darwin-arm64', 'alith'],
};
export default nextConfig;# Create the API directories
mkdir -p app/api/token-balance
mkdir -p app/api/chat
mkdir -p app/componentsimport { NextRequest, NextResponse } from 'next/server';
import { ethers } from 'ethers';
// ERC-20 Token ABI (minimal for balance checking)
const ERC20_ABI = [
{
"constant": true,
"inputs": [{"name": "_owner", "type": "address"}],
"name": "balanceOf",
"outputs": [{"name": "balance", "type": "uint256"}],
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "decimals",
"outputs": [{"name": "", "type": "uint8"}],
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "symbol",
"outputs": [{"name": "", "type": "string"}],
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "name",
"outputs": [{"name": "", "type": "string"}],
"type": "function"
}
];
// LazAI Testnet configuration
const LAZAI_RPC = 'https://testnet.lazai.network';
const LAZAI_CHAIN_ID = 133718;
export async function POST(request: NextRequest) {
try {
const { contractAddress, walletAddress } = await request.json();
// Validate inputs
if (!contractAddress || !walletAddress) {
return NextResponse.json(
{ error: 'Contract address and wallet address are required' },
{ status: 400 }
);
}
// Validate Ethereum addresses
if (!ethers.isAddress(contractAddress)) {
return NextResponse.json(
{ error: 'Invalid contract address format' },
{ status: 400 }
);
}
if (!ethers.isAddress(walletAddress)) {
return NextResponse.json(
{ error: 'Invalid wallet address format' },
{ status: 400 }
);
}
// Connect to LazAI testnet
const provider = new ethers.JsonRpcProvider(LAZAI_RPC);
// Create contract instance
const contract = new ethers.Contract(contractAddress, ERC20_ABI, provider);
try {
// Get token information with individual error handling
let balance, decimals, symbol, name;
try {
balance = await contract.balanceOf(walletAddress);
} catch (error) {
return NextResponse.json(
{ error: 'Failed to get token balance. Contract might not be a valid ERC-20 token.' },
{ status: 400 }
);
}
try {
decimals = await contract.decimals();
} catch (error) {
// If decimals call fails, assume 18 decimals (most common)
decimals = 18;
}
try {
symbol = await contract.symbol();
} catch (error) {
symbol = 'UNKNOWN';
}
try {
name = await contract.name();
} catch (error) {
name = 'Unknown Token';
}
// Format balance - convert BigInt to string first
const formattedBalance = ethers.formatUnits(balance.toString(), decimals);
// Get LAZAI balance for comparison
const lazaiBalance = await provider.getBalance(walletAddress);
const formattedLazaiBalance = ethers.formatEther(lazaiBalance.toString());
return NextResponse.json({
success: true,
data: {
tokenName: name,
tokenSymbol: symbol,
contractAddress: contractAddress,
walletAddress: walletAddress,
balance: formattedBalance,
rawBalance: balance.toString(), // Convert BigInt to string
decimals: Number(decimals), // Convert BigInt to number
lazaiBalance: formattedLazaiBalance,
network: {
name: 'LazAI Testnet',
chainId: LAZAI_CHAIN_ID,
rpc: LAZAI_RPC,
explorer: 'https://testnet-explorer.lazai.network'
}
}
});
} catch (contractError) {
console.error('Contract interaction error:', contractError);
return NextResponse.json(
{ error: 'Contract not found or not a valid ERC-20 token on LazAI testnet' },
{ status: 400 }
);
}
} catch (error) {
console.error('Error checking token balance:', error);
// Handle specific errors
if (error instanceof Error) {
if (error.message.includes('execution reverted')) {
return NextResponse.json(
{ error: 'Contract not found or not a valid ERC-20 token' },
{ status: 400 }
);
}
if (error.message.includes('network') || error.message.includes('connection')) {
return NextResponse.json(
{ error: 'Network connection failed. Please try again.' },
{ status: 500 }
);
}
}
return NextResponse.json(
{ error: 'Failed to check token balance' },
{ status: 500 }
);
}
}import { NextRequest, NextResponse } from 'next/server';
import { Agent } from 'alith';
// Function to detect token balance requests
function isTokenBalanceRequest(message: string): { isRequest: boolean; contractAddress?: string; walletAddress?: string } {
const lowerMessage = message.toLowerCase();
// Check for common patterns
const balancePatterns = [
/check.*balance/i,
/token.*balance/i,
/balance.*check/i,
/how much.*token/i,
/token.*amount/i
];
const hasBalanceIntent = balancePatterns.some(pattern => pattern.test(lowerMessage));
if (!hasBalanceIntent) {
return { isRequest: false };
}
// Extract Ethereum addresses (basic pattern)
const addressPattern = /0x[a-fA-F0-9]{40}/g;
const addresses = message.match(addressPattern);
if (!addresses || addresses.length < 2) {
return { isRequest: false };
}
// Assume first address is contract, second is wallet
return {
isRequest: true,
contractAddress: addresses[0],
walletAddress: addresses[1]
};
}
export async function POST(request: NextRequest) {
try {
const { message } = await request.json();
if (!message || typeof message !== 'string') {
return NextResponse.json(
{ error: 'Message is required and must be a string' },
{ status: 400 }
);
}
// Check if this is a token balance request
const balanceRequest = isTokenBalanceRequest(message);
if (balanceRequest.isRequest && balanceRequest.contractAddress && balanceRequest.walletAddress) {
// Route to token balance API
try {
const balanceResponse = await fetch(`${request.nextUrl.origin}/api/token-balance`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
contractAddress: balanceRequest.contractAddress,
walletAddress: balanceRequest.walletAddress
}),
});
const balanceData = await balanceResponse.json();
if (balanceData.success) {
const formattedResponse = `🔍 **Token Balance Check Results**
**Token Information:**
• Name: ${balanceData.data.tokenName}
• Symbol: ${balanceData.data.tokenSymbol}
• Contract: \`${balanceData.data.contractAddress}\`
**Wallet Information:**
• Address: \`${balanceData.data.walletAddress}\`
• Token Balance: **${balanceData.data.balance} ${balanceData.data.tokenSymbol}**
• LAZAI Balance: **${balanceData.data.lazaiBalance} LAZAI**
**Network:** ${balanceData.data.network.name} (Chain ID: ${balanceData.data.network.chainId})
You can view this transaction on the [block explorer](${balanceData.data.network.explorer}/address/${balanceData.data.walletAddress}).`;
return NextResponse.json({ response: formattedResponse });
} else {
let errorMessage = `❌ **Error checking token balance:** ${balanceData.error}`;
// Provide helpful suggestions based on the error
if (balanceData.error.includes('not a valid ERC-20 token')) {
errorMessage += `\n\n💡 **Suggestions:**
• Make sure the contract address is a valid ERC-20 token on LazAI testnet
• Verify the contract exists and is deployed on the network
• Check if the contract implements the standard ERC-20 interface`;
} else if (balanceData.error.includes('Invalid contract address')) {
errorMessage += `\n\n💡 **Suggestion:** Please provide a valid Ethereum address starting with 0x followed by 40 hexadecimal characters.`;
} else if (balanceData.error.includes('Invalid wallet address')) {
errorMessage += `\n\n💡 **Suggestion:** Please provide a valid Ethereum wallet address starting with 0x followed by 40 hexadecimal characters.`;
}
return NextResponse.json({ response: errorMessage });
}
} catch (error) {
console.error('Error calling token balance API:', error);
return NextResponse.json({
response: "❌ **Error:** Failed to check token balance. Please try again later.\n\n💡 **Possible causes:**\n• Network connection issues\n• Invalid contract or wallet addresses\n• Contract not deployed on LazAI testnet"
});
}
}
// Check if API key is configured for AI responses
if (!process.env.OPENAI_API_KEY) {
return NextResponse.json(
{ error: 'OpenAI API key is not configured' },
{ status: 500 }
);
}
// Initialize the Alith agent with enhanced preamble
const agent = new Agent({
model: "gpt-4",
preamble: `Your name is Alith. You are a helpful AI assistant with blockchain capabilities.
**Available Features:**
1. **Token Balance Checker**: Users can check ERC-20 token balances on the LazAI testnet by providing a contract address and wallet address. The format should include both addresses in the message.
**Network Information:**
- Network: LazAI Testnet
- Chain ID: 133718
- RPC: https://testnet.lazai.network
- Explorer: https://testnet-explorer.lazai.network
**How to use token balance checker:**
Users can ask questions like:
- "Check token balance for contract 0x... and wallet 0x..."
- "What's the balance of token 0x... in wallet 0x..."
- "Check balance: contract 0x... wallet 0x..."
Provide clear, concise, and accurate responses. Be friendly and engaging in your conversations. If users ask about token balances, guide them to provide both contract and wallet addresses.`,
});
// Get response from the agent
const response = await agent.prompt(message);
return NextResponse.json({ response });
} catch (error) {
console.error('Error in chat API:', error);
return NextResponse.json(
{ error: 'Failed to get response from AI' },
{ status: 500 }
);
}
}'use client';
import { useState, useRef, useEffect } from 'react';
interface Message {
id: string;
content: string;
role: 'user' | 'assistant';
timestamp: Date;
}
// Simple markdown renderer for basic formatting
const renderMarkdown = (text: string) => {
return text
.replace(/\*\*(.*?)\*\*/g, '<strong>$1</strong>')
.replace(/\*(.*?)\*/g, '<em>$1</em>')
.replace(/`(.*?)`/g, '<code class="bg-gray-100 px-1 py-0.5 rounded text-sm font-mono">$1</code>')
.replace(/\n/g, '<br>')
.replace(/•/g, '•');
};
export default function ChatInterface() {
const [messages, setMessages] = useState<Message[]>([]);
const [inputMessage, setInputMessage] = useState('');
const [isLoading, setIsLoading] = useState(false);
const messagesEndRef = useRef<HTMLDivElement>(null);
useEffect(() => {
// Auto-scroll to bottom when new messages are added
messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
}, [messages]);
const handleSendMessage = async () => {
if (!inputMessage.trim() || isLoading) return;
const userMessage: Message = {
id: Date.now().toString(),
content: inputMessage.trim(),
role: 'user',
timestamp: new Date(),
};
setMessages(prev => [...prev, userMessage]);
setInputMessage('');
setIsLoading(true);
try {
const response = await fetch('/api/chat', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ message: inputMessage.trim() }),
});
if (!response.ok) {
throw new Error('Failed to get response');
}
const data = await response.json();
if (data.error) {
throw new Error(data.error);
}
const assistantMessage: Message = {
id: (Date.now() + 1).toString(),
content: data.response,
role: 'assistant',
timestamp: new Date(),
};
setMessages(prev => [...prev, assistantMessage]);
} catch (error) {
console.error('Error getting response:', error);
const errorMessage: Message = {
id: (Date.now() + 1).toString(),
content: 'Sorry, I encountered an error. Please try again.',
role: 'assistant',
timestamp: new Date(),
};
setMessages(prev => [...prev, errorMessage]);
} finally {
setIsLoading(false);
}
};
const handleKeyPress = (e: React.KeyboardEvent) => {
if (e.key === 'Enter' && !e.shiftKey) {
e.preventDefault();
handleSendMessage();
}
};
const formatTime = (date: Date) => {
return date.toLocaleTimeString([], { hour: '2-digit', minute: '2-digit' });
};
return (
<div className="flex flex-col h-screen bg-gray-50">
{/* Header */}
<div className="bg-white border-b border-gray-200 px-6 py-4">
<h1 className="text-2xl font-bold text-gray-900">Alith AI Assistant</h1>
<p className="text-sm text-gray-600">Powered by Alith SDK & ChatGPT • Blockchain Capabilities</p>
</div>
{/* Messages Container */}
<div className="flex-1 overflow-y-auto px-6 py-4 space-y-4">
{messages.length === 0 && (
<div className="text-center text-gray-500 mt-8">
<div className="text-6xl mb-4">🤖</div>
<h3 className="text-lg font-medium mb-2">Welcome to Alith AI!</h3>
<p className="text-sm mb-4">I can help you with general questions and blockchain operations.</p>
{/* Feature showcase */}
<div className="bg-white rounded-lg p-4 max-w-md mx-auto border border-gray-200">
<h4 className="font-medium text-gray-900 mb-2">Available Features:</h4>
<ul className="text-sm text-gray-600 space-y-1">
<li>• 💬 General AI conversations</li>
<li>• 🔍 Token balance checking on LazAI testnet</li>
<li>• 📊 Blockchain data queries</li>
</ul>
<div className="mt-3 p-2 bg-blue-50 rounded text-xs text-blue-700">
<strong>Example:</strong> "Check token balance for contract 0x1234... and wallet 0x5678..."
</div>
</div>
</div>
)}
{messages.map((message) => (
<div
key={message.id}
className={`flex ${message.role === 'user' ? 'justify-end' : 'justify-start'}`}
>
<div
className={`max-w-[70%] rounded-lg px-4 py-3 ${
message.role === 'user'
? 'bg-blue-600 text-white'
: 'bg-white text-gray-900 border border-gray-200'
}`}
>
<div
className={`text-sm whitespace-pre-wrap ${
message.role === 'assistant' ? 'prose prose-sm max-w-none' : ''
}`}
dangerouslySetInnerHTML={{
__html: message.role === 'assistant'
? renderMarkdown(message.content)
: message.content
}}
/>
<div
className={`text-xs mt-2 ${
message.role === 'user' ? 'text-blue-100' : 'text-gray-500'
}`}
>
{formatTime(message.timestamp)}
</div>
</div>
</div>
))}
{isLoading && (
<div className="flex justify-start">
<div className="bg-white text-gray-900 border border-gray-200 rounded-lg px-4 py-3">
<div className="flex items-center space-x-2">
<div className="flex space-x-1">
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce"></div>
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce" style={{ animationDelay: '0.1s' }}></div>
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce" style={{ animationDelay: '0.2s' }}></div>
</div>
<span className="text-sm text-gray-600">Alith is thinking...</span>
</div>
</div>
</div>
)}
<div ref={messagesEndRef} />
</div>
{/* Input Container */}
<div className="bg-white border-t border-gray-200 px-6 py-4">
<div className="flex space-x-4">
<div className="flex-1">
<textarea
value={inputMessage}
onChange={(e) => setInputMessage(e.target.value)}
onKeyPress={handleKeyPress}
placeholder="Ask me anything or check token balances..."
className="w-full px-4 py-3 border border-gray-300 rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-transparent resize-none"
rows={1}
disabled={isLoading}
/>
</div>
<button
onClick={handleSendMessage}
disabled={!inputMessage.trim() || isLoading}
className="px-6 py-3 bg-blue-600 text-white rounded-lg hover:bg-blue-700 focus:ring-2 focus:ring-blue-500 focus:ring-offset-2 disabled:opacity-50 disabled:cursor-not-allowed transition-colors"
>
<svg
className="w-5 h-5"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
>
<path
strokeLinecap="round"
strokeLinejoin="round"
strokeWidth={2}
d="M12 19l9 2-9-18-9 18 9-2zm0 0v-8"
/>
</svg>
</button>
</div>
{/* Quick help */}
<div className="mt-2 text-xs text-gray-500">
💡 <strong>Tip:</strong> Include both contract and wallet addresses to check token balances on LazAI testnet
</div>
</div>
</div>
);
}# OpenAI API Key for AI conversations
OPENAI_API_KEY=your_openai_api_key_herenpm run dev
This guide will walk you through everything you need to get started with Alith. Whether you’re building intelligent agents, integrating tools, or experimenting with language models, Alith provides a seamless experience across multiple programming languages. Below, you’ll find installation instructions, quick start guides, and examples for Rust, Python, and Node.js.
npm install alith
# Or use pnpm
pnpm install alith
# Or use yarn
yarn install alithnpm i --save-dev @types/json-schema
# Or use pnpm
pnpm install --save-dev @types/json-schema
# Or use yarn
yarn install --save-dev @types/json-schemaTo configure different AI model providers, here we use the OpenAI model as an example.
Unix
Windows
import { Agent } from "alith";
const agent = new Agent({
model: "gpt-4",
preamble: "You are a comedian here to entertain the user using humour and jokes.",
});
console.log(await agent.prompt("Entertain me!"));export OPENAI_API_KEY=<your API key>$env:OPENAI_API_KEY = "<your API key>"tsc index.ts && node index.js