Overview
The Data Anchoring Token (DAT) is the foundational standard of the LazAI and Alith ecosystems.
It enables contributors to share privacy-sensitive datasets, AI models, or computation results while retaining full ownership, control, and economic rights.
DATs act as on-chain certificates of contribution, linking data provenance, access permissions, and value distribution directly to blockchain-based records.
Key Capabilities
Each DAT encodes three primary dimensions of data ownership and utility:
Capability
Description
Ownership Certificate
Records verifiable proof of contribution or authorship for datasets, models, or computation results.
Usage Rights
Defines how and where the data can be accessed — for example, by AI services, model training, or agent execution.
Value Share
Assigns proportional economic rewards to contributors based on usage, staking, or licensing activity.
Why AI Needs a New Token Standard
AI data is dynamic, composable, and frequently reused across models and tasks — properties that traditional token standards like ERC-20 and ERC-721 don’t fully support.
DAT introduces a semi-fungible token (SFT) model designed for modularity, traceability, and partial ownership of AI assets.
Comparison Summary
Token Type
Description
Limitation
ERC-20 (Fungible)
Fully interchangeable tokens, ideal for currency or credits.
Cannot represent unique datasets or ownership records.
ERC-721 (Non-Fungible)
Unique tokens for singular assets (e.g., one-of-a-kind NFTs).
Lacks divisibility and modularity for AI workloads.
DAT (Semi-Fungible)
Hybrid model combining ERC-20 and ERC-721 traits — divisible, composable, and traceable.
Tailored for data provenance and AI-specific workflows.
How DAT Works
Data Contribution:
A user encrypts and uploads a dataset or model output through LazAI’s privacy framework.
Metadata Anchoring:
A smart contract logs encrypted metadata, provenance proofs, and ownership claims on-chain.
Verification:
Validators or trusted enclaves (TEE) confirm authenticity and compliance.
Tokenization:
A DAT is minted as a semi-fungible token representing the data’s rights, access rules, and value distribution.
Technical Highlights
Standard: Semi-Fungible Token (SFT)
Purpose: Tokenize AI datasets, models, and computation outputs
Blockchain Layer: LazAI Testnet (EVM-compatible)
Supports: On-chain provenance, privacy-preserving validation, and composable ownership logic
Benefits
Benefit
Description
Verifiable Provenance
Every dataset or model is cryptographically tied to its origin and contributor.
Data Monetization
Contributors can receive automatic rewards or royalties for approved AI usage.
Privacy by Design
Encryption and TEE validation ensure that raw data remains confidential.
Composable Ownership
DATs can be merged, split, or referenced across multiple models or applications.
Related Concepts
DAT Architecture →
Lifecycle & Value Semantics →
Contribute Your Data →
Last updated

