LogoLogo
Developer Docs
Developer Docs
  • Platform Architecture
    • 💠Introduction
  • 💻Application Layer
    • 🙋‍♀️Alith - AI Agent Framework
    • 🏦DeFAI: AI-Driven DeFi
    • 🛒DAT Marketplace
    • 🚀Agent Launchpad
  • 🛡️Trust & Execution Layer
    • Consensus Protocol
    • Settlement Layer
    • Execution Layer
    • Data Availability Layer
  • 🖇️Exetention Layer
  • Data Anchoring Token (DAT)
    • 🧠Introduction
    • 🔍DAT Specification
    • 💎Value Semantics
    • 📁DAT Lifecycle Example
  • Quorum-based BFT Consensus
    • 💎Introduction
    • 🛠️iDAO-Quorum Interaction
    • 📝Quorum-Based BFT Protocol
    • 🫵Slashing & Challenger System
    • 🌀Quorum Rotation & Benefit
  • Verified Computing Framework
    • 🔷Overview
  • 🏗️Verified Computing Architecture
  • Contract & Execution Flow
  • LAZAI Workflow & Runtime
    • 🧩E2E Process
    • ⚙️POV Data Structure
    • 🔵AI Execution Mechanism
  • What's More?
    • 🔐Data Protection
  • 🛣️Roadmap
  • 🆎Glossary
  • ⁉️FAQs
Powered by GitBook
On this page
Export as PDF

FAQs

PreviousGlossary

Last updated 1 month ago

The model, store, and process of the data may directly affect its scalability and feasibility in certain applications. How does LazAI solve this?

In LazAI, we model data in POV format, which is not only a data format, but also a data specification and protocol. We will provide the Alith Agent framework to help users unify the formats and sizes of different data and convert them into POV format. For public data, it will be in a unified tensor form, stored on the chain, and limited to 32K; for private data, it will generate proof through the tool chain and store it on the chain. LazAI uses verify computing to make data available but invisible.

If an individual user wants to join LazAI to monetize his data, what is the general process?

You can learn more about this here:

Where is iDAO data stored? If it is personal off-chain private data, which oracle does LazAI use?

LazAI mainly supports private data on-chain. In theory, it can support data of any size to calculate its credentials on-chain. For public data, it will be limited to 32K, which is similar to the storage requirements of the chain and the context window of general AI models.

Through the Alith Agent toolkit, users can perform data preprocessing and proof calculation locally, and upload data to the LazAI network, making the data available but invisible.

How is the LazAI cross-chain designed?

LazAI's cross-chain functionality is divided into three scenarios:

  • Heterogeneous Chain Cross-Chain For public chains not built with the Metis SDK, cross-chain interactions with LazAI are completed using third-party bridges. If data-layer cross-chain operations are involved, they will still be facilitated through an Oracle.

  • Homogeneous Chain Cross-Chain For public chains also built using the Metis SDK, we plan to support a Native Shared Bridge in the future to enable fast sharing of assets and data. However, this may be scheduled for Q4 or even next year.

  • Third-Party Data Sources like IPFS & Arweave For other third-party data sources, we will currently maintain the same approach as with heterogeneous chain cross-chaining, i.e., using Oracles and third-party bridges. In the future, we will support native cross-chain methods.

What kind of token is DAT? Is it ERC6551 or ERC721?

ata Anchoring Token (DAT) is a semi-fungible token (SFT) standard designed for AI dataset anchoring, licensing, and AI model provenance tracking. This protocol introduces a multi-layered ownership model, ensuring composability, structured access control, and verifiable AI asset usage in a decentralized ecosystem.

You can find more information about DAT in the following section .

⁉️
E2E Process
here