LogoLogo
Developer Docs
Developer Docs
  • Platform Architecture
    • 💠Introduction
  • 💻Application Layer
    • 🙋‍♀️Alith - AI Agent Framework
    • 🏦DeFAI: AI-Driven DeFi
    • 🛒DAT Marketplace
    • 🚀Agent Launchpad
  • 🛡️Trust & Execution Layer
    • Consensus Protocol
    • Settlement Layer
    • Execution Layer
    • Data Availability Layer
  • 🖇️Exetention Layer
  • Data Anchoring Token (DAT)
    • 🧠Introduction
    • 🔍DAT Specification
    • 💎Value Semantics
    • 📁DAT Lifecycle Example
  • Quorum-based BFT Consensus
    • 💎Introduction
    • 🛠️iDAO-Quorum Interaction
    • 📝Quorum-Based BFT Protocol
    • 🫵Slashing & Challenger System
    • 🌀Quorum Rotation & Benefit
  • Verified Computing Framework
    • 🔷Overview
  • 🏗️Verified Computing Architecture
  • Contract & Execution Flow
  • LAZAI Workflow & Runtime
    • 🧩E2E Process
    • ⚙️POV Data Structure
    • 🔵AI Execution Mechanism
  • What's More?
    • 🔐Data Protection
  • 🛣️Roadmap
  • 🆎Glossary
  • ⁉️FAQs
Powered by GitBook
On this page
Export as PDF
  1. LAZAI Workflow & Runtime

E2E Process

PreviousContract & Execution FlowNextPOV Data Structure

Last updated 7 days ago

Overview

E2E Process

LazAI, we can divide workflows into data setup process, data usage process, and inference process according to different roles, corresponding to users who want to contribute data to obtain tokens and exchange tokens for data or models.

Data Setup Process

Data Usage & Inference Process

LazChain nodes not only consist of regular execution nodes and validation nodes, but also provide AI nodes (AI Execution Extension) for routine AI task execution such as model pulling and deployment, data processing, on-chain inference, and other processes.

Users provide POV and send the data to the blockchain in the form of transactions. POV is usually stored in the form of tensors and can be transmitted and stored in blocks when necessary. Users can submit publicly available tensor data or ZK proof of data to the blockchain, and provide the required model ID for subsequent inference tasks. Meanwhile, LazAI provides the Alith framework to assist users in data processing, format conversion, compression and interaction with LazChain.

When LazChain receives a POV, it will execute the corresponding contract code to request the DAO contract to perform data fingerprint verification. After verification is passed, the DAO contract can pull remote or local models to establish inference services in the AI node. The basic capabilities required for the inference services and the coprocessor acceleration are provided through AI precompile contracts.

After the user submits the POV data, they can send a data or inference request to LazChain to obtain data verification. After the data verification is completed, the data contribution is recorded and the corresponding DAT token is obtained. In addition, the LazAI network settles based on data and computing resource usage and includes a dynamic strategy for fees based on community and market conditions. For details, please refer to the settlement section.

🧩