E2E Process
Last updated
Last updated
Overview
LazAI, we can divide workflows into data setup process, data usage process, and inference process according to different roles, corresponding to users who want to contribute data to obtain tokens and exchange tokens for data or models.
LazChain nodes not only consist of regular execution nodes and validation nodes, but also provide AI nodes (AI Execution Extension) for routine AI task execution such as model pulling and deployment, data processing, on-chain inference, and other processes.
Users provide POV and send the data to the blockchain in the form of transactions. POV is usually stored in the form of tensors and can be transmitted and stored in blocks when necessary. Users can submit publicly available tensor data or ZK proof of data to the blockchain, and provide the required model ID for subsequent inference tasks. Meanwhile, LazAI provides the Alith framework to assist users in data processing, format conversion, compression and interaction with LazChain.
When LazChain receives a POV, it will execute the corresponding contract code to request the DAO contract to perform data fingerprint verification. After verification is passed, the DAO contract can pull remote or local models to establish inference services in the AI node. The basic capabilities required for the inference services and the coprocessor acceleration are provided through AI precompile contracts.
After the user submits the POV data, they can send a data or inference request to LazChain to obtain data verification. After the data verification is completed, the data contribution is recorded and the corresponding DAT token is obtained. In addition, the LazAI network settles based on data and computing resource usage and includes a dynamic strategy for fees based on community and market conditions. For details, please refer to the settlement section.