Using Python

Best Practice: Use a Python Virtual Environment

To avoid dependency conflicts and keep your environment clean, create and activate a Python virtual environment before installing any packages:

python3 -m venv venv
source venv/bin/activate

Install Dependencies

pip install llama-cpp-python pymilvus "pymilvus[model]"

For local development use the below installation command

pip install llama-cpp-python pymilvus "pymilvus[milvus_lite]"

Install Alith

python3 -m pip install alith -U

Set Environment Variables

Note: The public address of the private key you expose to the Query server is the LAZAI_IDAO_ADDRESS. Once the query server is running, the URL must be registered using the add_query_node function in Alith. This can only be done by LazAI admins.

For local development ask LazAI admins to add your wallet address & "http://localhost:3000" url to register in Alith funciton.

For OpenAI/ChatGPT API:

export PRIVATE_KEY=<your wallet private key>
export OPENAI_API_KEY=<your openai api key>
export RSA_PRIVATE_KEY_BASE64=<your rsa private key>

For other OpenAI-compatible APIs (DeepSeek, Gemini, etc.):

Step 1: Run the Query Server

Local Development

For OpenAI API or OpenAI-compatible APIs (DeepSeek, Gemini, etc.):

Production Deployment on Phala TEE Cloud

For production-ready applications, deploy your data query server on Phala TEE Cloud  for enhanced security and privacy. Once deployed, you will receive an data query URL that needs to be registered using the add_query_node function by LazAI admins. Use this starter kit to create and push your Docker image https://github.com/0xLazAI/LazAI-DATA-Query-Server-Setup-Kit

You can also use the existing data query nodes.

Step 2: Request Query via LazAI Client


Security & Privacy

  • Your data never leaves your control. Data query is performed in a privacy-preserving environment, using cryptographic settlement and secure computation.

  • Settlement headers ensure only authorized users and nodes can access your data for data query.

  • File ID links your data query request to the specific data you contributed, maintaining a verifiable chain of custody.

Last updated