Author: Darshan Gandhi, Founder of FutureX Labs; Translation: Jinse Finance xiaozou
Blockchain has undergone significant development and changes and has become a core technology in various fields such as financial services and supply chain management. By improving the security, transparency and efficiency of transactions, as well as the creation of dApps and smart contracts, blockchain solves major problems such as trust, fraud prevention and data immutability.
Despite these advances, many Layer 1 blockchains still face expansion challenges. High demand often leads to:
Network congestion
Slower transaction times
Increased costs
Layer 2 solutions and sharding are working to address these issues. However, as consumption grows, the need for faster and more efficient transaction processing becomes critical.
This is where a "coprocessor" comes in.
A coprocessor is a specialized piece of hardware that can handle specific tasks more efficiently than a general-purpose CPU. Coprocessors offer a promising solution by working alongside the main processor (L1 or L2 in this article) to perform demanding tasks such as cryptographic operations and complex calculations. This helps to “offload” the load on the main chain, improving overall performance and throughput.
In this article, we will take a deep dive into the coprocessor ecosystem to understand:
The main focus of this article will be on zero-knowledge (ZK) coprocessors, as they are the most advanced coprocessors currently available. Let's get started!
1 What is a coprocessor?
A coprocessor is a piece of hardware that is specifically designed to work with the main CPU to handle specific tasks, improving efficiency and performance.
The concept of coprocessors originated in computer architecture, with the goal of improving the performance of traditional computers. Initially, computers were completely dependent on the CPU, but as tasks became more and more complex, the CPU was unable to cope. So, in order to solve this problem, coprocessors such as GPUs were introduced to handle specific tasks, such as:
Graphics rendering
Encryption
Signal processing
Scientific computing
Examples of coprocessors include GPUs for graphics rendering, cryptographic accelerators, and math coprocessors. By dividing tasks between the CPU and these specialized units, computers have achieved significant performance improvements, enabling them to efficiently handle more complex workloads.
In the context of blockchain, coprocessors help manage complex tasks off-chain, ensuring transparency and trust through verifiable computations. They leverage technologies such as zk-SNARKs, MPC (multi-party computation), and TEE (trusted execution environment) to enhance security and scalability.
2 Why do we need coprocessors?
Coprocessors offer several benefits, especially for chains like Ethereum that face scalability issues. Benefits include:
Enhanced scalability
Gas-free transactions
Multi-chain support
To better understand this, let's look at a metaphor:
web3 The blockchain can be compared to the CPU in web2, and the coprocessor can be compared to the GPU that processes large amounts of data and complex computing logic.
3、Use cases and problems it helps solve
An important problem with blockchain is the high cost of on-chain computing. Although archival nodes store historical data, it is expensive and complicated for smart contracts to access this data. For example, EVM can easily access the latest block data, but it is difficult to access older data.
Blockchain machines focus on securely executing smart contract code, rather than processing big data or computationally heavy tasks. Therefore, off-chain computing or expansion technology is necessary.
Coprocessors provide solutions to these challenges by leveraging ZK technology to enhance scalability. The details are as follows:
Efficient large-scale computing: ZK coprocessors handle large-scale computing while maintaining blockchain security.
Historical data access authorization: They allow smart contracts to use zero-knowledge proofs to authorize historical data access and off-chain computing, and then bring the results to the chain.
Optimized scalability and efficiency: This separation improves scalability and efficiency without compromising security.
By adopting this new design, coprocessors can help applications access more data and run at a larger scale without paying high gas fees.
So, how do these services work? Here is a very good infographic to give you a better understanding.
4, Coprocessor Types and How They Compare to Rollup
When comparing coprocessors to other technologies, it is important to take into account the security model and assurance level required for the computation.
ZK coprocessors are ideal for sensitive computations that require maximum security and minimal trust. They use zero-knowledge proofs to ensure verifiable results without relying on an operator. However, this comes at the expense of efficiency and flexibility.
For less sensitive tasks such as analytics or risk modeling, MPC and trusted hardware are more efficient options. While these approaches offer fewer security guarantees, they enable a wider range of computations.
FHE-based coprocessors, such as those developed by Fhenix in collaboration with EigenLayer, offer significant improvements in confidential computing. These coprocessors maintain data confidentiality while offloading computational tasks.
Which of these technologies to choose depends on the risk tolerance and specific needs of the application.
Another comparison is often made between coprocessors and rollups:
Rollups focus on increasing transaction throughput and reducing fees by aggregating transactions and maintaining the state of the main chain. This makes them more suitable for high-frequency trading.
Coprocessors, on the other hand, handle complex logic and larger data volumes independently. They are ideal for advanced financial models and big data analytics across multiple blockchains and rollups.
5, Use Cases and Applications
Coprocessors are highly modular in nature and can be used for a wide variety of applications. Let’s explore some interesting use cases that are currently being built or can be built:
Coprocessors can handle complex computations in DeFi projects, supporting complex financial models and strategies that adapt in real time. They offload heavy computations from the main chain, ensuring efficiency and scalability, which is critical for optimizing trading strategies and high-frequency trading.
Coprocessors can offload complex functions from the EVM, enabling richer game mechanics and state updates. They can support advanced game logic and AI-driven features, creating more immersive and engaging gameplay than Web2 games.
Coprocessors provide transparent and verifiable margin logic for decentralized trading, enhancing the reliability of derivatives platforms. They ensure privacy and trust while supporting complex trading strategies and risk management practices.
Coprocessors can provide data capture, calculation, and verification services, enabling smart contracts to process large amounts of historical data. This helps achieve more advanced business logic and operational efficiency, enhancing the reliability of smart contracts.
Coprocessors can offload heavy computations to reduce gas fees for DAO operations, simplifying the governance process and decision-making process. This improves the efficiency and transparency of DAO operations, providing support for community-driven projects.
Coprocessors can support on-chain machine learning applications with verifiable off-chain computations, using historical data for security and risk management. This integration opens up new possibilities for advanced analytics and intelligent decision-making for blockchain applications.
Coprocessors can acquire off-chain data and create verifiable proofs for smart contracts, maintaining user privacy while ensuring compliance. This makes the KYC process in Web3 more secure, private, and efficient.
Coprocessors can use zero-knowledge proofs to verify digital identities and historical behaviors without revealing wallet addresses. This enhances the privacy and trustworthiness of social and identity verification applications, enabling secure proof of qualifications and activities.
Due to the flexibility provided by coprocessors, the applications are almost endless. Above are some exciting examples of some of the teams that have already started building innovative projects.
6Who is creating coprocessors?
So, the next question is: who are the teams that are actually creating these coprocessors?
Axiom
Axiom is an Ethereum ZK coprocessor that provides smart contracts that can access all on-chain data in a secure and verifiable manner. It uses zero-knowledge proofs to read data from block headers, states, transactions, and receipts, and performs calculations such as analysis and machine learning.
By generating ZK validity proofs for each task result, Axiom ensures the correctness of data acquisition and calculations, which are then verified on the chain. This trustless verification process provides space for more reliable dApp development.
RISC Zero
RISC Zero focuses on the verifiable execution of blockchain smart contract computing. Developers can write programs in Rust and deploy them on the network, with zero-knowledge proofs to ensure the correctness of each program execution.
Components like Bonsai and zkVM are also included. Bonsai integrates zkVM for the RISC-V instruction set architecture to provide high-performance proofs for general use cases.
Brevis
Brevis is a ZK coprocessor that enables decentralized applications to access and compute data across multiple blockchains in a trustless manner. Its architecture is as follows:
zkFabric is used to synchronize block headers
zkQueryNet is used to process data queries
zkAggregatorRollup is used to verify and submit proofs to the blockchain.
Lagrange
Largrange is an interoperable ZK coprocessor protocol that supports applications that require large-scale data computing and cross-chain interoperability. Its core product, ZK Big Data, processes and verifies cross-chain data and generates ZK proofs through highly parallel coprocessors.
Lagrange includes a verifiable database, dynamic updates, and SQL query capabilities for smart contracts. The protocol supports complex cross-chain applications and integrates platforms such as EigenLayer, Mantle, and LayerZero.
7, AI Coprocessors
Coprocessors enhance applications in the crypto AI space by offloading complex computations and ensuring efficiency, security, and scalability for a variety of tasks such as DeFi management, personalized assistants, and secure data processing. Here are some noteworthy projects using coprocessors for different use cases and technologies:
Phala Network
Phala Network integrates blockchain with a trusted execution environment (TEE) to enable secure AI interactions. Their Phat Contracts offload complex computations to Phala’s network through coprocessors, which is critical for AI-driven DeFi management tasks such as portfolio management and yield management.
Phala’s cross-chain interoperability enables AI agents to perform cross-chain transactions, privacy-preserving computations while protecting sensitive data.
Ritual Network
Ritual is developing the first community-owned sovereign AI network using Infernet, a decentralized oracle network (DON) that allows smart contracts to access AI models.
Ritual Network’s strategic partnerships highlight its modular nature:
EigenLayer: Enhances economic security and protects against potential threats using a re-staking mechanism.
Celestia: Provides access to the Celestia scalable data availability layer, improving data management efficiency and overall scalability.
Modulus Labs
Modulus Labs is focused on bringing complex machine learning algorithms directly on-chain using ZK coprocessors. Their projects showcase a variety of possible applications:
Rockybot: An on-chain AI trading bot that leverages coprocessors for high-frequency trading operations.
Leela vs The World: An interactive AI game that uses a coprocessor to handle in-game action tracking.
zkMon: Using zero-knowledge proofs to verify AI-generated art.
Giza
Giza is a platform designed to simplify the creation, management, and hosting of verifiable machine learning models using zero-knowledge (ZK) proofs. It allows developers to convert any ML (machine learning) model into a verifiable model, ensuring tamper-proof proofs of ML execution.
Giza provides AI engineers with a control panel to easily monitor, schedule, and deploy AI operations, and seamlessly integrates with different cloud providers and ML libraries. The platform also supports protocol integration through EVM verifiers, improving efficiency, promoting revenue growth, and the adoption of decentralized applications.
EZKL
EZKL integrates zk-SNARKs with deep learning models and computational graphs, using familiar libraries such as PyTorch or TensorFlow. It allows developers to export these models as ONNX files and generate zk-SNARK circuits, ensuring privacy and security by proving statements about computations without revealing the underlying data.
These proofs can be verified on-chain, in a browser, or on a device. EZKL supports a variety of applications, including financial models, games, and data proofs, and provides Python, JavaScript, and command-line interface tools to simplify off-chain computations while maintaining security.
8 The future of coprocessors
In general, coprocessors are critical to the blockchain ecosystem. I think of them as "steroids" that make blockchains faster and more secure.
Coprocessors will be critical for a wide range of applications, including:
Develop trustless and censorship-resistant AI applications
Perform verifiable analysis of large data sets
Increase the reliability and transparency of AI-driven applications in crypto
Allow smart contracts to access more data and off-chain computing resources at a lower cost without compromising decentralization.
Potential applications for coprocessors could revolutionize areas such as decentralized finance (DeFi), where they could help maintain the competitiveness of platforms such as Sushiswap and Uniswap.
However, like any technology, coprocessors come with their own set of challenges, such as development complexity and high hardware costs.
Despite these challenges, some teams are working hard to address them. For example, the collaboration between Fhenix and EigenLayer embodies efforts to enhance computing tasks and accelerate the development of private on-chain transactions. This collaboration is critical to overcoming existing barriers and unleashing the full potential of coprocessors in this space.
9 Conclusion
The coprocessor ecosystem is growing rapidly, with a variety of projects contributing to the development of both general-purpose solutions and specialized applications, such as Phala and Ritual, which have tailored some solutions for the AI field.
As this technology continues to develop, we expect new use cases and innovative applications to emerge. The future of coprocessors looks bright, and we are excited to witness the evolution of this space.