A primer on the missing piece to achieving mainstream blockchain adoption, with supplemental case studies.
A world with hundreds of chains is inevitable. Over time, nearly every team and developer will want to own their economics and users and, even if this can be done on general-purpose execution environments like Solana, the application depends on the throughput of these environments, which has historically demonstrated to be unreliable at times. If we believe that a paradigm shift into blockchain technology is imminent, the logical next conclusion is hundreds of execution environments specialized for the applications that are building on it. We can already see this playing out today, with applications like dYdX, Hyperliquid, Frax, and other nascent projects becoming standalone app-chains and rollups. Furthermore, it is also likely that Layer 2 scaling solutions exist in tandem with Layer 1s, as a smaller set of nodes can communicate globally significantly faster than a larger set. This would allow L2s, like rollups, to scale to virtually no limit, while inheriting security from L1s and having a 1/N trust assumption (as opposed to having high quorums to reach consensus like L1s do). In essence, we envision a future with hundreds of L1s and L2s.
However, even in the current state of having only a few dozen L1s and L2s, we’ve already seen concerns being voiced about substantial UX hurdles in this multi-chain present. A multi-chain future therefore has many problems to overcome including fragmented liquidity, complexity for end users with multiple bridges, RPC endpoints, different gas tokens and markets. Until now, there still hasn’t been a sufficient methodology for abstracting these UX complexities in a world with a few L1s and L2s. One can only imagine how unusable blockchains will be for end-users if the multi-chain ecosystem continues to grow without fixing these significant UX hurdles first.
The internet didn’t get to where it is by having its users understand its core protocols like HTTP, TCP/IP, UDP. Instead, it abstracted away technical nuances and allowed the layman to use it. Over time, the exact same will be true for blockchains and blockchain-native applications.
In crypto, users need to deploy liquidity across multiple L1s and L2s, settle for a suboptimal UX by having on-chain liquidity sources fragmented across these L1s and L2s, and have an understanding of the technical nuances of these systems. It has come the time to abstract away everything from the average user – as far as they’re concerned, they don’t need to know they’re using blockchain rails, let alone how many L1s and L2s exist under the hood, for this is the only way the industry gains mass adoption.
Chain abstraction is a means through which we abstract the blockchain nuances and technical specifics for the average user to deliver a seamless user experience wherein they don’t even know they’re using the blockchain. It can be argued that this breakthrough in UX may be the missing piece from onboarding the next generation of businesses and users to blockchains and crypto-native ecosystems.
Before going over some of the projects building out infrastructure crucial to achieving a chain abstracted future, it’s prudent to go over some of the technological components powering chain abstraction.
Today’s wallets face many limitations. Apart from various security vulnerabilities, they only offer limited functionality unless they’re used in tandem, i.e. interacting with other smart contracts. What if we reimagined this scenario to transform externally owned accounts (EOAs) into smart contract wallets (SCWs)? Unlike EOAs, SCWs can’t initiate transactions on their own — they require an EOA’s prompt. By merging the capabilities of both, we effectively turn EOAs into SCWs, empowering them to not only initiate transactions but also execute complex, arbitrary logic, the premise of smart contracts.
This could unlock a plethora of use cases. In this context, we will specifically focus on how this relates to chain abstraction.
When you transform an EOA into a SCW, you effectively separate who executes a transaction and who signs it. This means that users don’t directly need to execute transactions but rather have sophisticated actors (called executors) do this on their behalf. It is important to note that during this process the user doesn’t give up the wallet’s custody as the user retains their private key. Having an executor comes with other benefits, like not needing gas balances on all the different blockchains you want to use, as transaction/gas fees can now also be abstracted away. Additionally, users can have bundles of transactions executed at the click of a button. For example, it is possible to approve a token for a DEX, swap it and then lend the proceeds into an Aave market.
Having an executor eliminates the need to interact with smart contracts directly, all while the user retains custody of user funds. Just imagine using any blockchain application you want through a Telegram bot – this dynamic becomes possible with account abstraction.
Furthermore, account abstraction allows users to self-custody assets and open DeFi positions on many chains without needing different wallets, RPCs, or needing to worry about different signature types, all without even knowing that they’re using a different chain. You can see a demo of this here or continue reading as we cover projects leading exactly these kinds of account abstraction efforts.
That’s not all – account abstraction also obviates the need for users to hold their own private keys to secure their accounts without being managed by a third party. Users can choose the more traditional means of verification like 2FA and fingerprints in addition to social recovery to secure their wallets. Social recovery allows a lost wallet to be restored through, for example, the user’s family.
“The next billion users are not going to write 12 words on a piece of paper. Normal people don’t do that. We need to give them better usability; they shouldn’t need to think about crypto keys.” - Yoav Weiss, EF
As wallets are the entry point into crypto and blockchains, account abstraction ultimately enables chain abstraction to blossom.
For more details on the inner working of account abstraction, refer to this thread by Jarrod Watts. Avocado Wallet by Instadapp is also taking significant strides in leveraging the power of account abstraction for end users.
Intents enable sophisticated actors or “solvers’’ to execute transactions in the most optimal manner on the user’s behalf. It’s in the name – a user expresses their intent to perform an onchain action. A simple definition is expressing, off-chain, their desired on-chain action in the most optimal manner possible. For example, when you submit an order to CowSwap, you’re actually submitting an intent – an intent to swap said token for another, at the best possible price. By submitting this intent off-chain, it bypasses the public mempool and instead is directly routed to an encrypted private mempool where solvers compete to fill, or solve, your intent at the best possible price, either by using their own balance sheets, private orderflow, or using on-chain liquidity venues like Uniswap and Curve. In this way, the solver’s margins compress to zero, giving users the best execution, because there’s always another solver ready to step in to fill this intent.
So now that we have defined what intents are, how exactly can they help us achieve chain abstraction?
The answer comes back to delineation between signers and executors in an account abstracted world. If all users need to do is click a button to sign a transaction, they can outsource all their on-chain needs to sophisticated actors, who then take on the responsibility of finding the best execution. The sophisticated actors then bear the risks of interacting with all different applications across L1s and L2s, the associated gas fees in different tokens on different chains, reorganization risks (wherein there are two different versions of the chain), and other execution risks. By taking on these steps and risks, the solvers will price the fees charged to users accordingly. In this situation, users do not need to think about the various complexities and associated risks with using on-chain products and services, which are instead outsourced to sophisticated actors, who price users accordingly. Because of competition between solvers, the fees charged to users will compress to near-zero, as there is always another solver ready to undercut the one winning all the orderflow. It’s the magic of the free market – through the process of competition, users will have better quality services at lower prices.
Let’s explore an example: I have $ETH on Ethereum and want $SOL on Solana and want this executed at the best price. Through a Request for Quote (RFQ) system, the intent marketplace passes on orderflow and in a matter of seconds, the user has $SOL on Solana. Notably, Ethereum has 12 second blocktimes, which means that even though solvers don’t have settlement assurance, by running their own node they can be fairly certain that the $USDC deposit transaction is valid and will go through. Further, by using their own balance sheets solvers can front the $SOL capital on Solana and essentially fulfill the intent before they get their capital. As the risks aren’t borne by users, but rather by the sophisticated actors, users can get their intents fulfilled at sub-second latencies and at the best prices, without knowing the bridges they’re using, the RPCs or the gas costs.
In this case, users still know which chains they’re using. This example acts to illustrate how intents are working in today’s landscape, not in a completely chain abstracted one. But intents don’t stop there – much more is possible.
It is easy to envision a future wherein intents work to meet all of the users’ needs. The user simply needs to specify what is to be done and it will be completed in the most efficient way possible. For example, a user may want to borrow $DAI against his/her $ETH and deposit the $DAI into a liquidity pool to earn $CRV rewards. In this example, an authorized solver compares all $DAI borrow rates against $ETH and takes out a loan on the lowest interest rate. The solver then deposits the $DAI in a Yearn-like vault to autocompound the yield from the highest yielding 100% $DAI-denominated LP into $CRV, which streams to the user’s wallet.
However, an important caveat: risk is subjective and cannot be expressed in an intent, unlike other objective inputs like maximum price slippage for a trade. So which lending markets, liquidity pools, and chains are used to fulfill this intent? After all, each has different risk profiles and trust assumptions. That’s where the “authorized solvers” come in. Each authorized solver is, to some extent, trusted by the user to carry out the user’s intent with the user’s risk and trust preferences, which are expressed beforehand. For example, a user may specify no depositing into contracts that are “risky”. However, it is likely that only power users will specify a large set of subjective preferences to a network of solvers. Even more sophisticated players than power users (HFTs, MMs, VCs, etc.) will likely interface with the chain(s) directly to avoid any fees from solvers and tailor their risk and trust assumptions themselves. Users with a little understanding of blockchains will likely be able to choose from some set of presets (low, medium, or high risk for example) from which solvers can act upon.
Utilizing a set of authorized solvers for the subjective needs of users enables competitive dynamics between the solvers, which incentivizes the fulfillment of user orders in the best possible manner without any hassle for the user. Furthermore, the fact that the user can “unauthorize” a solver by taking out their executor privileges at any time maintains a system of checks and balances. This way, solvers have an incentive to stay honest and follow the user’s preferences, as otherwise a different solver can prove they were acting maliciously to the user originating the orderflow.
Of course, intents are still a work in progress, and the speculation of how intents can transform into a more sophisticated technology is just that – speculation. However, it would be no surprise to see intents evolve in this manner. We believe intents will play the most instrumental role in materializing a chain abstracted future.
Two projects tackling intents head-on are CowSwap and deBridge. We have already written about CoWSwap and the intent-based architecture it follows to deliver users a superior UX and execution here. Similar to CoWSwap, deBridge follows an intent-based architecture, but does so to enable lightning-fast cross-chain swaps (trading). DeBridge focuses on seamless UX as it relates to lightning fast trading speeds across chains, minimal fees, and great execution. Like most intent-based solutions, deBridge utilizes a solver network composed of MMs, HFTs, and other sophisticated actors that front capital through their own balance sheet on the destination chain before collecting the user’s capital on the source chain. Apart from having solvers compete with each other to give users the best execution possible, deBridge also differentiates itself by pushing risks, like reorg risks, and other inconveniences, like gas fees and a different RPC on the different involved chains, onto solvers.
The graphic below illustrates the deBridge model. In the example below, users with USD stablecoin on Solana want EUR stablecoin on Ethereum. They express their intent to the deBridge application, which propagates it to the solver network, allowing solvers, who have $ETH on Ethereum on their balance sheet, to swap their $ETH on Ethereum into $ethEUR, a EUR stablecoin on Ethereum. Soon after deBridge’s validator set verifies that the solver has fulfilled the user’s intent on the destination chain (in this case, give the user $ethEUR), it allows the user’s capital on the source chain (in this case Solana) to be unlocked to the solver. It is important that users don’t need to wait for the verification to happen before receiving their capital on the destination chain.
To better understand deBridge and its intent-based design, we recommend checking out this podcast episode.
One of the symptoms of an increasingly multi-chain future is extreme liquidity fragmentation. This can be hard to aggregate in a cohesive manner. In a world with hundreds of rollups, validiums, L1s, etc., each of which have their own liquidity hosted on their network, the UX gets increasingly worse for end users due to the fragmentation of the liquidity pool.
If only one centralized exchange (CEX) hosted the entire liquidity of the cryptocurrency markets, instead of the hundreds of CEXs that exist together with even more on-chain DEXs that all share the same pie of liquidity, the execution for end users for would be the best they possibly could be, censorship and overall centralization concerns aside. This is only a hypothetical, however, because this is not feasible in the real world where competition is rife and decentralizing forces exist.
The advent of DEX aggregators, which aggregate fragmented liquidity sources across a single network into a unified interface, has been an important step for the UX. However, as the inevitable multi-chain future started to play out, DEX aggregators would no longer cut it, as they could only aggregate liquidity on a single chain, not across many or even more than one chain. Furthermore, for blockchains like Ethereum, the associated gas costs required to route liquidity across multiple sources or chains, made the cost of using aggregators greater than that of direct liquidity sources. This model has demonstrated greater success on cheap, low-latency networks like Solana, though the aggregators themselves are still restricted in the liquidity sources they are able to route trades from.
In a chain abstracted future, having technology to aggregate fragmented liquidity is crucial, as the ideal user experience will be a chain-agnostic one, and will likely rely on third-party solvers for their execution services. A few solutions that aim to push forward the defragmentation of multi-chain liquidity include Polygon AggLayer and Optimism Superchain. While these are the two that we will be touching on, there are plenty more teams working on such solutions.
As the Polygon website states: “The AggLayer will be a decentralized protocol with two components: a common bridge, and the ZK-powered mechanism that provides a cryptographic guarantee of safety for seamless, cross-chain interoperability. With ZK proofs providing security, chains connected to the AggLayer can remain sovereign and modular while preserving the seamless UX of monolithic chains.”
Fundamentally, Ethereum Layer 2 scaling solutions, like rollups, have a canonical bridge with Ethereum. This means that all user funds that are bridged from Ethereum to an L2 reside in this bridge contract. However, this disrupts the interoperability among different L2s as well as the ability to seamlessly communicate data and transfer value between them. This is because if you want to, for example, go from Base to Zora (both Ethereum rollups), as seen below, you need to incur a 7 day withdrawal process to go from Base to Ethereum using the canonical Base bridge and then use the canonical Zora bridge to go from Ethereum to Zora. This is because, for optimistic rollups like Base, the time is needed to dispute the bridging transaction using a fault/fraud proof. Apart from the fact that this is a lengthy process, it is also expensive because you need to interact with the Ethereum main chain.
Polygon’s AggLayer flips this process on its head. Instead of having a canonical bridge to Ethereum, where only one particular rollup user’s non-native assets sit, all chains share a bridge contract with other chains utilizing the AggLayer to have this hub of liquidity, as seen below. Through this process, developers will now be able to connect their chain to the AggLayer to allow users to enjoy unified liquidity.
How AggLayer Works
At its core, the AggLayer aggregates zero-knowledge (ZK) proofs from all the chains connected to it – this allows it to facilitate cross-chain transactions. The AggLayer is essentially a place where all its supported chains post ZK proofs to show that some action has taken place. For example, that 5 $USDC from Base has been withdrawn in order to unlock liquidity on some other side, like Zora.
To further illustrate this, consider how it works in practice. In this example, we are assuming all named chains are connected to the AggLayer.
A solver detects a request, or intent, from a user who resides on Base. The user has $ETH and wants to purchase an NFT on Zora that costs 3000 $DAI. Since the solver doesn’t have $DAI on their balance sheet, they must quickly look for the best route to fulfill this intent. They realize that $DAI on Optimism is cheaper than market $DAI on Zora. Hence, the solver posts a proof to the AggLayer showing that the user has the $ETH on Base and wants a commensurate amount of $ETH on Optimism. Given that the bridge contract is shared, a ZK proof is all it takes to move said fungible asset residing on chain “X” in the same quantity to chain “Y”.
After posting the ZK proof and unlocking a commensurate amount of $ETH on Optimism, the solver then swaps into $DAI and does the same process to get the same amount of $DAI on Zora to then finish buying the NFT. Behind the scenes, the AggLayer also settles these ZK proofs to Ethereum for stronger security guarantees for end-users and AggLayer-connected chains.
However, in this case, the solver/user/other actor bears inventory risk. This comes in the form of the $DAI rate on Optimism being arbitraged, the cost of the NFT rising, the price of $ETH dropping, or any other risk between the time that the orderflow from the user is originated and filled, subsequently incurring losses to the respective party. Unlike DEX aggregators on a single chain, which have atomic composability, solvers that interact with different state machines aren’t privy to this same atomic composability. Atomic composability ensures that all operations are executed in a single, linear sequence and either all succeed or all fail together. This is because between different state machines always need at least a one block delay due to the potential risks of reorgs (on the destination chain).
However, this doesn’t mean that the aforementioned use cases are not possible. There are not only long-tail events but also solvers and other sophisticated actors that can take on these risks and compensate for them by pricing them to users. For example, the solver can guarantee execution by covering the losses if they so occur or by filling the user’s intents using their own balance sheets.
Another example of aggregation liquidity is the Optimism Superchain initiative. The Superchain, defined by the Optimism documentation is “a network of chains that share bridging, decentralized governance, upgrades, a communication layer and more – all built on the OP Stack.” The project focuses on aggregating liquidity, similar to AggLayer. Optimism Superchain will have all chains that are a part of the Superchain utilize a shared bridge contract. This is the first step in having aggregated liquidity between chains in the Superchain.
The difference between the Superchain and the AggLayer is that the AggLayer relies on ZK-proofs for seamless, whereas the Superchain relies on a shared sequencer between chains opting into the Superchain. While this post won’t get into the details of shared sequencing, you can refer to this to understand how shared sequencing unlocks benefits in the realm of seamless cross-chain interoperability and, to some extent, atomic composability (the same problems elucidated above with cross-chain atomic composability apply here too).
Because the Superchain mandates that chains opting in have to use the shared sequencer, it could limit the execution environments that can be used for chains opting into the Superchain. Other cumbersome challenges arise, such as having the chains lose access to the MEV their users create, in addition to other challenges outlined here. However, teams like Espresso are working on ways to redistribute MEV enabled by chains utilizing a shared sequencer. Furthermore, all chains connected to the Polygon AggLayer (and hence post ZK proofs to this AggLayer) need to use the same ZK circuits which could also limit the execution environments that can be used for chains connected to the AggLayer.
Frontier Research has developed the CAKE (Chain Abstraction Key Elements) framework, which can be seen above. This outlines the three layers (excluding the user-facing application layer) required to reach a state where:
“In a chain abstracted world, a user goes to a dApp’s website, connects their wallet, signs the intended operation and waits for eventual settlement. All the complexity of acquiring the required assets to the target chain and the final settlement gets abstracted away from the user, happening in [the three] infrastructure layers of the CAKE.”
The framework identifies the three infrastructure layers of the CAKE as the permission layer, the solver layer, and the settlement layer. We have mostly touched on the solver and permission layers. The permission layer consists of account abstraction and policies - authorization as we’ve called it - and the settlement layer, which includes low-level technology like oracles, bridges, pre-confirmations, and other back-end features.
As such, the settlement layer is expected to be greatly beneficial for solvers and other sophisticated actors and user-facing applications, as components of settlement in this framework all work together to help solvers manage their risk and provide better execution for users. This further extends into other components like data availability and execution proofs. These are all requirements for chains to provide a safe building experience for application developers and provide security guarantees which are eventually passed onto end-users.
The CAKE framework encompasses many of the concepts mentioned in this post and provides a coherent way of looking at the various components of chain abstraction and their relation to each other. Those interested in the framework can read this introductory article.
While we’ve already touched on a few projects spearheading the effort towards a chain abstracted future, here are a few other notable projects that are doing the same.
Particle Network is launching a modular L1 blockchain built on the Cosmos SDK, which will operate as a high-performance EVM-compatible execution environment. Originally, Particle debuted as an account abstraction service provider, enabling users to create smart contract wallets linked to their Web2 social accounts to then be used natively within dApp-embedded interfaces. Since then, the protocol has expanded its offerings, aiming to proliferate chain abstraction across the broader blockchain landscape through a suite of wallet, liquidity, and gas abstraction services on its L1.
Similar to other chain abstraction service providers, Particle envisions a future in which anyone will be able to easily transact across multiple chains through a single account, paying gas fees in any token they wish. As such, the underlying L1 will serve as a coordinator for the multi-chain ecosystem, unifying users and liquidity across EVM and non-EVM domains alike.
Let’s see how it works.
Particle offers a multi-faceted toolkit for chain abstraction services, each core technology playing a unique role as part of a greater whole.
From the perspective of an end-user, Particle’s chain abstraction stack starts with first principles - creating an account. Universal Accounts on Particle function as ERC-4337 smart accounts attached to a pre-existing EOA (externally owned address), aggregating token balances across multiple chains into a single address by automatically routing and executing atomic cross-chain transactions. While a traditional crypto wallet can be used to create and manage an account, Particle’s WaaS enables users to use social logins for onboarding as well.
To abstract away various complexities of blockchain-native operations, a UA functions as a unified interface built on top of existing wallets, allowing users to deposit and use tokens across multiple blockchain environments as if it existed on a single chain. To maintain a synchronous state across UAs, account settings are stored on the Particle L1 to be used as a central source of truth across every instance. The network will then facilitate cross-chain messaging to either deploy a new instance(s) or update existing ones.
As such, the Particle L1 acts as a coordination and settlement layer for all cross-chain transactions processed through Particle’s UAs.
Another key component of Particle’s chain abstraction services is the Universal Liquidity functionality. While UAs provide a means for users to express their transactional request through an interface, Universal Liquidity refers to the layer responsible for the automatic execution of these requests, which in turn enables a unification of balances across different networks. This feature is key to enabling cross-chain transfers which would otherwise be hindered by current barriers to entry, like purchasing the native gas token and creating a native wallet for a new network.
For instance, when a user wishes to purchase an asset on a blockchain they have never used before and don’t have any funds on, the liquidity needed for this purchase is automatically sourced from a user’s existing balances, which may likely be on a different chain and a different token. This is largely made possible through Particle’s Decentralized Messaging Network (DMN), which enables specialized services, known as Relayer Nodes, to monitor external chain events and the settlement of state events. To be more exact, relayers in the DMN use a Messaging Protocol for monitoring the status of UserOperations on external chains and then settling the final execution status to the Particle L1.
The third pillar of Particle’s chain abstraction stack is the implementation of a Universal Gas Token - part of the network’s gas abstraction service. Accessed by interacting with Particle’s UAs, Universal Gas allows users to spend any token to pay for gas fees, meaning Bob can pay a transaction fee for a swap on Solana using his USDC on Base, while Alice pays a transaction fee for purchasing an NFT on Ethereum using her ARB token on Arbitrum.
When a user wishes to execute a transaction through a Particle UA, the interface will prompt the user to select their gas token of choice, which is then automatically routed through Particle’s native Paymaster contract. All gas payments are settled to their respective source and destination chains, while a portion of the fee is swapped into Particle’s native $PARTI token to be settled on the Particle L1.
Particle builds on top of its existing account abstraction infrastructure, for which it has reported over 17m wallet activations and over 10m UserOperations to day. The addition of a Universal Liquidity layer, coupled with a Universal Gas token, aims to mark Particle’s expansion into providing chain abstraction services across a broader spectrum of users and participants. The Particle L1 is not meant to be another blockchain directly competing with today’s incumbents; rather it seeks to provide an interoperability layer to connect them all instead, working with key teams in the chain abstraction services sector, including the Near and Cake R&D teams.
The Particle Network L1 is currently in its testnet phase, allowing early participants to try out Universal Gas within an experimental UA implementation
Near is a sharded Proof-of-Stake Layer 1 blockchain that serves as an full-stack application domain for developers building decentralized products and services. Much of Near’s core ethos revolves around bridging the gap between blockchain-native applications and mainstream audiences. A key to fulfilling this vision is abstracting away the blockchain from the end-user. Near approaches this with Account Aggregation - a multi-faceted architecture built to abstract away key pain points of using blockchain networks such as switching wallets, managing gas fees, bridging. It accomplishes this by funneling all operations to be run through a single account.
Let’s dive deeper to better understand how this all works.
In addition to the alphanumeric public key hash standard on most blockchains today, Near’s proprietary account model enables each account to be mapped to a human-readable account name, i.e. alice.near. Near accounts also utilize two types of access keys which are distinct in their nature and underlying functions, enabling accounts to be able to manage multiple keys across multiple blockchains, each key accounting for the various permissions and configurations that are unique to its domain:
Further bolstering the abstraction of blockchains to the end-user is a simplified onboarding process with FastAuth, Near’s proprietary key management system. FastAuth enables users to sign up for a blockchain-native account with something as simple as their email address and uses passkeys, which replace passwords with biometrics, in place of long and complex seed phrases and passwords.
Multi-chain signatures are a key component of Near’s abstraction of blockchains, allowing any NEAR account to have associated remote addresses on other chains and to sign messages and execute transactions from those addresses. To enable this, Chain Signatures use the NEAR MPC (multi-party computation) network as the signer for these remote addresses, eliminating the need for explicit private keys. This is enabled by a novel threshold signature protocol, which implements a form of key resharing that allows the MPC signer to maintain the same aggregate public key, even as key shares and nodes constantly change.
Making MPC signer nodes also part of the NEAR network allows smart contracts to start the signing process for an account. By using different combinations of a chain ID, a NEAR account ID, and a specific path, each account can create an unlimited number of remote addresses on any chain.
Another key issue hindering the development of a seamless user experience across the universal blockchain landscape today is that each blockchain requires gas fees to be paid in its own native token, requiring users to acquire these tokens prior to being able to use the underlying network.
NEP-366 introduced meta transactions to Near, a feature which allows for the execution of transactions on Near without owning any gas or tokens on the chain. This is made possible through Relayers, a third-party service provider which receives signed transactions and relays them to the network while attaching the tokens necessary to subsidize their gas fees. From a technical perspective, the end user creates and signs a SignedDelegateAction, which contains the data necessary to construct a Transaction, and sends it to the relayer service. The relayer signs a Transaction using this data, sends the SignedTransaction to the network via RPC call, and ensures the relayer pays the gas fees while the actions are executed on the user’s behalf.
To better illustrate what this may look like in practice, consider the following example: Alice wants to send Bob some of her $ALICE tokens, but lacks $NEAR tokens needed to cover gas fees. By using meta transactions, she creates a DelegateAction, signs it, and sends it to a relayer. The relayer, who pays the gas fees, wraps it in a transaction and forwards it on-chain, allowing the transfer to be completed successfully.
The key to a successful implementation of a seamless user experience across multiple blockchain networks is the integration and support of those blockchains, even if they are competing businesses. Though Near functions as a competitive business of its own, their growth strategy revolves around growing the industry as a whole, granting their users access to many other blockchains in a seamless and secure manner.
Here are some other teams building solutions for chain abstraction services worth keeping an eye on – this list is not necessarily exhaustive but instead provides a foundation for those interested in conducting further research into chain abstraction models.
Connext is a modular interoperability protocol which defined chain abstraction in their blog (May 2023) as a “pattern to improve dApp user experience by minimizing the need for users to care about the chain they’re on,” which accurately depicts the core principle chain abstraction service providers are building around today. Though Connext offers a set of smart contract modules for application developers through its Chain Abstraction Toolkit, its core feature is xCall, a primitive which enables smart contracts to interact with one another across different environments. The xCall function initiates a cross-chain transfer of funds, calldata, and/or various named properties, which the Chain Abstraction Toolkit wraps in simple logic for developers to utilize. From a developer perspective, this implies a relatively simple process:
Socket provides infrastructure for application developers building interoperability-centric products and services with secure and efficient data and asset transfers across chains. Socket 2.0 marks a shift for the protocol from cross-chain to chain abstraction services, highlighted by its flagship Modular Order Flow Auction (MOFA) mechanism, which aims to enable a competitive mechanism for efficient chain abstracted markets. Traditional OFAs involve a network of various actors performing specialized tasks that compete to deliver the best possible outcome for an end-user request. Similarly, MOFA is designed to provide an open marketplace for execution agents, called Transmitters, and user intents. Within the MOFA, Transmitters compete to create and fulfill chain abstracted bundles, or ordered sequences of user-requests which require transfer of data and value across multiple blockchains.
Infinex is building a single UX layer aimed at unifying decentralized applications and ecosystems. Its flagship product, Infinex Account, is a multi-layered service that functions as a platform for integrating any on-chain application into a simplified UX for the end-user. At its core, the Infinex Account is a set of cross-chain smart contracts that can be controlled, secured and recovered via standard web2 authentication.
Brahma Finance is building its flagship Console product, an on-chain execution and custody environment aimed at enhancing user experience across DeFi, focusing specifically on the EVM blockchain ecosystem. Brahma uses batched and chained transactions to synchronize transactions across different chains, and Smart Accounts for interacting on-chain. The end result will reflect a user experience which enables seamless cross-chain interactions within a single UI.
Agoric is a Cosmos-native Layer 1 blockchain for building cross-chain smart contracts in JavaScript. The Agoric platform is designed with an asynchronous, multi-block execution environment, and aims to be the go-to environment for developing cross-chain applications. Agoric utilizes the Cosmos InterBlockchain Communication (IBC) Protocol for interchain communications, while leveraging Axelar’s General Message Passing (GMP) for interactions beyond the Cosmos ecosystem. Agoric’s Orchestration API simplifies developer experience by abstracting the complexities involved in cross-chain communication and smart contract execution, while the end-user benefits from applications with inherent chain abstracted features.
By now, the advantages that chain abstraction unlock for end-users should be clear - the complexities of using blockchain-native applications are entirely abstracted away into a unified interface layer, creating a global and chain-agnostic point of contact for anyone who wants to participate.
Equally as importantly, chain abstraction could unlock a huge benefit for blockchain applications. Currently, Web2 developers don’t “choose” where to deploy their application. For instance, Airbnb is available for anyone with an internet connection. However, in the Web3 landscape application developers need to choose where to deploy their application (for example, on Ethereum, Solana or Cosmos). Not only does this limit TAM, but also means that application developers are encumbered by needing to choose the “right” chain to deploy their application on. This is not only a hard decision to make but a crucial one. There have been a handful of applications that were extremely successful but struggled due to the underlying blockchain. Furthermore, with the continuous development and evolution of blockchains today, the “right” chain may constantly be changing. In a chain abstracted future, application developers are no longer encumbered by having to select a chain that their success is tied to.
It is evident that we are headed towards an increasingly multichain future. This inevitably will only compound UX issues that are one of the most critical barriers to mainstream adoption. We believe chain abstraction, with its various components, is a possible solution to many of crypto’s UX problems today.
A primer on the missing piece to achieving mainstream blockchain adoption, with supplemental case studies.
A world with hundreds of chains is inevitable. Over time, nearly every team and developer will want to own their economics and users and, even if this can be done on general-purpose execution environments like Solana, the application depends on the throughput of these environments, which has historically demonstrated to be unreliable at times. If we believe that a paradigm shift into blockchain technology is imminent, the logical next conclusion is hundreds of execution environments specialized for the applications that are building on it. We can already see this playing out today, with applications like dYdX, Hyperliquid, Frax, and other nascent projects becoming standalone app-chains and rollups. Furthermore, it is also likely that Layer 2 scaling solutions exist in tandem with Layer 1s, as a smaller set of nodes can communicate globally significantly faster than a larger set. This would allow L2s, like rollups, to scale to virtually no limit, while inheriting security from L1s and having a 1/N trust assumption (as opposed to having high quorums to reach consensus like L1s do). In essence, we envision a future with hundreds of L1s and L2s.
However, even in the current state of having only a few dozen L1s and L2s, we’ve already seen concerns being voiced about substantial UX hurdles in this multi-chain present. A multi-chain future therefore has many problems to overcome including fragmented liquidity, complexity for end users with multiple bridges, RPC endpoints, different gas tokens and markets. Until now, there still hasn’t been a sufficient methodology for abstracting these UX complexities in a world with a few L1s and L2s. One can only imagine how unusable blockchains will be for end-users if the multi-chain ecosystem continues to grow without fixing these significant UX hurdles first.
The internet didn’t get to where it is by having its users understand its core protocols like HTTP, TCP/IP, UDP. Instead, it abstracted away technical nuances and allowed the layman to use it. Over time, the exact same will be true for blockchains and blockchain-native applications.
In crypto, users need to deploy liquidity across multiple L1s and L2s, settle for a suboptimal UX by having on-chain liquidity sources fragmented across these L1s and L2s, and have an understanding of the technical nuances of these systems. It has come the time to abstract away everything from the average user – as far as they’re concerned, they don’t need to know they’re using blockchain rails, let alone how many L1s and L2s exist under the hood, for this is the only way the industry gains mass adoption.
Chain abstraction is a means through which we abstract the blockchain nuances and technical specifics for the average user to deliver a seamless user experience wherein they don’t even know they’re using the blockchain. It can be argued that this breakthrough in UX may be the missing piece from onboarding the next generation of businesses and users to blockchains and crypto-native ecosystems.
Before going over some of the projects building out infrastructure crucial to achieving a chain abstracted future, it’s prudent to go over some of the technological components powering chain abstraction.
Today’s wallets face many limitations. Apart from various security vulnerabilities, they only offer limited functionality unless they’re used in tandem, i.e. interacting with other smart contracts. What if we reimagined this scenario to transform externally owned accounts (EOAs) into smart contract wallets (SCWs)? Unlike EOAs, SCWs can’t initiate transactions on their own — they require an EOA’s prompt. By merging the capabilities of both, we effectively turn EOAs into SCWs, empowering them to not only initiate transactions but also execute complex, arbitrary logic, the premise of smart contracts.
This could unlock a plethora of use cases. In this context, we will specifically focus on how this relates to chain abstraction.
When you transform an EOA into a SCW, you effectively separate who executes a transaction and who signs it. This means that users don’t directly need to execute transactions but rather have sophisticated actors (called executors) do this on their behalf. It is important to note that during this process the user doesn’t give up the wallet’s custody as the user retains their private key. Having an executor comes with other benefits, like not needing gas balances on all the different blockchains you want to use, as transaction/gas fees can now also be abstracted away. Additionally, users can have bundles of transactions executed at the click of a button. For example, it is possible to approve a token for a DEX, swap it and then lend the proceeds into an Aave market.
Having an executor eliminates the need to interact with smart contracts directly, all while the user retains custody of user funds. Just imagine using any blockchain application you want through a Telegram bot – this dynamic becomes possible with account abstraction.
Furthermore, account abstraction allows users to self-custody assets and open DeFi positions on many chains without needing different wallets, RPCs, or needing to worry about different signature types, all without even knowing that they’re using a different chain. You can see a demo of this here or continue reading as we cover projects leading exactly these kinds of account abstraction efforts.
That’s not all – account abstraction also obviates the need for users to hold their own private keys to secure their accounts without being managed by a third party. Users can choose the more traditional means of verification like 2FA and fingerprints in addition to social recovery to secure their wallets. Social recovery allows a lost wallet to be restored through, for example, the user’s family.
“The next billion users are not going to write 12 words on a piece of paper. Normal people don’t do that. We need to give them better usability; they shouldn’t need to think about crypto keys.” - Yoav Weiss, EF
As wallets are the entry point into crypto and blockchains, account abstraction ultimately enables chain abstraction to blossom.
For more details on the inner working of account abstraction, refer to this thread by Jarrod Watts. Avocado Wallet by Instadapp is also taking significant strides in leveraging the power of account abstraction for end users.
Intents enable sophisticated actors or “solvers’’ to execute transactions in the most optimal manner on the user’s behalf. It’s in the name – a user expresses their intent to perform an onchain action. A simple definition is expressing, off-chain, their desired on-chain action in the most optimal manner possible. For example, when you submit an order to CowSwap, you’re actually submitting an intent – an intent to swap said token for another, at the best possible price. By submitting this intent off-chain, it bypasses the public mempool and instead is directly routed to an encrypted private mempool where solvers compete to fill, or solve, your intent at the best possible price, either by using their own balance sheets, private orderflow, or using on-chain liquidity venues like Uniswap and Curve. In this way, the solver’s margins compress to zero, giving users the best execution, because there’s always another solver ready to step in to fill this intent.
So now that we have defined what intents are, how exactly can they help us achieve chain abstraction?
The answer comes back to delineation between signers and executors in an account abstracted world. If all users need to do is click a button to sign a transaction, they can outsource all their on-chain needs to sophisticated actors, who then take on the responsibility of finding the best execution. The sophisticated actors then bear the risks of interacting with all different applications across L1s and L2s, the associated gas fees in different tokens on different chains, reorganization risks (wherein there are two different versions of the chain), and other execution risks. By taking on these steps and risks, the solvers will price the fees charged to users accordingly. In this situation, users do not need to think about the various complexities and associated risks with using on-chain products and services, which are instead outsourced to sophisticated actors, who price users accordingly. Because of competition between solvers, the fees charged to users will compress to near-zero, as there is always another solver ready to undercut the one winning all the orderflow. It’s the magic of the free market – through the process of competition, users will have better quality services at lower prices.
Let’s explore an example: I have $ETH on Ethereum and want $SOL on Solana and want this executed at the best price. Through a Request for Quote (RFQ) system, the intent marketplace passes on orderflow and in a matter of seconds, the user has $SOL on Solana. Notably, Ethereum has 12 second blocktimes, which means that even though solvers don’t have settlement assurance, by running their own node they can be fairly certain that the $USDC deposit transaction is valid and will go through. Further, by using their own balance sheets solvers can front the $SOL capital on Solana and essentially fulfill the intent before they get their capital. As the risks aren’t borne by users, but rather by the sophisticated actors, users can get their intents fulfilled at sub-second latencies and at the best prices, without knowing the bridges they’re using, the RPCs or the gas costs.
In this case, users still know which chains they’re using. This example acts to illustrate how intents are working in today’s landscape, not in a completely chain abstracted one. But intents don’t stop there – much more is possible.
It is easy to envision a future wherein intents work to meet all of the users’ needs. The user simply needs to specify what is to be done and it will be completed in the most efficient way possible. For example, a user may want to borrow $DAI against his/her $ETH and deposit the $DAI into a liquidity pool to earn $CRV rewards. In this example, an authorized solver compares all $DAI borrow rates against $ETH and takes out a loan on the lowest interest rate. The solver then deposits the $DAI in a Yearn-like vault to autocompound the yield from the highest yielding 100% $DAI-denominated LP into $CRV, which streams to the user’s wallet.
However, an important caveat: risk is subjective and cannot be expressed in an intent, unlike other objective inputs like maximum price slippage for a trade. So which lending markets, liquidity pools, and chains are used to fulfill this intent? After all, each has different risk profiles and trust assumptions. That’s where the “authorized solvers” come in. Each authorized solver is, to some extent, trusted by the user to carry out the user’s intent with the user’s risk and trust preferences, which are expressed beforehand. For example, a user may specify no depositing into contracts that are “risky”. However, it is likely that only power users will specify a large set of subjective preferences to a network of solvers. Even more sophisticated players than power users (HFTs, MMs, VCs, etc.) will likely interface with the chain(s) directly to avoid any fees from solvers and tailor their risk and trust assumptions themselves. Users with a little understanding of blockchains will likely be able to choose from some set of presets (low, medium, or high risk for example) from which solvers can act upon.
Utilizing a set of authorized solvers for the subjective needs of users enables competitive dynamics between the solvers, which incentivizes the fulfillment of user orders in the best possible manner without any hassle for the user. Furthermore, the fact that the user can “unauthorize” a solver by taking out their executor privileges at any time maintains a system of checks and balances. This way, solvers have an incentive to stay honest and follow the user’s preferences, as otherwise a different solver can prove they were acting maliciously to the user originating the orderflow.
Of course, intents are still a work in progress, and the speculation of how intents can transform into a more sophisticated technology is just that – speculation. However, it would be no surprise to see intents evolve in this manner. We believe intents will play the most instrumental role in materializing a chain abstracted future.
Two projects tackling intents head-on are CowSwap and deBridge. We have already written about CoWSwap and the intent-based architecture it follows to deliver users a superior UX and execution here. Similar to CoWSwap, deBridge follows an intent-based architecture, but does so to enable lightning-fast cross-chain swaps (trading). DeBridge focuses on seamless UX as it relates to lightning fast trading speeds across chains, minimal fees, and great execution. Like most intent-based solutions, deBridge utilizes a solver network composed of MMs, HFTs, and other sophisticated actors that front capital through their own balance sheet on the destination chain before collecting the user’s capital on the source chain. Apart from having solvers compete with each other to give users the best execution possible, deBridge also differentiates itself by pushing risks, like reorg risks, and other inconveniences, like gas fees and a different RPC on the different involved chains, onto solvers.
The graphic below illustrates the deBridge model. In the example below, users with USD stablecoin on Solana want EUR stablecoin on Ethereum. They express their intent to the deBridge application, which propagates it to the solver network, allowing solvers, who have $ETH on Ethereum on their balance sheet, to swap their $ETH on Ethereum into $ethEUR, a EUR stablecoin on Ethereum. Soon after deBridge’s validator set verifies that the solver has fulfilled the user’s intent on the destination chain (in this case, give the user $ethEUR), it allows the user’s capital on the source chain (in this case Solana) to be unlocked to the solver. It is important that users don’t need to wait for the verification to happen before receiving their capital on the destination chain.
To better understand deBridge and its intent-based design, we recommend checking out this podcast episode.
One of the symptoms of an increasingly multi-chain future is extreme liquidity fragmentation. This can be hard to aggregate in a cohesive manner. In a world with hundreds of rollups, validiums, L1s, etc., each of which have their own liquidity hosted on their network, the UX gets increasingly worse for end users due to the fragmentation of the liquidity pool.
If only one centralized exchange (CEX) hosted the entire liquidity of the cryptocurrency markets, instead of the hundreds of CEXs that exist together with even more on-chain DEXs that all share the same pie of liquidity, the execution for end users for would be the best they possibly could be, censorship and overall centralization concerns aside. This is only a hypothetical, however, because this is not feasible in the real world where competition is rife and decentralizing forces exist.
The advent of DEX aggregators, which aggregate fragmented liquidity sources across a single network into a unified interface, has been an important step for the UX. However, as the inevitable multi-chain future started to play out, DEX aggregators would no longer cut it, as they could only aggregate liquidity on a single chain, not across many or even more than one chain. Furthermore, for blockchains like Ethereum, the associated gas costs required to route liquidity across multiple sources or chains, made the cost of using aggregators greater than that of direct liquidity sources. This model has demonstrated greater success on cheap, low-latency networks like Solana, though the aggregators themselves are still restricted in the liquidity sources they are able to route trades from.
In a chain abstracted future, having technology to aggregate fragmented liquidity is crucial, as the ideal user experience will be a chain-agnostic one, and will likely rely on third-party solvers for their execution services. A few solutions that aim to push forward the defragmentation of multi-chain liquidity include Polygon AggLayer and Optimism Superchain. While these are the two that we will be touching on, there are plenty more teams working on such solutions.
As the Polygon website states: “The AggLayer will be a decentralized protocol with two components: a common bridge, and the ZK-powered mechanism that provides a cryptographic guarantee of safety for seamless, cross-chain interoperability. With ZK proofs providing security, chains connected to the AggLayer can remain sovereign and modular while preserving the seamless UX of monolithic chains.”
Fundamentally, Ethereum Layer 2 scaling solutions, like rollups, have a canonical bridge with Ethereum. This means that all user funds that are bridged from Ethereum to an L2 reside in this bridge contract. However, this disrupts the interoperability among different L2s as well as the ability to seamlessly communicate data and transfer value between them. This is because if you want to, for example, go from Base to Zora (both Ethereum rollups), as seen below, you need to incur a 7 day withdrawal process to go from Base to Ethereum using the canonical Base bridge and then use the canonical Zora bridge to go from Ethereum to Zora. This is because, for optimistic rollups like Base, the time is needed to dispute the bridging transaction using a fault/fraud proof. Apart from the fact that this is a lengthy process, it is also expensive because you need to interact with the Ethereum main chain.
Polygon’s AggLayer flips this process on its head. Instead of having a canonical bridge to Ethereum, where only one particular rollup user’s non-native assets sit, all chains share a bridge contract with other chains utilizing the AggLayer to have this hub of liquidity, as seen below. Through this process, developers will now be able to connect their chain to the AggLayer to allow users to enjoy unified liquidity.
How AggLayer Works
At its core, the AggLayer aggregates zero-knowledge (ZK) proofs from all the chains connected to it – this allows it to facilitate cross-chain transactions. The AggLayer is essentially a place where all its supported chains post ZK proofs to show that some action has taken place. For example, that 5 $USDC from Base has been withdrawn in order to unlock liquidity on some other side, like Zora.
To further illustrate this, consider how it works in practice. In this example, we are assuming all named chains are connected to the AggLayer.
A solver detects a request, or intent, from a user who resides on Base. The user has $ETH and wants to purchase an NFT on Zora that costs 3000 $DAI. Since the solver doesn’t have $DAI on their balance sheet, they must quickly look for the best route to fulfill this intent. They realize that $DAI on Optimism is cheaper than market $DAI on Zora. Hence, the solver posts a proof to the AggLayer showing that the user has the $ETH on Base and wants a commensurate amount of $ETH on Optimism. Given that the bridge contract is shared, a ZK proof is all it takes to move said fungible asset residing on chain “X” in the same quantity to chain “Y”.
After posting the ZK proof and unlocking a commensurate amount of $ETH on Optimism, the solver then swaps into $DAI and does the same process to get the same amount of $DAI on Zora to then finish buying the NFT. Behind the scenes, the AggLayer also settles these ZK proofs to Ethereum for stronger security guarantees for end-users and AggLayer-connected chains.
However, in this case, the solver/user/other actor bears inventory risk. This comes in the form of the $DAI rate on Optimism being arbitraged, the cost of the NFT rising, the price of $ETH dropping, or any other risk between the time that the orderflow from the user is originated and filled, subsequently incurring losses to the respective party. Unlike DEX aggregators on a single chain, which have atomic composability, solvers that interact with different state machines aren’t privy to this same atomic composability. Atomic composability ensures that all operations are executed in a single, linear sequence and either all succeed or all fail together. This is because between different state machines always need at least a one block delay due to the potential risks of reorgs (on the destination chain).
However, this doesn’t mean that the aforementioned use cases are not possible. There are not only long-tail events but also solvers and other sophisticated actors that can take on these risks and compensate for them by pricing them to users. For example, the solver can guarantee execution by covering the losses if they so occur or by filling the user’s intents using their own balance sheets.
Another example of aggregation liquidity is the Optimism Superchain initiative. The Superchain, defined by the Optimism documentation is “a network of chains that share bridging, decentralized governance, upgrades, a communication layer and more – all built on the OP Stack.” The project focuses on aggregating liquidity, similar to AggLayer. Optimism Superchain will have all chains that are a part of the Superchain utilize a shared bridge contract. This is the first step in having aggregated liquidity between chains in the Superchain.
The difference between the Superchain and the AggLayer is that the AggLayer relies on ZK-proofs for seamless, whereas the Superchain relies on a shared sequencer between chains opting into the Superchain. While this post won’t get into the details of shared sequencing, you can refer to this to understand how shared sequencing unlocks benefits in the realm of seamless cross-chain interoperability and, to some extent, atomic composability (the same problems elucidated above with cross-chain atomic composability apply here too).
Because the Superchain mandates that chains opting in have to use the shared sequencer, it could limit the execution environments that can be used for chains opting into the Superchain. Other cumbersome challenges arise, such as having the chains lose access to the MEV their users create, in addition to other challenges outlined here. However, teams like Espresso are working on ways to redistribute MEV enabled by chains utilizing a shared sequencer. Furthermore, all chains connected to the Polygon AggLayer (and hence post ZK proofs to this AggLayer) need to use the same ZK circuits which could also limit the execution environments that can be used for chains connected to the AggLayer.
Frontier Research has developed the CAKE (Chain Abstraction Key Elements) framework, which can be seen above. This outlines the three layers (excluding the user-facing application layer) required to reach a state where:
“In a chain abstracted world, a user goes to a dApp’s website, connects their wallet, signs the intended operation and waits for eventual settlement. All the complexity of acquiring the required assets to the target chain and the final settlement gets abstracted away from the user, happening in [the three] infrastructure layers of the CAKE.”
The framework identifies the three infrastructure layers of the CAKE as the permission layer, the solver layer, and the settlement layer. We have mostly touched on the solver and permission layers. The permission layer consists of account abstraction and policies - authorization as we’ve called it - and the settlement layer, which includes low-level technology like oracles, bridges, pre-confirmations, and other back-end features.
As such, the settlement layer is expected to be greatly beneficial for solvers and other sophisticated actors and user-facing applications, as components of settlement in this framework all work together to help solvers manage their risk and provide better execution for users. This further extends into other components like data availability and execution proofs. These are all requirements for chains to provide a safe building experience for application developers and provide security guarantees which are eventually passed onto end-users.
The CAKE framework encompasses many of the concepts mentioned in this post and provides a coherent way of looking at the various components of chain abstraction and their relation to each other. Those interested in the framework can read this introductory article.
While we’ve already touched on a few projects spearheading the effort towards a chain abstracted future, here are a few other notable projects that are doing the same.
Particle Network is launching a modular L1 blockchain built on the Cosmos SDK, which will operate as a high-performance EVM-compatible execution environment. Originally, Particle debuted as an account abstraction service provider, enabling users to create smart contract wallets linked to their Web2 social accounts to then be used natively within dApp-embedded interfaces. Since then, the protocol has expanded its offerings, aiming to proliferate chain abstraction across the broader blockchain landscape through a suite of wallet, liquidity, and gas abstraction services on its L1.
Similar to other chain abstraction service providers, Particle envisions a future in which anyone will be able to easily transact across multiple chains through a single account, paying gas fees in any token they wish. As such, the underlying L1 will serve as a coordinator for the multi-chain ecosystem, unifying users and liquidity across EVM and non-EVM domains alike.
Let’s see how it works.
Particle offers a multi-faceted toolkit for chain abstraction services, each core technology playing a unique role as part of a greater whole.
From the perspective of an end-user, Particle’s chain abstraction stack starts with first principles - creating an account. Universal Accounts on Particle function as ERC-4337 smart accounts attached to a pre-existing EOA (externally owned address), aggregating token balances across multiple chains into a single address by automatically routing and executing atomic cross-chain transactions. While a traditional crypto wallet can be used to create and manage an account, Particle’s WaaS enables users to use social logins for onboarding as well.
To abstract away various complexities of blockchain-native operations, a UA functions as a unified interface built on top of existing wallets, allowing users to deposit and use tokens across multiple blockchain environments as if it existed on a single chain. To maintain a synchronous state across UAs, account settings are stored on the Particle L1 to be used as a central source of truth across every instance. The network will then facilitate cross-chain messaging to either deploy a new instance(s) or update existing ones.
As such, the Particle L1 acts as a coordination and settlement layer for all cross-chain transactions processed through Particle’s UAs.
Another key component of Particle’s chain abstraction services is the Universal Liquidity functionality. While UAs provide a means for users to express their transactional request through an interface, Universal Liquidity refers to the layer responsible for the automatic execution of these requests, which in turn enables a unification of balances across different networks. This feature is key to enabling cross-chain transfers which would otherwise be hindered by current barriers to entry, like purchasing the native gas token and creating a native wallet for a new network.
For instance, when a user wishes to purchase an asset on a blockchain they have never used before and don’t have any funds on, the liquidity needed for this purchase is automatically sourced from a user’s existing balances, which may likely be on a different chain and a different token. This is largely made possible through Particle’s Decentralized Messaging Network (DMN), which enables specialized services, known as Relayer Nodes, to monitor external chain events and the settlement of state events. To be more exact, relayers in the DMN use a Messaging Protocol for monitoring the status of UserOperations on external chains and then settling the final execution status to the Particle L1.
The third pillar of Particle’s chain abstraction stack is the implementation of a Universal Gas Token - part of the network’s gas abstraction service. Accessed by interacting with Particle’s UAs, Universal Gas allows users to spend any token to pay for gas fees, meaning Bob can pay a transaction fee for a swap on Solana using his USDC on Base, while Alice pays a transaction fee for purchasing an NFT on Ethereum using her ARB token on Arbitrum.
When a user wishes to execute a transaction through a Particle UA, the interface will prompt the user to select their gas token of choice, which is then automatically routed through Particle’s native Paymaster contract. All gas payments are settled to their respective source and destination chains, while a portion of the fee is swapped into Particle’s native $PARTI token to be settled on the Particle L1.
Particle builds on top of its existing account abstraction infrastructure, for which it has reported over 17m wallet activations and over 10m UserOperations to day. The addition of a Universal Liquidity layer, coupled with a Universal Gas token, aims to mark Particle’s expansion into providing chain abstraction services across a broader spectrum of users and participants. The Particle L1 is not meant to be another blockchain directly competing with today’s incumbents; rather it seeks to provide an interoperability layer to connect them all instead, working with key teams in the chain abstraction services sector, including the Near and Cake R&D teams.
The Particle Network L1 is currently in its testnet phase, allowing early participants to try out Universal Gas within an experimental UA implementation
Near is a sharded Proof-of-Stake Layer 1 blockchain that serves as an full-stack application domain for developers building decentralized products and services. Much of Near’s core ethos revolves around bridging the gap between blockchain-native applications and mainstream audiences. A key to fulfilling this vision is abstracting away the blockchain from the end-user. Near approaches this with Account Aggregation - a multi-faceted architecture built to abstract away key pain points of using blockchain networks such as switching wallets, managing gas fees, bridging. It accomplishes this by funneling all operations to be run through a single account.
Let’s dive deeper to better understand how this all works.
In addition to the alphanumeric public key hash standard on most blockchains today, Near’s proprietary account model enables each account to be mapped to a human-readable account name, i.e. alice.near. Near accounts also utilize two types of access keys which are distinct in their nature and underlying functions, enabling accounts to be able to manage multiple keys across multiple blockchains, each key accounting for the various permissions and configurations that are unique to its domain:
Further bolstering the abstraction of blockchains to the end-user is a simplified onboarding process with FastAuth, Near’s proprietary key management system. FastAuth enables users to sign up for a blockchain-native account with something as simple as their email address and uses passkeys, which replace passwords with biometrics, in place of long and complex seed phrases and passwords.
Multi-chain signatures are a key component of Near’s abstraction of blockchains, allowing any NEAR account to have associated remote addresses on other chains and to sign messages and execute transactions from those addresses. To enable this, Chain Signatures use the NEAR MPC (multi-party computation) network as the signer for these remote addresses, eliminating the need for explicit private keys. This is enabled by a novel threshold signature protocol, which implements a form of key resharing that allows the MPC signer to maintain the same aggregate public key, even as key shares and nodes constantly change.
Making MPC signer nodes also part of the NEAR network allows smart contracts to start the signing process for an account. By using different combinations of a chain ID, a NEAR account ID, and a specific path, each account can create an unlimited number of remote addresses on any chain.
Another key issue hindering the development of a seamless user experience across the universal blockchain landscape today is that each blockchain requires gas fees to be paid in its own native token, requiring users to acquire these tokens prior to being able to use the underlying network.
NEP-366 introduced meta transactions to Near, a feature which allows for the execution of transactions on Near without owning any gas or tokens on the chain. This is made possible through Relayers, a third-party service provider which receives signed transactions and relays them to the network while attaching the tokens necessary to subsidize their gas fees. From a technical perspective, the end user creates and signs a SignedDelegateAction, which contains the data necessary to construct a Transaction, and sends it to the relayer service. The relayer signs a Transaction using this data, sends the SignedTransaction to the network via RPC call, and ensures the relayer pays the gas fees while the actions are executed on the user’s behalf.
To better illustrate what this may look like in practice, consider the following example: Alice wants to send Bob some of her $ALICE tokens, but lacks $NEAR tokens needed to cover gas fees. By using meta transactions, she creates a DelegateAction, signs it, and sends it to a relayer. The relayer, who pays the gas fees, wraps it in a transaction and forwards it on-chain, allowing the transfer to be completed successfully.
The key to a successful implementation of a seamless user experience across multiple blockchain networks is the integration and support of those blockchains, even if they are competing businesses. Though Near functions as a competitive business of its own, their growth strategy revolves around growing the industry as a whole, granting their users access to many other blockchains in a seamless and secure manner.
Here are some other teams building solutions for chain abstraction services worth keeping an eye on – this list is not necessarily exhaustive but instead provides a foundation for those interested in conducting further research into chain abstraction models.
Connext is a modular interoperability protocol which defined chain abstraction in their blog (May 2023) as a “pattern to improve dApp user experience by minimizing the need for users to care about the chain they’re on,” which accurately depicts the core principle chain abstraction service providers are building around today. Though Connext offers a set of smart contract modules for application developers through its Chain Abstraction Toolkit, its core feature is xCall, a primitive which enables smart contracts to interact with one another across different environments. The xCall function initiates a cross-chain transfer of funds, calldata, and/or various named properties, which the Chain Abstraction Toolkit wraps in simple logic for developers to utilize. From a developer perspective, this implies a relatively simple process:
Socket provides infrastructure for application developers building interoperability-centric products and services with secure and efficient data and asset transfers across chains. Socket 2.0 marks a shift for the protocol from cross-chain to chain abstraction services, highlighted by its flagship Modular Order Flow Auction (MOFA) mechanism, which aims to enable a competitive mechanism for efficient chain abstracted markets. Traditional OFAs involve a network of various actors performing specialized tasks that compete to deliver the best possible outcome for an end-user request. Similarly, MOFA is designed to provide an open marketplace for execution agents, called Transmitters, and user intents. Within the MOFA, Transmitters compete to create and fulfill chain abstracted bundles, or ordered sequences of user-requests which require transfer of data and value across multiple blockchains.
Infinex is building a single UX layer aimed at unifying decentralized applications and ecosystems. Its flagship product, Infinex Account, is a multi-layered service that functions as a platform for integrating any on-chain application into a simplified UX for the end-user. At its core, the Infinex Account is a set of cross-chain smart contracts that can be controlled, secured and recovered via standard web2 authentication.
Brahma Finance is building its flagship Console product, an on-chain execution and custody environment aimed at enhancing user experience across DeFi, focusing specifically on the EVM blockchain ecosystem. Brahma uses batched and chained transactions to synchronize transactions across different chains, and Smart Accounts for interacting on-chain. The end result will reflect a user experience which enables seamless cross-chain interactions within a single UI.
Agoric is a Cosmos-native Layer 1 blockchain for building cross-chain smart contracts in JavaScript. The Agoric platform is designed with an asynchronous, multi-block execution environment, and aims to be the go-to environment for developing cross-chain applications. Agoric utilizes the Cosmos InterBlockchain Communication (IBC) Protocol for interchain communications, while leveraging Axelar’s General Message Passing (GMP) for interactions beyond the Cosmos ecosystem. Agoric’s Orchestration API simplifies developer experience by abstracting the complexities involved in cross-chain communication and smart contract execution, while the end-user benefits from applications with inherent chain abstracted features.
By now, the advantages that chain abstraction unlock for end-users should be clear - the complexities of using blockchain-native applications are entirely abstracted away into a unified interface layer, creating a global and chain-agnostic point of contact for anyone who wants to participate.
Equally as importantly, chain abstraction could unlock a huge benefit for blockchain applications. Currently, Web2 developers don’t “choose” where to deploy their application. For instance, Airbnb is available for anyone with an internet connection. However, in the Web3 landscape application developers need to choose where to deploy their application (for example, on Ethereum, Solana or Cosmos). Not only does this limit TAM, but also means that application developers are encumbered by needing to choose the “right” chain to deploy their application on. This is not only a hard decision to make but a crucial one. There have been a handful of applications that were extremely successful but struggled due to the underlying blockchain. Furthermore, with the continuous development and evolution of blockchains today, the “right” chain may constantly be changing. In a chain abstracted future, application developers are no longer encumbered by having to select a chain that their success is tied to.
It is evident that we are headed towards an increasingly multichain future. This inevitably will only compound UX issues that are one of the most critical barriers to mainstream adoption. We believe chain abstraction, with its various components, is a possible solution to many of crypto’s UX problems today.