After the Cancun Upgrade, What Lies Ahead for Ethereum?

Advanced6/3/2024, 6:38:28 AM
Ethereum seems to have entered a bottleneck period, and although data availability and user costs have improved, it is still unable to keep up with the growing demand for data. Thinking deeply, what is the bottleneck of Ethereum?

*Forward the Original Title ‘NHR:坎昆升级之后,以太坊前路在何方?(Ethereum特别篇)’

TL;DR

Since the Cancun Upgrade (Dencun) took place two months ago, the Ethereum ecosystem seems to have hit a rough patch despite the major upgrade focusing on “improving gas fees”. On the surface, mainnet gas fees have indeed continued to decline after the upgrade, with a maximum decrease of over 70%. However, the underlying reasons for this phenomenon lie in the overall sluggishness of the Ethereum ecosystem and the lackluster market performance.

On one hand, the long-term trend of ETH is weaker than BTC, despite their volatility tending to converge, a phenomenon unprecedented in historical bull cycles. Factors such as low ETH exchange rates and market share, as well as the uncertain prospects of Ethereum ETFs, reflect a lack of confidence and interest in the market. On the other hand, the disappointment of the “altcoin season” has caused investors to lose confidence in altcoins. L2 tokens have plummeted by over 60% from their peak within a month after the Cancun Upgrade, with ARB, which dominates the market, becoming “unrecoverable”, dropping back to near bear market lows, and its previous gains failing to meet expectations as a leading project. In contrast, new ecosystems such as Solana and Ton have experienced growth due to the power of memes. Influenced by the macro environment, this is still only a “surface phenomenon”. However, from an intrinsic perspective, it seems that the Cancun Upgrade has not brought substantial benefits to the L2 ecosystem. Data shows that the total value locked (TVL) of the L2 ecosystem increased by only about 3% at most after the upgrade (03.13 - 04.09), and then plummeted by 20% with the market crash, although the decline is smaller compared to token prices, it still fell back to the level before the upgrade, indicating that the positive expectations brought by this upgrade have been almost wiped out. However, there are some positive signs: 1) On-chain activities such as DeFi transactions have remained stable (the following figure takes Arbitrum as an example); 2) Networks such as Base and Linea have experienced a temporary increase in TVL driven by MEME popularity, which at least indicates that the ecosystem has not completely “extinct”.

Data source: DeFiLlama

On the other hand, there is still a significant gap in the user experience of different L2 networks (mainly in terms of transaction costs). The leading ecosystems enjoy a significant cost advantage, with transaction fees stabilizing at around $0.01, while medium-sized ecosystems (especially some zkRollups) are approximately ten times higher than the former (about 5-7 times higher for non-priority transactions), which was the average level before the Cancun upgrade. The following figure shows incomplete statistics on L2 transaction fees for April.

Data source: L2Fee (as of Apr. 18)

At least for now, we see that the improvement effect of EIP-4844 on transaction fees is not significant and relatively limited, which may also be one of the reasons for the growing fatigue.

Obviously, the issue of fees, as a long-standing problem for Ethereum, cannot be solved all at once, as overall costs also depend on other aspects:

1) Issues at the L2 level i) Calldata transaction types (blue section in the figure below) still account for a significant proportion of operating expenses across all L2s, mainly because some operators (mostly zkRollups) are still using this settlement method, leading to pricing deviations. This is because some projects have not yet caught up with the technology in a timely manner, but this situation will not persist in the long term.


Data source: L2BEAT (as of Apr. 18)

ii) Computational costs (pink section in the figure above) account for a large part of the operating expenses of zkEVM, while the presence of protocol layer fees, the typically lower TPS, and the need for inherent profits lead users to pay higher fees. Settlement fees on L1 have indeed significantly decreased, but this portion is actually borne by users.

Therefore, the profit model of L2 developers has almost not been affected by EIP-4844, as the base fee for EVM Gas is determined by developers/operators based on current demand. Assuming Rollups have the same revenue expectations but different business volumes and computational expenses, they will implement different Gas pricing for users, resulting in disparities.

Below is a simple fee propagation diagram ↓

iii) While the Gas expenditure of zkEVM is typically higher, instances of Gas spikes have also occurred within the Base of the OP Stack, with fees even exceeding $1 at times. This situation arises from network congestion caused by speculative meme activity and a temporary imbalance in block resource allocation. It reflects the limited capacity of some L2 networks to handle extreme conditions, which, if occurring intermittently, is equally unacceptable for users.

Therefore, the network status is an undeniable factor affecting L2 transaction costs. “Machine too cold” or “machine too hot” both affect “fuel consumption,” and due to the stable network status of leading ecosystems, they also demonstrate more “fuel efficiency”.

2) Issues at the L1 level

For the Ethereum mainnet, although each block can currently produce about 3 Blob data blocks, theoretically increasing the space by nearly 400KB and the maximum TPS to 300-1500, there may still not be enough. Due to the uniqueness of Blob, it is foreseeable that it will not only be gradually adopted by L2 but also by other data demands from the mainnet, including a large number of low-value/high-frequency transactions (such as blobscription), which often do not require high data availability but adopt the same Gas pricing standards as regular transactions. This may lead to unnecessary space occupation.

A similar situation has been reflected in the L2 ecosystem. Paradigm’s research indicates that Base has long been burdened with too much garbage data, leading to long-term data storage burdens on the network. These burdens may indirectly pass on to users through Gas pricing.

Data source: Paradigm

In the short-term dynamic process, continuously growing resource-intensive activities can also weaken the effective payload capacity of L1. Because such activities may significantly occupy Blob space in the short term, it means that the actual TPS provided by L1 to L2 decreases linearly.

In addition, the long-term burden on the Ethereum consensus layer is also worth noting. Since the Blob capacity is approximately 10 times that of the main blocks, generating and processing larger amounts of data also require higher computational power. Therefore, will the hardware requirements for validators increase, and will production costs be passed on to users through Gas pricing?

In conclusion, an obvious conclusion is that Ethereum seems to have entered a bottleneck period. Although data availability and user costs have improved, they still cannot keep up with the increasingly growing data demands. Upon deeper reflection, what is Ethereum’s bottleneck?

An individual enters a bottleneck period often because their capabilities are not sufficient to meet their own expectations. Similar to individuals, Ethereum’s bottleneck is the mismatch between the scale of the ecosystem and its carrying capacity. It’s like a car filled with passengers and clutter, if the engine power is limited, not only will it run slowly, but it will also consume more fuel. High costs are mainly used to maintain the sustainability of the consensus system, which is crucial and indispensable for Ethereum. However, when this impact indirectly passes on to the user level, it becomes a key obstacle for Web3 to achieve mass adoption.

Of course, the simplest and most direct solution is to lower the price of ETH, and costs will naturally decrease. We would like that, but developers certainly won’t allow it…

During the downturn in the market and the period of low activity in the ecosystem, both mainnet and L2 transaction fees have “significantly improved” (as of May 12th), compared to this, the next-generation PoS chains benefit from the updates in the underlying technology to solve the cost problem by improving scalability. For example, Solana, Aptos, and Sui focus on parallel processing to increase throughput, while the Cosmos blockchain emphasizes modularity (Celestia) to expand the ecosystem dimension, or utilizes high programmability to create specialized high-performance application chains (Sei). Although Ethereum has general advantages and a large user base, the underlying scalability is still one of its biggest pain points in development.

At the market level, the competition for user traffic among major public chain factions has become increasingly fierce. The Bitcoin ecosystem is growing rapidly, Solana continues its new round of growth momentum with memes, and new factions like Sui and Ton also have great network effect potential. However, Ethereum seems to be in a low-growth phase due to high costs and scalability difficulties, and after the Cancun upgrade, the “investment attractiveness” has decreased, leaving people questioning how much “ecosystem dividends” it can still bring. So, is there no way out? Do we just watch ETH plummet? We would like that, but developers certainly won’t allow it…

The Cancun upgrade is certainly not a “panacea”, and “long-term treatment” is often effective in solving “symptoms”. For Ethereum, scalability remains the most important development theme, including the long-term scalability project aimed at Danksharding.

Looking ahead, can Danksharding help Ethereum break through its current predicament?

Figuratively speaking, if you want to maximize the growth of a block’s capacity, then its three baseline dimensions (length, width, and height) must all increase simultaneously. Since Ethereum needs to cater to massive data demands, it must achieve this. Therefore, Danksharding is such a “three-dimensional” scalability solution. The core of Danksharding is “sharding” technology. Although sharding is not a new concept (first proposed by Professor Jiaping Wang in 2019), its scalability is no less than other mainstream technologies. Mainstream solutions focus on fine-grained underlying design, but parallel computing has no advantage in verification and confirmation time, and Tendermint + Optimistic Process, although ensuring time efficiency, implies higher security risks and is not conducive to decentralization, which is why they are usually deployed on application chains. As a general-purpose chain, Ethereum usually needs to make multiple trade-offs, and the comprehensive advantages embodied by sharding are:

  • Asynchronous consensus group: Splitting blocks and validator subsets to achieve “parallelization” of the consensus layer, significantly improving verification efficiency and scalability;

  • Random rotation mechanism: Eliminating the correlation between validator combinations to ensure decentralization;

  • Sharding: Based on cryptography and probability theory data confidence levels to meet the general underlying security definition and strong privacy requirements. Therefore, the best compromise solution that combines scalability, decentralization, and security is most suitable for Ethereum. The appearance of Blob data blocks indicates that EIP-4844 is the first step for Ethereum to implement sharding (but asynchronous verification has not yet been achieved). Danksharding then makes some improvements based on the original concept to make scalability more “three-dimensional”, mainly changing from “splitting” to “expanding”, increasing the number of Blob to up to 64, and implementing cross-verification. Its design of “three-dimensionality” lies in promoting the adoption of Blob by second-layer/third-layer ecosystems through horizontal expansion of Blob, thereby also improving vertical scalability.

Can transaction fees, which are of greatest concern to ordinary users, be reduced due to Danksharding? — For L2 users, they have indeed been enjoying ultra-low mainnet settlement fees. As for L1 users, ample mainnet space can help stabilize Gas fluctuations, and under normal circumstances, the average Gas per block will also decrease accordingly.

However, no technology is perfect. First and foremost, sharding technology is extremely difficult. Although Danksharding has reduced the difficulty of this project through optimization improvements, it still requires several years to complete and demands that the blockchain itself meets strict conditions, such as a sufficiently large data processing volume, and very high requirements for the dynamic coordination of the entire system.

Next, due to the nature of Blob being time-limited and subject to expiration, the limitations on its data availability remain inevitable. To ensure the effectiveness of the vast amount of Blob data in Danksharding, it is necessary to conduct “sampling inspections” of the data. Therefore, Data Availability Sampling (DAS) based on mathematical confidence levels will be introduced to Ethereum. On the one hand, DAS can reduce the burden on nodes storing data, while also ensuring the “probability of correctness” or “validity” of historical data. However, on the other hand, it requires all nodes, including light nodes, to conduct sampling to enhance overall security, implying an increase in the overall threshold for all ecosystem participants. Additionally, the growth of mainnet traffic represents an increase in overall load. The network may need to handle up to 600KB of data per second, resulting in processing up to 1GB of data in about 30 minutes. This means that validators’ hardware and software performance, communication speed, and resource allocation capabilities must meet higher requirements, or else pose new challenges to the ecosystem’s costs. Therefore, Ethereum will continue to face immense pressure from Dencun to Danksharding, and the decrease in L1 fees does not necessarily mean structural improvements in L2 fees (due to potential congestion issues in L2 itself). However, developers will still strive to introduce additional updates to strengthen the flexibility of the ecosystem, such as: 1) Verkle Trees: Improved data structures based on Merkle trees and elliptic curves, which transform original data into feature vectors for quick verification of simple hashes, reducing information redundancy and storage burdens (this improvement is expected to be implemented after the future Osaka upgrade).

2) KZG: This represents a more economically efficient consensus layer coordination strategy within Ethereum. Validators can swiftly verify the validity of Blob data generated by trusted random number settings without needing to inspect the complete data. It’s akin to an “embedded zero-knowledge proof” within L1. 3) Smart Contract Accounts (EIP-3074): This proposal is anticipated to be introduced in the Pectra upgrade and aims to enhance account abstract class technology while increasing resistance to quantum computing. It introduces several new wallet features to users, including batch transactions, asset recovery, and using non-ETH tokens to pay for gas. This means you could pay for gas with stablecoins without worrying about price drops, or use altcoins to pay for gas while enjoying price increases. Yes, please. Finally, there are some questions/suggestions left for us to ponder, such as: - On the technical front: - DAS, on one hand, offers cryptographic protection, and research on zk-STARK indicates that the actual distribution of randomly sampled satisfying constraints is highly concentrated. Is it worth adopting it for other layered networks? - Exploring the prospects of implementing parallel computation in EVM chains. - Analyzing methods and punishment mechanisms for validator combination correlation. - Can infrastructure like Celestia (TIA) DA Layer become a novel Ethereum layer-outside DA solution? - On the cost front: - Implementing multi-dimensional pricing for gas (proposed by Vitalik Buterin), no longer constrained by a single linear factor.- Balancing token price increases with baseline setting of base fees to regulate overall ecosystem costs.- On the ecosystem level: - Can domains like social, gaming, DeFi, esports, generative AI, etc., become X-factors for ecosystem growth?-Adding the following ingredients in moderation to water…

. . .

Summary: The future of Ethereum depends on the growth of its community.

For a long time, L2 has had to rely on L1 for scalability. When L1 encounters bottlenecks, it restricts the overall development of the ecosystem. The London upgrade signifies a change in the collaboration between L1 and L2, so Ethereum’s growth can be attributed to the transformation of “social relationships”: L1 needs to, in turn, solve problems for L2, actively create more favorable conditions for L2 development, and gradually maximize the efficiency of production relations and their expansion. However, blockchain is ultimately just a machine. Machines will have flaws, lifespans, and limits, so the key lies in those who control these machines (representing the community). This community, after a decade of development, still possesses an excellent trait: actively seeking self-improvement to forge this machine into a stronger one, albeit this journey is indeed long and arduous.

So will Ethereum ultimately succeed? Perhaps it depends on how we define the “success” of a blockchain: a hundredfold increase in value. Throughput of tens of thousands? The user base of billions? This is often difficult to define. But at least we can foresee that Ethereum still has “growth potential,” meaning it still has more or less room for growth. Using our ALL-ON-LINE™ algorithm, there is a significant correlation between asset-implied value and some form of data depth, and NEM, built on Ethereum, has the largest on-chain adoption of discrete data stacks among all blockchains. Through setting on-chain adoption and off-chain demand, the estimated valuation level of ETH based on a simple algorithm model is as follows:

Note: The above are long-term expected values, for reference only.

As for the market aspect, this article cannot provide concrete advice. When it comes to so-called “growth assets,” investors can only give themselves more confidence and time. Of course, whether they have a “promising future” or are just “pure garbage” can only be answered by time. But if you believe they are utterly “trash coins,” consider the following suggestions:

  1. Sell your trash coins to the 🐕 Whale;

  2. Spend your trash coins as fees on some future day;

  3. Treat us as a trash can and dump those trash coins here… (without paying)

Disclaimer:

  1. This article is reprinted from [NoHedge]. Forward the Original Title‘NHR:坎昆升级之后,以太坊前路在何方?(Ethereum特别篇)’. All copyrights belong to the original author [NHR Team]. If there are objections to this reprint, please contact the Gate Learn team, and they will handle it promptly.
  2. Liability Disclaimer: The views and opinions expressed in this article are solely those of the author and do not constitute any investment advice.
  3. Translations of the article into other languages are done by the Gate Learn team. Unless mentioned, copying, distributing, or plagiarizing the translated articles is prohibited.

After the Cancun Upgrade, What Lies Ahead for Ethereum?

Advanced6/3/2024, 6:38:28 AM
Ethereum seems to have entered a bottleneck period, and although data availability and user costs have improved, it is still unable to keep up with the growing demand for data. Thinking deeply, what is the bottleneck of Ethereum?

*Forward the Original Title ‘NHR:坎昆升级之后,以太坊前路在何方?(Ethereum特别篇)’

TL;DR

Since the Cancun Upgrade (Dencun) took place two months ago, the Ethereum ecosystem seems to have hit a rough patch despite the major upgrade focusing on “improving gas fees”. On the surface, mainnet gas fees have indeed continued to decline after the upgrade, with a maximum decrease of over 70%. However, the underlying reasons for this phenomenon lie in the overall sluggishness of the Ethereum ecosystem and the lackluster market performance.

On one hand, the long-term trend of ETH is weaker than BTC, despite their volatility tending to converge, a phenomenon unprecedented in historical bull cycles. Factors such as low ETH exchange rates and market share, as well as the uncertain prospects of Ethereum ETFs, reflect a lack of confidence and interest in the market. On the other hand, the disappointment of the “altcoin season” has caused investors to lose confidence in altcoins. L2 tokens have plummeted by over 60% from their peak within a month after the Cancun Upgrade, with ARB, which dominates the market, becoming “unrecoverable”, dropping back to near bear market lows, and its previous gains failing to meet expectations as a leading project. In contrast, new ecosystems such as Solana and Ton have experienced growth due to the power of memes. Influenced by the macro environment, this is still only a “surface phenomenon”. However, from an intrinsic perspective, it seems that the Cancun Upgrade has not brought substantial benefits to the L2 ecosystem. Data shows that the total value locked (TVL) of the L2 ecosystem increased by only about 3% at most after the upgrade (03.13 - 04.09), and then plummeted by 20% with the market crash, although the decline is smaller compared to token prices, it still fell back to the level before the upgrade, indicating that the positive expectations brought by this upgrade have been almost wiped out. However, there are some positive signs: 1) On-chain activities such as DeFi transactions have remained stable (the following figure takes Arbitrum as an example); 2) Networks such as Base and Linea have experienced a temporary increase in TVL driven by MEME popularity, which at least indicates that the ecosystem has not completely “extinct”.

Data source: DeFiLlama

On the other hand, there is still a significant gap in the user experience of different L2 networks (mainly in terms of transaction costs). The leading ecosystems enjoy a significant cost advantage, with transaction fees stabilizing at around $0.01, while medium-sized ecosystems (especially some zkRollups) are approximately ten times higher than the former (about 5-7 times higher for non-priority transactions), which was the average level before the Cancun upgrade. The following figure shows incomplete statistics on L2 transaction fees for April.

Data source: L2Fee (as of Apr. 18)

At least for now, we see that the improvement effect of EIP-4844 on transaction fees is not significant and relatively limited, which may also be one of the reasons for the growing fatigue.

Obviously, the issue of fees, as a long-standing problem for Ethereum, cannot be solved all at once, as overall costs also depend on other aspects:

1) Issues at the L2 level i) Calldata transaction types (blue section in the figure below) still account for a significant proportion of operating expenses across all L2s, mainly because some operators (mostly zkRollups) are still using this settlement method, leading to pricing deviations. This is because some projects have not yet caught up with the technology in a timely manner, but this situation will not persist in the long term.


Data source: L2BEAT (as of Apr. 18)

ii) Computational costs (pink section in the figure above) account for a large part of the operating expenses of zkEVM, while the presence of protocol layer fees, the typically lower TPS, and the need for inherent profits lead users to pay higher fees. Settlement fees on L1 have indeed significantly decreased, but this portion is actually borne by users.

Therefore, the profit model of L2 developers has almost not been affected by EIP-4844, as the base fee for EVM Gas is determined by developers/operators based on current demand. Assuming Rollups have the same revenue expectations but different business volumes and computational expenses, they will implement different Gas pricing for users, resulting in disparities.

Below is a simple fee propagation diagram ↓

iii) While the Gas expenditure of zkEVM is typically higher, instances of Gas spikes have also occurred within the Base of the OP Stack, with fees even exceeding $1 at times. This situation arises from network congestion caused by speculative meme activity and a temporary imbalance in block resource allocation. It reflects the limited capacity of some L2 networks to handle extreme conditions, which, if occurring intermittently, is equally unacceptable for users.

Therefore, the network status is an undeniable factor affecting L2 transaction costs. “Machine too cold” or “machine too hot” both affect “fuel consumption,” and due to the stable network status of leading ecosystems, they also demonstrate more “fuel efficiency”.

2) Issues at the L1 level

For the Ethereum mainnet, although each block can currently produce about 3 Blob data blocks, theoretically increasing the space by nearly 400KB and the maximum TPS to 300-1500, there may still not be enough. Due to the uniqueness of Blob, it is foreseeable that it will not only be gradually adopted by L2 but also by other data demands from the mainnet, including a large number of low-value/high-frequency transactions (such as blobscription), which often do not require high data availability but adopt the same Gas pricing standards as regular transactions. This may lead to unnecessary space occupation.

A similar situation has been reflected in the L2 ecosystem. Paradigm’s research indicates that Base has long been burdened with too much garbage data, leading to long-term data storage burdens on the network. These burdens may indirectly pass on to users through Gas pricing.

Data source: Paradigm

In the short-term dynamic process, continuously growing resource-intensive activities can also weaken the effective payload capacity of L1. Because such activities may significantly occupy Blob space in the short term, it means that the actual TPS provided by L1 to L2 decreases linearly.

In addition, the long-term burden on the Ethereum consensus layer is also worth noting. Since the Blob capacity is approximately 10 times that of the main blocks, generating and processing larger amounts of data also require higher computational power. Therefore, will the hardware requirements for validators increase, and will production costs be passed on to users through Gas pricing?

In conclusion, an obvious conclusion is that Ethereum seems to have entered a bottleneck period. Although data availability and user costs have improved, they still cannot keep up with the increasingly growing data demands. Upon deeper reflection, what is Ethereum’s bottleneck?

An individual enters a bottleneck period often because their capabilities are not sufficient to meet their own expectations. Similar to individuals, Ethereum’s bottleneck is the mismatch between the scale of the ecosystem and its carrying capacity. It’s like a car filled with passengers and clutter, if the engine power is limited, not only will it run slowly, but it will also consume more fuel. High costs are mainly used to maintain the sustainability of the consensus system, which is crucial and indispensable for Ethereum. However, when this impact indirectly passes on to the user level, it becomes a key obstacle for Web3 to achieve mass adoption.

Of course, the simplest and most direct solution is to lower the price of ETH, and costs will naturally decrease. We would like that, but developers certainly won’t allow it…

During the downturn in the market and the period of low activity in the ecosystem, both mainnet and L2 transaction fees have “significantly improved” (as of May 12th), compared to this, the next-generation PoS chains benefit from the updates in the underlying technology to solve the cost problem by improving scalability. For example, Solana, Aptos, and Sui focus on parallel processing to increase throughput, while the Cosmos blockchain emphasizes modularity (Celestia) to expand the ecosystem dimension, or utilizes high programmability to create specialized high-performance application chains (Sei). Although Ethereum has general advantages and a large user base, the underlying scalability is still one of its biggest pain points in development.

At the market level, the competition for user traffic among major public chain factions has become increasingly fierce. The Bitcoin ecosystem is growing rapidly, Solana continues its new round of growth momentum with memes, and new factions like Sui and Ton also have great network effect potential. However, Ethereum seems to be in a low-growth phase due to high costs and scalability difficulties, and after the Cancun upgrade, the “investment attractiveness” has decreased, leaving people questioning how much “ecosystem dividends” it can still bring. So, is there no way out? Do we just watch ETH plummet? We would like that, but developers certainly won’t allow it…

The Cancun upgrade is certainly not a “panacea”, and “long-term treatment” is often effective in solving “symptoms”. For Ethereum, scalability remains the most important development theme, including the long-term scalability project aimed at Danksharding.

Looking ahead, can Danksharding help Ethereum break through its current predicament?

Figuratively speaking, if you want to maximize the growth of a block’s capacity, then its three baseline dimensions (length, width, and height) must all increase simultaneously. Since Ethereum needs to cater to massive data demands, it must achieve this. Therefore, Danksharding is such a “three-dimensional” scalability solution. The core of Danksharding is “sharding” technology. Although sharding is not a new concept (first proposed by Professor Jiaping Wang in 2019), its scalability is no less than other mainstream technologies. Mainstream solutions focus on fine-grained underlying design, but parallel computing has no advantage in verification and confirmation time, and Tendermint + Optimistic Process, although ensuring time efficiency, implies higher security risks and is not conducive to decentralization, which is why they are usually deployed on application chains. As a general-purpose chain, Ethereum usually needs to make multiple trade-offs, and the comprehensive advantages embodied by sharding are:

  • Asynchronous consensus group: Splitting blocks and validator subsets to achieve “parallelization” of the consensus layer, significantly improving verification efficiency and scalability;

  • Random rotation mechanism: Eliminating the correlation between validator combinations to ensure decentralization;

  • Sharding: Based on cryptography and probability theory data confidence levels to meet the general underlying security definition and strong privacy requirements. Therefore, the best compromise solution that combines scalability, decentralization, and security is most suitable for Ethereum. The appearance of Blob data blocks indicates that EIP-4844 is the first step for Ethereum to implement sharding (but asynchronous verification has not yet been achieved). Danksharding then makes some improvements based on the original concept to make scalability more “three-dimensional”, mainly changing from “splitting” to “expanding”, increasing the number of Blob to up to 64, and implementing cross-verification. Its design of “three-dimensionality” lies in promoting the adoption of Blob by second-layer/third-layer ecosystems through horizontal expansion of Blob, thereby also improving vertical scalability.

Can transaction fees, which are of greatest concern to ordinary users, be reduced due to Danksharding? — For L2 users, they have indeed been enjoying ultra-low mainnet settlement fees. As for L1 users, ample mainnet space can help stabilize Gas fluctuations, and under normal circumstances, the average Gas per block will also decrease accordingly.

However, no technology is perfect. First and foremost, sharding technology is extremely difficult. Although Danksharding has reduced the difficulty of this project through optimization improvements, it still requires several years to complete and demands that the blockchain itself meets strict conditions, such as a sufficiently large data processing volume, and very high requirements for the dynamic coordination of the entire system.

Next, due to the nature of Blob being time-limited and subject to expiration, the limitations on its data availability remain inevitable. To ensure the effectiveness of the vast amount of Blob data in Danksharding, it is necessary to conduct “sampling inspections” of the data. Therefore, Data Availability Sampling (DAS) based on mathematical confidence levels will be introduced to Ethereum. On the one hand, DAS can reduce the burden on nodes storing data, while also ensuring the “probability of correctness” or “validity” of historical data. However, on the other hand, it requires all nodes, including light nodes, to conduct sampling to enhance overall security, implying an increase in the overall threshold for all ecosystem participants. Additionally, the growth of mainnet traffic represents an increase in overall load. The network may need to handle up to 600KB of data per second, resulting in processing up to 1GB of data in about 30 minutes. This means that validators’ hardware and software performance, communication speed, and resource allocation capabilities must meet higher requirements, or else pose new challenges to the ecosystem’s costs. Therefore, Ethereum will continue to face immense pressure from Dencun to Danksharding, and the decrease in L1 fees does not necessarily mean structural improvements in L2 fees (due to potential congestion issues in L2 itself). However, developers will still strive to introduce additional updates to strengthen the flexibility of the ecosystem, such as: 1) Verkle Trees: Improved data structures based on Merkle trees and elliptic curves, which transform original data into feature vectors for quick verification of simple hashes, reducing information redundancy and storage burdens (this improvement is expected to be implemented after the future Osaka upgrade).

2) KZG: This represents a more economically efficient consensus layer coordination strategy within Ethereum. Validators can swiftly verify the validity of Blob data generated by trusted random number settings without needing to inspect the complete data. It’s akin to an “embedded zero-knowledge proof” within L1. 3) Smart Contract Accounts (EIP-3074): This proposal is anticipated to be introduced in the Pectra upgrade and aims to enhance account abstract class technology while increasing resistance to quantum computing. It introduces several new wallet features to users, including batch transactions, asset recovery, and using non-ETH tokens to pay for gas. This means you could pay for gas with stablecoins without worrying about price drops, or use altcoins to pay for gas while enjoying price increases. Yes, please. Finally, there are some questions/suggestions left for us to ponder, such as: - On the technical front: - DAS, on one hand, offers cryptographic protection, and research on zk-STARK indicates that the actual distribution of randomly sampled satisfying constraints is highly concentrated. Is it worth adopting it for other layered networks? - Exploring the prospects of implementing parallel computation in EVM chains. - Analyzing methods and punishment mechanisms for validator combination correlation. - Can infrastructure like Celestia (TIA) DA Layer become a novel Ethereum layer-outside DA solution? - On the cost front: - Implementing multi-dimensional pricing for gas (proposed by Vitalik Buterin), no longer constrained by a single linear factor.- Balancing token price increases with baseline setting of base fees to regulate overall ecosystem costs.- On the ecosystem level: - Can domains like social, gaming, DeFi, esports, generative AI, etc., become X-factors for ecosystem growth?-Adding the following ingredients in moderation to water…

. . .

Summary: The future of Ethereum depends on the growth of its community.

For a long time, L2 has had to rely on L1 for scalability. When L1 encounters bottlenecks, it restricts the overall development of the ecosystem. The London upgrade signifies a change in the collaboration between L1 and L2, so Ethereum’s growth can be attributed to the transformation of “social relationships”: L1 needs to, in turn, solve problems for L2, actively create more favorable conditions for L2 development, and gradually maximize the efficiency of production relations and their expansion. However, blockchain is ultimately just a machine. Machines will have flaws, lifespans, and limits, so the key lies in those who control these machines (representing the community). This community, after a decade of development, still possesses an excellent trait: actively seeking self-improvement to forge this machine into a stronger one, albeit this journey is indeed long and arduous.

So will Ethereum ultimately succeed? Perhaps it depends on how we define the “success” of a blockchain: a hundredfold increase in value. Throughput of tens of thousands? The user base of billions? This is often difficult to define. But at least we can foresee that Ethereum still has “growth potential,” meaning it still has more or less room for growth. Using our ALL-ON-LINE™ algorithm, there is a significant correlation between asset-implied value and some form of data depth, and NEM, built on Ethereum, has the largest on-chain adoption of discrete data stacks among all blockchains. Through setting on-chain adoption and off-chain demand, the estimated valuation level of ETH based on a simple algorithm model is as follows:

Note: The above are long-term expected values, for reference only.

As for the market aspect, this article cannot provide concrete advice. When it comes to so-called “growth assets,” investors can only give themselves more confidence and time. Of course, whether they have a “promising future” or are just “pure garbage” can only be answered by time. But if you believe they are utterly “trash coins,” consider the following suggestions:

  1. Sell your trash coins to the 🐕 Whale;

  2. Spend your trash coins as fees on some future day;

  3. Treat us as a trash can and dump those trash coins here… (without paying)

Disclaimer:

  1. This article is reprinted from [NoHedge]. Forward the Original Title‘NHR:坎昆升级之后,以太坊前路在何方?(Ethereum特别篇)’. All copyrights belong to the original author [NHR Team]. If there are objections to this reprint, please contact the Gate Learn team, and they will handle it promptly.
  2. Liability Disclaimer: The views and opinions expressed in this article are solely those of the author and do not constitute any investment advice.
  3. Translations of the article into other languages are done by the Gate Learn team. Unless mentioned, copying, distributing, or plagiarizing the translated articles is prohibited.
Start Now
Sign up and get a
$100
Voucher!