
{"id":25289,"date":"2024-12-03T09:28:49","date_gmt":"2024-12-03T09:28:49","guid":{"rendered":"https:\/\/mycryptomania.com\/?p=25289"},"modified":"2024-12-03T09:28:49","modified_gmt":"2024-12-03T09:28:49","slug":"2-surge-how-will-ethereum-scale","status":"publish","type":"post","link":"https:\/\/mycryptomania.com\/?p=25289","title":{"rendered":"2. Surge\u200a\u2014\u200aHow will Ethereum scale?"},"content":{"rendered":"<h3>2. Surge\u200a\u2014\u200aHow will Ethereum\u00a0scale?<\/h3>\n<p>While \u201cMerge\u201d focuses on optimizing the Proof-of-Stake mechanism, \u201cSurge\u201d is focused on Scaling\u2014speed, throughput, and transactions-per-second (tps)\u2014while not compromising on security and decentralization.<\/p>\n<h3><strong>Rollup-Centric Roadm<\/strong>ap<\/h3>\n<p>Unlike other performance-oriented blockchains, where the base layer itself supports high speed and volumes, Ethereum has decided on a Rollup-Centric Roadmap. The base layer (L1) will provide security, and a type of Layer 2 (L2) called Rollup will handle most of the data and computation.<\/p>\n<p>An upgrade called Dencun went live in March \u201924 and implemented<a href=\"https:\/\/medium.com\/coinmonks\/ethereums-next-upgrade-eip4844-13012774213f\"> Proto-Danksharding,<\/a> which enabled L2s to post blocks to Ethereum more efficiently in the form of blobs. This has technically increased the Ethereum ecosystem\u2019s speed to <strong>1,000tps.<\/strong> While an improvement, this is not nearly\u00a0enough.<\/p>\n<p>The major constraint limiting tps is that more transactions increase the amount of data that has to be stored in\u00a0L1.<\/p>\n<h4><strong>Reduce data per transaction<\/strong><\/h4>\n<p>Given that each transaction currently uses an average of 180 bytes of space, even if the slot size is increased to 16MB, a max of <strong>7407 tps<\/strong> (16,000,000\/180) can be\u00a0reached.<\/p>\n<p>The space used by each transaction must be reduced to increase capacity. Some examples of data compression techniques being considered are:<\/p>\n<p>Long sequences of zero bytes can be replaced with 2 bytes representing the number of zero bytes there\u00a0areMultiple transactions signed by multiple users can be combined with a single signature.Addresses take up 20 bytes. If an address has been used before, it can be replaced with address pointers that use only 4\u00a0bytes.Transaction and fee values can be represented more compactly. There can even be a dictionary of commonly used\u00a0values.<\/p>\n<p>Larger slots combined with data compression techniques should increase capacity to <strong>58,000\u00a0tps.<\/strong><\/p>\n<h4><strong>Divide data between nodes for\u00a0storage<\/strong><\/h4>\n<p>16MB is a lot of data. If nodes have to store such large amounts of data, then only large and resource-rich entities will be able to carry out validating duties, making validators centralized. The solution is not to have to store the data while being assured that the correct data is stored elsewhere and can be downloaded when required. This is possible with <a href=\"https:\/\/github.com\/ethereum\/research\/wiki\/A-note-on-data-availability-and-erasure-coding\"><strong>Data Availability Sampling\u00a0(DAS)<\/strong><\/a><\/p>\n<p>Implementing the ethos of \u201cVerify. Don\u2019t trust,\u201d a full node downloads all data and executes all transactions before accepting a block. An honest full node cannot, therefore, be fooled into accepting invalid transactions.<\/p>\n<p>However, downloading every piece of data and executing every transaction is expensive. The solution is for every node to download only a small randomly selected part of the data. Anyone looking to reconstruct the full blocks can do so by querying other nodes for the data stored with\u00a0them.<\/p>\n<p>But if you\u2019re only downloading a small part of the data, how can you be sure that all the data is available? Here, we\u2019re helped by the technology called \u201cData Erasure Coding\u201d. The original data is blown up in a manner that if any data is missing, 50% of the blown-up data will be missing. The number of samples to be downloaded per node can then be adjusted so that there\u2019s a high probability that at least one of the downloaded samples will have data\u00a0missing.<\/p>\n<p>Another advantage of Data Erasure Coding is that you don&#8217;t need to download all the data. Once you have 50% of the data, you can reconstruct the full\u00a0data.<\/p>\n<p>Ethereum is currently evaluating between<strong> <\/strong><a href=\"https:\/\/ethresear.ch\/t\/peerdas-a-simpler-das-approach-using-battle-tested-p2p-components\/16541\"><strong>PeerDAS<\/strong><\/a> and <a href=\"https:\/\/ethresear.ch\/t\/subnetdas-an-intermediate-das-approach\/17169\"><strong>SubnetDAS<\/strong><\/a>. In PeerDAS, nodes listen in on a small group of Subnets that broadcast samples of 1 blob and query other nodes for data on other blobs. In SubnetDAS, nodes listen to broadcasts of all Subnets. Currently, the plan is for Consensus Nodes to go for SubnetDAS and for the rest to implement PeerDAS.<\/p>\n<p>PeerDAS and SubnetDAS are implementations of 1D Sampling. Once PeerDAS is implemented and blob count is slowly increased, the next step is likely to be <a href=\"https:\/\/ethresear.ch\/t\/2d-data-availability-with-kate-commitments\/8081\">2D Sampling<\/a>, which does random sampling not just within blobs but also between blobs. This would further reduce the amount of data stored by each\u00a0node.<\/p>\n<h4><strong>L2s add speed. But they\u2019re not trustless. What is the roadmap to make them trustless?<\/strong><\/h4>\n<p>Most L2s today are not decentralized or trustless.<\/p>\n<p>L2s can be in one of 3 stages of maturity:<\/p>\n<p><strong>Stage 1: <\/strong>The validation process may be fully centralized, but anyone should be able to run a node and sync the\u00a0chain.<\/p>\n<p><strong>Stage 2: <\/strong>A trustless proof system must ensure that only valid transactions are accepted. The Security Council can override the proof system provided it has at least a 75% majority. At least 26% of the Security Council (the quorum-breaking portion) must be outside the entity building the\u00a0Rollup.<\/p>\n<p><strong>Stage 3: <\/strong>A trustless proof system must ensure that only valid transactions are accepted. The Security Council can override the proof system, but only when there\u2019s a bug in the\u00a0code.<\/p>\n<p>In September earlier this year, Vitalik Buterin stated that he would <a href=\"https:\/\/cointelegraph.com\/news\/ethereum-l2-not-at-stage-1-are-dead-says-vitalik\">acknowledge L2s only if they reach at least Stage 1. <\/a>According to L2Beat, an analytics platform,<a href=\"https:\/\/l2beat.com\/scaling\/summary\"> Arbitrum, OP Mainnet, dYdX v3, and ZKSync Lite<\/a> claim to have reached this milestone.<\/p>\n<h4><strong>Using L2s currently doesn\u2019t feel like you\u2019re using a part of the Unified Ethereum Ecosystem<\/strong><\/h4>\n<p>To move funds between L2s today, one must first withdraw funds to the L1, wait for the challenge period to pass (in the case of optimistic rollups), and then transfer funds to the other\u00a0L2.<\/p>\n<p>Operating in the Ethereum ecosystem, therefore, does not feel like one is operating in a unified space. Each rollup is its own\u00a0silo.<\/p>\n<p>Some upgrades being considered to make the Rollup-Centric Ecosystem more seamlessly interoperable are:<\/p>\n<p><strong>Chain-Specific Addresses: <\/strong>Ethereum Ecosystem addresses look the same regardless of which L2 they are on. If the address specified the name of the chain, \u201cwallets\u201d could do all the background work involved in moving funds from one chain to the other. All the user would have to do is mention the address it wants to send\u00a0funds.<strong>Chain-Specific Payment Requests: <\/strong>In addition to making payments, it should be possible to request payment in specified tokens from specified chains. For example, a merchant should be able to request payment in USDC on Arbitrum.<strong>Light clients: <\/strong>One should not have to trust intermediaries to verify the chains one interacts with. Light clients should be able to do so themselves.<strong>Keystore wallets: <\/strong>Today, if you want to update the keys to your wallet, you must do so in every chain in which the wallets exist. With a keystore wallet, the address only needs to exist in one place and, therefore,<strong> <\/strong>needs to be updated only\u00a0once.<strong>Shared Token Bridge: <\/strong>Instead of withdrawing from one L2 and depositing to another for every transaction, it would be more efficient to have a shared minimal rollup that keeps track of how many tokens of which type are owed by and to each L2. These balances can be updated by periodically transferring funds between L2s in\u00a0bulk.<\/p>\n<h4><strong>Could Plasma be a better option than\u00a0Rollups?<\/strong><\/h4>\n<p>While the progress made with Rollups is impressive, it still isn\u2019t sufficient to support use cases like decentralized social\u00a0media.<\/p>\n<p>While Ethereum seems to have decided on a Rollup-Centric Roadmap, going the <a href=\"https:\/\/vitalik.eth.limo\/general\/2024\/10\/17\/futures2.html#4\">Plasma<\/a> route is possible. Plasma is a scaling solution where an Operator publishes blocks off-chain, and only Merkle roots are published on L1. This is different from Rollups, which publish the full block on-chain.<\/p>\n<p>The progress in SNARKs technology makes Plasma more viable than in the\u00a0past.<\/p>\n<h3><strong>Finally, should we \u201calso\u201d scale L1 itself and not just rely on L2s for\u00a0scaling?<\/strong><\/h3>\n<p>While Ethereum has decided on a path of scaling with L2s, there are risks associated with not scaling the base\u00a0layer:<\/p>\n<p>Transactions on the base layer give ETH utility and raise its value. If the ETH value drops, the capital securing the base layer will drop, making both L1 and L2s less\u00a0secureIf the base layer does not provide the needed security and the ecosystem weakens, L2s might prefer to go independent.There are likely to be situations when an L2 fails. Users will then need to go through the base layer to recover their funds. In that case, L1 should be able to support a high transaction load, at least temporarily.<\/p>\n<p>The easiest way to scale L1 is to increase the gas limit. This risks centralization, but there\u2019s more leeway today for increasing the gas limit after factoring in that technologies like statelessness and history expiry make large blocks easier to verify. Further, there\u2019s also scope for making some types of computation cheaper<\/p>\n<p><a href=\"https:\/\/medium.com\/coinmonks\/2-surge-how-will-ethereum-scale-5fed48a79dde\">2. Surge\u200a\u2014\u200aHow will Ethereum scale?<\/a> was originally published in <a href=\"https:\/\/medium.com\/coinmonks\">Coinmonks<\/a> on Medium, where people are continuing the conversation by highlighting and responding to this story.<\/p>","protected":false},"excerpt":{"rendered":"<p>2. Surge\u200a\u2014\u200aHow will Ethereum\u00a0scale? While \u201cMerge\u201d focuses on optimizing the Proof-of-Stake mechanism, \u201cSurge\u201d is focused on Scaling\u2014speed, throughput, and transactions-per-second (tps)\u2014while not compromising on security and decentralization. Rollup-Centric Roadmap Unlike other performance-oriented blockchains, where the base layer itself supports high speed and volumes, Ethereum has decided on a Rollup-Centric Roadmap. The base layer (L1) will [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"class_list":["post-25289","post","type-post","status-publish","format-standard","hentry","category-interesting"],"_links":{"self":[{"href":"https:\/\/mycryptomania.com\/index.php?rest_route=\/wp\/v2\/posts\/25289"}],"collection":[{"href":"https:\/\/mycryptomania.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mycryptomania.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/mycryptomania.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=25289"}],"version-history":[{"count":0,"href":"https:\/\/mycryptomania.com\/index.php?rest_route=\/wp\/v2\/posts\/25289\/revisions"}],"wp:attachment":[{"href":"https:\/\/mycryptomania.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=25289"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mycryptomania.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=25289"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mycryptomania.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=25289"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}