Bitcoin
$5,566.72
-7.16
Ethereum
$173.21
-1.47
Litecoin
$41.08
-1.16
DigitalCash
$131.45
-5.79
Monero
$88.15
-1.86
Nxt
$0.05
0
Ethereum Classic
$7.21
-0.26
Dogecoin
$0.00
-0

What is Blockchain Scalability?

During the heydays of the 2017 bitcoin spike, the limits of the payment network became apparent. Prior to the implementation of SegWit, transactions could take a week or more to process with transaction fees measured in the hundreds of dollars. Not only did this effectively make bitcoin ineffective for microtransactions, but it also revealed a fatal flaw of blockchains that may hinder their applicability in the real world.

When blockchains were introduced by Satoshi Nakamoto, mass use or bottlenecking were not considerations to be solved. Nakamoto’s objective simply was to create a truly portable, trustless way to use digital money. At best, Nakamoto’s model was just a proof-of-concept.  However, as the world grows dependent on Nakamoto’s model, a way to make it work in real-world applications become more necessary.

So, how does one bring elasticity to a rigid model? This article will look at what is blockchain scalability and what are some of the solutions toward achieving it.

 

Scalability

To understand scalability, we will return to an example used before on this website, that of a bank. When the bank functions well, the customer can come in, deliver his transaction with the clerk, and have the clerk process it without any unnecessary waits or bottlenecks.

However, what happens if it is extraordinarily busy in the bank, such as on the first of the month? Our clerk may have trouble keeping up. Even if the clerk is the world’s fastest typist (which she is not), the infrastructure would betray her. The vacuum lines can only move transactions so fast, the receipt boxes can only hold so many files before needing to be emptied, and the computer’s sync time cannot be quickened. In other words, the bank is unscalable or unable to adjust to increases in demand.

“When talking about the future of blockchain technology, the tech’s scalability issues need to be addressed. Not because it’s anywhere near the most interesting aspect of it, but because it’s the biggest challenge the industry faces today,” Invest in Blockchain reports.

“Mass adoption won’t happen if blockchains can’t scale, simply because most people will not accept slower applications than they’re used to just for the sake of decentralization.”

“Since the decentralized ledger that forms the foundation of a blockchain keeps getting bigger with every block added to the chain, scalability is an inherent problem of blockchain technology. Moreover, decentralized networks severely increase the cost of maintenance and transactions, because the nodes need to be incentivized to validate the network.”

“Then there is the problem that with an increase in the number of nodes in a network, the harder it becomes to reach consensus among these nodes — but less nodes means less decentralization.”

“These problems inherent to blockchain technology, and decentralized ledger technology in general, have come to light over the past year as cryptocurrencies became more popular. Bitcoin and Ethereum display serious scalability issues, and most other blockchain projects haven’t actually been tested for this yet, as their blockchains haven’t been subjected to serious transaction volumes.”

Every blockchain has its own scalability problems. For the sake of space, we will only be focusing on bitcoin and Ethereum.

Bitcoin: Prior to SegWit, bitcoin functioned on the limits established by Satoshi Nakamoto, which were created to prevent the possibility of real-time hacking. This includes a block size of one megabyte and a block discovery time of ten minutes.

The problem with this is that – using the bank example again – the receipt box can fill up before the ten minutes are up. What this means is that there are transactions being intentionally delayed in verification because there is no place to put them. During the height of the 2017 price spike, this meant that a transaction could take a week to process. The only way to get a faster processing time is to offer more in transaction fees, negating any value the system had for low-value transactions.

The debate to solve this fell along two lines: increase the block size and make transactions smaller. Both solutions were flawed. Making the block size bigger, for example, did not offer a permanent solution and would require more powerful computers to process the increased memory demands. This could create quasi-centralization among the larger miners (which one can argue already exist due to energy cost and the existence of mega mining farms). Segregating the transaction signature or “witness file,” conversely, can also be seen as “kicking the can down the road,” as it does not offer a permanent solution. While effectively reducing a transaction’s need space down by three-quarters, this only raise bitcoin’s maximum transactions per second to about 60 – far below the major payment channels or even other cryptocurrencies.

Ethereum: Unlike bitcoin, Ethereum does not have a block limit and its block generation time is 14 seconds approximately. This allows Ethereum to process significantly more transactions than bitcoin at a consistently lower cost.

However, Ethereum has a maximum gas limit per block. Using our bank example, this is like saying that the clerk’s computer is only allowed to accept so many commands before syncing. Gas is the computational cost to perform a dApp operation in the Ethereum Virtual Machine; all transactions must be bundled in a way so that it meets or comes under the Gas limit in order to be processed. Overages must wait for the next block. This inefficiency effectively limits Ethereum’s scalability.

As Vitalik Buterin, the creator of Ethereum, explained in 2017: “Bitcoin is currently processing a bit less than three transactions per second and if it goes close to four, it is already at peak capacity. Ethereum has been doing five per second and if it goes above six, then it is also at peak capacity. On the other hand, Uber on average does 12 rides per second, PayPal several hundred, Visa several thousand, major stock exchanges tens of thousands, and in IoT, you’re talking hundreds of thousands per second.”

“There already is really a lot of institutional hype in the space and just public hype. So when you have Vladimir Putin having known what Blockchains and Ethereum are and Paris Hilton going out promoting ICOs on Twitter, that’s peak hype. But the reason why a lot of this hasn’t materialized into action yet is precisely because of some of these technical obstacles that make Blockchains work okay for niche use cases but not really for work well for mainstream use.”

 

Bringing Flexibility to the Inflexible

Much of the problem with solving the scalability problem is that there are large egos in the cryptocurrency community that do not like to be told that they may be wrong. One example is with the failed SegWit 2x proposal, which would see the bitcoin block size increase.

“The Segwit2x effort began in May with a simple purpose:  to increase the blocksize and improve Bitcoin scalability,” SegWit 2x proponent Mike Belshe wrote in a circular sent to a mailing list. “At the time, the Bitcoin community was in crisis after nearly 3 years of heavy debate, and consensus for Segwit seemed like a distant mirage with only 30% support among miners. Segwit2x found its first success in August, as it broke the deadlock and quickly led to Segwit’s successful activation. Since that time, the team shifted its efforts to phase two of the project – a 2MB blocksize increase.”

“Our goal has always been a smooth upgrade for Bitcoin.  Although we strongly believe in the need for a larger blocksize, there is something we believe is even more important: keeping the community together. Unfortunately, it is clear that we have not built sufficient consensus for a clean blocksize upgrade at this time. Continuing on the current path could divide the community and be a setback to Bitcoin’s growth. This was never the goal of Segwit2x.”

“As fees rise on the blockchain, we believe it will eventually become obvious that on-chain capacity increases are necessary. When that happens, we hope the community will come together and find a solution, possibly with a blocksize increase. Until then, we are suspending our plans for the upcoming 2MB upgrade.”

“We want to thank everyone that contributed constructively to Segwit2x, whether you were in favor or against. Your efforts are what makes Bitcoin great. Bitcoin remains the greatest form of money mankind has ever seen, and we remain dedicated to protecting and fostering its growth worldwide.”

For a cryptocurrency to effectively compete against the major payment systems, like Visa or Mastercard, or to serve as a money system, it must be able to adapt effectively to the demands daily transactions would impose. One way forward may be to abandon proof-of-work.

The way proof-of-work works today is to have every server that runs the core software validate the transaction. While only a small number of the nodes must verify the transaction for the transaction to be considered “processed,” all nodes must eventually process the transaction. Not only is this redundant, it is also energy-intensive to an extent that many governments are either cognizant of the problem or developing plans to deal with it.

While proof-of-work was designed to encourage decentralization, it achieved the opposite. Proof-of-work led to some miners to develop large mining farms to guarantee mining rewards. This resulted in the majority of network hashing power being held by a small number of pools and mining facilities, with most small miners being effectively forced out.

Proof-of-stake, which places all of the verification duties on verifiers which placed a stake of the coin or token on the network, is slightly better than proof-of-work performance-wise. However, as the size of the stake increases the odds that a verifier wins the block reward, this constitutes a plutarchy where the richest users with the most powerful nodes would benefit most.

Proof-of-assignment, which is currently being used with IOTW, offers a low-power alternative to proof-of-work for private blockchains. “Proof-of-Assignment (PoA) is a consensus algorithm which can be used for permissioned ledgers,” Captainaltcoin.com reports. “A permissioned ledger is one which isn’t available for public utility and has a limited number of users. It uses a set of ‘authorities’ which are basically designated nodes that are in charge of creating new blocks and securing the ledger. With the PoA algorithm, a ledger will need a majority of its authority nodes to ‘approve’ of a block in order for that block to be added.”

“This algorithm is very suitable for private networks, due to its ability to keep outsiders from participating in the consensus. However, its biggest feature is that it’s rather centralized. This helps it be more efficient and more scalable than the average public blockchain.”

“PoA is seen as the answer to the shortcomings of currently most popular algorithms – PoW and PoS – which come to light when we attempt to apply these algorithms on the Internet of Things (IoT). IoT will be the network of physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, actuators, and connectivity; In 2017 there were 8.4 billion IoT devices recorded globally and this number is expected to rise to 30 billion in 2020, according to Gartner. Most of these devices can and will be very limited when it comes to energy consumption.”

Bitcoin’s Lightning Network represents a fix for those unwilling to change the underlying foundation. The Lightning Network is a secondary blockchain layer where transactions can be conducted without immediately introducing it to the blockchain. By creating micropayment channels, users can create transaction pathways for micropayments that only need to be recorded to the blockchain after the fact with the users’ stakes in the channel being the collateral to ensure trust.

Another such solution is Ethereum’s proposed Plasma Cash, which allow users to only focus on blocks relevant to them, while safeguarding user transactions. Users would create Plasma coins, that represent the Ether to be transferred. As the coin cannot be altered and is not fungible, it not only create a bridge to just the blocks affected – speeding up the verification process – but also create a situation where transactions cannot be hacked.

Regardless of the solution, the scalability issue is something that must be solved. As Cointelgraph explained: “Unless action is taken, it’s likely that transactions will take longer and longer to process. In a digital economy where fiat payments can be sent and received instantly, blockchain platforms need to offer the same if they are going to be regarded as a viable alternative — even if they offer an array of other compelling advantages. Otherwise, there’s a real risk that even the most ardent crypto enthusiasts will abandon this technology altogether.”

“Dwindling user numbers could see prices for major cryptocurrencies tumble, with assets once worth thousands of dollars depreciating in value to a small fraction of what they were before.”

“It could also mean that centralization is here to stay, with all of the imperfections that motivated the dawn of the blockchain community in the first place. From here, who knows how many brilliant crypto platforms may never come to fruition.”

Leave A Reply

Your email address will not be published.