1
Stand with Ukraine and Donate here
logo

Blockchain scalability

Blockchain

One of the characteristic trends of the modern world is the consolidation of power and resources within any one institution. On the one hand, this allows achieving the set result more efficiently, but on the other hand, at the cost of infringing on the freedom of a person as a whole and restricting his access to important information. And blockchain technology is one of the few ways to effectively counter this trend. At least at a basic level - financial interaction. But the more actively these technologies were used, the more clearly the main problem emerged - not very efficient scalability.

Scalability issue

The blockchain network, in its essence, is a decentralized ledger in which each element of the network stores all the information available in it. This ensures that the loss of one of the nodes does not affect the overall functionality, and constant checks for authenticity - prevent inconsistent changes to the stored information.

But the more computers, the more time is spent on these checks and on transferring data to each of them. Speed drops, bandwidth decreases. And in order to somehow control this, the cost of each transaction began to increase for preventing “clog of the channel” with small transfers. However, this was only a temporary solution.

Actually, this is the problem of scaling - the more nodes in the network, the worse its bandwidth and the slower the work as a whole. It is precisely such situations that Bitcoin users faced in 2017, when the growth in the popularity of cryptocurrencies led to a significant increase in the number of active nodes.

An active debate has begun on how to deal with this situation. And two different approaches were developed.

The first is an increase in block size and optimization of the information stored in it. The more transactions in a block, the less blocks are needed in general. The solution is effective, but only in the short term, because it gave a linear improvement, and the number of users grew exponentially. And very soon it would be necessary to increase the blocks again and again. Not to mention that an increase in blocks would complicate the mining process and reduce the competitiveness of machines with low computing power.

The second is the introduction of additional external protocols and the optimization of information storage. That is - at first a certain number of transactions are collected in one external block, then - together with several of the same blocks it is encoded in the Bitcoin protocol. Reverse decryption is also possible - for the usual verification of the accuracy of information. That is, the data is encoded twice, which not only increases throughput but also increases security in general.

It was the second approach that formed the basis of the Lightning network, which allowed to sharply increase the bandwidth of the Bitcoin network. However, experts believe that this is also only a temporary solution since a further increase in the number of active users can further increase the load on the network. And you will need new improvements for scaling - Schnorr signatures, Liquid sidechains and other approaches that are just being tested.

However, several years before the network bandwidth ceases to meet the needs that we have. And it’s better to spend these years on an early search for possible solutions to the scaling problem.

Alternative consensus algorithms

The Bitcoin network uses the PoW consensus algorithm based on proof of work. It is reliable, but slow and requires serious computing power. However, this is not the only possible approach. At a minimum, there are two fairly promising alternatives.

The first is delegated proof of stake or DPoS. A more advanced version of the "proof of stake" in which users confirm the accuracy of the information with their own money. In this option, they can delegate their authority to elected representatives who will already confirm the truth of the transaction. And the users themselves will receive part of the reward in proportion to the amounts invested. The system was invented by someone Daniel Larimer, and it is something like real-time voting plus a complex system of social reputation.

The second option is Practical Byzantine fault tolerance (pBFT), developed in 1999 by Barbara Liskov and Miguel Castro, which allows you to achieve an unmistakable consensus based on the mutual confirmation, even if there are nodes that purposely spread false information.

Both of these approaches are quite promising, especially for non-financial use of blockchains, but they also have some disadvantages. They are as follows.

The delegated proof of stake is, on the one hand, an example of classical democracy. People elect deputies, vote for them (in our situation, with their money), and the "chosen ones" benefit their guarantors. But, as in real life, this does not always work as intended.

Reducing the number of verifiers to a few “trusted delegates” reduces the decentralization of the network. And with it - the ability to withstand "conspiracy of trustees." A simple example - the EOS blockchain was under the control of 11 token holders from China, who conspired with each other and completely control information within the network.

Practical Byzantine fault tolerance works due to the fact that all nodes periodically check the reliability of the system in a cyclic format. That is - in turn, and not simultaneously. This is not quite a classic blockchain, but the functions and results are largely the same. Plus, this principle has lower resource consumption and higher throughput.

However, this algorithm is vulnerable to “sybil” attacks, during which many new “malicious” network nodes are created. 33 percent is enough to completely take control of the entire network. On the Bitcoin network, for example, this is not possible, since new nodes must still carry out proof of work in order to attack the system. And in systems running on the basis of PoS, each node must contribute a certain amount in cryptocurrency for operation.

Conclusions

The existing PoW consensus algorithm is not enough to effectively deal with scaling issues. This means that widespread adoption of the blockchain in, for example, electoral processes or the transport sector will not happen due to the need to work with too much information. New solutions are needed.