With that background let’s look at the actual underlying technology of the current web and the “web3” vision. Currently it will cost me roughly $20 a month to participate in this distributed computing system.
In the current web we start with the DNS lookup, which maps a human name to a server’s identity using a distributed system. As a site operator I contract with a registrar to provide my domain name. This is the first of two gatekeepers I will deal with, costing on the order of $10 a year. I then also need to either run or contract out my DNS authority server operation, which the registrar will often provide if I don’t want to do it myself.
Now I set up my server and storage solution at the other gatekeeper: my hosting provider. A good (if notoriously pricey) hosting solution is Amazon Web Services’s EC2. I’m starting off with a small site, so I can probably get away with a micro-instance, which is 1 cpu core and 1 GB of memory for about $8 a month, with $.08/GB-month for the persistent storage and $.09/GB sent to visitors to my web site.
Finally, I actually construct my site. My site really is a distributed computation split between my server and my users’ web browsers. The visitor’s browser is running JavaScript which performs the user-side computations and presentation while my side consists of a HTTP server, my own custom logic, and probably a database to store user-data in an efficient manner. This design splits the trust: the user’s browser is only trusted for that identified user’s data, while the server logic is trusted to access all users’ data. Persistent storage is primarily on the server, but I can cache data on the client for faster access. All told this is probably costing me $20 a month.
So what does the supposed “web3” add to this vision? The cryptocurrency web3 starts with all our existing infrastructure. So I still need a DNS name, I still need a server, I still need storage, and I still have a distributed computation occurring between the browser and the server. So already I haven’t removed any of the gatekeepers from the conventional distributed system, showing the claims of gatekeeper-free decentralization are false.
Web3 is only about adding an additional layer of complexity in the name of justifying the underlying cryptocurrencies. The web browser is augmented with a cryptocurrency wallet and part of the computation and storage is shifted from my server to the decentralized cryptocurrency infrastructure. When a user wants to use my service they will pay some amount of cryptocurrency to perform the cryptocurrency-side computation, with any remaining transferred to me as a fee for my service. So does the new infrastructure provide anything useful? Lets focus primarily on Ethereum, but the same problems appear regardless of the underlying cryptocurrency.
To begin with, Ethereum has the notion of coupling a small program to the transfer of Ethereum. These programs are written in a language called Solidity and then compiled into a stack-machine based intermediate representation. Of course letting arbitrary code potentially run forever wouldn’t work. So instead any program is run for only a limited number of instructions until it either completes or is terminated.
The measure of the amount of compute is called “gas”, with various instructions and operations costing a different amount of gas to process. The total cost of a transaction is the amount of gas consumed times the gas price.
Any given block of the Ethereum blockchain represents a maximum amount of execution, currently 30 million gas. And the system adds a new block every 15 seconds, which means the total compute of the Ethereum network as 2 million gas/second, since that is the amount of computation that gets recorded into the Ethereum ledger.
Estimating the cost (measured in ‘gas’) of an arbitrary computation is complex but let’s assume that we are only interested in the most simple operation: 256 bit integer addition. Each addition costs 3 gas each. So on a worldwide basis this system rates at 600,000 adds per second.
Compare this amount of compute to a Raspberry Pi 4, a $45 single-board computer which has four processors running at 1.5 GHz. Each core has 2 ALUs and it will take 4 instructions to perform a 256 bit addition, as the basic unit for the Raspberry Pi (and most other modern computers) is 64 bits. So each core has a peak performance of 750,000,000 adds per second for a total peak of 3,000,000,000 adds per second. Put bluntly, the Ethereum “world computer” has roughly 1/5,000 of the compute power of a Raspberry Pi 4!
This might be acceptable if the compute wasn’t also terrifyingly expensive. The current transaction fees for 30M in gas consumed is over 1 Ether. At the current price of roughly $4000 per Ether this means a second of Ethereum’s compute costs $250. So a mere second of Ethereum’s virtual machine costs 25 times more than a month of my far more capable EC2 instance. Or could buy me several Raspberry Pis.
What about the storage? The entire Ethereum blockchain is just 1 terabyte of data and adds a total of a few hundred kilobytes a minute. Storing the Ethereum blockchain using a robust commercial service like Amazon S3 costs just $20 a month.
Even the most optimal storage strategy in Ethereum requires 600 gas per byte. Yet the total network compute capacity is only 2M gas/second so storing 1 megabyte will require 300 seconds. So not only can the Ethereum blockchain only store 3 kB of data a second, storing that 3kB costs $250! So the cost of writing a single 3kB message to the Ethereum blockchain is the same price as a year of storage on Amazon for the entire 1 TB Ethereum blockchain. Or the same price as buying a 1 TB M.2 SSD.