What the History of Headphones Says About the Internet’s Future

0
21

Predicting the blockchain future is easy. Knowing when to try is hard.

I’m going to talk about the future of the internet and blockchain technology, but first, let’s talk about noise-cancelling headphones.

John Wolpert is the group executive in charge of enterprise mainnet at ConsenSys AG and the chair of the technical steering committee of the Baseline Protocol. This article is part of CoinDesk’s Internet 2030 series, exploring the future of the digital economy.

Eighty-seven years to a headset

In 1933, Paul Lueg submitted a patent application describing the principle of noise cancellation. But it wasn’t until Amar Bose started work on the concept in 1978 that the path to a practical commercial product emerged. It was another two decades before processing power per milliwatt could cancel out the drone of an airplane in an affordable device that ran on batteries. 

So imagine if, in 1933, someone asked Lueg what the world would be like in 10 years, and if he thought it would include computers with the ability to grab sound waves from the air and cancel them out. He would probably have asked, “What’s a computer?”

It was possible in 1933 to imagine a device like the Bose headset, but it wasn’t possible to plot any kind of reasonable trajectory toward a real product. But after Gordon Moore correctly asserted the silicon lithography process would double the number of transistors you could place in the same amount of space every two years, companies like Bose could do the math and predict when chips might deliver the speed and efficiency needed to detect a sound wave and drive a speaker to cancel out the sound, in a fraction’s fraction of a second.

A question of scale

What will the internet look like in 2030? And specifically, will blockchain technology have a material impact on the way the internet of 2030 enables people to conduct business and live their lives? Whether or not we can answer these questions depends on whether blockchain has arrived at a stage of continuous improvement or whether we’re still waiting for new paradigm shifts.

The most important consideration for any new internet technology is scale. The internet scales because at its core it is largely stateless. Internet routers receive a packet of data and pass it to the next router. They don’t need to remember anything about those packets, and they don’t store the packet or check with other routers on the state of a packet before sending it along. 

The stateful internet

Decentralized technology, and especially public blockchains like Ethereum, hold the promise of adding value to the internet by introducing state to this stateless system – what I call the stateful internet. 

An example of state: The state of that airplane was 30,000 feet traveling at 500 mph and now its new state is 30,100 feet traveling at 501 mph. 

Without shared state there is no shared truth, no way to agree the plane is at 30,100 feet or discover which of us is wrong if we disagree. We’re living the downsides of our failure to achieve shared truth everywhere today, both philosophically and technologically.

However, there is significant overhead involved in remembering things and coordinating between different machines that might have different memories about the same thing. Managing state makes it hard to achieve massive scale. 

See also: Paul Brody – How Small Business Can Achieve ‘Economies of Scale’ by 2030

We can’t know when blockchains will have a significant impact on the internet’s utility until we can write the “Moore’s Law” of scaling global state machines (e.g., public blockchains). 

Compartmentalization and privacy

The second most important thing to remember when considering what decentralized state machines could do to change our experience of the internet is the need to balance transparency and compartmentalization of information.

Neither the today’s internet nor blockchains are very good at data compartmentalization. Any cryptographer or IT security professional will tell you that encrypting data is good, but without the ability to compartmentalize access to the actual bits – scrambled or not – encryption just increases the amount of time it will take to expose your information.

This is a particular issue for a stateful system like a blockchain. At least with the internet you must catch packets in flight and figure out which ones need to be reassembled with others to reconstitute a coherent message. But with a blockchain, the data is at rest. If you have a copy of the ledger, you have all the information stored there and can go to work decrypting data, decompiling business logic and analyzing metadata.

A blockchain will never be as good at that as a similar system that isn’t decentralized.

The question isn’t how we can get infinite levels of scaling and privacy. The question is how much scale and privacy do we need to do useful work with the reliability and performance required by industry. If we expect the internet to serve as the back end to any and all applications – from enterprise recordkeeping to Twitch gaming – the level at which it needs to scale may border on defying the laws of physics. 

Even a massively sharded distributed database, involving no blockchain-style consensus algorithm, couldn’t handle the reading and writing of data for even a vanishingly small percentage of the applications out there. And even if we built a blockchain that could handle the throughput, most of us wouldn’t be comfortable having that kind of data, even encrypted, sitting in shared memory for others to read and analyze. No. 

Most, if not all, applications should avoid using decentralized systems like public – or even private – blockchains to read and write persistent data. 

At the frontlines of the user experience, we want performance. We want the data as close to the computation as possible and we don’t want to have to worry about other applications gumming up the system’s responsiveness. A blockchain will never be as good at that as a similar system that isn’t decentralized.

Real use

So, if the stateful internet isn’t going to be a back end for all application data, what should we use it for? One practical use is to manage cryptographic proofs that allow you to know that records in your recordkeeping system are verifiably identical to the matching records in my system and that multi-party workflows maintain integrity. This general and limited use makes the compartmentalization problem irrelevant and sets the read-write performance requirements at attainable levels. 

For example, I need to know that you and I both have the same purchase order information, so that I’m not surprised with an incorrect invoice or a delayed delivery date. To do this, we need a common frame of reference, a baseline. The public mainnet can provide the common frame of reference to let our respective systems maintain that baseline without either of us being able to say we “didn’t get the memo” or that we fat-fingered the price when we read it off the fax and typed it into the computer. These sorts of confirmations usually don’t need to be instantaneous. An acceptable time frame can be a minute, an hour or even a day. And often they can be batched. 

See also: ‘Boring Is the New Exciting’: How Baseline Protocol Connected With 600 Corporates

So, what would be the minimum level of performance of a stateful Internet that could confirm consistency for B2B events such as payments and inventory control? The number of non-cash payments between companies has been estimated to be around 1.6  billion a day. Let’s say that another 6.4 billion non-payment events like purchase orders, RFPs and back-order notices would also require shared records to be baselined. That’s 10 billion events a day, give or take a few billion. 

The true limitation here is the speed of coordinating a write to memory consistently across all machines maintaining the system. Sharding, and the ability to continuously improve the number of shards that can be added before performance degradation exceeds the marginal benefit of the next shard, is the key to scaling the stateful internet to provide this baseline service to industry.

Just as there are many other uses for silicon chips than noise cancellation, there are many other uses of a public blockchain like Ethereum. But what’s nice about the “baseline” case is that it sets specific requirements in order for practical applications to start going up the adoption curve. 

The Moore’s Law of sharded blockchains

Perhaps we are on the cusp of a “Moore’s Law” for blockchain that might say something like this: “The number of on-chain proofs that can be deposited on a mainnet shard with common addressing to all other mainnet shards doubles every 18 months.” 

Perhaps not. But it seems likely that the next 18 months will tell us a lot, as Ethereum 2.0 rolls out and advancements from there hopefully increase confidence in the ability for additional sharding. 

If we are there, if Ethereum 2.0 works and shows a path of continuous improvement, then we can expect the next ten years to deliver a stateful internet that, at the very least, will be a useful way to keep business data in sync. If Eth 2.0 doesn’t deliver on promises, or if unexpected problems arise in the sharding scheme, then we will be looking for new paradigm shifts. 

See also: Ben Edgington – It’s Time to Launch the Ethereum 2.0 Beacon Chain

Whether or not we can predict a 10-year timeline, what seems likely is we are on our way to a stateful internet.  And that will be profoundly transformative both in ways we can imagine today, and in ways we can’t yet see.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here