by Molly White on
← Back to the collection
When I speak about the inefficiency of popular blockchains, or mention that we seem to be hurtling towards a “web3” so centralized it challenges big tech’s firm grasp on today’s web, or point out that somehow no has managed to find a positive use for blockchains that wouldn’t be better served by blockchainless technologies, I often hear “it’s the early days”. “Give it a chance”. “People are still figuring all this blockchain stuff out, ironing out the kinks”.
Bitcoin, currently one of the best-known and most-used blockchains, began to be used in 2009. Ethereum, another well-known and popular blockchain, launched in 2015. In the grand scheme of things, 2009 and 2015 were not that long ago. In the technology world, that was a lifetime ago.
In 2009, smartphones without physical keyboards were starting to become more popular. We still aggregated our favorite blogs to read on Google Reader, but people had started posting their thoughts on this weird new website called “Twitter”. VentureBeat had just published an article urging people not to “believe the hype” around fully-electric cars, writing that Tesla was “floundering” after recently bringing in Elon Musk as CEO. Uber was founded, and people started to talk more widely about this “gig economy” idea. Intel Core processors had just been released, starting with the i3 and i5. Consumer-grade desktop computers usually had 4 or 8GB of RAM. In the software world more specifically, Go was publicly announced, though not yet popular. MongoDB and Redis were brand new players in databases. jQuery was just taking off, about to reach near-ubiquity over the coming years. Node.js was first released. Windows 7 was the hot new thing, after the horror that was Windows Vista.
All that to say, a lot has changed in the technology world in the past six to twelve years. One only needs to look at Moore’s law to see how this is pretty much built in to the technology world, as once-impossible ideas are rapidly made possible by exponentially more processing power. And yet, we are to believe that as technology soared forward over the past decade, blockchain technologies spent that time tripping over their own feet?
“Blockchains have been around for a while,” some will say, “but so many web3 concepts are brand new!” Bullshit, I say to that. Cryptocurrency exchanges have been around for ages—the infamous Mt. Gox launched in 2010. Stablecoins have been around since 2014. One of the first well-known DAOs (decentralized autonomous organizations) was created in 2016. Smart contracts became popular in 2017, and along with them came decentralized finance platforms. NFTs were one of the more recent creations—2018—and a truly stunning example of how this space is apparently only getting worse the more people try to innovate in it. And in 2018, guess what Neha Narula and Alexis Ohanian were saying about cryptocurrencies and blockchain technologies? “It’s early days”.
So this raises the question: How long can it possibly be “early days”? How long do we need to wait before someone comes up with an actual application of blockchain technologies that isn’t a transparent attempt to retroactively justify a technology that is inefficient in every sense of the word? How much pollution must we justify pumping into our atmosphere while we wait to get out of the “early days” of proof-of-work blockchains? How many people must be scammed for all they’re worth while technologists talk about just beginning to think about building safeguards into their platforms? How long must the laymen, who are so eagerly hustled into blockchain-based projects that promise to make them millionaires, be scolded as though it is their fault when they are scammed as if they should be capable of auditing smart contracts themselves?
The more you think about it, the more “it’s early days!” begins to sound like the desperate protestations of people with too much money sunk into a pyramid scheme, hoping they can bag a few more suckers and get out with their cash before the whole thing comes crashing down.