Or: How I Learned to Stop Worrying and Love My 4G Phone
By Jeffrey W Page, Sr
We live in an age of technological miracles, or so the marketing departments would have us believe. Every year brings promises of revolutionary breakthroughs that will transform our lives: 5G will change everything, quantum computers are just around the corner, and AI will do all our work for us. There’s just one tiny problem—most of it is smoke, mirrors, and very expensive vaporware.
On 5G: The Upgrade Nobody Asked For (And Nobody Can Tell the Difference)
Remember when 5G was going to revolutionize civilization? The hype machine promised us smart cities, autonomous vehicles, remote surgery, and downloads so fast they’d make our heads spin. Fast forward to 2025, and what do most people actually use 5G for? Scrolling through social media slightly faster than before—assuming they can even find a 5G signal.
The reality check comes from an unexpected source: some experts now argue that 4G remains the superior choice for most everyday users, offering better coverage, exceptional reliability, and longer battery life. That’s right—the “old” technology works better for actual human beings. Sure, 5G has enabled some impressive industrial applications like smart factories and vehicle-to-everything communication in controlled environments, but let’s be honest: when was the last time you needed your refrigerator to communicate with a traffic light? The average person’s interaction with 5G consists of watching their battery drain faster while their phone desperately searches for a signal that may or may not exist. The theoretical speeds of 5G exceed 4G dramatically on paper, but scenarios where such extreme speeds are necessary remain limited in everyday life. Translation: you can download a movie in 10 seconds instead of 30 seconds. Congratulations, you’ve saved enough time to… continue staring at your phone. The infrastructure costs are astronomical, requiring new towers and small cells everywhere, yet most of us are still waiting for reliable coverage in our own homes. It’s the technological equivalent of building a Ferrari to drive to the grocery store—impressive, expensive, and completely unnecessary.
On Quantum Computing: Schrödinger’s Computer (It’s Both Working and Not Working Until Someone Checks)
Ah, quantum computing—the technology that’s been “just five to ten years away” for the past two decades. It’s the fusion energy of the computing world, perpetually on the horizon but never quite arriving. Let’s talk about what quantum computers actually need to function. Current dilution refrigerators for hundred-qubit systems cost millions of dollars and require specialized facilities with vibration isolation, electromagnetic shielding, and continuous helium supply chains. Oh, and they need to be cooled to approximately 15 millikelvin—nearly absolute zero, representing one of the most extreme engineering environments ever created for computing. Here’s the kicker: scaling to millions of qubits would require refrigeration consuming the power output of a small city. Your electric bill is going to be a problem. But surely they must work well once you get past these minor inconveniences? Well, most current quantum computers run for only a few milliseconds, with record-breaking machines managing just over 10 seconds. A recent “breakthrough” achieved an astounding two hours of operation—which researchers celebrated like they’d just discovered fire. For context, my laptop has been running continuously for three weeks without complaining.
As one Tom’s Hardware commenter eloquently put it: “Quantum computing is starting to look like the ‘Fusion Energy’ of processing data. Always at a distance…like a mirage.” The scientific community isn’t much more optimistic. It’s broadly believed by experts that “lots of applications” for quantum computers as they currently exist is just hype, with far fewer real-world applications than some would like us to believe. When even scientists are calling out the hype, you know something’s amiss. Meanwhile, researchers have determined that single-GPU systems will continue to outperform quantum computers for the foreseeable future. So basically, the gaming rig in your bedroom is more powerful than a million-dollar quantum computer that needs the temperature of deep space to function. Despite concerns about quantum computing’s potential impacts, only 5% of organizations consider it a high priority, and just 5% have a defined quantum computing strategy. Even the people who should be worried aren’t worried. That should tell you something.
On AI: The “Intelligence” That Needs You to Be Intelligent First
Finally, we arrive at AI—the technology that’s simultaneously going to steal all our jobs and also can’t write a coherent email without extensive human guidance. The dirty secret about AI that nobody in Silicon Valley wants to admit: AI primarily functions as a complementary asset, augmenting human skills rather than replacing them, and operates optimally only when guided by individuals who possess requisite domain-specific knowledge and skills. In other words, if you don’t know what you’re doing, AI won’t magically make you competent. Many people incorrectly assume AI systems can entirely fill gaps in their personal knowledge or expertise, thinking they can produce excellent design work without a design background or write persuasively without writing experience. The reality? Even sophisticated AI tools will be insufficient to guarantee successful outcomes if users are incapable of independently executing tasks. The problem goes deeper. AI tells you things that sound very true but in fact are not true, making it difficult for someone who doesn’t have in-depth knowledge to disambiguate. So you need to be an expert to know when the AI is confidently spouting nonsense—which it does with alarming frequency.
AI needs vast and accurate data to produce useful results; without strong input, the system can make poor choices or show bias. Garbage in, garbage out, as they say. Except now the garbage comes with a fancy neural network wrapper. And let’s not forget the limitations. AI lacks a deep understanding of the world, operating based on patterns learned from data without comprehending underlying concepts. It can’t handle common sense reasoning, emotional intelligence, or nuanced ethical judgment. Basically, all the things that make us human are the things AI spectacularly fails at. When knowledge becomes commoditized through AI tools, its value paradoxically shifts from content to context—and the most valuable human expertise increasingly lies in identifying unasked questions and recognizing unknown unknowns. Translation: AI is great at giving you answers, but you still need to be smart enough to know what questions to ask.
The Bottom Line
Don’t get me wrong—these technologies have legitimate applications. 5G works wonders in industrial settings with controlled deployments. Quantum computers might eventually solve specific problems in chemistry and cryptography (in about 20 years). And AI is genuinely useful as a tool for people who already know what they’re doing. But let’s stop pretending these are magic solutions that will instantly revolutionize everything. Your life hasn’t dramatically changed because of 5G. Quantum computers still can’t run longer than a decent Hollywood movie. And AI is only as intelligent as the person using it—which is both reassuring and terrifying, depending on who that person is. So the next time someone breathlessly tells you about the coming technological revolution, smile, nod, and then go back to your 4G phone that actually works reliably. Sometimes the old ways are old because they’re actually good.
The author can be reached on their 4G connection, when it’s available, which is most of the time, because 4G coverage is actually pretty good.
Thanks! I love the picture!