What CES 2026 didn’t show: The quiet crisis in wireless capacity nobody is talking about

4 hours ago 6
Young woman with smartphone outside (Image credit: Future)

Have you noticed that in crowded places, we often no longer have access to decent internet connectivity? Our mobile applications have increased and multiplied, and the cellular networks are getting overloaded, no longer able to meet our data demand.

Perhaps the most surprising revelation at this year’s Consumer Electronics Show (CES) wasn’t what was on display, but what wasn’t.

Founder and CEO of Taara.

Across the Las Vegas convention center, in nearby hotels, and right along the Strip, internet connectivity was often slow, unreliable, or prohibitively expensive at $80 per day. Attendees adapted, as they always do, hopping between cellular access and Wi-Fi.

After all, that’s what we’ve learned to do in crowded places like malls, stadiums and dense metros. But events like CES are where the future is meant to feel real. How can it when the basic exchange of data struggles under the weight of demand?

In concerts, stadiums, festivals, and city centers, connectivity struggles the most at the moments it matters most. Messages stall, uploads crawl, calls drop. We’ve learned to work around it because we assume this is simply how wireless networks behave under pressure. But I think that assumption deserves to be challenged.

If the technologies on display are meant to define the next decade – autonomous systems, intelligent infrastructure, and AI that operates in real time at the edge – then the networks supporting them can’t be an afterthought. The networks we’ve designed have been fine for decades, but they’re not fit for what’s coming next.

Why connectivity fails when we need it most

Cellular networks are designed for steady, distributed usage, not for short, intense bursts of demand from tens of thousands of devices packed into a confined area. At events like CES, every attendee arrives with multiple connected devices, all competing for attention at the same time.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Phones are streaming video, uploading high-resolution photos, syncing cloud data, and running applications that take persistent, low-latency connections for granted – not to mention the likes of Claude, Gemini and ChatGPT.

When that load spikes suddenly and repeatedly, the network is pushed far beyond the conditions it was designed to handle.

Most wireless connectivity today relies on shared radiofrequency spectrum, and in dense environments, bandwidth quickly becomes saturated and congestion degrades network performance, deteriorating the average experience for all users on the network.

And as more users join the network, each one receives a smaller slice of the available capacity. Attempts to solve the problem by adding more radios can actually make things worse, increasing congestion and interference which slow speeds and impact reliability. That’s why connectivity degrades when it should be most reliable.

The network isn’t failing because it’s neglected; it’s failing because it’s being asked to do something it was never architected to handle.

The interference problem

Interference is the key here, and it’s a limitation rooted in physics rather than policy or pricing. Radio frequency spectrum is inherently shared. In a dense environment, thousands of devices and access points are transmitting and receiving at the same time, all within a relatively small physical space.

Signals collide, overlap, and compete, and every additional transmission increases the noise floor for everyone else. The natural response has been to add more radios, more channels, and more complexity, but capacity doesn’t scale linearly when interference is the dominant bottleneck.

What’s needed is a more deliberate approach to how capacity is delivered and used.

High-capacity backhaul should not be competing with access traffic over the same limited spectrum.

Instead, large amounts of data need to be moved into these environments through links that are predictable, isolated, and designed for throughput, feeding a greater number of smaller, shorter-range access points that serve users locally.

This separation allows limited radio resources to be used where they are most effective, rather than stretched thin across entire areas.

When networks are designed with precision instead of simply “adding more”, performance becomes more consistent, latency becomes more predictable, and the experience improves for everyone, even under extreme demand.

Precision over “spray and pray”

We don’t need to rewrite the laws of physics to solve this problem – we just need to work with it. If radio-based interference is the limiting factor in dense environments, then one answer is to move as much high-capacity traffic as possible onto links that are not subject to interference in the first place.

This is where wireless optical communication begins to change things. Instead of broadcasting energy across wide areas of shared spectrum, optical systems use tightly focused beams of light to move data directly between two points, creating predictable, high-capacity links that don’t compete with surrounding wireless traffic.

Because these links are narrow and precise, they can carry large volumes of data without contributing to congestion, even in the most crowded settings.

That makes them well suited for backhauling capacity into convention centers, stadiums, and dense urban areas, where trenching fiber is slow, expensive, or impractical, and where cellular or satellite connections buckle under the broad load they’re being asked to carry.

By delivering gigabit-scale throughput over the air with fiber-like latency, optical links can feed a greater number of localized access points, allowing radio spectrum to be used where it performs best: at short range, serving users directly.

We don’t need to replace existing wireless technologies, but build on them to achieve a more balanced architecture that is deliberate and fit for purpose.

CES, as ever, was a remarkable showcase of breakthrough technologies, but if the networks that support these new applications don’t have the connectivity infrastructure in place to support them, they’re destined to remain on the stage rather than in people’s homes, offices, and cities.

In other words, the future hinted at in Las Vegas will only become real when the connectivity infrastructure is designed with the same ambition as the innovations it’s meant to support.

We've featured the best business smartphone.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Founder and CEO of Taara.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.

Read Entire Article