Don’t dumb down smart tech
** This editorial first appeared in the Detroit News on on July 6, 2017.
In taking the first step towards overturning his predecessors’ net neutrality rules, FCC Chairman Ajit Pai got what he expected — lots of complaints. This issue demands a spirited debate, but when so much opinion-making has taken place in the muddy waters of social media, it’s also essential that we expose a central fallacy at the heart of much of the opposition.
The goal behind the existing FCC mandates has been to proactively address the possibility of anticompetitive behavior on the Internet. This is despite the fact that no one has legitimately argued that such anticompetitive behavior has occurred on any meaningful scale. But the bigger and more egregious fallacy is that prioritization of internet traffic is harmful, while the equal treatment of every bit of data is without risk to any service we currently need or enjoy.
An egalitarian approach to data may seem like common sense — and it might be a reasonable in a world in which all data is comprised of emails and twitter posts. But what happens when an Amber alert is forced to compete with social media for limited bandwidth? Prioritization allows for important distinctions between these types of data, and ensures the continuity and quality of video, even if that means marginally slowing down (by hundredths of a second or less) simple text.
It’s time to move past overly-simplistic treatment of prioritization as a dirty word, and recognize that it is essential engineering DNA for handling the complexities of modern networks. In fact, attacks on prioritization fly in the face of the nearly constant innovation that has led to incredibly smart, reliable and fast networks.
Today, because of trillions of dollars invested in technology and networks, internet providers can simultaneously deliver the bandwidth needed for connected car technologies, fully automated manufacturing and supply chain logistics, and streaming video and music — all while making sure your Instagram post is published in nanoseconds.
In this sense, prioritization should be understood not just as benevolent, but as necessary; it eliminates unfair and potentially destructive competition for bandwidth. Smart network systems constantly and swiftly shift resources to maintain quality for critical services. This prevents, as one example, the download of a large routine software update from interfering with a city’s traffic management system.
In a world where all data is mindlessly treated the same, the application needing a routine update could send repeated requests, forcing itself ahead of data that is essential for the safety of drivers — a self-inflicted “denial of service” type attack that could cause vital network functions to fail.
Priority is about stability and reliability of already fast service. In fact, prioritization in different forms has existed for years and yet, as FCC Commissioner Michael O’Rielly recently said, there is “no credible evidence of harms to businesses or consumers.” O’Rielly pointed out that in other contexts, even strong net neutrality supporters “routinely pay for a variety of services to ensure the best possible experience.” Just as paid overnight package services don’t slow down all other mail, “express” delivery of online data has little or no impact on all other traffic.
Net neutrality — the idea that network providers shouldn’t pick winners and losers — is useful so long as its applied intelligently and not blindly. Unfortunately, that’s not the case with the existing FCC rules, which threaten network capacity and flexibility by putting the role of regulators and lawyers above the complex technological and business realities of network management.
The true innovators in the internet ecosystem are the engineers and developers who create new products and services. But as long as the internet is governed as a public utility, these creators must take a back seat to lawyers and regulators, needing permission slips for new innovations that require ultra-fast speeds, high degrees of reliability, or both. No company can invest in the next great thing in this uncertain environment.
This combination of business uncertainty and the need to take a cautionary, lawyer-driven approach to network management will lead to dumbed-down systems and scaled-back investments. The ironic result is that consumers will have less access, slower speeds, and reduced reliability.
Instead, regulators should return to the longstanding and bipartisan approach of light-touch rules that saw the Internet thrive for the past two decades. That is what FCC Chairman Ajit Pai has begun to do, with the goal of putting our ever-smarter network technology and technologists back in charge.