American telecom giant AT&T proposed something this week that is almost certainly dead on arrival: the reincarnation of Ma Bell thought it would be just swell if advertisers and other businesses could pay for the wireless data its customers use for certain apps. Want to watch streaming video on your phone, but don’t want to pay for the digital mileage? No worries there, NBC would be happy to pay for your viewing time—provided, one imagines, they get at least some access to the treasure trove of information on the average person’s smartphone.
Even if it didn’t raise privacy concerns, AT&T’s idea is likely to be torpedoed by the US Federal Communications Commission for the good enough reason that it would put a massive hole in the already leaky concept of “net neutrality,” the idea that the digital domain works best when its biggest players can’t buy their way to static dominance.
Whatever the future of AT&T’s proposal (and there may be a version that’s less offensive to American regulators out there), it illustrates one of the fundamental differences between the digital world we’ve built since the 1990s and the analog systems that preceded it: classic TV or radio is an all-you-can-eat affair, because the channel was always open and nobody was metering how much you watched. This was partly a function of technology, but also a function of business model: TV users didn’t have to pay, because advertisers already were. AT&T isn’t inventing some novel evil here.
If telco giants can’t sell as many high-priced megabytes as they want, that’s their problem. The problem we all face in the 21st century is that a digital, metered, controlled world has very few pieces that can easily be turned to a public purpose when needed. Television and radio, which both assumed their modern forms during the Cold War, were explicitly built with certain public functions in mind—like disseminating news during an emergency. (Even in Canada, anyone who received US signals on their TV ought to know what the test of the emergency broadcast signal sounds like.) Because the channel was always open and nobody had to check their wallets before turning on the TV, government mostly didn’t have to worry about whether people would get the news that Soviet missiles were on their way.
The framework of the digital economy was mostly built after the fall of the Soviet Union, and the emergencies governments face now are for the most part less apocalyptic. Perhaps that’s why there’s no obvious lever for governments to pull when it comes to disseminating information using 21st century means during the 21st century’s emergencies. Whether it was 2002’s blackout across a substantial chunk of the continent, or the recent ice storm in Ontario, radio and TV remain the go-to media for leaders looking to inform and calm their voters.
Sure, governments can and do release reams of information on their websites—but they have no way of making sure that information gets to citizens, especially in cases where the power goes out. Even if your cellphone was charged and even if the wireless network is still operating, what if you ran out of minutes this month?
For now, this is a small and mostly theoretical problem. But with younger consumers increasingly abandoning not just landline phones but traditional broadcasting altogether, governments are going to need to think hard about how they reach their citizens in a digital, metered world.