Skip to main content

Half the equation and half the definition

There is a lot of confusion over what constitutes Net Neutrality, so much so that parties fiercely on the opposite side of issues both claim to be for it. As an example,  the current controversy over Free Basics has been between Facebook, whose CEO penned an Op-Ed entitled "Free Basics protects net neutrality", and on the opposite side of it is a volunteer coalition, SaveTheInternet (STI), whose entire charter is to protect Net Neutrality. As the Op-Ed from the volunteers suggests, the basic contention between Facebook and the volunteers is a different definition of Net Neutrality. While the concept of Net Neutrality was coined by Tim Wu back in 2003, the definition of what constitutes Net Neutrality has been evolving. 


Let me walk you through the evolution of the definition that the STI coalition is going with, which is widely accepted and which I have arrived at after years of researching the issue. I will explain why Facebook (amongst countless others, they are not solely to blame here) only consider half the equation and thus end up with half the definition.



I'll start with the folk-definition that we started hearing, around 10 years ago:


Folk Definition: All packets must be treated equally

As networking researchers we knew that this definition was not practical and it made little sense to us. Without getting too much into boring details, we knew routers on the Internet did not treat all packets identically (TCP-SYN packets are treated differently from TCP-Data or TCP-Ack packets, UDP packets are treated differently, packets in the tail of the queue are dropped during congestion etc. etc.). However, we also knew what the principle of Net Neutrality was trying to say, and that was the network did not discriminate. So the folk definition needed to be made crisper. The FCC adopted Net Neutrality rules last year and the definition broadly laid out the following principles:

FCC: ISPs will not block or throttle any traffic and will not implement any paid prioritization (no fast lanes)

This definition changes the abstraction from how packets are treated, to how services are treated which is a logical progression. However the FCC missed out in one crucial aspect, and that is not incorporating the concept of differential pricing in its Net Neutrality Principles. Zero Rating, which is a special case of differential pricing,  was not a big problem in the US when the Open Internet order was voted upon, and the FCC preferred a wait and watch approach to it (as a refresher, Zero Rating is the concept where consumers don't pay for the bandwidth of some or all services, and instead the cost of the bandwidth is borne either by the ISP or the content provider). The FCC definition focused on quality of service (QoS) as the determining factor for Net Neutrality, and insisted that all content on the Internet received the same quality of service from ISPs. The intent was to not provide competitive advantage to any service on the Internet, as that was in the best interest of both consumers as well as entrepreneurs. However it missed out in the following way:


As a brief background, in game theory (the mathematical tool we have used in our work on analyzing the issue), the quantity that we focus upon is called Consumer SurplusSurplus is defined as Utility derived from a particular service minus the cost paid to obtain that service. The Utility is a mathematical quantity that models the impact of the QoS obtained for a particular application, and the FCC was absolutely correct in enforcing neutrality there, but the FCC did not model the cost paid in its definition of Net Neutrality (and it is the definition Facebook uses to justify Free Basics as being consistent with Net Neutrality). How much an application costs changes the surplus a consumer obtains, and applications with similar utility (quality) but with differing costs provide different surpluses. In game theoretic models higher surpluses get competitive advantages, thus it is crucial to model the cost aspect of an application to get to a definition of Net Neutrality that works. Differential pricing or Zero Rating of select services absolutely violates the principle of Net Neutrality if we consider the impact on consumer surplus.



Thus, if we only model half the equation, we end up with a definition of Net Neutrality that focuses only on QoS, however if we model the equation fully then the price of the service comes into play. A lot of people only model half the equation (Facebook included) and thus claim that differential pricing (Zero Rating specifically) is fine under Network Neutrality, but that is not true. If we are talking about a true level playing field, then the other half of the equation cannot be ignored.




Access Now, a global non-profit aimed at protecting the digital rights of citizens, has adopted a definition that states the following:


Access Now: Net neutrality requires that the Internet be maintained as an open platform, on which network providers treat all content, applications and services equally, without discrimination.

This definition implies that differential pricing cannot be adopted, but it does not say so explicitly and people (usually differential pricing advocates) can easily ignore the pricing aspect of a service and say Zero Rating is consistent with this definition. To fix this minor issue, and make things explicit, I have proposed the following definition which has received acceptance from academics, policy makers, entrepreneurs and activists alike, and I announced it publicly sometime back:


This definition has the following properties


  1. It incorporates both QoS and Pricing in the definition of Net Neutrality, thus correctly modeling consumer surplus.
  2. It makes explicit the notion that Net Neutrality is not how we treat packets but how we treat competition.
  3. It allows for reasonable traffic management by ISPs without violating Net Neutrality.
  4. It allows differential QoS and/or pricing as long as it is allowed in a non-discriminatory way. ISPs can prioritize all real time traffic (e.g. all voice or all Video Conference traffic in a provider agnostic way) over all non-real time traffic. Similarly all emergency services or health monitoring apps can be prioritized.
  5. It allows creating and differentially pricing entire class of services. For instance, an ISP can create an extremely low latency service and offer it to all games/gamers without discrimination, and that should be fine.The definition permits differentiation between services, but prohibits discrimination within a service.
  6. It ensures a level playing field on the Internet, where upstarts can come in and compete on the basis of ideas.
  7. Lastly, and this is only half in jest, the definition fits in 140 characters including a hashtag.
I am a big believer in the power of capitalism, as I think humans are largely selfish with varying degrees of altruism. However, for capitalism to work for the greater good of society, it is critical that the selfish interests of corporations align with public interests. And that's where regulators step in, using the concept of mechanism design, to introduce a minimal set of regulations that incentivize corporations to act in societal interest. I think the concept of Net Neutrality, defined in the way above, provides the mechanism for the Internet economy to work in the public interest. I hope the right regulations get passed.


Comments

  1. Play Casino in San Diego - MapyRO
    Mapyro's real-time and online gaming resource provides 상주 출장마사지 real-time gaming results, reviews, 당진 출장안마 경상북도 출장마사지 You can 세종특별자치 출장샵 also access the list of slot machines 서산 출장안마 by clicking on any

    ReplyDelete

Post a Comment

Popular posts from this blog

The business of ZeroRating

ZeroRating conversations are dominating Network Neutrality issues these days, whether it is the FreeBasics controversy  in India, Binge On by T-Mobile, or Verizon's recent announcement of a plan similar to AT&T's sponsored data. Here are a few thoughts to consider about ZeroRating and why it makes no sense (to me). If ISPs Zero Rate content, somebody has to pay for the bandwidth. Suppose the Content provider pays for it. Then there is a pricing problem: ISPs cannot charge the content provider a price above the price they charge consumers. Suppose they charge consumers X per MB of data, and they charge content providers X+Y per MB of data. Then, for sufficient traffic where overheads are accounted for, it is cheaper  for content providers to send recharge coupons back directly to the customers who used their services. Long term, pricing above the consumer price is not sustainable. ISPs cannot  charge the content provider a price below  the price they cha...

A short tutorial on the Robust Synthetic Control python library, Part 1: counterfactuals

I have posted a couple of blogs on the powerful technique of (multidimensional) Robust Synthetic Control here and here . In this post I will give a short tutorial on how you can use mRSC to perform your own analysis using the python package my collaborator Jehangir has made available on github. This posting will be about counterfactual analysis. We will work with the canonical example of the synthetic control based counterfactual analysis of the impact California's Prop 99 . All the data and code is included in the github repository linked above. I will post the python code as run on a Jupyter Notebook, and the "tslib" library referenced above has been downloaded and is available. Preliminaries: importing the libraries. In [1]: import sys , os sys . path . append ( "../.." ) sys . path . append ( ".." ) sys . path . append ( os . getcwd ()) from matplotlib import pyplot as plt import matplotlib.ticker as ti...

mRSC: A new way to answer What Ifs and do time series prediction

Introduction What if the federal minimum wage is raised to 16 dollars an hour? What if Steve Smith bats at number 5 in the Ashes 2019 instead of number 3? What if Australian style gun laws were implemented in the USA - what would be the impact on gun related violence? What if Eden Hazard attacks today instead of winging in the midfield? "What if?” is one of the favorite questions that occupy minds, from sports fans to policymakers to philosophers. Invariably, there is no one answer to the What ifs and everyone remains convinced in their own alternate realities but a new wave of work has been looking at data-driven approaches to answer (at least a subset of) these What If questions. The mathematical tool of (Robust) Synthetic Control examines these What If questions by creating a synthetic version of reality and explore its evolution in time as a counterfactual to the actual reality. Recently, together with my collaborators Jehangir Amjad (MIT/Google) Devavra...