Thursday, February 18, 2016

Why the FCC Should Adopt CalSPEED, Part 2

In Part 1, I briefly went through three mobile data-related questions that underlie the FCC’s Big Policy Question regarding Advanced Communications Services as defined in section 706 of the Telecommunications Act of 1996.

In Part 2, I’ll show how the mobile data the FCC is currently using falls short in answering the Big Policy Question and why the FCC should adopt the CPUC’s mobile testing methodology.

CPUC Mobile Field Test Data

The CPUC conducts a statewide field test of the four major mobile providers twice a year across 1986 locations covering 52% rural, 37% urban, and 11% tribal. The test relies on both a west coast server and an east coast server in order to mimic the user experience for both cached and non-cached content. The test uses both an Android smartphone and a tablet in order to get a profile of more than one device and does not filter any of its results - throughput, coverage, latency or other network metric.


  • Tests at same location twice a year
  • Point data results are interpolated to create “heat maps” of speed, adjusted speed, latency, VoIP, streaming video, and video conferencing.
  •  Test works where there’s no network. “No service” is recorded and the test result is uploaded to the CPUC’s database the next time the test is run in a connected environment.
  • Results from crowdsource version of test (CalSPEED) used for validation
  • Code is open source, available to anyone to modify for their own purposes (it’s being used by Virginia Tech—search for “Data Cardinal” on Google Play)


  • The FCC references results from the CPUC's mobile field testing program, but test results from the semi-annual mobile field test are available only for California.

(Please visit the CPUC's mobile broadband testing page to download the latest reports, data, and overview of the CalSPEED mobile testing application back-end design)

Form 477

The FCC relies on Form 477 to collect “information about broadband connections to end-user locations, wired and wireless local telephone services, and interconnected Voice over Internet Protocol (VoIP) services in the 50 states, the District of Columbia, and the Territories and possessions.” The FCC uses 477 data to map broadband infrastructure deployment and estimate population coverage. Mobile operators are asked to provide their minimum advertised speeds.


  • Form 477 has been used for a number of years to collect broadband subscriber data from providers. The addition of deployment data to the submission process following the end of the NTIA’s State Broadband Initiative means that the FCC can get both data sets directly from providers.


  • Self-reported data
  • Who advertises minimum speeds? The only reliable minimum speed for mobile is zero, which you get when you leave a network coverage area.
  • Minimum advertised speed isn’t a reliable proxy for actual speed. The FCC says: “The relationship between advertised and actual speed is more complex for mobile services because the mobile providers report their minimum advertised speed and each mobile provider advertises the minimum speed at various points of their actual speed distribution.” (2016 Broadband Progress Report, footnote 246).

Rather than dive further into that topic, let’s compare the FCC’s estimate of population coverage for 10 Megabits per second down and 1 Megabit per second up (10/1) with the CPUC’s estimate of coverage for a slightly lower threshold. The CPUC’s mobile mean throughput speeds (upstream and downstream) are adjusted by two standard deviations and then used to create a “heat map” indicating likely coverage between test locations.

  • FCC: 53% “of Americans don’t have access to [a] mobile service provider with a LTE technology service with a minimum advertised speed of 10 Mbps/1 Mbps.” (paragraph 83, Table 4). By extension, 47% of Americans should have access to LTE at a minimum of 10/1.
  • CPUC: Only 16% of California’s population has access to speeds of at least 6 Mbps down and 1.5 Mbps or greater using the adjusted mean throughput.

How does this data help answer the Big Policy Question? The FCC recently revised the definition of advanced telecommunications services to the level where “consumers have access to actual download (i.e., to the customer) speeds of at least 25 Mbps and actual upload (i.e., from the customer) speeds of at least 3 Mbps (25 Mbps/3 Mbps).” (paragraph 26). Note the use of the word, “actual,” and not “advertised” or “minimum advertised.” In this regard, the CPUC data is more suitable than 477 data, because the CPUC data is more granular and is based on actual tests.

Mosaik is a private company that advertises having the “largest mobile network coverage database in the world." The FCC's map page shows a nationwide map of mobile coverage based on data from Mosaik and the 2010 Census.


  • Mosaik provide a one-stop shop for mobile coverage data.


  • How accurate is the data? According to the FCC, Mosaik data may overstate coverage: "Coverage calculations based on Mosaik data, while useful for measuring developments in mobile coverage, have certain limitations that likely result in an overstatement of the extent of mobile coverage."
  • What service level does the coverage data represent?

As I explained in a previous blog post, you can’t show coverage without specifying service level. What level of service does the Mosaik coverage map indicate? It’s not clear. Moreover, the image below suggests that most of the state is “covered” by at least one major provider.

The Mosaik data is binary: green (coverage), and white (no coverage).
In contrast, here is the CPUC’s mobile broadband coverage map image of California based on a service level of at least 6 Megabits per second down and 1.5 Megabits per second up using interpolation of mobile field test results of mean minus two standard deviations.

Rather than showing a binary “coverage/no coverage” image, the CPUC’s map (lower left) shows areas where an end user is likely (e.g. mean minus two standard deviations – see explanation here of why we do this) to get 6 Megabits per second down and 1.5 Megabits per second or greater (green). Yellow indicates areas where a user is likely to get speeds below that threshold, but higher than combined 768 Kilobits per second down and 200 Kilobits per second up (768/200). Finally, red indicates areas where not even 768/200 is likely.
The CPUC’s data provides more granularity than Mosaik’s, and it shows different service levels, depending on the geography. Furthermore, as I mentioned earlier, the CPUC data lets us map different types of service levels such as VoIP, streaming video, and video conferencing.

FCC's mobile testing app (mMBA)

In 2013, the FCC released a mobile testing app on Android and later followed up with a version for the iPhone. The purpose of the app is to crowdsource mobile performance across the United States. The FCC states that its speed test app “accurately measures mobile broadband (cellular) and WiFi network performance and delivers consumers an in-depth, real-time view of key metrics related to their mobile broadband experience.”


  • Provides the FCC with real-world mobile performance data
  • Crowdsourcing lets the general public become more informed stakeholders in broadband policy (the same reason why the CPUC also offers its own crowdsource app (CalSPEED) on the iPhone and Android.


  • Where is the data from the mMBA tests? One of the few snippets of mMBA data appeared in the 2016 Broadband Progress Report: "On a national level, mMBA reports median LTE download speeds for the first two quarters of 2015 as 11.6, 7.5, 5, and 13.6 megabits per second for Verizon, AT&T, Sprint and T-Mobile, respectively." 
  • Crowdsourced data provides input only where the crowds are -- namely, mostly urban locations. This is why the CPUC conducts a separate, semi-annual mobile field test.


Ideally, data should line up perfectly with policies. It can be both costly and difficult to do make that happen, but it’s good to be aware of the mismatch and have an idea of what data is needed to close the gap.

In an ideal world, every American would have the FCC’s app installed on his or her smartphone, and the app would provide petabytes of actual mobile broadband data to the FCC and the general public. Unfortunately, that doesn’t seem to be happening. In place of that, there may be a way to take the methodological and analytical approach we’ve created at the CPUC and apply it on a larger scale, nationwide drive test of geographically representative locations.

-Special thanks to Karen Eckersley and Ken Biba for their valuable feedback.

Why the FCC Should Adopt CalSPEED, Part 1

The FCC's 2016 Broadband Progress Report quotes a number of data sources to answer what I’ll call the Big Policy Question posed in section 706 of the Telecommunications Act of 1996 – namely, whether “advanced telecommunications capability” is being deployed to all Americans in a reasonable and timely fashion. Unfortunately, the FCC's mobile data falls short in helping answer that question. I’ll demonstrate how and show why the FCC should adopt the CPUC’s mobile testing methodology.

First, though, we need to address the following questions regarding the data:

1.      What is the speed that consumers are getting compared to what is advertised?

2.      Is speed the only consideration?

3.      Should the basis of analysis be households or people?

Question 1 - the FCC has shown through its Measuring Broadband America reports that actual speeds are pretty close if not higher than advertised speeds for wireline technologies. For mobile, the answer is not so simple. However, the CPUC’s mobile testing program has produced mountains of data on availability and variability for California, and we find that actual speeds and performance vary greatly depending on whether you are in urban, rural, or tribal areas.

Question 2 - the FCC has considered other metrics such as latency and variability, but for now has excluded them from the definition of “advanced telecommunications capability” because of lack of data. Nevertheless, data from the CPUC’s mobile testing program has allowed us to create “heat maps” predicting coverage for varying service levels – E-mail, Voice over IP, streaming video, and video conferencing.

Question 3 is rhetorical. Mobile networks serve people. Sometimes, people are inside their homes, and other times they are on the move. Hence, mobile deployment and availability needs to focus on where people go, not where they are when they sleep at night.

So, how effective is the data used by the FCC in answering the Big Policy Question? In Part 2, I will summarize the pros and cons of the CPUC’s mobile field test data as well as those for three mobile data sets referenced by the FCC on their web site and in the 2016 Broadband Progress Report.

Thursday, February 11, 2016

CalSPEED now available on iPhone

We are ecstatic to finally provide an iPhone version of the popular CalSPEED mobile testing app. This is the first full beta release of the app, and we are already working on improvements to the user interface and the mapping feature. Some users told us that the "Unsatisfactory/Satisfactory" readout at the top of the results screen is misleading, because it applies only to the Mean Opinion Score, not for the entire test. Nevertheless, nearly half of the 784 CalSPEED tests since the soft launch of the iPhone version last December have been run on iPhones.