Monday, June 19, 2017

Spring 2017 Testing - Week 3





Week 3 Status: Red = completed test locations, Blue = soon-to-be-tested

Coming into Week 4 of Spring 2017 mobile field testing, our drivers have compiled a large collection of memorable photos. Follow them on Instagram at: cpuc_broadband_testing

Here is a sample of some of their work.








Monday, June 12, 2017

CalSPEED Testing in France

I traveled to Montpellier France three weeks ago and bought a prepaid SIM from SFR (France's #2 mobile carrier) for my unlocked iPhone. Montpellier is France's fastest growing city and its 8th largest. Here are the speed test results from CalSPEED from a test I ran near the Place de la Comedie, pictured below. 



As we saw with 3G/HSDPA technology in California, ping times were long, between 342 and 347 milliseconds, and hampered real-time streaming capabilities. Granted, my phone was pinging servers on the east and west coasts of the United States, but if I were using LTE instead of 3G, I would expect the ping times to be in the range of 150-200 milliseconds for CalSPEED.

This was my first time using a prepaid SIM from a local carrier, and, not knowing how much data to buy in advance, I bought 20 gigabytes. I thought that would be fine for two and a half weeks of exploring the Languedoc area. To my surprise, by the end of my stay I had only used just over 1 gigabyte, and that was with heavy daily usage. Rather than data, it was battery life that ended up being the limiting factor.

Montpellier has a large student population and there are many new cafes catering to them. Pictured below is a sandwich board written in English, tongue-in-cheek.
The writer Rabelais was an ordained monk and practicing physician, and he taught for a few years at the Montpellier medical college, one of France's oldest.

Tuesday, March 14, 2017

Mobile Field Test Data Gets A Face Lift

We're beta testing a new visualization tool for the California Interactive Broadband Map that displays Rounds 6-10 of our mobile field test data. Below are images from the beta site. The graphics are interactive, so you can select different parts of the charts and drill down to more specifics, such as 1XRTT only (shown below).




 



Friday, January 13, 2017

T-Mobile Latest: A “Fire Chicken?”


With the winter holiday season behind us, it’s time to move that unsold inventory, time to repackage those frozen turkeys as “fire chickens” and call them “fresh."

Fierce Wireless reported in August last year that T-Mobile was replacing its Binge-On sponsored data service with something called “T-Mobile One.” The new service includes unlimited data. However, as with any unlimited plan, the big question is, “Unlimited data at what speed and quality?” Unlimited data at 2.5G or 3G speeds is akin to washing your automobile with a teacup.

As I blogged last March, T-Mobile’s Binge-On sponsored data plan let you stream music and video from select content providers without incurring extra data charges, but the only way to get HD-quality streaming video was by disabling Binge-On, because T-Mobile throttled video streaming to 480 pixels (also known as “SD” for “standard definition”).

It’s no surprise that the Binge-On successor, T-Mobile One, by default only delivers SD quality video. If you want HD quality (720-1080 pixels), you have to pay $25 extra. The only difference between Binge-On and T-Mobile One seems to be the latter lets you stream any content (not just T-Mobile’s preferred providers), and it costs more. Also, just to make sure “unlimited data” doesn’t give people crazy ideas, T-Mobile limits tethering to 3G speeds and also says they’ll throttle anyone who streams more than 28 gigabytes per month (that’s a lot of data, by the way).

Image: Read the fine print!

Read the fine print: “On all T-Mobile plans, during congestion the top 3% of data users (>28GB/mo.) may notice reduced speeds until next bill cycle. Video typically streams on smartphone/tablet at DVD quality (480p). Tethering at Max 3G speeds. Sales tax and regulatory fees included in monthly service price.”

Here is an updated image of estimated streaming video quality based on the interpolated spring 2016 field test results. The January 2016 blog post showing estimated streaming video quality using 2015 field test data is here.



Friday, December 2, 2016

Connect America Fund Layer Now Available on CA Interactive BB Map


As a follow up to my November 1 post on AT&T's plans to deploy wireless local loops as part of their obligations under Connect America Fund Phase II (see "Hello Wireless Loops. Goodbye Fiber? Part 2"), the Connect America Fund Phase II eligible areas are now loaded onto the California Interactive Broadband Map. As shown in the legend to the left, four carriers' territories are shown in light blue (AT&T), dark blue (Consolidated), gray (Frontier), and red (Verizon, now also Frontier). To view the areas, go to the map's menu on the right and expand the FCC Data menu. Check the box next to "Connect America Fund Phase II Locations."


Connect America Fund Phase II Eligible Census Blocks by Carrier



CalSPEED iPhone Ver 1.1.1 Now Available on iTunes

 We've made some improvements to CalSPEED with the iPhone version 1.1.1. Changes include:

  • Clearer indoor/outdoor slider
  • WiFi reminder screen if device is connected to WiFi
  • California and Virginia server testing segments indicated at bottom 
  • Streaming video quality indicator added to results summary
  • Results history shows video streaming and Mean Opinion Score (MOS) estimate for over the top service







Monday, November 21, 2016

New to CalSPEED? Frequently Answered Questions

     Those of you new to CalSPEED often ask how it compares to other speed testing apps. Here are answers to some of the more common questions.     
      
      How Did CalSPEED Begin? CalSPEED was originally funded by a State Broadband Initiative grant from the National Telecommunications and Information Administration. The testing program began in spring 2012 and has now completed 10 rounds. The test program collects not only speed data, but also speed variation, latency, jitter, and packet loss. With these data, we are able to estimate performance for “over-the-top” streaming voice and video service. With ten rounds of semi-annual mobile testing completed, the CPUC has one of the largest public data sets of mobile broadband performance.

      Where Are Tests Performed? The CPUC tests the same 1,990 locations twice a year. The breakdown is 37% urban locations, 56% rural locations, and 7% tribal locations, which were randomly generated. The field test relies on two devices from four major providers (AT&T, Sprint, T-Mobile, and Verizon). 80 TCP tests are performed for each provider, on each device, at each of the 1,990 locations.

      Is 1,990 Locations Enough? Using advanced geo-statistical methods, we are able to  interpolate service characteristics likely to be experienced by a user located anywhere in the state. The CPUC designed the mix of test locations to cover not only urban places where people live and work, but also rural locations where people may be passing through, such as rural highways and state and national parks. All tests are performed along roads navigable by automobile.

      Why Not Test In Every Census Block? Performing field tests in all of California’s 710,145 census blocks would be prohibitively expensive, impractical and unnecessary. For this reason, neither the CPUC nor mobile providers like Verizon, AT&T, T-Mobile, and Sprint perform tests in every census block, but instead use statistical techniques to approximate service characteristics in between tested locations.
     
 Why Two Servers? Most testing applications use only one, generally nearby, server.  This method understates latency and overstates throughput as compared to using multiple, geographically-diverse servers.Testing to a nearby server results in speeds likely to be experienced for applications such as streaming movies, where content is often cached locally due to its popularity.  However, much of the content broadband users access is not cached locally, so CalSPEED tests two two servers -- one in Arlington, Virginia, the other in San Jose, California -- to understand the role of back haul networks in each provider’s delivery of mobile broadband. While using more than two test servers in disparate locations across the globe would be desirable, using both east coast and west coast servers yields more representative results that testing to only one server.


Is This Better Than SpeedTest.net or the FCC's Speed Test? As shown in a study published by Novarum in 2014 comparing Ookla, FCC, and CalSPEED testing applications, results for Ookla and FCC tests tend to be higher because both intentionally select test servers for lowest latency, which tend to be geographically closer. Moreover, Ookla’s test further biases results by discarding the bottom half of upstream results and bottom third of downstream results. By consistently testing to the same two servers, one on each coast of the continent, CalSPEED provides a reliable backhaul performance metric for each of the four mobile providers. Since we began testing in 2012, we have seen the performance (latency) difference between east and west servers decrease.
 
How Else Does CalSPEED Differ From Other Speed Tests? Most speed test applications rely on crowd sourcing. Crowd sourcing has an inherent selection bias of only collecting data from where it is chosen to be used. Where data is collected, it is biased towards who collected it, why, when and where. In contrast, the CalSPEED methodology has testers return to the same location every time, and the geographic distribution of test locations provides a more complete picture of mobile broadband across the state.


How Many TCP Threads Does CalSPEED Use?  Multi-threading means opening more than one connection to the host and combining them in order to boost overall throughput and is used by many speed test applications. When the CPUC designed CalSPEED, we examined the effect of using multiple threads (“flows”) and concluded there was no material difference in mobile throughput between four threads versus eight threads or sixteen threads. The current test design has 4 threads, each divided into ten 1-second tests for upstream to the west server, then again to the east server. The same is true for downstream. This is then repeated a second time, totaling eighty 1-second tests. Most applications only use one thread. 

Why Do Speed Testing? Carriers Already Have Coverage Maps. Most provider maps show a single coverage color and say things like "4G/LTE Coverage." Through CalSPEED, the CPUC has been able to discern more subtle speed and coverage differences by region. Some providers advertise speeds, but we have observed that those speeds are not ubiquitous, that is, they are not available everywhere providers claim to offer service. Speeds vary widely depending on if you are in an urban, rural, or tribal location. For this reason, we create a heat map of speeds based on actual field test data, and the heat map shows how speeds vary across the state. 

Why Not Use Average Speed, Like Mean or Median? The CPUC has demonstrated[1] through years of methodical field testing that mean and median speeds, by themselves, are unreliable indicators of what consumers can expect to experience reliably at a location. CalSPEED takes observed variability into account to determine speeds that consumers can consistently expect to receive.  As mean throughput increases, so does the amount of variability around the mean.

[1] See Section 2.4 Intra-Session Variation in “CalSPEED: California Mobile Broadband - An Assessment - Fall 2014,” by Novarum.