0319-trust-verify-web

Trust Is All About Verification

March 21, 2019
What CAF Recipients Must Deliver Per the FCC’s Order DA 18-710 The FCC distributes financial support via the Connect America Fund (CAF) to carriers in exchange for providing a certain […]

What CAF Recipients Must Deliver Per the FCC’s Order DA 18-710

The FCC distributes financial support via the Connect America Fund (CAF) to carriers in exchange for providing a certain quality of service. With that, the public seems to trust that such service is in fact being provided.

However, until recently, that service quality was never verified (Form 477 data is self-reported). As a remedy, the FCC moved to implement a Performance Testing requirement, which aims to provide accountability and transparency for the impact of CAF dollars. The key is to trust but verify.

That verification comes in the form of performance testing, outlined in the FCC Order DA 18-710 in July 2018 and due to begin this summer. The order identified 3 methods for conducting performance testing, established which companies are required to test, and specified testing parameters and requirements.

These testing parameters are strict, and consequences for noncompliance are severe. For companies whose speed and/or latency tests do not pass the compliance thresholds, penalties include combinations of increased reporting requirements and reduction or loss of support. However, many companies don’t have a solid grasp of what is required of them — or how to complete the testing correctly. Here’s what you need to know.

The 3 methods of conducting performance tests identified in the Order are:
1. MBA testing infrastructure.
2. Off-the-shelf testing tools.
3. Provider-developed self-testing configurations.

Each company will need to weigh these options carefully, as there are both operational and technical questions to consider. These questions will influence which solution is best for your company.
• How will we optimize our testing conditions, including IXP selection?
• How will the test data be certified and filed?
• What can be done to minimize risk to our CAF support?
• How will our testing solution impact the customer broadband experience?
• When will the solution be available, and can we test it before committing?

Most recipients of CAF high-cost universal service support will be required to comply with performance testing. This includes price cap carriers, Rate-of-Return carriers, Rural Broadband Experiment (RBE) support recipients, Alaska Plan carriers, and CAF Phase II auction winners.

Notably, the order set the start date for performance testing for 3rd quarter of 2019 — which opens on July 1. (Note: There has been speculation of a potential delay. However, at the time of this writing, the July 1 date is still in place, and we are advising our clients to act in compliance with this date until official notice otherwise.) Beginning in that quarter and in each quarter thereafter, performance tests must be performed during a weeklong test window for the duration of the high-cost support term. Which week during each quarter is at the discretion of each company.

Performance testing measures 2 metrics: Speed and Latency. In accordance with the stated goal of greater accountability, the tests are intentionally required to be performed during peak usage times (7:00 p.m. – 11:00 p.m.) to mimic the actual experience of end users. Hence, testing for both speed and latency must begin 1 hour before the peak window and continue until 1 hour after. The test window, then, is 6:00 p.m. until 12:00 a.m. local time.

For many companies, testing during peak usage time is problematic: the customers will already be using their network, and there won’t be sufficient bandwidth to complete the test! The FCC did consider this and included a provision for idle line verification. If there is crosstalk on the line of 64 kbps download or 32 kbps upload, the speed or latency test can be delayed until the next minute.

All test results must be certified and submitted to USAC annually, beginning July 1, 2020, based on tests from the 3rd and 4th quarters of 2019. All test data is subject to audit, and the reporting frequency may be increased for companies whose test results do not meet FCC requirements. At the time of this writing, the exact process for certifying and submitting test data has not been articulated by the FCC.

The Details

Speed and latency tests must both be run from the customer location to or through an FCC-defined Internet Exchange Point (IXP). These are largely major metro areas such as Atlanta, Dallas, and Seattle. The order notes that most mainland US locations are within 300 airmiles of at least one IXP, and all within 500 airmiles of one. However, for many companies, especially in rural areas, testing to IXP locations will require utilizing networks outside the provider’s control.

While the test windows, IXP locations, and reporting obligations, are the same for both speed and latency tests, the tests for each differ in several ways. Speed tests must be performed once per hour during the test window. That is, once per hour beginning at 6:00 p.m. until 12:00 a.m. for each day during the testing window each quarter. (For each location, 1 test per hour x 6 testing hours per testing window x 7 testing windows per quarter x 4 quarters = 168 speed tests per location annually.)

Speed tests are intended to verify that the actual speed is at least as fast as the speed tier for which the company is receiving support. They must exceed an 80/80 standard for compliance. This 80/80 standard means tests must achieve 80% of the CAF-supported speed in at least 80% of the tests performed. For instance, if a CAF II winner indicated they would provide 100 Mbps speed, the Performance Testing speed test data should show at least 100 Mbps and must show at least 80 Mbps actual speed at least 80% of the time — even if the customer is subscribing to a lower speed.

On the upper limit, test data should not exceed 150% of the advertised speed. For example, if the CAF-supported speed is 25 Mbps, but the customer is subscribing to a 50 Mbps speed package, the tests must show the customer is receiving at least 80% of 25 Mbps (the CAF-supported speed) but no greater than 75 Mbps (150% of the subscribed speed).

In contrast, latency tests must be performed once per minute — that adds up to 10,080 tests annually for each test location. Latency tests must verify that latency is less than 100 ms for traditional carriers, or less than 750 ms for high latency tiers (i.e., satellite). To meet compliance, 95% of latency tests must pass.

Testing for speed/latency must be performed at 10% of active locations, but not less than 5 nor more than 50, per each speed tier, per state. Thus, the number of locations that must be tested for speed and latency will vary for each company. Locations will be selected at random from active locations within the USAC HUBB and will change every 2 years. (As of this writing, guidelines for randomization and selection have not been published.)

The BETTI Box.

Performance Testing can seem complex, and no one wants to risk losing their support or increasing their federal filing obligations. Therefore, as companies explore their options for testing mechanisms they should carefully consider both the technical and operational requirements as they choose the right testing solution. One solution, the BETTI Box by Vantage Point, for instance, includes customized testing, certification, and filing of results, and a suite of Unique Support Preservation™ tools such as test condition optimization.

Ultimately, the end goal — accurate data for more transparency and accountability for CAF dollars — is a noble one, and for companies already providing the required service there is little risk in proving it. The FCC has trusted USF recipients. Now, it’s time to verify.

About the Author

Andy Deinert

Andy Deinert is Network & Security Services Manager, Vantage Point Solutions. Andy has over 14 years of experience in IT assisting network administrators design and secure their networks. His expertise is in the areas of cybersecurity, social engineering, data network design, network monitoring and troubleshooting. Andy has engineered several regional fiber optic networks for broadband service providers He also sits on the Cyber Security Industry Advisory Board at Dakota State University. For more information, please email [email protected] or visit https://www.vantagepnt.com. Follow Vantage Point Solutions on Twitter @Vantage_Pnt.