Big Data Analytics: A Service Provider’s Gold Mine

0

Fact and Fiction of Big Data

by: Brad Hine

(This article originally ran in theOctober 2015 issue of OSP Magazine)

Buzz word or business driver? Big Data Analytics has everyone in the Information and Communications Technology (ICT) industry buzzing, for sure — but is it hype or must-have? Is there a big return on investment for broadband service providers? It’s time to debunk Big Data.

Assessing network capacity during network maintenance order requires a map displaying subscriber info, fiber segments, work order tracking, field tech and vehicle tracking and inventory available. 

Assessing network capacity during network maintenance order requires a map displaying subscriber info, fiber segments, work order tracking, field tech and vehicle tracking and inventory available.

So What Is Big Data, Anyway?

The 1880 US Census took 8 years to tabulate, and it was estimated that the 1890 census would take more than 10 years to calculate — meaning the report wouldn’t be completed before the 1900 census had to be taken.

As the fact above illustrates, human beings have been trying to gather and analyze data for centuries. Why? To ascertain patterns and behaviors that can predict future patterns and behaviors.

And look how far we have come: Today IBM reports that 2.5 quintillion bytes of data is created on a daily basis. Walmart processes over 1 million transactions per hour. Google handles over 100 billion searches per month. And according to Gartner (www.gartner.com), developing the capability to create value from data is a $34 billion industry.

(Service Outage Monitor) Monitoring network faults and threshold alarms requires data from network management integrations, inventory management, subscriber records, and workforce systems.

(Service Outage Monitor) Monitoring network faults and threshold alarms requires data from network management integrations, inventory management, subscriber records, and workforce systems.

The term big data refers to not only the huge amount of data that is generated, collected and available on a daily, even hourly basis, but the fact that this data is being created from multiple, disparate sources, and is extremely hard to capture, analyze, and apply. For example, McKinsey Global Institute (http://www.mckinsey.com/insights/mgi) estimates that the US healthcare industry could save $300 billion a year through better integration and analysis of data, from clinical trials to insurance claims to connected running shoes.

This begs the question: How does big data apply to my operation? As a network operator and broadband service provider, big data could be better defined as all of the structured and unstructured data that a service provider collects that can be analyzed to reveal associations, patterns, and trends relating to your business. It certainly will not be quintillion bytes of data — but it is significant to your organization if harnessed and analyzed correctly.

(Customer Experience) Tracking network health and consumption, repeat customer calls, and truck rolls with common network routes helps to secure revenue assurance and customer satisfaction.

(Customer Experience) Tracking network health and consumption, repeat customer calls, and truck rolls with common network routes helps to secure revenue assurance and customer satisfaction.

Let’s take a moment and look at the type of data we are talking about:

Subscriber Statistics: Subscriber location, service packages, service technologies, purchasing trends, support calls, disconnected accounts.

Network Data: General network health statistics, all network alarms and thresholds, bandwidth usage, signal loss, and logical transmission routes.

Asset/Inventory Tracking: Details for every piece of equipment within the inside and outside plant (OSP) infrastructure.

Workforce Management: Field service calls, installation schedules, service tech vehicle tracking, current service call status.

Market Statistics: Service penetration, subscriber ARPU, average household revenue, real estate data, census info, and regional demographic info.

Big_Fig4_102015

(Under-Used Assets) Monitoring under-used network assets overlaid with service penetration and capacity statistics can help providers repurpose equipment for higher priority areas.

Imagine the value to your entire organization if that type of information could be at their fingertips. Tracking this data allows service providers to run endless analytics to extract information that, on a high level, falls into 3 categories:

Category 1. Descriptive. What happened in the past, and how can I use this data to alert me when this pattern repeats itself?

Category 2. Predictive. What might happen in the future, and how can I use this data to plan more efficiently?

Category 3. Prescriptive. What I do now, and how can I use the data to build a smart, automated and responsive work flow?

And on a day-to-day operational and management basis, this data can be used for:

  • Improve the Customer Experience
  • Predicting churn
  • Capacity Planning
  • Target market campaign management
  • Subscriber cross-sell and upsell strategies
  • Measuring capital expenditures with ROI metrics
  • Calculating Revenue Assurance
  • Service Assurance management
  • Market segmentation analysis

The data that gives service providers the power to control these could easily reside in a dozen necessary systems — mostly systems that are not integrated and do not share information. This is valuable data but without the ability to run cross-system analytics with it and produce smarter data it is useless. What you do with the data is as vital to your end goal as collecting it.

One of the main focuses of big data analysis is that the data is easily accessible for the users that are trying to approach problem solving with preemptive and proactive processes and methods. In the world of database design and architecture, a suitable alternative, and for most a requirement, is that the data exist in a centralized location. A strong case for creating a central storage of service provider data is that it solves several traditional problems associated with management of multi-system:

  • Data consistency
  • Data redundancy
  • Data accuracy

This is especially true, in the case of Triple Play service providers, when technologies for voice/video/data are separate, and not communicating.

Integrating the structured and unstructured data to a consolidated data warehouse allows that system to evaluate and audit the data so it remains unique, precise, and accurate.

Although the process of migrating or integrating data from multiple systems to one can be daunting, service providers must ask themselves how much effort and resources are wasted while employees try to access critical data to be more efficient.

The Big Role of GIS in Big Data Analysis

Most of us live in a business environment where communication and problem solving is primarily done with flat-file documents, bulleted lists contained in emails, brief text threads, and scheduled face-to-face meetings or conference calls.

These great technologies allow employees and subscribers to communicate more effectively, right? But when the vast majority of all the data we consume and analyze to solve a problem is location-based, it makes sense to measure that data in a spatial context.

Traditionally, the role of Geographic Information System (GIS) in telecommunications has been to design, engineer, and manage service delivery networks and OSP assets. Representing big data sets through GIS platforms of today allows for analytics applications to monitor business and operational groups too. A cross-department GIS solution can provide a bird’s-eye-view of every element and activity within your service delivery footprint. And if that data is available to you in real time, then a GIS analysis tool with the ability to export spatial information in reports should become part of a daily work flow.

Even if your data exists in many unrelated systems, modern day GIS platforms can extract and query multiple resources to turn your raw data into actionable insight. In that way, GIS becomes the aggregator to provide the visual component, or map, to see the story behind your data. And when the data is communicated visually via a map it can help accelerate response and resolution times over the usual interactions.

Seeing spatially enabled big data on a map allows us to answer questions, and also ask new ones:

  • Where in my footprint can I build onto my network that gives me the quickest route to ROI?
  • Where are device outages occurring?
  • What are the common network elements involved?
  • What are the standing SLAs in place within this service outage?
  • What is my service penetration in those crucial areas?

Geographic thinking adds a new dimension to big data problem solving, and helps you make sense of big data.

Putting social media data on a map helps you track a dynamic situation. Retailers display data feeds on maps to monitor and protect their brands. Banks use geographic analysis to detect fraud. The mapping of social media feeds has also helped governments worldwide gauge public sentiment in real time during significant events, such as elections and uprisings.

The Future: Social Media and Internet of Things (IoT)

The future of expanding big data sets for service providers is about to explode. The very networks that service providers manage today will track any device with the proper embedded technology to communicate a multitude of states and information. Software and hardware standardizations will allow IoT driven devices to be uniquely identified and managed. Traditional home devices (e.g., thermostats, washers, dryers, TVs) will all be candidates for remote control of their owners or service companies — and the possibility for monitored devices is endless.

The rewards of data-driven decision making include:

Improved transparency: Allowing the ease and accessibility to all relevant data to give accurate and actionable insight to problem solving.

Efficiency of Expenditures: Allowing teams to be more effectively efficient for operational and capital expenditures.

Better Customer Experience: Aggregating the proper data to positively influence the customer’s experience and expectation.

About the Author

Brad Hine is a Product Manager at ETI Software Solutions. With 15 years in the software industry, Hine specializes in data analytics and is responsible for ETI’s broadband analytics platform, Overture. A frequent conference speaker, Hine has experience in IT support, technical writing, and IT network supervision. For more information, please email bhine@etisoftware.com or visit www.etisoftware.com.

Related

About Author

Comments are closed.