Proactive Network Management Post #3 of 5 - Integration: The Right Data At The Right Time
24 November 2015
Improved situational awareness is at the heart of PNM and means that relevant information is presented concisely and visually whilst at the same time being actionable. This is far more challenging that it first seems, there is a huge array of different data spread across many different systems that can relate to network status, performance and reliability issues. It is easy to throw data together and dump it on the engineering and operations teams to sift through, it is much more difficult to present it in a way that is intuitive to use and rapidly leads to root cause determination, impact analysis, decision support and proactive response.
In this post we will cover the variety of different engineering and operational data sources and how they can be integrated into your Enterprise Location Intelligence platform, as well as the value they add to the PNM puzzle.
There are a number of critical BSS (business support system) and OSS (operational support system) data sources that can be integrated to provide a single live operating view of your network. We will describe the common ones which deliver the most value and how practically they should either be synced between or publish from those systems to your Enterprise Location Intelligence platform.
GIS Design and Network Asset Management
Cable service providers typically have a standardized model and source of record for their network design and asset management that is used and maintained by the planning and engineering groups. This system is typically a Geospatial Information System (GIS) or Physical Network Inventory (PNI) that will include the network characteristics, location and topology within a job and design life cycle management workflow. This system of record is critical for operations so that issues, activities and events can be correlated with the related or upstream/downstream impacted assets.
Outside plant design and construction are often very long cycles that run days, weeks or months depending the nature and scale of the project. The design and asset tracking will have an established process for posting engineering work orders through approvals, construction and as-built verification. Engineering work orders and designs are commonly channeled into a job queue and posted on a set schedule, for example daily or weekly. We recommend a daily sync of the network assets for operations to be effective if a live feed is not available.
Network and Device Data
Network device and service data should be integrated on a more frequent basis because changes are more dynamic than outside plant engineering. For example a customer premise equipment (CPE) device management platform can push out data on the currently deployed configurations on a regular interval such as a few times per day or even every hour without putting excess load on the system. This information can then be geo-coded overlaid with the network topology and customer account locations and becomes another hook point for more sensitive operational status and performance data.
Trouble Tickets and Work Orders
It is important that information related to issues impacting customer service be presented in near real-time. A practical approach is to sync a list of all active trouble tickets and restoration work orders every 10-15 minutes, geo-code them based on the customer account, address or infrastructure location, and render them on the network map.
This paints a complete picture of what customers are experiencing and where your efforts should be directed. Extending this to relating tickets and restoration orders with the outside plant infrastructure data allows for customer issues to be associated to the network assets and topology for root cause analysis and correlation.
Performance and Trend Data
Performance and trend data is the keystone of an effective PNM program. There are a variety of commercial and home grown systems in every enterprise that provide trend information about network health and performance. By integrating this information with your network assets and delivering it to everyone in your enterprise you will see a step change in overall network knowledge and rapidly improvements in problem diagnosis, positively impacting areas such as Mean Time To Repair (MTTR).
Cloud Based Geospatial Data
Google has dominated the consumer mapping market with its high performance and ease of use. It extended its solution with Google Street View which is a valuable resource for cable companies whose assets are geographically dispersed but are focused on urban areas where the data is most complete. Google does not publish how current its data images so this should be taken into account when viewing the data but in context it can be a very valuable resource. Bing and Open Street Maps (OSM) are also alternative data sources which can be considered.
Other alternative online mapping systems which package data with cartographic tools can also be utilized. Examples include Mapbox which offers simple easy to use map publishing capabilities and CartoDB which delivers more complex capabilities.
Market and Competitive Intelligence
Marketing and competitive intelligence data is typically acquired or updated on a quarterly or monthly schedule. Examples of market and competitive data sources include Dunn & Bradstreet business demographics, metropolitan building multi-tenant information, wireless infrastructure, residential broadband coverage, lot parcel characteristics, and competitor service area coverage. As the underlying information does not change rapidly it does not need to be updated or refreshed all that frequently to still be a valuable aid in infrastructure planning, serviceability and customer response.
From a network operations perspective this information can provide valuable information about where customers are located in relation to equipment, service areas and the end-to-end topology. Customer account data generally comes from the Billing or CRM system can be refreshed daily based on accumulative changes.
Accounts location must be geo-coded either as a batch process or as part of the account location validation workflow. One consideration is the accuracy of the geo-coding; in some cases the engineering group handles this and the account location is very close to the center of the lot parcel or near building footprint, and in many other cases, results from Google Maps, Bing or other address geo-coding services are sufficient. However it is obtained and managed, the customer account location is a critical part of tying network and service usage, performance and status to the operational visualization.
Most modern field teams have GPS either on their device or trucks. This is useful in two ways, it provides automatic context for use in the field and it also provides other people in the organization valuable information to be able to direct the closest qualified person to an incident. This data does not need to be in real-time and can be shown in the system in a 1 to 5 minute update cycle.
Weather & Traffic
There is an increasing sophistication in the data available about both current weather and traffic. Traffic can assist in decisions about which work to do prior to leaving the office as well as reducing issue response times in the field. Integrated weather allows you to proactively manage storm preparedness and recovery situations and aids issue diagnosis. This data is typically provided in a live stream from providers such as Google and Weatherbug.
Public data from federal, state and local sources tends to be an untapped resource. These data sources can be valuable for operations teams as they provide up to date information about areas such as road closures, rezoning and police incidents all of which can fold into your planning and day to day operations. This data is generally updated on a weekly or monthly cycle and can be easily either synced or read from a live stream.
Map out your data timelines
We recommend you map out your data sources on a timeline to clearly illustrate the applicable refresh and update cycles of each data source. The source system database and application architecture often drives the approach to integration including the timing. Once mapped out the next step is to prioritize data integration and implement incrementally to maximize the time-to-value for your project.
So having identified the data sources that are readily available for integration there are some simple questions that you need to ask about every source to determine the best method for integration.
Is there an existing data service I can plug into?
Increasingly data services exist that publish data in industry standard forms. Many GIS and spatial systems have documented services which most Enterprise Location Intelligence platforms should support and other industry standards such as Open GIS Consortium WMS/WFS standards can be used.
For non-spatial data such as status, usage and performance there are a multiple ways that data feeds can be enabled including product API’s, web services, EAI, ODBC/JDBC, and ETL (extract, transform and load). The latter approach can be simple and effective, using scheduled automatic bulk export and import procedures. Millions of records can be handled this way and it tends to be fast and easy with a minimal amount of overhead and no additional software costs.
Can the source system meet the performance requirements?
If there is not an existing service which you can take advantage of then another series of questions must be posed. Enterprise Location Intelligence platforms as the name suggests make requests based on geospatial proximity which may be different to how some of your internal systems are optimized for their primary use.
So a key question to ask is will the source data system support the typical use cases you will see from day to day users? If the answer is yes then a direct integration can be implemented. If the answer is no then we recommend a sync into a spatially indexed database such as Oracle Spatial or PostGIS. Once in a spatial database then data can either be rendered directly or a tiling mechanism can be implemented to optimize performance in a similar way to Google Maps.
Do I need to use this data offline?
Some data may be needed when you are running in an area without network coverage or in a situation where the network is not available such as storm situations. In this case most Enterprise Location Intelligence platforms support the ability to incrementally sync data to a local device from a central repository.
In our first blog post in this series we explained how integration and geospatial visualization of critical enterprise data becomes the foundation of PNM and provides improved decision support and situational awareness.
In the second post we showed how consumerization of IT is positively impacting business system usability, flexibility and speed. When combined, open source, mobility and cloud data create the perfect storm of disruption that is transforming how enterprise systems are designed and deployed. Applications that are intuitive and simple, and work across all devices are being rolled out in record time with unprecedented user acceptance and impact. These factors have become the platform of change in PNM.
In this post we have shown how by taking a flexible and practical approach about how and when data is integrated and information is presented geospatially enables service providers to quickly and effectively take network operations and customer service to a new level of intelligence, awareness and responsiveness.
In the next part of our PNM blog series we will discuss how to consolidate, correlate and present all this valuable data in usable ways to optimize network engineering and operations processes and arm service provider teams with the insights they need to improve key performance indicators (KPI’s).
If you have questions about data sources and methodologies please contact the Ubisense team.
Todd Kuty, Director, Cable & Telecom Solutions
m: 720-506-1422 / e: [email protected]
Ubisense Inc, 999 18th Street #901, Denver CO 80202