1. / American Telecom Company Builds a Data Hub to Take on Load of 2.8 PB
Overview

Large enterprises need better visibility into their data – it needs to be of good quality, accessible from any of their centers, and capable of being monitored in real-time. The customer wanted a centralized hub to gather data coming in from its 54.3M customers across the US.

Problem

An initial assessment of the customer’s existing infrastructure revealed a few challenges:

  • Lack of a centralized data hub to support all customer-telecom data analytics 
  • Lack of a solution to monitor the performance of the telecom network in real-time
  • Lack of a system to collate data from switches provided by various vendors for network capacity planning and performance monitoring
Solution

Apexon devised an end-to-end solution to cover all bases for the customer’s data pipeline using Ab Initio, Informatica, Hadoop, Splunk, erwin data modeler, tableau reporting, and more technology. The solution includes:

  • A full-scope enterprise data warehouse-BI application with data modeling, ETL, cube, and a reporting framework.
  • Data extraction from binary files in mediation servers, checks for data gap and quality in the multi-threaded mediation and ETL stage of the data cycle
  • Various cubes designed and built with the right partition, aggregation, and caching strategies to carry out trending on capacity, performance, and compliance of network stats
  • An end-to-end ETL solution that ingests data into Hadoop as well as on landing TD tables and Splunk dashboards
  • Pushing customer and demographic services data into Snowflake cloud data warehouse, building Snowsql reports and Chartio dashboards
Impact

The comprehensive solution we created for the customer generated many benefits, such as:

  • A centralized data store which functions as a single version of the truth that helped in operational performance improvement and faster data processing
  • Capability to handle an incremental data load of 2.8 PB
  • Migration to a more scalable Teradata and Hadoop platform to meet growing data demands in a cost-efficient manner

Other Case Studies
A large container operator implements centralized data warehouse to improve data quality by 30%

Case Studies

A large container operator implements centralized data warehouse to improve data quality by 30%

Learn how we assisted a Danish business conglomerate with activities in the transport, logistics and energy sectors to implement a centralized Data warehouse to improve their data quality, analyze the business trend and performance.

A major US bank holding company reduced data anomaly and mismatch errors by 60%.

Case Studies

A major US bank holding company reduced data anomaly and mismatch errors by 60%.

Discover how we aided a leading US bank holding company headquartered in Michigan with capability enhancements that ensure a long-term process that reduces data anomaly and mismatch errors, improves operations, and mitigates risks.

Emergency Department – Coding Improvement

Case Studies

Emergency Department – Coding Improvement

Learn how we assisted a leading revenue cycle management services company in the United States in identifying opportunities to improve documentation while also modifying and automating its reconciliation and quality procedures.

US-based Physicians Group Improves ROI by 6X

Case Studies

Healthcare & Life Sciences
US-based Physicians Group Improves ROI by 6X

Learn how we helped a large physicians group based in the US use deep learning models to improve error identification by 3x and reduce financial leakage by 40%.

Let’s head to the apex

Get in touch and let’s start a conversation!






    banner_side