2-minute read

At a glance

When a new regulation required one of California’s largest natural gas providers to build a more accessible data foundation, they turned to Logic20/20.

 

Customer challenge

Our partner was leading a major internal effort to improve data accessibility and quality across the enterprise. By offering data governance and data engineering support, they aimed to bring siloed datasets into a single, centralized data lake built on SAP HANA and AWS.

Driving this effort was a need on the part of their risk management team to implement more than 60 new data requirements as part of compliance with the Pipeline and Hazardous Materials Safety Administration (PHMSA)’s new “Mega Rule.” Non-compliance with the rule can result in potential multi-million-dollar fines from state and federal regulatory bodies.

 

Approach and solution

We delivered staged, pre-processed datasets ready to be brought into the SAP HANA data lake, with additional data validation and remediation to improve data quality. Additional support included

• Data governance planning and implementation

• Cloud architecture design

• Agile coaching and transformation

• Requirements gathering and analysis

• Developing and delivering three in-person staff workshops

 

Value and benefits

Our team provided the data foundation our partner needed to build their annual risk models in compliance with the new PHMSA regulations. Further results included

• Support for major organizational transformations in data governance

• Business process automation and improvement

• Agile delivery frameworks

• Cloud/AWS deployment

 

Like what you see?

Paul Lee

Steve Ernst is a Solutions Architect in Logic20/20’s Digital Transformation practice.

Paul Lee

Chris Wu is a Manager in Logic20/20’s Advanced Analytics practice.

Authors