top of page
Wavy Abstract Background
Writer's pictureBill Gowans

Modernizing Data Goverance

Data Governance Modernization


If the first generation of data governance was about awareness, and the second generation introduced tools and operating models, data governance modernization today is about enabling information consumers to realize the full benefit of data and analytics capabilities.

 

As Insurance companies invest heavily in core business system transformations (Guidewire/Duck Creek), Cloud migration initiatives, and integrated data environments and analytic solutions, it has become increasingly important to have a comprehensive data governance program in place.  For insurance companies, data governance is a big deal.  Customer experience, data risk mitigation, compliance, and having a data-driven culture are at stake.  A comprehensive data governance program that incorporates data and analytic modernization, along with data management innovation, can transform how organizations manage, utilize, and benefit from their data assets. 

 

Next-Generation Data Governance

 

Here are some defining elements for future-facing data governance programs:

 

Data Stewardship

A central component of an effective data governance program is data stewardship. Assigning data management responsibilities to data stewards where data is created and used, promotes ownership, accountability, and a deeper understanding of data, while still aligning with the enterprise governance policies and standards.  It is also important to understand the differences between data producers and data consumers when defining your stewardship operating model. This distinction is gaining traction because data producers need to think enterprise-wide while consumers can be business-specific, so their governance requirements and stewardship functions differ. The number of analytic data producers and consumers is growing rapidly in our era of data decentralization, so it pays to organize governance around this scalable operating model.

 

Enterprise Data Catalog and Metadata Automation

The data catalog helps insurers identify, understand, and manage their data assets. While the data sources may be many and multiplying, the catalog can be the “single pane of glass” through which a single version of the truth is achieved.  The data catalog connects to various source types, including relational databases, file systems, and analytic and reporting tools. It extracts valuable metadata from each source, from data popularity and usage patterns to samples, profiling, and lineage. 

 

Metadata automation streamlines the integration of new data sources and maintenance of business and technical metadata, increasing efficiency, data transparency and usability. The completeness of this information attracts catalog users and pushes it to the adoption tipping point, i.e., where adoption accelerates because it’s the data “meeting place.”

 

Data Research and Analytic Environment

Creating a dedicated data research and analytic environment is another key component of a modern data governance program. This environment provides data scientists and analysts, the ability to explore data and develop advanced analytics in a controlled and secured space without the rigidity of a “production” IT environment.  This allows for new data to be onboarded rapidly and put to work quickly, often while the data is still being discovered.

 

Data Analytics Marketplace

With a growing number of data producers and consumers and an appetite for re-use rather than replication, many insurance companies have or are exploring building an analytics marketplace where data consumers can find what’s out there that may meet their data needs. A data analytics marketplace transforms how data is shared and consumed across the enterprise. In some cases, the data catalog serves as the data analytics marketplace.

 

AI-Enabled Data Quality Management 

Ensuring high data quality is a foundational component of data governance. This includes designing and implementing data quality rules and remediation of data quality defects.  AI-driven anomaly detection helps ensure the data powering the business stays reliable. AI-based tools can monitor the quality of your KPIs, the critical data elements that make up your KPIs, and the processing pipelines used to create the KPIs.




By Bill Gowans

Partner and Founder, PremiumIQ


Recent Posts

See All

Comments


bottom of page