The Data Fit Organization and the Lessons of Macondo - Katalyst Data Management

Unleash Your Data's Potential℠

The Data Fit Organization and the Lessons of Macondo

The Data Fit Organization and the lessons of Macondo
Article Written By Jess Kozman, Katalyst Data Management

A Data Fit Organization (DFO) is one where:

• data culture is a ubiquitous part of work, like safety is today,
• all employees have data competencies and capabilities,
• all employees demonstrate behaviors that deliver strategic value from data, and
• data roles and responsibilities are measured and incentivized.

The concept of a Data Fit Organization was developed by a Steering Committee in Perth, WA, with input from the CORE Innovation Hub (Australia’s first co-working collaboration space focused on resources technology), resource sector operators (petroleum and mining), academia, government agencies, and technology service providers. The Steering Committee is delivering a thought leadership framework to help map, assess and improve data capabilities and behaviors across roles to improve effective upskilling.

A Data Fit Organization is one where data is the organization, and where a shared industry
approach is used to drive value consistently and effectively through data mapping, and by assessing and improving data capability and maturity. It uses a simple, repeatable, and accessible framework informed by the Fitness-To-Operate (FTO) safety competency framework, using three measurable “capitals”: Human, Social, and Organizational. The FTO was developed in 2013 for the offshore oil and gas industry by the International Regulators Forum to reduce the risk of an accident in Australian waters similar to the Macondo incident in the Gulf of Mexico. The Steering Committee continues to focus on the role of data in determining fitness to operate.

The 2010 explosion and fire on the Deepwater Horizon platform, drilling for BP, was in part attributable to a data management and delivery failure. It led to the loss of the rig, 11 deaths, 17 injuries, an oil slick that could be seen from space, and penalties and losses of over $60 billion dollars. What became one of the largest environmental disasters in the United States, with an impact on six state coastlines, helped to focus the resources industry on “Fitness To Operate” and ESG priorities.
A study by the SAS Institute in 2015 found that the first clear data indicator of fluid flow imbalance appeared 43 minutes before the blowout. The rig operators had the data to prevent the accident (see below). The study used data artifacts from the Deepwater Horizon, materials from the Accident Investigation Report, the U.S. Coast Guard, and the U.S. National Commission Chief Counsel’s Report, and an IEEE causal chain of events fishbone analysis.

A study by the SAS Institute in 2015 found that the first clear data indicator of fluid flow imbalance appeared 43 minutes before the blowout. The rig operators had the data to prevent the accident (see below). The study used data artifacts from the Deepwater Horizon, materials from the Accident Investigation Report, the U.S. Coast Guard, and the U.S. National Commission Chief Counsel’s Report, and an IEEE causal chain of events fishbone analysis.

Red: drill pipe pressure. Blue: flow out. Black: flow out. “When the pumps are switched off the flow in is reduced to 0. Initially, the flow out signature drops but rapidly increases again. Either of these patterns are enough to red-flag that additional flow is in the annulus and is almost certainly coming from the formation as the well is kicking.” Reference: Walker and Duarte, 2013

The data management failure was that the information was in a hard-to-read location in a format not highlighted at a time when multiple simultaneous operations were occurring on the rig floor. The driller would have had to notice a small inflection in one of at least 19 parameters displayed on a small auxiliary console. The Rig Manager had in fact earlier admitted to “having a little trouble” interpreting data, but concluded at the time, “It’s no big deal”.

In Australia in 2014, a working group published a fitness-to-operate (FTO) conceptual framework for
assessing behavioral factors that influence both short and long-term safety outcomes, including how an
organization encourages the questioning of operational data. This framework grew into the Data Fit
Organization imperative and has now been piloted and refined using field operations at both oil and gas and mining operators in Australia. The data management capability and maturity aspects have been the subject of several workshops and presentations for industry data management organizations, including the PPDM Association and the Society for Petroleum Data Managers.

Guidance from the workshops in the form of force-ranked priorities can be used to focus data management efforts on foundational, transformational, networking and integration skill sets that allow knowledge workers to optimize the use of data in the current industry environment of volatility, uncertainty, complexity, and ambiguity (VUCA).

Embedded Data Workflows, similar to value stream modeling in business architecture, map all key data roles and capabilities to deliver business outcomes through standardized processes that leverage data. Successful data workflows are role-led, with data roles overlapping those identified in data governance frameworks.

Read More