A Two-Lane Road to AI: Why Clean Data Is Your Only On-Ramp

A Two-Lane Road to AI: Why Clean Data Is Your Only On-Ramp

I have been putting a lot of thought into what AI means in organizations—specifically around how to get to AI. In fact, I just had a discussion with my good friend, Edwin Garcia – data scientist for MD Anderson in Houston – about the how to get large organizations to the “AI level”. Here is the Cliff’s notes summary (pardon the 90’s reference):

The key raw material required to build a two-lane road to AI, not a highway—just 2 lanes, is clean data… and a lot of it.

There’s no amount of heuristic trickery or decision tree priming that can get around bad data. You (read: your organization) have to be really good at data to use AI. For an oil and gas company, it has to become as fundamental as drilling. It’s that important.

The Brutal Reality of “Garbage In, Garbage Out”

Let me start with an uncomfortable truth that every AI expert acknowledges but too many executives ignore: bad data WILL lead to bad outputs, AI hallucinations, etc. There’s no way around it. Doug Robinson, executive director of the National Association of State CIOs, puts it bluntly: “GenAI models are trained on and use massive amounts of data to be successful. You need to feed the beast! If you can’t trust the quality, integrity and reliability of your data, you can’t trust the results of the analysis.”

The computer science principle of “Garbage In, Garbage Out” (GIGO) isn’t just relevant—it’s amplified in AI systems. As one data scientist explains: “AI systems can only perform in the quality of data they are fed. Any AI decisions, predictions, or actions will be biased on using flawed data.” In our industry, where decisions impact millions of dollars in assets and worker safety, this isn’t just a technical issue—it’s an existential business risk.

Kel Wang, manager of applied data practices at GovEx, emphasizes a critical point: “All around, everybody talks about AI, but it’s honestly hard to come across articles around the lines of data quality or data inventories or data governance. Data quality is not the shiny icing on the cake.”

My Framework: The Seven-Stage Journey to AI Maturity

Based on my experience and extensive research into successful AI transformations, I’ve developed what I call the “Ashwood Enterprise AI Maturity Model.” (my marketing folks will probably come up with a better name someday) This isn’t theoretical—it’s based on real implementations and the hard-learned lessons of what actually works in complex industrial environments.

The framework recognizes that AI readiness isn’t a destination—it’s a journey through seven distinct stages, each building on the foundation of the previous one. More importantly, it acknowledges that you can’t skip stages. You can’t jump from paper-based operations directly to autonomous AI systems any more than you can build the 40th floor before you’ve built the foundation.

Stage 1: Foundation (Prepare & Align)

This is where most oil and gas companies should start, but many want to skip. It’s about building organizational readiness for AI transformation through governance frameworks, leadership alignment, and cultural preparation. Without this foundation, technical capabilities become expensive experiments rather than business transformations.

Stage 2: Digitized (Data & Infrastructure)

Here’s where the rubber meets the road on my “clean data” imperative. This stage focuses on establishing comprehensive data capture, quality assurance, and governance. Data quality failures at this stage mean that “rather than garbage in garbage out (GIGO), you will have faster garbage in and garbage out (FGIGO), causing significant operational risk as data is used in more sophisticated modelling and automated processing.”

Stage 3: Decisions (Human-AI Collaboration)

Once you have good data flowing without manual adjustments, then you can start to add sophistication to your data analysis. This stage establishes effective human-AI decision-making processes with clear role definitions and proven ROI validation.

Stage 4: Delegated (Intelligent Automation)

You can automate more because you can trust that—if you’re doing the correct math—you’ll get the right answers. This stage scales AI-driven automation across business processes with comprehensive workforce transformation.

Stage 5: Dynamic (Adaptive Intelligence)

Once analysis is automated, you can start to layer on more conditional situations. AI systems adapt and optimize based on changing conditions, including advanced generative AI capabilities.

Stage 6: Deductive (Predictive Transformation)

Once you’ve reached this level of sophistication—NOW you can start to use your systems to train ML and AI models. This represents holistic prediction-driven business transformation with autonomous AI agents.

Stage 7: Discerning (Contextual Intelligence)

The pinnacle: AI systems with deep contextual understanding, ethical reasoning, and the ability to drive entirely new business models.

Why the Sequential Approach Matters

Good data requires disciplined practices and caretaking. You can’t shortcut this reality. As IT governance experts warn: “Not doing this now will lead to impacts further down the line through biased data” and exponentially more expensive fixes.

Here’s what I’ve learned from companies that tried to skip stages: they end up with what I call “AI theater”—impressive demos that fail in production, chatbots that hallucinate dangerous recommendations, and predictive models that predict the past better than the future.

The principle is clear: “The quality and relevant use of any analysis, analytics or business output is a direct function of the quality of the input data feeding the model.” In oil and gas, this means your reservoir models, production forecasts, and safety systems are only as reliable as your data governance.

The Two-Lane Road Philosophy

Why do I call this a “two-lane road” rather than a highway? Because most oil and gas companies don’t need the complexity of a full highway to AI—they need a reliable path that gets them there safely and sustainably.

A two-lane road implies:

  • Controlled access: You can’t just jump on at any point
  • Sequential progression: You travel through each stage in order
  • Bidirectional capability: You can adjust course based on what you learn
  • Appropriate scale: Sized for the journey, not over-engineered

The key raw material for this journey isn’t computing power or algorithm sophistication—it’s clean, well-governed data. Everything else is built on top of this foundation.

Real-World Application: What This Looks Like

Let me give you a practical example from a client engagement. They wanted AI-powered predictive maintenance for their offshore platforms. Their first instinct was to hire a machine learning team and start building models.

Instead, we started with Stage 1 (Foundation): establishing governance frameworks and getting leadership alignment on data quality standards. Then Stage 2 (Digitized): cleaning and standardizing their sensor data, maintenance records, and operational logs.

It took 18 months just to get clean, reliable data flowing. But once they reached Stage 3 (Decisions), their human-AI collaboration for maintenance planning showed immediate ROI. By Stage 4 (Delegated), they were automatically scheduling maintenance based on actual equipment conditions rather than arbitrary time intervals.

The result? 30% reduction in unplanned downtime and 25% reduction in maintenance costs. More importantly, they built a foundation that now supports multiple AI initiatives across their operations.

The Data Quality Imperative

Recent data breach statistics underscore the criticality of proper data management: “In 2023, the 2,814 most significant data incidents resulted in the breach of 8,214,886,660 records.” For oil and gas companies, data breaches aren’t just about privacy—they’re about operational security and competitive intelligence.

The framework addresses this through progressive governance maturity. Each stage builds more sophisticated data protection while enabling more advanced AI capabilities. You don’t get the advanced capabilities without first proving you can handle the foundational responsibilities.

Cross-Stage Success Factors

The framework also identifies four critical success factors that apply across all stages:

Continuous Value Measurement: ROI tracking and business case validation at each stage, ensuring you’re building business value, not just technical capability.

Adaptive Governance: Governance frameworks that evolve with AI capabilities while maintaining risk management and regulatory compliance.

Human-Centric Transformation: Workforce development and reskilling programs that evolve roles rather than simply replacing people.

Technical Excellence: Platform architecture that supports current needs while being flexible enough for future AI capabilities.

Why This Framework Matters Now

As AI experts warn: “Whether pursuing a digital transformation or taking advantage of technologies like artificial intelligence (AI), machine learning (ML) or the Internet of Things (IoT), organizations need a strong foundation of trusted data to achieve their business goals.”

The oil and gas companies that master this sequential approach will have sustainable competitive advantages. Those that try to skip stages will find themselves with expensive AI projects that don’t deliver business value.

As one observer notes about AI implementations: “There’s no guarantee that the information you get from AI is accurate or beneficial. It will simply follow the popular opinion. And if that popular opinion is wrong, garbage in, garbage out.” In our industry, “popular opinion” can get people killed or cost billions in failed projects.

I’m Still Tweaking This Framework

I’ll be honest—this framework is still evolving. I’m continuously refining it based on client experiences, industry feedback, and emerging AI capabilities. But the core insight remains constant: clean data is your only reliable on-ramp to AI success.

The seven stages provide a roadmap, but every organization’s journey will be unique. The key is understanding where you are, where you need to go next, and what capabilities you must build at each stage.

Your thoughts are always welcome—just reach out! I’m particularly interested in hearing from companies that have attempted AI implementations and what they learned about data quality requirements along the way.

The Bottom Line

To reach the level of AI sophistication that transforms business operations, you’ve nailed the fundamentals: data acquisition and maintenance, data analysis-enabled decision making at the business level, automated conditional analyses, and so on. There’s no shortcut around this reality.

For an oil and gas company, data competency has to become as fundamental as drilling. It’s that important. Companies that embrace this reality and commit to the sequential journey will unlock AI’s transformative potential. Those that try to skip the hard work of data governance will find themselves stuck with expensive AI theater instead of business transformation.

The question isn’t whether AI will transform our industry—it will. The question is whether your organization will build the data foundation necessary to lead that transformation or be left behind by it.


JP Garcia is a Founding Partner at the Ashwood Advisory Group, specializing in helping energy companies navigate their AI maturity journey. He focuses on building the data governance and organizational capabilities that enable sustainable AI transformation in complex industrial environments.

Ready to assess where your organization stands on the AI maturity journey and develop a roadmap for systematic advancement? Contact Ashwood Advisory Group to discuss how Ashwood’s Enterprise AI Maturity Model can guide your transformation strategy.