When Industry Experience Meets AI Theory

When Industry Experience Meets AI Theory

Why Responsible Knowledge Design and Responsible AI Converge

For most of my journey building MyDrill, artificial intelligence was not the starting point.

The platform began from a much more practical discomfort: the way real industry capability, experience, and judgment were being reduced online to marketing language, directories, and keywords that didn’t reflect how decisions are actually made.

That work started well before the public GPT era.

Only recently did I come across the work of Yoshua Bengio, one of the foundational thinkers in modern AI. What struck me was not that his ideas changed my direction — but that they confirmed it.

This article is about that convergence.

The problem I was trying to solve — before AI entered the conversation

Working closely with industry, I kept seeing the same pattern:

  • Capability was presented as claims, not context
  • Experience was flattened into bullet points
  • Complex trade-offs were hidden behind promotional language

This wasn’t just a visibility problem. It was a decision-quality problem.

MyDrill was designed to address that by structuring knowledge differently:

  • companies as entities, not ads
  • capability as conditional, not declarative
  • constraints and trade-offs as first-class information

At the time, this was simply good industry design.

Discovering AI theory that explains the instinct

When I later encountered Bengio’s work, one idea stood out:

AI systems don’t learn the world by memorising information — they learn by building internal representations.

That immediately resonated.

If digital platforms shape the representations AI systems learn, then:

  • poorly structured industry knowledge doesn’t just mislead people,
  • it trains future systems to misunderstand the industry itself.

What Bengio articulated in theory, I had been responding to in practice.

Why structure matters more than scale

Many platforms optimise for reach, traffic, and content volume.

MyDrill took a different path:

  • structure over volume
  • clarity over promotion
  • governance over growth-at-all-costs

This wasn’t about being conservative. It was about recognising that what looks like content today becomes training data tomorrow.

Responsible AI doesn’t start at the model level. It starts much earlier — at the knowledge layer.

Values as defaults, not slogans

Another strong alignment with Bengio’s perspective is the role of assumptions.

In MyDrill:

  • safety is baseline
  • compliance is assumed
  • environmental and operational constraints are real inputs

These are not marketing differentiators. They are embedded defaults.

AI systems, like people, absorb what is implicit far more than what is proclaimed.

Human governance belongs in the architecture

One point where industry and AI theory align particularly well is governance.

Bengio consistently argues that high-impact AI systems must remain human-governed. Independently, MyDrill was designed as knowledge infrastructure, not a media platform.

Industry bodies, experts, and stewards are not an afterthought in this model — they are part of the system.

Why this matters for industry leaders

AI will increasingly influence:

  • procurement decisions
  • risk assessments
  • capability evaluation
  • and strategic planning

The question for leaders is not whether AI will be used — but what it will learn from.

My journey with MyDrill has reinforced a simple insight:

Responsible knowledge design naturally converges with responsible AI theory.

You don’t need to start with AI to arrive there. You just need to care deeply about how decisions are made — and how understanding is formed.

A closing reflection

I’m genuinely glad to have discovered Yoshua Bengio’s work when I did. Not because it validated a platform, but because it validated a way of thinking.

As industry leaders, we have a choice:

  • allow algorithms to learn our industries accidentally,
  • or teach them deliberately, through well-designed, well-governed knowledge systems.

The future of AI in industry will be shaped less by models — and more by the quality of the knowledge we choose to share.

 

by Shadi Samieifar
Creator of MyDrill

Add comment

Sign up to receive the latest updates and news

© 2026 MyDrill - All rights reserved | Privacy Policy | Terms