Operational Intelligence Requires Operational Experience
By Carlos Santos
Senior Vice President, Business Development
I’ve had the pleasure in my career of working with engineering/automation companies and OT integrators as well as IT companies over the years. One thing has become increasingly clear to me, especially in the last few years: when it comes to operational technology and industrial AI, experience in the field matters.
Not presentation-deck experience, real field experience. What matters is the kind of experience that comes from designing control systems, commissioning plants with short turnarounds, troubleshooting a PLC communication failure at 2 a.m., and years spent working directly with operations teams.
That matters more now than ever.
There’s a growing wave of AI messaging in industry with every major consulting firm and enterprise software provider is talking about “digital transformation,” “industrial AI,” and “enterprise data strategy.” Some of that work is valuable. But too often, these initiatives start from the top down, driven primarily by IT-first thinking and corporate data architecture rather than operational understanding. Operational data is different.
Industrial environments can be messy. Two assets from the same manufacturer can have completely different tag structures. Naming conventions evolve over decades and with asset purchases. Data ranges are inconsistent. Some signals matter every millisecond and some only matter when during abnormal situations, or as is often said in the field, when ‘it hits the fan’. OT data isn’t clean consumer data or financial transaction data. It reflects real physical processes, maintained by generations of engineers, operators, and technicians solving practical problems over time.
That’s why OT programs built solely around large enterprise platforms often struggle to deliver quick value. The infrastructures demanded are massive and incredibly expensive even before the use cases are proven. Companies spend years trying to normalize data into rigid structures that were originally designed for ERP systems or business analytics, not high-frequency operational intelligence. Meanwhile, operations teams are waiting for something useful.
The irony is that the OT world has been doing many of these things for decades already. Long before AI became a boardroom buzzword, industrial automation teams were building centralized historians, integrating plant systems, and deploying advanced process control. In many ways, APC was the early industrial form of AI — using data models, process understanding, and real-time optimization to improve operations continuously.
Automation and OT companies understand where data originates because they’ve worked directly with the instruments, PLCs, SCADA systems, and operators generating it. They understand which data matters, which data is noisy, and which data should never leave the
control room. Most importantly, they understand that successful industrial AI doesn’t begin with enterprise architecture. It begins with operational problems.
Start small. Prove value quickly. Build trust with operations. Create a scalable data foundation based on actual use cases, not theoretical future states. Not as exciting as billion-dollar transformation, but more likely to produce results.
I’ve also seen what happens when large organizations acquire smaller system integrators Initially, they gain operational knowledge and practical experience, but over time, they force these companies into the corporate systems. Processes become heavier. Decision-making moves further away and the focus shifts from solving operational problems to managing enterprise programs.
Somewhere along the way, the ties come out and the dirt under the fingernails disappears.
Industrial AI absolutely has transformative potential. But companies should be careful about confusing scale with effectiveness. Bigger platforms do not automatically produce better operational outcomes.
The future of OT and AI will belong to organizations that combine modern technology with genuine operational understanding — the people who know both the data and the process behind it.