After that Day 1 keynote, the mood throughout K26 was a bit elevated. And rightfully so, considering the magnitude. But we like to stay grounded here at CoreX. And one of the more valuable aspects of Knowledge (so far) has been the number of conversations moving into operational reality.
Take, for example, what has to exist underneath enterprise AI before autonomy actually becomes achievable. A session exploring Workflow Data Fabric and autonomous IT operations brought that conversation into focus in a very practical way.
While the examples came from NFP and its operational model, the challenges discussed are familiar across many large enterprise environments. As infrastructure ecosystems continue to expand, operations teams are being asked to manage growing volumes of telemetry, monitoring events, alerts, dependencies, workflows, and service relationships spread across increasingly fragmented tooling environments.
Most organizations do not lack monitoring data. In fact, they often have too much of it. The issue is that the data frequently exists without enough operational context surrounding it to support confident action.
That challenge showed up repeatedly throughout the 30-minute session. Teams are flooded with alerts coming from siloed monitoring platforms, yet many still struggle to determine:
Which events truly matter
What business services are affected
Who owns remediation
How incidents should be prioritized in real time
Organizations often invest heavily in observability, monitoring, and event collection before they establish the service relationships, workflow orchestration, and contextual visibility required to operationalize that information.
The result is often perceived as a lack of intelligence, when the focus should be on a lack of coordinated operational context. That distinction matters because autonomous operations are not created simply by introducing AI into an existing environment.
AI systems still depend on trustworthy operational relationships underneath them. If alerts are disconnected from business services, if ownership structures are unclear, or if workflows are fragmented across systems, automation tends to amplify operational noise rather than reduce it.
That is what made this Workflow Data Fabric session particularly relevant. Rather than positioning the platform as another monitoring layer, the session focused on how the tool acts as a connective thread between telemetry and action.
Organizations can stream monitoring events directly into ServiceNow, where those events can then be enriched with CMDB relationships, workflow logic, service mapping, prioritization parameters, and broader operational intelligence.
Once operational signals are connected to meaningful business context:
Workflows can begin acting with much greater confidence.
Events can be correlated and deduplicated automatically.
Resolver groups can be assigned intelligently.
Incidents can be prioritized according to business impact instead of raw alert volume.
In some cases, remediation workflows can begin autonomously before human intervention is even required.
Refreshingly, the session did not frame this as replacing existing operational tools. The opportunity now is connecting those systems into a more operationally mature workflow architecture capable of supporting autonomous action.
Right now, much of the market conversation still focuses on models, copilots, and AI assistants. But inside enterprise environments, the harder challenge is operationalizing it safely across real infrastructure, real workflows, and real business dependencies.
That operational maturity becomes even more critical as organizations move toward agentic AI and autonomous operations. AI agents still require context. They still require governance. They still require trusted operational relationships across systems and services. Without that foundation, enterprises risk accelerating workflows without improving outcomes.
We see this frequently in organizations beginning their AI journey. Automation initiatives often stall not because the technology is incapable, but because the underlying operational environment was never fully connected in the first place.
Workflow Data Fabric is about creating the operational structure necessary for AI, automation, and workflows to function together in a coordinated way across the enterprise. One slide during the session summarized the idea succinctly:
“AI needs Workflow Data Fabric to be operational.”
That statement captures something many organizations are starting to realize. The future of autonomous IT will be shaped by which organizations are able to connect their operational data, workflows, governance models, and service relationships well enough to let automation act with trust, visibility, and context.
--
Have questions about your own data strategy? Schedule a consultation with CoreX to better manage your data today, and prepare it for what's next.