For years, supply chain professionals have talked about visibility, resilience, and efficiency. The tools we have used, ERP systems, spreadsheets, and siloed databases, have served us well, however as complexity increases, and the margin for error narrows, there has been a growing recognition that patchwork systems are no longer enough.
What is needed is a practical, scalable way to unify supply chain data across systems, make it useful in real time, and apply intelligence, whether from algorithms, machine learning models, or a trained human eye, to act on it quickly. That is exactly the direction taken by InterSystems and their customers, as detailed at InterSystems READY 2025 in Orlando, Florida.
Data Fabric Studio: One Place to Start
At the center of the discussions was the InterSystems Data Fabric Studio, a cloud-based system designed to integrate and organize data from multiple sources. Not just for IT departments or data scientists, but also for the people who manage daily operations, procurement leads, planners, and inventory analysts.
This product connects directly to systems like Snowflake, Kafka, AWS S3, and relational databases. It allows users to build and automate workflows (called “recipes”) that clean, reconcile, and move data into consistent formats, without having to code from scratch.
In short, it helps turn fragmented operational data into something trustworthy, structured, and ready for use across departments.
Use Case: Supplier Data Integration Across ERPs
One session focused on a familiar problem: integrating supplier data spread across two disconnected ERP systems. Each system used different IDs for the same suppliers, different formats for purchase orders, and different rules for reconciliation.
Using Data Fabric Studio, the team:
- Mapped and validated key identifiers (like DUNS numbers) across systems.
- Flagged inconsistencies in supplier names and standardized the records.
- Created lookup tables and transformation rules to automate future loads.
- Set a schedule to refresh this data daily, no manual uploads are needed.
The takeaway: fewer errors, faster onboarding, and one consistent view of supplier performance.
Forecasting, Not Just Reporting
Several sessions went far beyond integration, since once data is unified, it becomes possible to do more with it, like improve forecasts or detect early signs of trouble.
One method shown was to create snapshots of data tables at regular intervals, such as open purchase orders at the start of each week or inventory by location at shift change. These snapshots could then feed planning tools without requiring repeated rework or new queries every time someone asks for an update.
It is not classical predictive AI, but it is the kind of practical structure that supports accurate forecasting and decision-making.
AI Integration
Many AI projects fail not because of the models themselves, but because the data going into them is disorganized or outdated. InterSystems’ position, with which ARC strongly agrees is that data must be AI-ready, structured, validated, and governed, before AI can be reliably applied.
For those who are ready, Data Fabric Studio includes native support for vector search and retrieval-augmented generation (RAG). This means it can:
- Embed semantic search into procurement or customer service workflows.
- Feed large language models with accurate, up-to-date information drawn from verified data.
- Support natural language interfaces, including assistants that generate SQL or explain trends.
One example came from Agimero, a European firm that used vector search to streamline parts procurement. They built a semantic layer into their sourcing tool, which helped reduce turnaround time and freed up staff. Time to deploy? Less than a week!
Lessons from Healthcare That Apply Here
A keynote from the AI for Healthcare track may seem unrelated at first, but the core lesson was broadly applicable: data doesn’t have to be clinical to be useful in diagnosis. In one case, the team used shopping data to detect early signs of ovarian cancer based on changes in food purchases.
Now let’s translate that into supply chain language.
What if sudden shifts in supplier invoicing patterns indicated financial stress? What if internal communications flagged increasing lead times before they hit the dashboard? The tools now exist to explore those questions, not just log them.
The point is it is time to examine where such signals might live in your own systems.
A Modular Approach That Does Not Lock You In
Another strength of the Data Fabric Studio is its modular design. You can start with basic data ingestion and cleaning, then layer on adaptive analytics, natural language assistants, or domain-specific modules (e.g., for supply chain, finance, or healthcare) when and if they make sense.
Unlike some vendor ecosystems, this one doesn’t insist you move everything into a new system. It works alongside existing data warehouses, ERP tools, and planning platforms. That flexibility matters, especially for organizations that cannot afford multi-year migrations.
Vector Search and RAG: Where It Fits
The integrated vector search capabilities shown during the sessions were grounded, not speculative. One demonstration showed how a company used it to improve search across 400 million records of biological data. The same tools were used in supply chain use cases, surfacing similar suppliers, matching part numbers across catalogues, or enabling text-based queries across historical documents.
These systems don’t replace human judgment, but they make pattern recognition faster, and reduce the time spent digging through dashboards and reports to get to the relevant piece.
Scalability Is not Optional
For companies working on a global scale, performance is key. Sessions with Epic (the healthcare software company behind MyChart) showed how InterSystems IRIS, the underlying engine behind Data Fabric Studio, supports hundreds of millions of real-time transactions.
Why mention this in a supply chain context? Because once data becomes foundational to operations, slow queries and manual workarounds no longer are enough. The infrastructure must keep pace.
InterSystems have built their offerings with that in mind, whether for healthcare, finance, or logistics.
What stood out across all the sessions was not hype, there was a clear theme:
- Organize your data first.
- Reconcile it across systems.
- Use automation to reduce repeat work.
- Add intelligence gradually, where it supports decisions.
- Prioritize infrastructure that can scale.
If you have worked in supply chain for any length of time, that list is no surprise, however seeing those steps pulled together in one single system, accessible to both developers and business users, is important and unique.
Digital Transformation is about doing what works, better, faster, and with less friction, and that is exactly what was seen at InterSystems READY 2025.