top of page

Supply Chain "Whack-a-Mole": How AI, Standardization, and Adoption will Fix the Mess

  • Writer:  David Renne
    David Renne
  • Mar 17
  • 8 min read


TL;DR: In this Sentinel "Deep Dive," David Renne and Andy Reed share how - after extensive research and conversations with industry leaders - they’ve come to see how supply chain data management is a never-ending game of whack-a-mole: fix one issue, and another pops up. AI offers a path forward, but without standardized, interoperable data, even the best systems struggle. The real winners will be the adopters and innovators who break the cycle and make seamless data integration a reality.


"Every success story is a tale of constant adaptation, revision, and change." - Sir Richard Branson
 

Introduction


In our recent trips to several supply chain technology conferences (see our takeaways here), we reconnected with various industry leaders to hear their findings and further understand their pain points. With decades of experience in the supply chain planning space, their insights as both users of legacy systems and early adopters of new technology have been invaluable to our research.


As we shared in prior blogs (here and here), one of the biggest challenges in supply chain technology investment is adoption. And underpinning that macro challenge is a litany of micro challenges that have largely held the industry at bay over the past decade.


One of those micro challenges, and a key area of focus for us, is the widespread issue of master data management (MDM) across supply chains. MDM refers to the standardization of data - both internal and external - ensuring interoperability across supply chains and their systems - regardless of suppliers, distributors, carriers, or other stakeholders.


Without high-quality MDM, operations can grind to a halt, visibility becomes obscured, and supply chains lose their agility. While it may seem intuitive, data-driven decision-making only works when the data is accurate, up-to-date, and accessible.  Relying on incomplete, outdated, or siloed data is a recipe for inefficiency, resulting in imprecise and under-optimized outcomes. 


In this Sentinel Deep Dive, we’ll explore some of the common pitfalls in tackling the industry-wide MDM problem and highlight potential solutions for the future.


A big thank you to the adopters we met at each conference for sharing their expertise and experience with us. If you, like them, have insights, challenges, or visions for the future of supply chain technology, please reach out!  


 

The Scope of the Master Data Problem


The lack of quality data management can cost businesses worldwide up to 25% of their revenue through lost time and lost sales. This is no different in supply chains – and it is often exacerbated when data inconsistencies ripple from suppliers to customers and back again. 

The compounding nature of different data across supply chains creates a whack-a-mole problem.


For instance, if you are a large multinational shipper with hundreds of suppliers and want to impose MDM standards across all of them, you must drive change across a vast network of companies. This requires both upstream and downstream counterparties to comply with the standards that you, as the shipper, prefer – standards that may not necessarily align with what is best for each individual business.


Now, suspend disbelief for a moment and assume this standardization is achievable in today’s world. What happens when one of these suppliers adjusts its standards to accommodate another large multinational shipper?


This is where the whack-a-mole syndrome takes hold, and the ability to manage every possible node of your supply chain becomes untenable. This is a point that harkens back to our earlier piece – a key barrier to adoption is clean, accessible, and usable data. We believe the lack of MDM standards across the industry have both created opportunity and simultaneously sunk some promising startups and talented founders in recent years.

It is worth recognizing previous attempts to standardize data. The first that comes to mind is EDI standards. EDI, or electronic data interchange, was developed in the 1960s but not broadly adopted until the 1990s. Despite its long history, it remains the gold standard for most trading partners around the world. However, our conversations indicate that EDI implementation and use remain cost prohibitive.


Other solutions, like ISO 8000, have attempted to force the issue (and ISO 8000, specifically, has necessitated it). However, we still see major gaps in communicable data between supply chain participants. Again, it bears repeating that only incremental strides have been made over the past 60 years.  However, a modern, adaptable, and, perhaps most importantly, affordable solution remains elusive for the industry as a whole.


Data is an acknowledged issue here, but the unspoken, more specific concerns are archaic systems and data pipelines. It costs too much to rebuild system architecture, leading many organizations to opt for bolt-on solutions rather than addressing core infrastructure.


 

Key Barriers in Data Standardization and Integration


Imagine, if you will: the year is 1950. Goods shipped internationally are loaded into sacks, barrels, and crates – each differing in size and weight, making loading and unloading 1) difficult and 2) time consuming. Enter Malcolm McLean – the father of modern containerization. In 1956, McLean introduced a standardized size and shape of international shipping containers. Effectively, he personally ushered in a new standard for global trade and reduced the time to load and unload international shipments onto and off of ships from a matter of weeks to a matter of hours. Today, nearly 80% of goods worldwide are moved over earth’s oceans in shipping containers born out of McLean’s vision.



You may ask - what does this have to do with MDM? The story illustrates that despite the seemingly infinite nature of global supply chains, consensus building and standardization is possible - and McLean accomplished it prior to the advent of the internet and modern communication technology. Fast forward to today, and despite those revolutions in communication standards, international supply chains still operate in siloes, with global standards for how data and people interact seemingly far away. 


Why haven’t we achieved a similar breakthrough in supply chain data management? The advent of technology like EDI connections and APIs, though expensive for smaller supply chain participants, are steps toward global unification. However, larger resistance to change and unification remain, as few suppliers are incentivized to adhere to one customer’s requirements or suggestions.  This leaves us with fragmented, disparate systems that are unable to properly communicate and unlock value. 


McLean’s revolution worked because there was only one viable standard. Today, we have 30 competing systems, each offering distinct benefits.  How do we choose a solution that provides the necessary features while ensuring seamless, secure, and private data sharing? This question remains unresolved, but we believe the winners in this space will be those who successfully address this challenge – and we may already be seeing some early answers..


 

Emerging Solutions and Their Challenges


As we’ve seen across different industries (and have written about previously), open source technologies have revolutionized the way employees, companies, and consumers work, live, and collaborate. As we see it, ubiquitous open-source supply chain planning software could provide gateways to better collaboration and, over time, a potential solution to our MDM conundrum. Additionally, startups in the space can begin to explore freemium models to achieve peak virality, incentivizing participants to coalesce around standardized data to maximize ROI. 


Another thought: we believe one effective solution to wrangling the master data problem could be grassroots efforts at universities and large organizations to better inform supply chain talent. This way, use cases are better understood and innovation can more scalably be achieved. After all, supply chain technology is unique, and its best programmers, data engineers, and developers have a very thorough understanding of the industry, its participants, and their needs. 


Other industries have democratized access and improved data quality by standardizing data pipeline rails, getting insights closer to the point of creation, using common data stores that play nicely with other internal systems, and allowing for the quick ingestion and transformation of that data to be usable by all factions of the business. When we think ten years ahead, we believe history will favor the founders and adopters who lead the way toward a more democratized supply chain, potentially following the pathway paved by many software and AI startups today.


But even within an organization, achieving consistent data flow across teams is a significant challenge, particularly as organizations scale. Teams often rely on different systems (e.g. ERP systems like SAP, Oracle, Acumatica, Plex Systems, and Blue Yonder, to name a few), typically chosen based on leaders’ prior experience and preferences.


This fragmentation creates additional barriers to effective MDM within the organization itself. Efforts to integrate disparate systems often result in the same "whack-a-mole" syndrome observed at the broader supply chain level. Even when an organization commits to standardization, the process can take years to materialize due to resistance from teams unwilling to abandon their established workflows and alter how they operate. 


Additionally, many large organizations opt for custom-built systems over third-party solutions like Kinaxis. For instance, in our experience with enterprises such as Flipkart, most teams prioritize building proprietary tools tailored to their unique requirements, citing the need for customization and data privacy. This approach frequently exacerbates the problem of data inconsistency, as suppliers are left scrambling to accommodate the distinct data requirements of multiple clients, further destabilizing standardization efforts.


Another significant hurdle lies in the skill levels of staff at all levels of supply chain operations. Many advanced tools and platforms demand a steep learning curve, yet employees often lack the necessary training or expertise to navigate these systems effectively. To address this gap, organizations frequently resort to simpler data standards that can be managed by less-experienced staff, inadvertently deprioritizing long-term standardization needs. 


Small and medium-sized supply chain businesses, in particular, often operate in a reactive, "firefighting" mode, focusing on short-term survival rather than investing in long-term improvements to data quality and standards. This tendency further undermines efforts to create cohesive, standardized supply chain ecosystems. While some employees have memorized the standard systems they use daily, introducing new tools may seem like an effective strategy.  But as noted, change is often slow and stubborn, and the implementation of new solutions must not feel overbearing on the user. 


 

Path Forward: Strategic Recommendations


Shaping the future of interoperable commerce requires leadership.  To that end, it is paramount that leaders in the industry – including ourselves and other VCs –, find and fund companies, founders, and ideas that will shape the future of interoperable supply chains. Companies with sufficient scale in the space should also exert their influence to help the industry coalesce around data standards and systems, leading to a future where the MDM conundrum becomes a relic of a bygone era. 

One potential path toward standardization is promotion of open-source or freemium models in the supply chain planning and supply chain technology space, as previously mentioned. Owners and managers can drive adoption by natively embedding these technologies into operations, and building through communally determined standards for data presentation and storage..


Another potential solution is the rise of vertically integrated supply chain AI companies that can clean and transform disparate data into standardized formats. However, much like the containerization revolution ushered in by McLean, this innovation will only be possible once a universal standard is established across the industry. Without a common framework, even the most advanced AI systems risk being undermined by the same fragmentation that plagues current systems. 


Different supply chain processes will naturally require tailored standards. For example, the data requirements for procurement differ significantly from those for freight tracking. To address this complexity, we foresee these vertical AI companies developing specialized models for each standard, enabling a more targeted and efficient approach to data standardization. The potential impact of such a solution is immense. By integrating advanced AI capabilities with clearly defined standards, the industry could achieve not only greater interoperability but also enhanced agility.


 

Conclusion


The challenges posed by MDM underscore the high stakes nature of supply chain operations. Without effective solutions, minor inefficiencies cascade into significant disruptions, and bottlenecks beget other bottlenecks, stalling entire supply chains and eroding business performance.


To address these challenges, collaboration, innovation, and a willingness to embrace new paradigms are paramount. By prioritizing standardization, fostering creative solutions, and leveraging advanced technologies like AI, the industry can overcome its current limitations. 


Now is the time for stakeholders across the supply chain to work together and pave the way for a more resilient, efficient, and interoperable future. Now is the time for startups to find their footing and develop bold, innovative solutions that can change the face of the industry at large.

Another big thank you to the adopters who were kind enough to provide the inspiration for this post.


Please reach out to us if you’d like to speak more about supply chains, interoperability, and data pipelines or if you want to share any interesting ideas you may have yourself.




댓글


bottom of page