12 November 2020 - by Trevor Miles
The quest for resilient supply chains is not new. In fact, it is relatively easy to go back to the early 2010s and beyond and find a lot of material on the topic of supply chain resiliency. Yossi Sheffi
wrote an article titled, “Building a Resilient Supply Chain”, which appeared in the Sloan Management Review of October 2005. It is a précis of his book, “The Power of Resilience”.
Of course, we need to start with a definition of resiliency used in this context. Darwin for example already had a lot to say on the topic. Clearly, there is a strong link between the quote from Darwin and the sub-heading of Sheffi’s book:
“It is not the strongest of the species that survive, nor the most intelligent, but the one most responsive to change.”
The question is how. Yossi Sheffi breaks it down into two alternatives:
1. Redundancy: the use of high safety stock, and spare capacity. Of course, this is all very expensive, and not necessarily that effective.
However, recent events, particularly the COVID-19 crisis, have brought the topic back to the forefront, particularly the “control system” option, otherwise known in supply chain circles as visibility (detection) and planning (correction). These are crucial for avoiding the use of redundancy to achieve resilience.
Tim Payne at Gartner has been discussing resilient supply chain planning for a little while now. In an article titled, “Mastering Uncertainty: The Rise of Resilient Supply Chain Planning”, Tim states that;
“SCP has been driven by a deterministic planning paradigm since the invention of MRP.”
While he refers to MRP -the basis of nearly all planning tools- as being deterministic, he doesn’t explicitly mention that none of the existing planning tools can address the need for stochastic planning, which models the uncertainty. Without modeling stochasticity, there is little chance of developing a resilient plan. Another dirty little secret he does not mention is that all existing tools base the plan on master data parameters that have been loaded from ERP systems. The quality of the master data is very questionable in many cases; so, garbage in, garbage out, compounding the poor quality of the plan.
Let me be clear, range forecasting is not the same as stochastic planning. Undoubtedly, range forecasting is better than a one-number forecast, but it completely misses the uncertainty on the supply side, which in turn gives a false sense of confidence to the supply chain organization that the range forecast can be achieved.
Because traditional planning tools are deterministic, they cannot give a risk estimate associated with a plan. Bringing this up indirectly by referring to the need to deploy new tools to achieve resiliency, ergo the existing tools cannot get us there.
Enter LOP.ai. The first thing that LOP.ai does is crawl through many transactions to determine the current demonstrated flows and performance of the supply chain. This is in sharp contrast to the use of the ERP master data by other planning tools. In reality, LOP.ai goes beyond traditional process mining tools such as Celonis, UIPath, Signavio, Minit and others by using probabilistic matching to link material flows across the end-to-end supply chain. This is a great opportunity for the primary value of process mining, namely Business Process Improvement (BPI), including compliance and conformance.
The diagram above clearly shows the flow of materials through two different production lines at a facility before flowing into the distribution process at the bottom of the diagram (*). There is no limit to the number of supply chain variants that LOP.ai can capture.
(*) The names of the facilities and the products are blocked out for reasons of confidentiality.
Of course, nearly everyone is interested in performing value stream analysis of their supply chains. The diagram below shows a multi-stage supply chain, where stages are marked by the darker vertical line, in which the process (PT), wait (WT), and transit (TT) times have been analyzed over a given period. The ladder diagram below, common to value stream mapping, is a great way of visualizing where process improvements can be achieved. Notice also that each of the times is plotted as a box plot to give a sense of variability.
While time is the obvious variable to capture, cost, quality, and other variables can also be captured.
While important, BPI does not get my juices flowing. Now for the part that excites me: The digital supply chain twin built up through the first stage of process mining can now be used for stochastic planning. As described by Tim Payne, this is the use of Monte Carlo simulation to run thousands of scenarios that sample from both the demand and supply side distributions of key variables to arrive at a Probability-to-Execute (PTE) score, which measures the probability that the goal can be achieved. In this case, the goal is a 90% customer service level. The PTE score is 97%, so the expected customer service level is around 87%. However, there is quite a high likelihood (79.5%) that the target customer service level will not be achieved, the average impact being 3.8%.
So, that were a lot of numbers. Let’s focus on the action now. Are you interested to learn and see more on how resilient planning could be a gamechanger for your company? Learn more about the rise of resilient planning here or reach out via firstname.lastname@example.org and let’s connect soon!