Information flow

From the June 2020 print edition

Technology has advanced at an ever-increasing rate. This has put a vast amount of data at the fingertips of supply chain professionals, leading to the rise of so-called big data. The digital era’s increase in data generation and availability is a boon to the supply chain field, as it provides insights not previously available. But rapid data production also raises challenges, such as how to use all the information, what technology must be leveraged to do so, what roadblocks exist to data collection and analysis and where to seek the information needed to get started?

According to Jon Trask, CEO and founder of Blockchain Guru, big data is a combination of data (structured, semi-structured and unstruc­tured) that an organization collects, which can then be mined for useable information. Organ­izations can employ the data in machine learning projects, predictive modelling and other advanced analytics applications.

Systems that process and store big data have become a common feature of many organizations’ data management architectures, Trask says. Big data is also often characterized by the “3Vs,” he notes: the wide variety of data types, the volume and velocity at which it’s generated, collected and processed. Other “Vs” have been added recently, including veracity, value and variability.

“Although big data doesn’t equate to any specific volume of data, big data deployments often involve terabytes (TB), petabytes (PB) and even exabytes (EB) of data captured over time,” he says.

Companies and their supply chains gain an advantage by properly using advancements in technology that capture big data, as well as developments in data science (AI and machine learning, for example) that analyze big data, Trask says. For example, he notes, data science and advanced analytics can help uncover correlations and trends, potentially ahead of the competition. It can also give insights into opportunities, improve planning, forecasting and decision-making, help supply chain professionals secure the most critical competi­tive growth strategies, as well as automate data analysis to save time, resources and effort.

Trask notes that the advanced analytics of big data can also:

  • minimize ETD and ETA prediction errors, enable smart inventory management and category planning;
  • increase end-to-end coordination of the purchasing value chain and route optimization;
  • improve supplier selection and distribution channel efficiency;
  • optimize price points and results in end-to-end cost reduction; and
  • help detect fraud and counterfeiting while improving risk management and quality.

“Understand the power of emerging technology – IoT as a data creation tool, blockchain as a data storage and sharing tool and AI or BI software to aid in transitioning data to improve predictions and forecasts,” Trask advises those looking to leverage big data analytics. “The supply chain world has changed significantly in the past few years. It’s harder for a supply chain professional to keep abreast of these changes and understand how to use emerging technology.”

The role of data
Data is a key ingredient to supply chain success for several reasons, says Josh Levin, senior director, supply chain at Synovos. Strong data allows practitioners to negotiate better with suppliers, optimize opportunities for cost savings while focusing on standardization exercises and areas with impact potential to an organization. “The more reliable the data set, the more we understand where the spend is going and how it’s being used,” Levin says. “It creates much larger cost-savings opportunities.”

Data collection and analytics means better planning, forecasting and even automation of procurement, Levin notes. This frees up time for forecasting, helping to search out cost savings opportunities. For example, data analytics can lead to discovering the underperformance of a part’s life expectancy, Levin says. It’s possible to then make recommendations surrounding changing the source of the part
or recommending an alternative. “The client can also consider changing its processes or equipment,” Levin says. “But it all starts with data integrity.”

The bigger the data set, the better the understanding of actual inventory usage, Levin adds. Accurate data can help establish min­imum and maximum inventory levels and track usage.

In turn, this allows supply chain professionals to have the right materials on hand at the right time, therefore planning the procurement lifecycle with far more confidence.

Levin recommends maintaining data integrity through best practices. It’s possible to have three years’ worth of data in a system but no useful information like manufacturer names, part numbers or proper units of measure, he says. Making sure that fundamental inform­ation is available is key, he says. Also important is to think strategically about data, Levin says. With proper informaiton, a purchase at one site that seems insignificant can impact the entire organization.

“That includes being able to manage materials across different locations from the same data set,” he says. “It leads to stronger supply chain strategy and better inventory strategy.”

Big data’s importance to supply chain has grown as the number of different parts (SKUs) grows and operations spread globally, says Rodrigo Altaf, procurement analyst and category manager at Vale, a mining and metals company. Through analyzing the consumption of each SKU per location over time, it’s possible to estimate demand more accurately and therefore to plan deliveries, Altaf says. At the same time, big data can help to paint a picture of clients’ consumption patterns, therefore aiding supply profes­sionals in planning production, shipping and warehousing activities accordingly, helping to avoid shortages or surpluses.

“This improves the cashflow and generates savings,” he says. “Outbound logistics can also be optimized by providing more accurate information to the product carriers, which will help them plan their activities to better serve their customers.”

Altaf recommends investing in the quality and relevance of information that’s gathered before making any decisions. It’s also important to validate and align with end users and vendors, he adds.

“Have someone treating data who doesn’t necessarily perform the sourcing activity but is 100-per cent dedicated to analyzing data and to verifying the software is functioning well,” Altaf says.

“Make sure the experience is shared between several category managers – the inform­ation may not be treated the same way across all categories, but the routine of analyzing the data can be rolled out across most categories.”