How do you spell ‘data’ backwards?
From the August 2023 print edition
Right off the top of your head, why is data important? Like answering the question ‘why is there air?’, answering the above probably took you less than a second to come up with at least one, if not many, answers.
So, here is a tougher question. Why is extracting data and turning it into actionable knowledge challenging? And why are we consistently finding ourselves in a reactive (or descriptive) state, playing catch-up rather than being proactively prescriptive?
Descriptive versus prescriptive?
In a recent article for a German publication, I wrote the following: “Descriptive analytics tell you what happened or is happening, while prescriptive analytics take risk management intelligence to the next level by telling you how to address a problem. It is an evolution of intelligence gathering, analysis, and optimized responses.”
The above is more than wordplay or semantics. However, it is easy to use the words ‘descriptive’ and ‘prescriptive’ interchangeably.
For example, when it comes to responding to volatility in the supply chain, one reader observed:
“I am not sure what is talked about in the blog post are prescriptive analytics. Prescriptive analytics tell you what to do and are essentially automating response with a human validation.
The blog post, as far as I can see, is talking about enriching real-time data with more insights to make a better call, the industry standard since a couple of years.”
Given our traditional view of business intelligence and digital transformation, it was a reasonable point to raise. In other words, automation – particularly artificial intelligence (AI) – should be about enhancing decision-making and not simply providing data faster.
A top 3PL executive, Shira Yoskovitch, expressed a similar view when I talked to her about what happened with Peloton during the height of the pandemic. When the issue regarding the timely availability of data came up, she said that the problem wasn’t because they didn’t have the information to see what was coming. The problem is they didn’t know how to read it.
Once again, access to timely data, which was a very real problem in the past, is no longer an issue with today’s advanced, digital, AI technologies.
In short, we now have the data. In some cases, we have an over-abundance of data. We need the ability to sift through the information beyond knowing what is happening and what we can do about it.
Deep Blue and Garry Kasparov
In my response to the reader, I assured them that I believe we are saying the same thing regarding automating response with human validation. For example, how descriptive analytics tell you what happened or is happening, while prescriptive analytics take risk management intelligence to the next level by telling you how to address a problem.
I provided the following case study to demonstrate the evolution of descriptive data capture and availability to prescriptive intelligence gathering, analysis, and optimized responses beyond a conceptual understanding.
When I developed one of the industry’s first algorithm-driven web-based procurement systems for the Department of National Defence in the late 1990s and early 2000s, the buyer entered the weighted parameters for a specific purchase based on policies and priorities.
These weighted parameters could be changed to reflect an immediate change in the situation. At that point, the algorithms would instantly “recalculate” a different solution or answer
to address the new scenario.
No matter how advanced AI becomes, you still need effective intelligence gathering – including accurate data and human analysis to assess the program’s output to achieve optimized outcomes.
In this context, I likened it to the chess interplay between chess grandmaster Garry Kasparov and IBM’s Deep Blue “technology.”
A progressive state of interplay
The first time Kasparov played Deep Blue was in a six-game match in 1996. Kasparov beat the computer four games to two.
For the next match in 1997, IBM upgraded Deep Blue. The result was that the machine defeated the man, winning two games and drawing three.
In recounting his experience, Kasparov wrote, “While writing the book, I did a lot of research – analyzing the games with modern computers, also soul-searching – and I changed my conclusions.
I am not writing love letters to IBM, but my respect for the Deep Blue team went up, and my opinion of my own play, and Deep Blue’s play, went down.”
As you read the above quote from Kasparov’s book, what stood out the most? It was his revelation that neither the Big Blue AI technology nor his own chess IQ was a match for the IBM Deep Blue team.
When it comes to AI and data analytics, it is not a machine-versus-human or human-versus-machine scenario. It is a collaborative approach to intelligence gathering and analysis to enable organizations to identify and implement optimized responses. This ability is at the heart of
a prescriptive analytics approach to supply chain volatility.
Turning your ‘ataD’ into Data
The main takeaway from this article is that data analytics and procurement spend are not static information to enable you to buy better. It is not about contract compliance or budgeting. Data analytics and procurement spend are not really about money at all. OK, maybe they are a little bit.
But it is really about what you achieve or accomplish with the dollars you spend in peaceful times, along with times of great volatility and uncertainty.
To put it another way, what good is a consolidated view of your spend alone if the goods or services you procure are not being delivered because of a disruption? Think
of the example of Peloton.
When you think of data analytics, AI and consolidated spend, you must always have the end game or result in mind; otherwise, what is the point?