Do Your Big Data Analytics Measure Or Predict?
- by 7wData
Organizations have key business processes that they are constantly trying to re-engineer. These key business processes – loan approvals, college applications, mortgage underwriting, product and component testing, credit applications, medical reviews, employee hiring, environmental testing, requests for proposals, contract bidding, etc. – go through multiple steps, usually involving multiple people with different skill sets, with a business outcome at the end (accept/reject, bid/no bid, pass/fail, retest, reapply, etc.).
And while these processes typically include “analytics” that report on how well the processes worked (process effectiveness), the analytics only provide an “after the fact” view on what happened.
Instead of using analytics to measure how well the process worked, how about using predictive and prescriptive analytics to actually direct the process at the beginning? Instead of analytics that tell you what happened, how about creating analytics at the front of the process that predict what steps in the process are necessary and in what order? Sometimes the most effective process is the process that you don’t need to execute, or only have to execute in part.
High-Tech Manufacturing Testing Example
We had an engagement with a high-tech manufacturer to help them to leverage analytics and the Internet of Things to optimize their 22-step product and component testing process. Not only was there a significant amount of capital tied up with their in-process inventory, but the lengthy testing processes also created concerns about excessive and obsolete inventory in an industry where product changes happened constantly.
The manufacturer had lots of data that was coming off of the testing processes, but the data was being used after the fact to tell them where the testing was successful or not. Instead, our approach was to use that data to predict what tests needed to be run for which components coming from which of the suppliers manufacturing facilities. Instead of measuring what happened and identifying waste and inefficiencies after the fact, the manufacturer wanted to predict the likely quality of the component (given the extensive amount of data that they could be capturing but today was hitting the manufacturing and testing floors) and identify what tests where needed given that particular situation. Think dynamic or even smart testing.
We worked with the client to identify all the data that was coming out of all the different testing processes. We discovered that nearly 90% of the potential data was just “hitting the floor” because the organization did not have a method for capturing and subsequently analyzing this data (most of which was either very detailed log files, or comments and notes being generated by the testers, engineers and technicians during the testing processes). At a conceptual level, their processing looked like Figure 1 with traditional dashboards and reports that answered the basic operational questions about how the process was working (see Figure 1).
However, we wanted to transform the role of analytics to not just reporting how the processes was working, but we wanted to employ predictive analytics to create a score for each component and then use prescriptive analytics to recommend what tests had to be run and in what order given the results of the predictive score.
We used a technique called the “By Analysis” to brainstorm the “variables and metrics that might be better predictors of testing performance”. For example, when examining components with a high failure rate, we started the brainstorming process with the following question:
“Show me the percentage of component failures by…”
We asked the workshop participants to brainstorm the “by” variables and metrics that we might want to test.
[Social9_Share class=”s9-widget-wrapper”]
Upcoming Events
From Text to Value: Pairing Text Analytics and Generative AI
21 May 2024
5 PM CET – 6 PM CET
Read More