The importance of monitoring machine learning models
- by 7wData
Agile development teams must ensure that microservices, applications, and databases are observable, have monitoring in place to identify operational issues, and use AIops to correlate alerts into manageable incidents. When users and business stakeholders want enhancements, many devops teams follow agile methodologies to process feedback and deploy new versions.
Even if there are few requests, devops teams know they must upgrade apps and patch underlying components; otherwise, the software developed today will become tomorrow’s technical debt.
The life-cycle management of machine learning models is more complex than software. Andy Dang, cofounder and head of engineering at WhyLabs, explains, “Model development life cycle resembles software development life cycle from a high level, but with much more complexity. We treat software as code, but data, the foundation of an ML model, is complex, highly dimensional, and its behavior is unpredictable.”
In addition to code, components, and infrastructure, models are built using algorithms, configuration, and training data sets. These are selected and optimized at design time but need updating as assumptions and the data change over time.
Like monitoring applications for performance, reliability, and error conditions, machine learning model monitoring provides data scientists visibility on model performance. ML monitoring is especially important when models are used for predictions or when the ML runs on datasets with high volatility.
Dmitry Petrov, cofounder and CEO of Iterative, says, “The main goals around model monitoring focus on performance and troubleshooting as ML teams want to be able to improve on their models and ensure everything is running as intended.”
Rahul Kayala, principal product manager at Moveworks, shares this explanation on ML model monitoring. “Monitoring can help businesses balance the benefits of AI predictions with their need for predictable outcomes,” he says. “Automated alerts can help ML operations teams detect outliers in real time, giving them time to respond before any harm occurs.”
Stu Bailey, cofounder of ModelOp, adds, “Coupling robust monitoring with automated remediation accelerates time to resolution, which is key for maximizing business value and reducing risk.”
In particular, data scientists need to be notified of unexpected outliers. “AI models are often probabilistic, meaning they can generate a range of results,” says Kayala. “Sometimes, models can produce an outlier, an outcome significantly outside the normal range. Outliers can be disruptive to business outcomes and often have major negative consequences if they go unnoticed. To ensure AI models are impactful in the real world, ML teams should also monitor trends and fluctuations in product and business metrics that AI impacts directly.”
For example, let’s consider predicting a stock’s daily price. When there’s low market volatility, algorithms such as the long short-term memory (LSTM) can provide rudimentary predictions, and more comprehensive deep learning algorithms can improve accuracy.
[Social9_Share class=”s9-widget-wrapper”]
Upcoming Events
Shift Difficult Problems Left with Graph Analysis on Streaming Data
29 April 2024
12 PM ET – 1 PM ET
Read MoreYou Might Be Interested In
We’re Finally Realizing the Promise of Big Data
22 Feb, 2017Seeing into the future used to be the stuff of fantasy and science fiction, but as we become more adept …
How Much Math do I need in Data Science?
7 Jun, 2020If you are a data science aspirant, you no doubt have the following questions in mind: Can I become a …
How AI Is Helping Diagnose Rare Genetic Diseases
1 Dec, 2021400 million people globally suffer from a rare disease. This is greater than the population of the United States, yet …
Recent Jobs
Do You Want to Share Your Story?
Bring your insights on Data, Visualization, Innovation or Business Agility to our community. Let them learn from your experience.