Monitoring and Analyzing Data Quality for XGBoost Churn Models With Amazon SageMaker Model Monitor
In this lesson, you will learn how to use Amazon SageMaker Model Monitor to track and analyze the performance of a deployed machine learning model in real-time.
Learning Objectives
Set up a SageMaker environment for model monitoring
Deploy a model to a SageMaker endpoint with data capture enabled
Create a baseline for model monitoring
Implement a monitoring schedule for a deployed model
Analyze monitoring results and detect data quality issues
Intended Audience
This demo is ideal for machine learning practitioners, data scientists, and DevOps professionals who want hands-on experience in monitoring model inference and ensuring the ongoing performance of deployed models using Amazon SageMaker.
Prerequisites
To get the most out of this lesson, you should have some basic working knowledge of machine learning concepts and AWS cloud services. Familiarity with Python programming, Amazon SageMaker, and experience training machine learning models will be beneficial.