hands-on lab

Monitoring Data Quality Issues with Amazon SageMaker

Difficulty: Beginner
Duration: Up to 1 hour and 30 minutes
Students: 18
Get guided in a real environmentPractice with a step-by-step scenario in a real, provisioned environment.
Learn and validateUse validations to check your solutions every step of the way.
See resultsTrack your knowledge and monitor your progress.

Description

Amazon SageMaker is a service that enables you to build, train, and deploy machine learning models in the public AWS cloud. In addition, SageMaker provides a model monitoring feature that allows you to monitor the quality of your deployed models over time.

Learning how to use the model monitoring feature in Amazon SageMaker will benefit anyone who is looking to deploy machine learning models in production environments.

In this hands-on lab, you will use a Jupyter notebook to examine a dataset, an endpoint, and configure a model monitor schedule for the endpoint.

Please note: This lab uses an Amazon SageMaker notebook and endpoint, which can take up to ten minutes to deploy. Please ensure you have enough time available before starting the lab.

Learning objectives

Upon completion of this beginner-level lab, you will be able to:

  • Access a JupyterLab notebook
  • Generate and examine a synthetic dataset
  • Examine a deployed model endpoint
  • Configure a model monitor schedule for the endpoint

Intended audience

  • Candidates for the AWS Certified Machine Learning Engineer Associate certification
  • Cloud Architects
  • Data Engineers
  • DevOps Engineers
  • Machine Learning Engineers

Prerequisites

Familiarity with the following will be beneficial but is not required:

  • Amazon SageMaker
  • Amazon S3
  • The Python programming language

The following content can be used to fulfill the prerequisites:

Environment before

Environment after

Covered topics

Lab steps

Logging In to the Amazon Web Services Console
Opening JupyterLab on Your SageMaker Notebook
Configuring Amazon SageMaker Model Monitor