hands-on lab

Importing Data Into Cloud Bigtable

Difficulty: Intermediate
Duration: Up to 40 minutes
Students: 35
Rating: 5/5
Get guided in a real environmentPractice with a step-by-step scenario in a real, provisioned environment.
Learn and validateUse validations to check your solutions every step of the way.
See resultsTrack your knowledge and monitor your progress.

Description

Google Cloud Bigtable is a fully managed, scalable NoSQL database service for large-scale workloads. One use case for Bigtable is Internet of Things (IoT) applications. This lab is based on an IoT traffic monitoring application that uses sensors to record vehicle speeds along freeways. Although this scenario is the focus of this, generalizations of the data pipeline are also mentioned.

In the lab, you will explore the cloud services and application code that are used to collect the sensor data, store it in Bigtable, and read data from Bigtable. Once data is streaming into Cloud Bigtable, you will also learn how to back up and restore the table for your recovery needs.

Learning Objectives

Upon completion of this intermediate-level lab, you will be able to:

  • Describe how to import data into Cloud Bigtable
  • Use the Cloud Bigtable CLI (cbt) to query a table
  • Back up and restore a Cloud Bigtable table

Intended Audience

This lab is intended for:

  • Google Professional Cloud Database Engineer exam candidates
  • Google Cloud data engineers

Prerequisites

You should possess:

  • A basic understanding of the following Google Cloud services:

    • Cloud Bigtable
    • Cloud Dataflow

The following content can fulfill the prerequisites:

Environment before

Environment after

Covered topics

Lab steps

Signing In to the Google Cloud Console
Importing Data Into Cloud Bigtable With Dataflow
Backing Up the Bigtable Table