Back to Courses

Data Management Courses - Page 2

Showing results 11-20 of 399
Developing Data Models with LookML
This course empowers you to develop scalable, performant LookML (Looker Modeling Language) models that provide your business users with the standardized, ready-to-use data that they need to answer their questions. Upon completing this course, you will be able to start building and maintaining LookML models to curate and manage data in your organization’s Looker instance.
Use Bash Scripting on Linux to Execute Common commands
By the end of this project, you will use a bash script to execute commands and observe their output on a Linux system. Bash, or Bourne Again Shell, is more than a shell running in a terminal on Linux; it is a programming language that is used to create powerful programs called shell scripts. Shell scripts are often used to capture common repetitive tasks so they can be executed without the need to memorize multiple individual commands.
Introduction to Data Engineering
This course introduces you to the core concepts, processes, and tools you need to know in order to get a foundational knowledge of data engineering. You will gain an understanding of the modern data ecosystem and the role Data Engineers, Data Scientists, and Data Analysts play in this ecosystem. The Data Engineering Ecosystem includes several different components. It includes disparate data types, formats, and sources of data. Data Pipelines gather data from multiple sources, transform it into analytics-ready data, and make it available to data consumers for analytics and decision-making. Data repositories, such as relational and non-relational databases, data warehouses, data marts, data lakes, and big data stores process and store this data. Data Integration Platforms combine disparate data into a unified view for the data consumers. You will learn about each of these components in this course. You will also learn about Big Data and the use of some of the Big Data processing tools. A typical Data Engineering lifecycle includes architecting data platforms, designing data stores, and gathering, importing, wrangling, querying, and analyzing data. It also includes performance monitoring and finetuning to ensure systems are performing at optimal levels. In this course, you will learn about the data engineering lifecycle. You will also learn about security, governance, and compliance. Data Engineering is recognized as one of the fastest-growing fields today. The career opportunities available in the field and the different paths you can take to enter this field are discussed in the course. The course also includes hands-on labs that guide you to create your IBM Cloud Lite account, provision a database instance, load data into the database instance, and perform some basic querying operations that help you understand your dataset.
Dataflow: Qwik Start - Templates
This is a self-paced lab that takes place in the Google Cloud console. This page shows you how to create a streaming pipeline using a Google-Provided Cloud Dataflow template.
Security and Privacy for Big Data - Part 2
This course sensitizes regarding privacy and data protection in Big Data environments. You will discover privacy preserving methodologies, as well as data protection regulations and concepts in your Big Data system. By the end of the course, you will be ready to plan your next Big Data project successfully, ensuring that all privacy and data protection related issues are under control. You will look at decent-sized big data projects with privacy-skilled eyes, being able to recognize dangers. This will allow you to improve your systems to a grown and sustainable level. If you are an ICT professional or someone who designs and manages systems in big data environments, this course is for you! Knowledge about Big Data and IT is advantageous, but if you are e.g. a product manager just touching the surface of Big Data and privacy, this course will suit you as well.
Read an Input File with COBOL
In this project you will use COBOL code to read the data records from a sequential file. You will code, compile, and run programs using the PC-based COBOL IDE called OpenCobolIDE. Since COBOL is often used with large amounts of data, the ability to process files—to read the records in them as input and write the records as output—is a critical skill for any COBOL programmer.
Analytics as a Service for Data Sharing Partners
This is a self-paced lab that takes place in the Google Cloud console. In this lab you will learn how Authorized Views in BigQuery can be shared and consumed to create customer-specific dashboards.
Troubleshooting and Solving Data Join Pitfalls
This is a self-paced lab that takes place in the Google Cloud console. This lab focuses on how to reverse-engineer the relationships between data tables and the pitfalls to avoid when joining them together.
Aggregate Data in SQL using MySQL Workbench
In this project you will use MySQL Workbench to write SQL queries that aggregate (group) data. Incorporating aggregate functions like COUNT, SUM, and AVG, your SQL queries will group and summarize data. Data that is aggregated and presented in a logical format makes it a more valuable decision-making tool for users. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Advanced Features with Relational Database Tables Using SQLiteStudio
In this course, you’ll increase your knowledge of and experience with relational tables as you explore alternative ways of getting data into tables. You’ll also look at some of the advanced features that can give relational tables super powers. As you learn about the new features, you’ll use SQLiteStudio to apply them to your tables. Those features will enable your tables to more efficiently manage data—while keeping your data safe and accurate. Tables are great for data storage. The concept of organizing data in rows and columns is familiar to most people. Accountants use spreadsheets to organize financial data, making it easier to budget and track expenses. Parents use lists with columns to track their family’s schedules so that everyone gets to participate in outside activities. Even the Internal Revenue Service gets in the game by using tax tables to provide tax amounts for a variety of incomes. Even a simple grocery list is tabular in nature. Each row is an item, with one column having the item's name/description, and a second column noting the quantity needed. It’s no surprise that database designers like to use tables in a relational database to organize and store data. In the Design and Create a Relational Database Table Using SQLiteStudio course you learned about tables. You created and populated a relational table using the SQLiteStudio database management system. That was a great beginning. Now it's time for the next step!