Back to Courses

Data Management Courses - Page 30

Showing results 291-300 of 399
Advanced Topics and Future Trends in Database Technologies
This course consists of four modules covering some of the more in-depth and advanced areas of database technologies, followed by a look at the future of database software and where the industry is heading.
Using Efficient Sorting Algorithms in Java to Arrange Tax Data
By the end of this project, you will learn how to create an application that sorts Missouri Tax Data into ascending order using a variety of critical sorting algorithms. We will learn how to process a real life data set, and see the difference between various sorting algorithms in memory and time usage. In addition, we will learn how to analyze a sorting algorithm and how to design a readable implementation. Finally, we will cover what circumstance are ideal for each type of sorting algorithm. After completing this project, students will be able to move to more advanced algorithms and data structures. Sorting algorithms are essential to the creation of powerful and efficient programs, for almost any circumstance when we need to arrange the data for our user in a certain order. Doing this can make it significantly faster for a human, or even computer, to parse and understand this data to make business decisions. We will explore how each of these sorting methods are different and how to implement them. We will also briefly cover how to access these methods using built-in Java functions. In this course we will cover bubble sort, insertion sort, merge sort, selection sort, and quicksort. These five sorting techniques span a variety of efficiencies and use cases in real life. They also all are easy to implement with knowledge of arrays, recursion, and loops in Java.
Data Publishing on BigQuery for Data Sharing Partners
This is a self-paced lab that takes place in the Google Cloud console. In this lab you will learn how to share datasets and publish datasets with BigQuery.
Use the Apache Spark Structured Streaming API with MongoDB
By the end of this project, you will use the Apache Spark Structured Streaming API with Python to stream data from two different sources, store a dataset in the MongoDB database, and join two datasets. The Apache Spark Structured Streaming API is used to continuously stream data from various sources including the file system or a TCP/IP socket. One application is to continuously capture data from weather stations for historical purposes.
Analyze Data to Answer Questions
This is the fifth course in the Google Data Analytics Certificate. These courses will equip you with the skills needed to apply to introductory-level data analyst jobs. In this course, you’ll explore the “analyze” phase of the data analysis process. You’ll take what you’ve learned to this point and apply it to your analysis to make sense of the data you’ve collected. You’ll learn how to organize and format your data using spreadsheets and SQL to help you look at and think about your data in different ways. You’ll also find out how to perform complex calculations on your data to complete business objectives. You’ll learn how to use formulas, functions, and SQL queries as you conduct your analysis. Current Google data analysts will continue to instruct and provide you with hands-on ways to accomplish common data analyst tasks with the best tools and resources. Learners who complete this certificate program will be equipped to apply for introductory-level jobs as data analysts. No previous experience is necessary. By the end of this course, you will: - Learn how to organize data for analysis. - Discover the processes for formatting and adjusting data. - Gain an understanding of how to aggregate data in spreadsheets and by using SQL. - Use formulas and functions in spreadsheets for data calculations. - Learn how to complete calculations using SQL queries.
Understanding, Using, and Securing Crypto and Digital Assets
This specialization offers the latest developments in blockchain technology through a highly engaging learning experience with animated video components and intuitive course flow to maximize your knowledge retention.
Eventarc for Cloud Run
This is a self-paced lab that takes place in the Google Cloud console. In this lab, you will use Eventarc for Cloud Run to listen to events from Cloud Pub/Sub and Audit Logs. At the end of this lab, you will be able to deliver events from various sources to Google Cloud sinks.
OpenTelemetry
In this lab you will build a test appliction that sends metrics and trace data to Prometheus and Zipkin backends via OpenTelemetry collector and agent
Introduction to Privacy - Part 3
Part 3 of the Privacy Fundamentals explores information security and data protection and how to manage a security breach. We will also take a look at continual improvement which includes understanding performance, exploring metrics, different audit methodologies and more.
Web Applications and Command-Line Tools for Data Engineering
In this fourth course of the Python, Bash and SQL Essentials for Data Engineering Specialization, you will build upon the data engineering concepts introduced in the first three courses to apply Python, Bash and SQL techniques in tackling real-world problems. First, we will dive deeper into leveraging Jupyter notebooks to create and deploy models for machine learning tasks. Then, we will explore how to use Python microservices to break up your data warehouse into small, portable solutions that can scale. Finally, you will build a powerful command-line tool to automate testing and quality control for publishing and sharing your tool with a data registry.