Back to Courses

Data Science Courses - Page 3

Showing results 21-30 of 1407
Introduction to Microsoft Excel
By the end of this project, you will learn how to create an Excel Spreadsheet by using a free version of Microsoft Office Excel. Excel is a spreadsheet that works like a database. It consists of individual cells that can be used to build functions, formulas, tables, and graphs that easily organize and analyze large amounts of information and data. Excel is organized into rows (represented by numbers) and columns (represented by letters) that contain your information. This format allows you to present large amounts of information and data in a concise and easy to follow format. Microsoft Excel is the most widely used software within the business community. Whether it is bankers or accountants or business analysts or marketing professionals or scientists or entrepreneurs, almost all professionals use Excel on a consistent basis. You will learn what an Excel Spreadsheet is, why we use it and the most important keyboard shortcuts, functions, and basic formulas.
Build an End-to-End Data Capture Pipeline using Document AI
This is a self-paced lab that takes place in the Google Cloud console. In this lab you use Cloud Functions and Pub/Sub to create an end-to-end document processing pipeline using Document AI. The Document AI API is a document understanding solution that takes unstructured data, such as documents and emails, and makes the data easier to understand, analyze, and consume. In this lab, you will create a document processing pipeline that will automatically process documents that are uploaded to Cloud Storage. The pipeline consists of a primary Cloud Function that processes new files that are uploaded to Cloud Storage using a Document AI form processor and then saves form data detected in those files to BigQuery. If the form data includes any address fields the address data is then written to a Pub/Sub topic that in turn triggers a second Cloud Function that uses to Geocoding API to provide geographic coordinate data for the address that is also written to BigQuery. This is a simple pipeline that uses a general form processor that will detect basic form data, such as a labelled field containing address information. Document AI processors that use one of the specialized parsers that are beyond the scope of this lab provide enhanced entity information for specific document types even when those documents do not include labelled fields. For example, a Document AI Invoice parser can provide detailed address and supplier information, from an unlabelled invoice document because it understands the layout of invoices.
Image Denoising Using AutoEncoders in Keras and Python
In this 1-hour long project-based course, you will be able to: - Understand the theory and intuition behind Autoencoders - Import Key libraries, dataset and visualize images - Perform image normalization, pre-processing, and add random noise to images - Build an Autoencoder using Keras with Tensorflow 2.0 as a backend - Compile and fit Autoencoder model to training data - Assess the performance of trained Autoencoder using various KPIs Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Data Analysis Tools
In this course, you will develop and test hypotheses about your data. You will learn a variety of statistical tests, as well as strategies to know how to apply the appropriate one to your specific data and question. Using your choice of two powerful statistical software packages (SAS or Python), you will explore ANOVA, Chi-Square, and Pearson correlation analysis. This course will guide you through basic statistical principles to give you the tools to answer questions you have developed. Throughout the course, you will share your progress with others to gain valuable feedback and provide insight to other learners about their work.
Diabetic Retinopathy Detection with Artificial Intelligence
In this project, we will train deep neural network model based on Convolutional Neural Networks (CNNs) and Residual Blocks to detect the type of Diabetic Retinopathy from images. Diabetic Retinopathy is the leading cause of blindness in the working-age population of the developed world and estimated to affect over 347 million people worldwide. Diabetic Retinopathy is disease that results from complication of type 1 & 2 diabetes and can develop if blood sugar levels are left uncontrolled for a prolonged period of time. With the power of Artificial Intelligence and Deep Learning, doctors will be able to detect blindness before it occurs.
Introduction to Deep Learning & Neural Networks with Keras
Looking to start a career in Deep Learning? Look no further. This course will introduce you to the field of deep learning and help you answer many questions that people are asking nowadays, like what is deep learning, and how do deep learning models compare to artificial neural networks? You will learn about the different deep learning models and build your first deep learning model using the Keras library. After completing this course, learners will be able to: • Describe what a neural network is, what a deep learning model is, and the difference between them. • Demonstrate an understanding of unsupervised deep learning models such as autoencoders and restricted Boltzmann machines. • Demonstrate an understanding of supervised deep learning models such as convolutional neural networks and recurrent networks. • Build deep learning models and networks using the Keras library.
Facial Keypoint Detection with PyTorch
In this 2-hour project-based course, you will be able to : - Understand the Facial Keypoint Dataset and you will write a custom dataset class for Image-Keypoint dataset. Additionally, you will apply keypoint augmentation to augment images as well as its keypoints. For keypoint augmentation you will use albumentation library. You will plot the image keypoint pair. - Load a pretrained state of the art convolutional neural network using timm library. - Create train function and evaluator function which will helpful to write training loop. Moreover, you will use training loop to train the model. - Lastly, you will use trained model to find keypoints given any image.
Materials Data Sciences and Informatics
This course aims to provide a succinct overview of the emerging discipline of Materials Informatics at the intersection of materials science, computational science, and information science. Attention is drawn to specific opportunities afforded by this new field in accelerating materials development and deployment efforts. A particular emphasis is placed on materials exhibiting hierarchical internal structures spanning multiple length/structure scales and the impediments involved in establishing invertible process-structure-property (PSP) linkages for these materials. More specifically, it is argued that modern data sciences (including advanced statistics, dimensionality reduction, and formulation of metamodels) and innovative cyberinfrastructure tools (including integration platforms, databases, and customized tools for enhancement of collaborations among cross-disciplinary team members) are likely to play a critical and pivotal role in addressing the above challenges.
Analyze Survey Data with Tableau
Surveys are used in a variety of scenarios, both in businesses and in research. Companies are using them to better understand consumer insights and feedback, and researchers are going beyond the traditional uses to learn more about the world around us. Tableau can help visualize survey data of all kinds in a useful way—without needing advanced statistics, graphic design, or a statistics background. In this project, learners will learn how to create an account in Tableau and how to manipulate data with joins and pivots. Students will then learn how to create different kinds of visualizations, including tables, pie charts, and a stacked pie chart. This would be a great project for business and academic uses of survey data. This project is designed to be used by those somewhat familiar with Tableau and data visualizations. But the project can be accessible for those new to Tableau as well.