Back to Courses

Machine Learning Courses - Page 10

Showing results 91-100 of 485
Object Localization with TensorFlow
Welcome to this 2 hour long guided project on creating and training an Object Localization model with TensorFlow. In this guided project, we are going to use TensorFlow's Keras API to create a convolutional neural network which will be trained to classify as well as localize emojis in images. Localization, in this context, means the position of the emojis in the images. This means that the network will have one input and two outputs. Think of this task as a simpler version of Object Detection. In Object Detection, we might have multiple objects in the input images, and an object detection model predicts the classes as well as bounding boxes for all of those objects. In Object Localization, we are working with the assumption that there is just one object in any given image, and our CNN model will classify and localize that object. Please note that you will need prior programming experience in Python. You will also need familiarity with TensorFlow. This is a practical, hands on guided project for learners who already have theoretical understanding of Neural Networks, Convolutional Neural Networks, and optimization algorithms like Gradient Descent but want to understand how to use use TensorFlow to solve computer vision tasks like Object Localization.
Specialized Models: Time Series and Survival Analysis
This course introduces you to additional topics in Machine Learning that complement essential tasks, including forecasting and analyzing censored data. You will learn how to find analyze data with a time component and censored data that needs outcome inference. You will learn a few techniques for Time Series Analysis and Survival Analysis. The hands-on section of this course focuses on using best practices and verifying assumptions derived from Statistical Learning. By the end of this course you should be able to: Identify common modeling challenges with time series data Explain how to decompose Time Series data: trend, seasonality, and residuals Explain how autoregressive, moving average, and ARIMA models work Understand how to select and implement various Time Series models Describe hazard and survival modeling approaches Identify types of problems suitable for survival analysis Who should take this course? This course targets aspiring data scientists interested in acquiring hands-on experience with Time Series Analysis and Survival Analysis.   What skills should you have? To make the most out of this course, you should have familiarity with programming on a Python development environment, as well as fundamental understanding of Data Cleaning, Exploratory Data Analysis, Calculus, Linear Algebra, Supervised Machine Learning, Unsupervised Machine Learning, Probability, and Statistics.
Convolutional Neural Networks in TensorFlow
If you are a software developer who wants to build scalable AI-powered algorithms, you need to understand how to use the tools to build them. This course is part of the upcoming Machine Learning in Tensorflow Specialization and will teach you best practices for using TensorFlow, a popular open-source framework for machine learning. In Course 2 of the deeplearning.ai TensorFlow Specialization, you will learn advanced techniques to improve the computer vision model you built in Course 1. You will explore how to work with real-world images in different shapes and sizes, visualize the journey of an image through convolutions to understand how a computer “sees” information, plot loss and accuracy, and explore strategies to prevent overfitting, including augmentation and dropout. Finally, Course 2 will introduce you to transfer learning and how learned features can be extracted from models. The Machine Learning course and Deep Learning Specialization from Andrew Ng teach the most important and foundational principles of Machine Learning and Deep Learning. This new deeplearning.ai TensorFlow Specialization teaches you how to use TensorFlow to implement those principles so that you can start building and applying scalable models to real-world problems. To develop a deeper understanding of how neural networks work, we recommend that you take the Deep Learning Specialization.
Deploy Machine Learning Models in Azure
Did you know that there is more than one way you can deploy models in Azure? This Guided Project “Deploy machine learning models in Azure” is for everybody working with ml models in Azure . In this 1-hour long project-based course, you will learn how to deploy machine learning models from Portal in Azure, deploy machine learning models in Azure from Python script and deploy machine learning models using Azure CLI. To achieve this, we will use one example data, train a machine learning model, prepare all the files needed for deployment and deploy it! There a couple of ways of deployment in Azure, so you can pick your most convenient and favourite one. In order to be successful in this project, you will need knowledge of Python language and experience with machine learning in Python. Also, Azure subscription is required (free trial is an option for those who don’t have it), as well as Azure Machine Learning resource and a compute instance within. Instructional links will be provided to guide you through creation, if needed. Let's get started!
Predict Taxi Fare with a BigQuery ML Forecasting Model
This is a self-paced lab that takes place in the Google Cloud console. In this lab, you will explore millions of New York City yellow taxi cab trips available in a BigQuery Public Dataset, create an ML model inside of BigQuery to predict the fare, and evaluate the performance of your model to make predictions.
Using Machine Learning in Trading and Finance
This course provides the foundation for developing advanced trading strategies using machine learning techniques. In this course, you’ll review the key components that are common to every trading strategy, no matter how complex. You’ll be introduced to multiple trading strategies including quantitative trading, pairs trading, and momentum trading. By the end of the course, you will be able to design basic quantitative trading strategies, build machine learning models using Keras and TensorFlow, build a pair trading strategy prediction model and back test it, and build a momentum-based trading model and back test it. To be successful in this course, you should have advanced competency in Python programming and familiarity with pertinent libraries for machine learning, such as Scikit-Learn, StatsModels, and Pandas. Experience with SQL is recommended. You should have a background in statistics (expected values and standard deviation, Gaussian distributions, higher moments, probability, linear regressions) and foundational knowledge of financial markets (equities, bonds, derivatives, market structure, hedging).
Nearest Neighbor Collaborative Filtering
In this course, you will learn the fundamental techniques for making personalized recommendations through nearest-neighbor techniques. First you will learn user-user collaborative filtering, an algorithm that identifies other people with similar tastes to a target user and combines their ratings to make recommendations for that user. You will explore and implement variations of the user-user algorithm, and will explore the benefits and drawbacks of the general approach. Then you will learn the widely-practiced item-item collaborative filtering algorithm, which identifies global product associations from user ratings, but uses these product associations to provide personalized recommendations based on a user's own product ratings.
MLOps (Machine Learning Operations) Fundamentals
This course introduces participants to MLOps tools and best practices for deploying, evaluating, monitoring and operating production ML systems on Google Cloud. MLOps is a discipline focused on the deployment, testing, monitoring, and automation of ML systems in production. Machine Learning Engineering professionals use tools for continuous improvement and evaluation of deployed models. They work with (or can be) Data Scientists, who develop models, to enable velocity and rigor in deploying the best performing models. This course is primarily intended for the following participants: Data Scientists looking to quickly go from machine learning prototype to production to deliver business impact. Software Engineers looking to develop Machine Learning Engineering skills. ML Engineers who want to adopt Google Cloud for their ML production projects. >>> By enrolling in this course you agree to the Qwiklabs Terms of Service as set out in the FAQ and located at: https://qwiklabs.com/terms_of_service <<<
Turn Ethical Frameworks into Actionable Steps
Ethical principles build a strong foundation for driving ethical technologies. Principles alone can be elusive and impractical for application. Ethical frameworks based upon these principles provide a structure to guide technologists when implementing data-driven solutions. However, ethical frameworks, along with standards and regulations, can make compliance tasks more complex, and they can also raise the tension between ethical duties and business practicalities. An approach is needed to reconcile these issues. This second course within the Certified Ethical Emerging Technologist (CEET) professional certificate is designed for learners seeking to analyze ethical frameworks, regulations, standards, and best practices and integrate them into data-driven solutions. Students will become familiar with frameworks and the common ethical principles they are based upon and how they can be applied across a variety of ethically driven dilemmas. You will learn applicable regulations and best practices established across global organizations and governments and how to navigate the integration of these standards in the context of business needs. This course is the second of five courses within the Certified Ethical Emerging Technologist (CEET) professional certificate. The preceding course is titled Promote the Ethical Use of Data-Driven Technologies.
Machine Learning: Classification
Case Studies: Analyzing Sentiment & Loan Default Prediction In our case study on analyzing sentiment, you will create models that predict a class (positive/negative sentiment) from input features (text of the reviews, user profile information,...). In our second case study for this course, loan default prediction, you will tackle financial data, and predict when a loan is likely to be risky or safe for the bank. These tasks are an examples of classification, one of the most widely used areas of machine learning, with a broad array of applications, including ad targeting, spam detection, medical diagnosis and image classification. In this course, you will create classifiers that provide state-of-the-art performance on a variety of tasks. You will become familiar with the most successful techniques, which are most widely used in practice, including logistic regression, decision trees and boosting. In addition, you will be able to design and implement the underlying algorithms that can learn these models at scale, using stochastic gradient ascent. You will implement these technique on real-world, large-scale machine learning tasks. You will also address significant tasks you will face in real-world applications of ML, including handling missing data and measuring precision and recall to evaluate a classifier. This course is hands-on, action-packed, and full of visualizations and illustrations of how these techniques will behave on real data. We've also included optional content in every module, covering advanced topics for those who want to go even deeper! Learning Objectives: By the end of this course, you will be able to: -Describe the input and output of a classification model. -Tackle both binary and multiclass classification problems. -Implement a logistic regression model for large-scale classification. -Create a non-linear model using decision trees. -Improve the performance of any model using boosting. -Scale your methods with stochastic gradient ascent. -Describe the underlying decision boundaries. -Build a classification model to predict sentiment in a product review dataset. -Analyze financial data to predict loan defaults. -Use techniques for handling missing data. -Evaluate your models using precision-recall metrics. -Implement these techniques in Python (or in the language of your choice, though Python is highly recommended).