Back to Courses

Data Analysis Courses - Page 30

Showing results 291-300 of 998
Stock Analysis: Create a Buy Signal Filter using R and the Quantmod Package
In this 1-hour long project-based course, you will learn how to pull down Stock Data using the R quantmod package. You will also learn how to perform analytics and pass financial risk functions to the data. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Deploying an Open Source Cassandra™ Database using the GCP Marketplace
This is a self-paced lab that takes place in the Google Cloud console. In this lab you will deploy an Apache Cassandra™ database using using the GCP marketplace. You will connect to the database using CQL Shell and run some simple DDL commands to create a table, load some data and query it.
Decision Tree Classifier for Beginners in R
Welcome to this project-based course Decision Tree Classifier for Beginners in R. This is a hands-on project that introduces beginners to the world of statistical modeling. In this project, you will learn how to build decision tree models using the tree and rpart libraries in R. We will start this hands-on project by importing the Sonar data into R and exploring the dataset. By the end of this 2-hour long project, you will understand the basic intuition behind the decision tree algorithm and how it works. To build the model, we will divide or partition the data into the training and testing data set. Finally, you will learn how to evaluate the model’s performance using metrics like Accuracy, Sensitivity, Specificity, F1-Score, and so on. By extension, you will learn how to save the trained model on your local system. Although you do not need to be a data analyst expert or data scientist to succeed in this guided project, it requires a basic knowledge of using R, especially writing R syntaxes. Therefore, to complete this project, you must have prior experience with using R. If you are not familiar with working with using R, please go ahead to complete my previous project titled: “Getting Started with R”. It will hand you the needed knowledge to go ahead with this project on Decision Tree. However, if you are comfortable with working with R, please join me on this beautiful ride! Let’s get our hands dirty!
Query a Database Table with SQL in LibreOffice Base
By the end of this project, you will have written SQL queries to retrieve data from a database table in LibreOffice Base. While Base includes a WYSIWYG query utility, learning to access data using SQL provides an additional measure of control over the data retrieval process. In addition, SQL skills can be applied across a variety of relational database management systems in addition to LibreOffice Base. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Mastering Data Analysis with Pandas: Learning Path Part 4
In this structured series of hands-on guided projects, we will master the fundamentals of data analysis and manipulation with Pandas and Python. Pandas is a super powerful, fast, flexible and easy to use open-source data analysis and manipulation tool. This guided project is the fourth of a series of multiple guided projects (learning path) that is designed for anyone who wants to master data analysis with pandas. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Data Warehouse Concepts, Design, and Data Integration
This is the second course in the Data Warehousing for Business Intelligence specialization. Ideally, the courses should be taken in sequence. In this course, you will learn exciting concepts and skills for designing data warehouses and creating data integration workflows. These are fundamental skills for data warehouse developers and administrators. You will have hands-on experience for data warehouse design and use open source products for manipulating pivot tables and creating data integration workflows. In the data integration assignment, you can use either Oracle, MySQL, or PostgreSQL databases. You will also gain conceptual background about maturity models, architectures, multidimensional models, and management practices, providing an organizational perspective about data warehouse development. If you are currently a business or information technology professional and want to become a data warehouse designer or administrator, this course will give you the knowledge and skills to do that. By the end of the course, you will have the design experience, software background, and organizational context that prepares you to succeed with data warehouse development projects. In this course, you will create data warehouse designs and data integration workflows that satisfy the business intelligence needs of organizations. When you’re done with this course, you’ll be able to: * Evaluate an organization for data warehouse maturity and business architecture alignment; * Create a data warehouse design and reflect on alternative design methodologies and design goals; * Create data integration workflows using prominent open source software; * Reflect on the role of change data, refresh constraints, refresh frequency trade-offs, and data quality goals in data integration process design; and * Perform operations on pivot tables to satisfy typical business analysis requests using prominent open source software
Analyze Box Office Data with Plotly and Python
Welcome to this project-based course on Analyzing Box Office Data with Plotly and Python. In this course, you will be working with the The Movie Database (TMDB) Box Office Prediction data set. The motion picture industry is raking in more revenue than ever with its expansive growth the world over. Can we build models to accurately predict movie revenue? Could the results from these models be used to further increase revenue? We try to answer these questions by way of exploratory data analysis (EDA) and feature engineering. We will primarily use Plotly for data visualization. Plotly Python which is Plotly's Python graphing library makes interactive, publication-quality graphs ready for both online and offline use. This course runs on Coursera's hands-on project platform called Rhyme. On Rhyme, you do projects in a hands-on manner in your browser. You will get instant access to pre-configured cloud desktops containing all of the software and data you need for the project. Everything is already set up directly in your internet browser so you can just focus on learning. For this project, you’ll get instant access to a cloud desktop with Python, Jupyter, and scikit-learn pre-installed. Notes: - You will be able to access the cloud desktop 5 times. However, you will be able to access instructions videos as many times as you want. - This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Healthcare Data Quality and Governance
Career prospects are bright for those qualified to work with healthcare data or as Health Information Management (HIM) professionals. Perhaps you work in data analytics but are considering a move into healthcare, or you work in healthcare but are considering a transition into a new role. In either case, Healthcare Data Quality and Governance will provide insight into how valuable data assets are protected to maintain data quality. This serves care providers, patients, doctors, clinicians, and those who carry out the business of improving health outcomes. "Big Data" makes headlines, but that data must be managed to maintain quality. High-quality data is one of the most valuable assets gathered and used by any business. This holds greater significance in healthcare where the maintenance and governance of data quality directly impact people’s lives. This course will explain how data quality is improved and maintained. You’ll learn why data quality matters, then see how healthcare professionals monitor, manage and improve data quality. You’ll see how human and computerized systems interact to sustain data quality through data governance. You’ll discover how to measure data quality with metadata, tracking data provenance, validating and verifying data, along with a communication framework commonly used in healthcare settings. This knowledge matters because high-quality data will be transformed into valuable insights that can save lives, reduce costs, to improve healthcare and make it more accessible and affordable. You will make yourself more of an asset in the healthcare field by what you gain from this course.
Get Started With Tableau
Tableau is a powerful software program frequently used by business analysts in a variety of departments including sales, marketing, finance, operation and more. Analysts within these departments use Tableau to create visualizations that explain datasets and tell data stories. In this project, learners will learn the basic steps to begin using Tableau. They will learn how to upload data and how the user interface works. Learners will move on to understand the difference between dimensions and measures as well as discrete and concrete variables. Learners will apply these new skills as they build bar graphs, line graphs, and tables. At the conclusion of this project, learners will feel confident in their ability to answer common business questions with Tableau visualizations. Along the way, there are questions and challenges to test learning and to display skills.
Dashboarding and Deployment
This course will take you through the various parts of analytical dashboarding: from best practices for designing a dashboard, creating a unified analytical environment, to deploying and publishing visualizations. We will briefly discuss the advanced visualization techniques and you will develop an information layout of the biggest gainers and losers in the financial markets and compare those movements to the economic data as your capstone project.