Back to Courses

Data Science Courses - Page 88

Showing results 871-880 of 1407
Neural Networks and Deep Learning
In the first course of the Deep Learning Specialization, you will study the foundational concept of neural networks and deep learning. By the end, you will be familiar with the significant technological trends driving the rise of deep learning; build, train, and apply fully connected deep neural networks; implement efficient (vectorized) neural networks; identify key parameters in a neural network’s architecture; and apply deep learning to your own applications. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI.
Data Engineering with MS Azure Synapse Apache Spark Pools
In this course, you will learn how to perform data engineering with Azure Synapse Apache Spark Pools, which enable you to boost the performance of big-data analytic applications by in-memory cluster computing. You will learn how to differentiate between Apache Spark, Azure Databricks, HDInsight, and SQL Pools and understand the use-cases of data-engineering with Apache Spark in Azure Synapse Analytics. You will also learn how to ingest data using Apache Spark Notebooks in Azure Synapse Analytics and transform data using DataFrames in Apache Spark Pools in Azure Synapse Analytics. You will integrate SQL and Apache Spark pools in Azure Synapse Analytics. You will also learn how to monitor and manage data engineering workloads with Apache Spark in Azure Synapse Analytics. This course is part of a Specialization intended for Data engineers and developers who want to demonstrate their expertise in designing and implementing data solutions that use Microsoft Azure data services for anyone interested in preparing for the Exam DP-203: Data Engineering on Microsoft Azure (beta). You will take a practice exam that covers key skills measured by the certification exam. This is the sixth course in a program of 10 courses to help prepare you to take the exam so that you can have expertise in designing and implementing data solutions that use Microsoft Azure data services. The Data Engineering on Microsoft Azure exam is an opportunity to prove knowledge expertise in integrating, transforming, and consolidating data from various structured and unstructured data systems into structures that are suitable for building analytics solutions that use Microsoft Azure data services. Each course teaches you the concepts and skills that are measured by the exam. By the end of this Specialization, you will be ready to take and sign-up for the Exam DP-203: Data Engineering on Microsoft Azure (beta).
Multi-Echelon Inventory Simulation Using R Simmer
Welcome to "Multi Echelon Inventory Simulation Using R Simmer". This is a project-based course which should take about 2 hours to finish. Before diving into the project, please take a look at the course objectives and structure. By the end of this project, you will gain introductiory knowledge of Discrete Event Simulation, Multi Echelon Inventory Systems, be able to use R Studio and Simmer library, create statistical variables required for simulation, define process trajectory, define and assign resources, define arrivals (eg. incoming customers / work units), run simulation in R, store results in data frames, plot charts and interpret the results.
Econometrics: Methods and Applications
Welcome! Do you wish to know how to analyze and solve business and economic questions with data analysis tools? Then Econometrics by Erasmus University Rotterdam is the right course for you, as you learn how to translate data into models to make forecasts and to support decision making. * What do I learn? When you know econometrics, you are able to translate data into models to make forecasts and to support decision making in a wide variety of fields, ranging from macroeconomics to finance and marketing. Our course starts with introductory lectures on simple and multiple regression, followed by topics of special interest to deal with model specification, endogenous variables, binary choice data, and time series data. You learn these key topics in econometrics by watching the videos with in-video quizzes and by making post-video training exercises. * Do I need prior knowledge? The course is suitable for (advanced undergraduate) students in economics, finance, business, engineering, and data analysis, as well as for those who work in these fields. The course requires some basics of matrices, probability, and statistics, which are reviewed in the Building Blocks module. If you are searching for a MOOC on econometrics of a more introductory nature that needs less background in mathematics, you may be interested in the Coursera course “Enjoyable Econometrics” that is also from Erasmus University Rotterdam. * What literature can I consult to support my studies? You can follow the MOOC without studying additional sources. Further reading of the discussed topics (including the Building Blocks) is provided in the textbook that we wrote and on which the MOOC is based: Econometric Methods with Applications in Business and Economics, Oxford University Press. The connection between the MOOC modules and the book chapters is shown in the Course Guide – Further Information – How can I continue my studies. * Will there be teaching assistants active to guide me through the course? Staff and PhD students of our Econometric Institute will provide guidance in January and February of each year. In other periods, we provide only elementary guidance. We always advise you to connect with fellow learners of this course to discuss topics and exercises. * How will I get a certificate? To gain the certificate of this course, you are asked to make six Test Exercises (one per module) and a Case Project. Further, you perform peer-reviewing activities of the work of three of your fellow learners of this MOOC. You gain the certificate if you pass all seven assignments. Have a nice journey into the world of Econometrics! The Econometrics team
Python and Statistics for Financial Analysis
Course Overview: https://youtu.be/JgFV5qzAYno Python is now becoming the number 1 programming language for data science. Due to python’s simplicity and high readability, it is gaining its importance in the financial industry. The course combines both python coding and statistical concepts and applies into analyzing financial data, such as stock data. By the end of the course, you can achieve the following using python: - Import, pre-process, save and visualize financial data into pandas Dataframe - Manipulate the existing financial data by generating new variables using multiple columns - Recall and apply the important statistical concepts (random variable, frequency, distribution, population and sample, confidence interval, linear regression, etc. ) into financial contexts - Build a trading model using multiple linear regression model - Evaluate the performance of the trading model using different investment indicators Jupyter Notebook environment is configured in the course platform for practicing python coding without installing any client applications.
Data Analytics for Lean Six Sigma
Welcome to this course on Data Analytics for Lean Six Sigma. In this course you will learn data analytics techniques that are typically useful within Lean Six Sigma improvement projects. At the end of this course you are able to analyse and interpret data gathered within such a project. You will be able to use Minitab to analyse the data. I will also briefly explain what Lean Six Sigma is. I will emphasize on use of data analytics tools and the interpretation of the outcome. I will use many different examples from actual Lean Six Sigma projects to illustrate all tools. I will not discuss any mathematical background. The setting we chose for our data example is a Lean Six Sigma improvement project. However data analytics tools are very widely applicable. So you will find that you will learn techniques that you can use in a broader setting apart from improvement projects. I hope that you enjoy this course and good luck! Dr. Inez Zwetsloot & the IBIS UvA team
Digging Deeper into Audience Reports in Google Analytics
In this project, you will discover some of the potentially less familiar Audience Reports. You will learn about the Active Users report, the Lifetime Value report, the Cohort Analysis report, the Benchmarking reports and the Users Flow Report. But, even more importantly, you will learn how to use these reports to help you make better decisions when it comes to reaching and engaging your website audience. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Cognitive Solutions and RPA Analytics
Millions of companies in the world today are processing endless documents in various formats. Although Robotic Process Automation (RPA) thrives in almost every industry and is growing fast, it works well only with structured data sources. What about the data that’s not fully structured and comes in varying layouts? To address this problem, there is another aspect of RPA that is taking the industry by storm: cognitive automation. While implementing RPA, you can deploy automations with “cognitive” capabilities. Cognitive automation uses Artificial Intelligence (AI) to learn and understand the same way the human brain works. Thus, it assists humans in making decisions, completing tasks, or meeting goals. Using cognitive automation, you can extract semi—or unstructured data—which is 80% of all data! Data is a precious asset. Businesses struggle to make sense of large volumes of available data and to generate a tangible value from them. Manual business processes barely contain any data that is available for capture. This is where Robotic Process Automation (or RPA) Analytics comes in. RPA bots don’t just automate business processes; they also digitize them. They take the large volume of data and interpret it in near real-time to provide actionable information. In this course, you will be introduced to cognitive automation, the role that AI plays in it, and Automation Anywhere’s cognitive solution, IQ Bot. You will also be introduced to RPA analytics and explore how Automation Anywhere’s Web Control Room and Bot Insight can provide this functionality. As you begin this course, you will learn the six steps to deploy cognitive automation. Next, you will explore the IQ Bot portal – Automation Anywhere’s web-based application for developing the cognitive IQ bots. You will then learn to use the portal by following the IQ Bot workflow. Next, you will learn how RPA analytics help interpret and improve automated business processes. You will see how Bot Insight functions as an RPA analytics platform. You will also explore the different types of RPA analytics and learn how to generate RPA analytics via two mechanisms – the Web Control Room for Operational Analytics and Bot Insight for Business Analytics and CoE Analytics. Finally, you will learn how to use the RPA mobile app to study and edit the default CoE dashboard that is published via Bot Insight.
Geospatial Data Visualization using Python and Folium
In this project, we are going to learn how to process and analyze geospatial data. we are going to work with a dataset containing information about almost 100 taxis running in Proto, Portugal. We are going to learn how to prepare our data and how to use different geospatial visualization techniques in order to answer some analytical questions. during this project, we will learn how to work with the Folium module in python which is one of the best tools when it comes to geospatial data visualization.
Artificial Intelligence Privacy and Convenience
In this course, we will explore fundamental concepts involved in security and privacy of machine learning projects. Diving into the ethics behind these decisions, we will explore how to protect users from privacy violations while creating useful predictive models. We will also ask big questions about how businesses implement algorithms and how that affects user privacy and transparency now and in the future.