Get in Touch

Course Outline

Introduction to Applied Machine Learning <\/h3>
  • Statistical learning versus Machine learning <\/li>
  • Iteration and evaluation <\/li>
  • Bias-Variance trade-off <\/li>
  • Supervised versus Unsupervised Learning <\/li>
  • Problems solved with Machine Learning <\/li>
  • Train Validation Test – ML workflow to prevent overfitting <\/li>
  • Workflow of Machine Learning <\/li>
  • Machine learning algorithms <\/li>
  • Selecting the appropriate algorithm for the problem <\/li> <\/ul>

    Algorithm Evaluation <\/h3>
    • Evaluating numerical predictions
      • Accuracy measures: ME, MSE, RMSE, MAPE <\/li>
      • Parameter and prediction stability <\/li> <\/ul> <\/li>
      • Evaluating classification algorithms
        • Accuracy and its limitations <\/li>
        • The confusion matrix <\/li>
        • Issues with unbalanced classes <\/li> <\/ul> <\/li>
        • Visualizing model performance
          • Profit curve <\/li>
          • ROC curve <\/li>
          • Lift curve <\/li> <\/ul> <\/li>
          • Model selection <\/li>
          • Model tuning – grid search strategies <\/li> <\/ul>

            Data preparation for Modelling <\/h3>
            • Data import and storage <\/li>
            • Understanding the data – basic explorations <\/li>
            • Data manipulations with pandas library <\/li>
            • Data transformations – Data wrangling <\/li>
            • Exploratory analysis <\/li>
            • Missing observations – detection and solutions <\/li>
            • Outliers – detection and strategies <\/li>
            • Standardization, normalization, binarization <\/li>
            • Recoding qualitative data <\/li> <\/ul>

              Machine learning algorithms for Outlier detection <\/h3>
              • Supervised algorithms
                • KNN <\/li>
                • Ensemble Gradient Boosting <\/li>
                • SVM <\/li> <\/ul> <\/li>
                • Unsupervised algorithms
                  • Distance-based <\/li>
                  • Density based methods <\/li>
                  • Probabilistic methods <\/li>
                  • Model based methods <\/li> <\/ul> <\/li> <\/ul>

                    Understanding Deep Learning <\/h3>
                    • Overview of the Basic Concepts of Deep Learning <\/li>
                    • Differentiating Between Machine Learning and Deep Learning <\/li>
                    • Overview of Applications for Deep Learning <\/li> <\/ul>

                      Overview of Neural Networks <\/h3>
                      • What are Neural Networks <\/li>
                      • Neural Networks versus Regression Models <\/li>
                      • Understanding Mathematical Foundations and Learning Mechanisms <\/li>
                      • Constructing an Artificial Neural Network <\/li>
                      • Understanding Neural Nodes and Connections <\/li>
                      • Working with Neurons, Layers, and Input and Output Data <\/li>
                      • Understanding Single Layer Perceptrons <\/li>
                      • Differences Between Supervised and Unsupervised Learning <\/li>
                      • Learning Feedforward and Feedback Neural Networks <\/li>
                      • Understanding Forward Propagation and Back Propagation <\/li> <\/ul>

                        Building Simple Deep Learning Models with Keras <\/h3>
                        • Creating a Keras Model <\/li>
                        • Understanding Your Data <\/li>
                        • Specifying Your Deep Learning Model <\/li>
                        • Compiling Your Model <\/li>
                        • Fitting Your Model <\/li>
                        • Working with Your Classification Data <\/li>
                        • Working with Classification Models <\/li>
                        • Using Your Models <\/li> <\/ul>

                          Working with TensorFlow for Deep Learning <\/h3>
                          • Preparing the Data
                            • Downloading the Data <\/li>
                            • Preparing Training Data <\/li>
                            • Preparing Test Data <\/li>
                            • Scaling Inputs <\/li>
                            • Using Placeholders and Variables <\/li> <\/ul> <\/li>
                            • Specifying the Network Architecture <\/li>
                            • Using the Cost Function <\/li>
                            • Using the Optimizer <\/li>
                            • Using Initializers <\/li>
                            • Fitting the Neural Network <\/li>
                            • Building the Graph
                              • Inference <\/li>
                              • Loss <\/li>
                              • Training <\/li> <\/ul> <\/li>
                              • Training the Model
                                • The Graph <\/li>
                                • The Session <\/li>
                                • Train Loop <\/li> <\/ul> <\/li>
                                • Evaluating the Model
                                  • Building the Eval Graph <\/li>
                                  • Evaluating with Eval Output <\/li> <\/ul> <\/li>
                                  • Training Models at Scale <\/li>
                                  • Visualizing and Evaluating Models with TensorBoard <\/li> <\/ul>

                                    Application of Deep Learning in Anomaly Detection <\/h3>
                                    • Autoencoder
                                      • Encoder - Decoder Architecture <\/li>
                                      • Reconstruction loss <\/li> <\/ul> <\/li>
                                      • Variational Autoencoder
                                        • Variational inference <\/li> <\/ul> <\/li>
                                        • Generative Adversarial Network
                                          • Generator – Discriminator architecture <\/li>
                                          • Approaches to AN using GAN <\/li> <\/ul> <\/li> <\/ul>

                                            Ensemble Frameworks <\/h3>
                                            • Combining results from different methods <\/li>
                                            • Bootstrap Aggregating <\/li>
                                            • Averaging outlier score <\/li> <\/ul>

Requirements

  • Experience with Python programming <\/li>
  • Basic familiarity with statistics and mathematical concepts <\/li> <\/ul>

    Target Audience<\/strong> <\/p>

    • Developers <\/li>
    • Data scientists <\/li> <\/ul>
 28 Hours

Number of participants


Price per participant

Testimonials (5)

Upcoming Courses

Related Categories