Artificial Neural Networks, Machine Learning, Deep Thinking

Artificial Neural Network is a computational data model used in the development of Artificial Intelligence (AI) systems capable of performing "intelligent" tasks. 

Course Format

Online

Accreditation Type

Certificate

Skill Level

Intermediate

Course Cost

R60225

Artificial Neural Networks, Machine Learning, Deep Thinking

COURSE OVERVIEW

Introduction and ANN Structure.

  • Biological neurons and artificial neurons.
  • Model of an ANN.
  • Activation functions used in ANNs.
  • Typical classes of network architectures .

Mathematical Foundations and Learning mechanisms.

  • Re-visiting vector and matrix algebra.
  • State-space concepts.
  • Concepts of optimization.
  • Error-correction learning.
  • Memory-based learning.
  • Hebbian learning.
  • Competitive learning.

Single layer perceptrons.

  • Structure and learning of perceptrons.
  • Pattern classifier - introduction and Bayes' classifiers.
  • Perceptron as a pattern classifier.
  • Perceptron convergence.
  • Limitations of a perceptrons.

Feedforward ANN.

  • Structures of Multi-layer feedforward networks.
  • Back propagation algorithm.
  • Back propagation - training and convergence.
  • Functional approximation with back propagation.
  • Practical and design issues of back propagation learning.

Radial Basis Function Networks.

  • Pattern separability and interpolation.
  • Regularization Theory.
  • Regularization and RBF networks.
  • RBF network design and training.
  • Approximation properties of RBF.

Competitive Learning and Self organizing ANN.

  • General clustering procedures.
  • Learning Vector Quantization (LVQ).
  • Competitive learning algorithms and architectures.
  • Self organizing feature maps.
  • Properties of feature maps.

Fuzzy Neural Networks.

  • Neuro-fuzzy systems.
  • Background of fuzzy sets and logic.
  • Design of fuzzy stems.
  • Design of fuzzy ANNs.

Applications

  • A few examples of Neural Network applications, their advantages and problems will be discussed.
  • The PAC Learning Framework
    • Guarantees for finite hypothesis set – consistent case
    • Guarantees for finite hypothesis set – inconsistent case
    • Generalities
      • Deterministic cv. Stochastic scenarios
      • Bayes error noise
      • Estimation and approximation errors
      • Model selection
  • Radmeacher Complexity and VC – Dimension
  • Bias - Variance tradeoff
  • Regularisation
  • Over-fitting
  • Validation
  • Support Vector Machines
  • Kriging (Gaussian Process regression)
  • PCA and Kernel PCA
  • Self Organisation Maps (SOM)
  • Kernel induced vector space
    • Mercer Kernels and Kernel - induced similarity metrics
  • Reinforcement Learning

This will be taught in relation to the topics covered on Day 1 and Day 2

  • Logistic and Softmax Regression
  • Sparse Autoencoders
  • Vectorization, PCA and Whitening
  • Self-Taught Learning
  • Deep Networks
  • Linear Decoders
  • Convolution and Pooling
  • Sparse Coding
  • Independent Component Analysis
  • Canonical Correlation Analysis
  • Demos and Applications
  • Good understanding of mathematics.
  • Good understanding of basic statistics.
  • Basic programming skills are not required but recommended.

21 hours (usually 3 days including breaks)


COURSE COMPLETION

Artificial Neural Networks, Machine Learning, Deep Thinking Training Course

CREDIT BEARING

This course is NOT credit bearing

COURSE LICENCE

This course is available under Attribution-ShareAlike 2.0 South Africa