Neural Networks and Deep Learning – a Practical Introduction with CL Kim

When: 
Saturday, October 16, 2021 - 9:00am
Room: 
online via Zoom
Lecturer(s): 
CL Kim

Image recognition, speech recognition, and natural language processing are among the challenging problems for which neural networks and deep learning can provide solutions.

• A neural network can learn the weights and biases of its artificial neurons from training examples using stochastic gradient descent.
• A backpropagation algorithm makes it practical to use neural networks to solve problems previously thought to be insoluble.
• Iterating a deep neural network over too many training epochs can result in overfitting, but there are ways to ameliorate that problem.

Seminar in Detail: 

This talk will introduce the sigmoid artificial neuron, and the core ideas and principles behind a feedforward neural network, deep learning, stochastic gradient descent, the four backpropagation equations, and the backpropagation algorithm. It will also introduce additional approaches, besides increasing size of training data, to reduce overfitting, such as L2 regularization, dropout, and artificially expanding the training data.

• Feedforward Neural Networks.
o Simple (Python) Network to classify a handwritten digit
o Learning with Gradient Descent
o How the backpropagation algorithm works
• Improving the way neural networks learn:
o Cross-entropy cost function
o Softmax activation function and log-likelihood cost function
o Rectified Linear Unit
• Overfitting and Regularization:
o L2 regularization
o Dropout
o Artificially expanding data set
o Hyper-parameters

Benefits of attending:
• Learn the core principles behind neural network and deep learning.
• See a simple Python program that solves a concrete problem: teaching a computer to recognize a handwritten digit.
• Improve the result through incorporating more and more of core ideas about neural networks and deep learning.
• Understand the theory, with worked-out proofs of fundamental equations of backpropagation for those interested.
• Run straightforward Python demo code example.

Who should attend:

Anyone who would like a practical overview of neural networks and deep learning will benefit from this presentation. A basic familiarity with multivariable calculus and matrix algebra would help you understand the four fundamental equations behind backpropagation. They are present in the demo program and also in the reference book, so you can work through them outside the presentation. Some knowledge of Python or a similar computer language would be helpful.

Demo program and reference book:

There is a Python demo program run in a Docker container that runs on your Mac, Windows, or Linux personal computer. Even if you don't program in Python, it should be easy to understand with just a little effort. Your registration confirmation email will include:
• A link to the demo program in GitHub
• A link to the demo program instructions
Early registration will allow you more time to set up the demo and work with it before the presentation.

Reference book: "Neural Networks and Deep Learning" by Michael Nielsen, http://neuralnetworksanddeeplearning.com

Speaker’s background:

CL Kim works in Software Engineering at CarGurus, Inc. He has graduate degrees in Business Administration and in Computer and Information Science from the University of Pennsylvania. He had previously taught for a few years the well-rated IEEE Boston Section class on introduction to the Android platform and API.

Register now for $65 at https://www.eventbrite.com/e/neural-networks-and-deep-learning-a-practic...

After registering, you will receive a confirmation email containing information about joining the webinar.

No votes yet