Simplifying Neural Networks and Deep Learning Basics!
Simplifying Neural Networks and Deep Learning Basics!
and Deep
Learning
Ahmed Elhelbawy
Naxcen Quantum Society
Neural Network
•Mimics the functionality of a brain.
•A neural network is a graph with neurons
(nodes, units etc.) connected by links.
Neural Network: Neuron
X2 W2= ?
X1
W1= ?
a
t=? OR Gate
X2 W2= ?
a
X1 t=? NOT Gate
W1= ?
X2 W2= 1
X1
W1= 1
a
t = 0.5 OR Gate
X2 W2= 1
a
X1 t = -0.5 NOT Gate
W1= -1
Neural Network: Multi Layer Perceptron
(MLP) or Feed-Forward Network (FNN)
• Network with n+1 layers
• One output and n hidden layers.
Deep Learning
What is Deep Learning?
•A family of methods that uses deep architectures to
Example 1
MAN
Example 2
Layer-wise Pre-training
•Then, train second
layer next, optimizing
data-likelihood
objective P(h)
Layer-wise Pre-training
Finally, fine-tune labelled objective P(y|x) by
• Backpropagation
where
Example:
• Let weights (h;1 x),1 (h; x)
1 3
be positive, others be
zero, b = d = 0.
• Calculate p(x,h) ?
• Ans: p(x1 = 1; x2 = 0; x3 = 1; h1 = 1; h2 = 0; h3 = 0)
Contrastive Divergence:
Deep Belief Nets (DBN) = Stacked RBM
W
U
RNN Extensions
Bidirectional RNN
•
Deep (Bidirectional) RNNs
•
RNN (Cont..)
• “the clouds are in the sky”
RNN (Cont..)
• “India is my home country. I can speak fluent Hindi.”
is my home fluent W2
Simple RNN
LSTM
LSTM
•LSTM remove or add information to the cell
state, carefully regulated by structures called
gates.
LSTM
• Gates
– Forget Gate
– Input Gate
– Output Gate
LSTM
• Gates
– Forget Gate
– Input Gate
– Output Gate
LSTM
• Gates
– Forget Gate
– Input Gate
– Output Gate
LSTM- Variants
1. Convolutional layer
2. Pooling layer
3. Fully connected layer
Convolutional Neural Network (CNN)
1. Convolutional layer 1 0 1
0 1 0
1 0 1
1 1 1 0 0
Convolution Filter
0 1 1 1 0
0 0 1 1 1
0 0 1 1 0
0 1 1 0 0
Image
1. Convolutional layer 1 0 1
0 1 0
1 0 1
Convolutional Neural Network (CNN)
1. Convolutional layer 1 0 1
• Local receptive field
• Shared weights 0 1 0
1 0 1
2. Pooling layer
Convolutional Neural Network (CNN)
.
. .
.
Pooled
feature
Labels
Convolution
feature