Mrinal Walia takes us through the concept of Long Short Term Memory:
A simple Recurrent Neural Network has a very simple structure, that forms a chain of repeating modules of a neural network, with just a single activation function such as tanh layer, similarly LSTM too have a chain-like structure with repeating modules just like RNN but instead of a single Neural network layer in RNN, LSTM has four layers which are interacting in a very different way each performing its unique function in the network.
Read on for a good amount of theory followed by an example using Keras.
Comments closed