Learn about RNNs (Recurrent Neural Networks) → http://ibm.biz/rnn-guide
Check out IBM Watson → http://ibm.biz/ibm-watson
Long Short Term Memory, also known as LSTMs, are a special kind of Recurrent Neural Network, or RNN, architecture capable of learning long-term dependencies as well as a solution to the vanishing gradient problem that can occur when training traditional RNNs.
In this lightboard video, Martin Keen with IBM, breaks down why we need LSTMs to address the problem of long-term dependencies, how the cell state and its various gates help transfer relative information in a sequence chain, and a few key LSTM use cases.
Watch “What are Convolutional Neural Networks (CNNs)” lightboard video → https://youtu.be/QzY57FaENXg
Watch “What are GANs (Generative Adversarial Networks)?” → https://youtu.be/TpMIssRdhco
Get started on IBM Cloud at no cost → https://www.ibm.biz/BdfLkF
Subscribe to see more videos like this in the future → http://ibm.biz/subscribe-now
#LSTM #RNN #AI
You can watch this video also at the source.