Go to [[Week 2 - Introduction]] or back to the [[Main AI Page]] Part of the pages on [[Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Natural Language Processing]] and [[Attention Mechanism]].
A deep learning architecture in the same way [[CNNS - Convolutional neural networks]] are.
These algorithms are able to forget, according to the RNN section of the beginners’ guide to NLP
According to wikipedia:
A common LSTM unit is composed of a cell, an input gate, an output gate and a forget gate. The cell remembers values over arbitrary time intervals and the three gates regulate the flow of information into and out of the cell.
Source - Wikipedia.
An LSTM is made up of a:
The difference between LSTMs and [[GRUs]]
Rendering context...