 One of the most commonly used gated recurrent neural network architectures is LSTMs, which stands for a long short-term memory. Now, replace every hidden unit with something called an LSTM cell, and add another connection from every cell called the cell state. This here is now our LSTM RNN. LSTMs were designed to mitigate the vanishing and exploding gradient problem. Apart from the hidden state vector, each LSTM cell maintains a cell state vector. And at each time step, the next LSTM can choose to read from it, write to it, or reset the cell using an explicit gating mechanism.