What is LSTM?
LSTM is a type of recurrent neural network (RNN), with feedback links attached to some layers of the network. Unlike conventional RNN, it is well-suited to learn from experience to predict time series when there are time steps with arbitrary size.
To generate the deep and invariant features for step-ahead crypto stock value prediction, this work presents a deep learning framework for financial time series using a deep learning-based forecasting scheme that integrates the architecture of stacked autoencoders and long-short-term memory.
Long short-term memory is one of the many variations of recurrent neural network (RNN) architecture. In this section, the model of RNN and its LSTM architecture for forecasting the closing price is introduced. Initially, we started with the basic recurrent neural network model and then proceed to the LSTM model.
The RNN is a variety of deep neural network construction that has a deep structure in the transient dimension. The premise of a common neural network is that all units of the input vectors are free of each other. As a result, the traditional neural network cannot make use of the sequential information. In opposition, the RNN model combines a hidden state that is generated by the sequential data of a time series, with the output dependent on the hidden state.
The primary design of LSTM network holds the number of hidden layers with a number of stops, which becomes a number of past data that account for training and probing. The financial time series is separated into three subsets: a training subset, a validation subset, and testing subset, with a proportion of 50%, 30%, and 20% respectively. The back-propagation algorithm is applied to train the WSAEs-LSTM model as well as the models in the experimental control group including WLSTM, LSTM, and RNN. The speed of convergence is controlled by the learning rate, which is a decreasing function of time. The experimental result becomes solid once convergence is achieved through the combinations of parameters.