In a paper published in the journal Entropy, researchers explored a hybrid quantum-classical approach for stock price prediction using a quantum long short-term memory (QLSTM) model, which integrated classical LSTM with quantum computing.
They validated QLSTM on an International Business Machines (IBM) quantum simulator and a real quantum computer, comparing its performance to classical models. The QLSTM model achieved superior results, with lower RMSE and higher prediction accuracy, outperforming the classical LSTM and other models. Hyperparameter impacts were also analyzed to optimize performance.
Related Work
Past stock price prediction work has utilized classical and quantum machine learning techniques. Classical models, such as LSTM, showed strong error minimization but lacked robustness for portfolio selection, while social media-based models achieved varied accuracy. Quantum approaches explored methods like quantum neural networks and quantum support vector machines (SVMs), with mixed results in prediction effectiveness. Hybrid models, although promising, faced challenges with computational complexity and scalability.
Quantum-Enhanced Stock Prediction Model
The QLSTM architecture for stock price prediction combines classical LSTM networks with variational quantum circuits (VQCs) to leverage quantum-enhanced learning. This model is designed to be compatible with noisy intermediate-scale quantum (NISQ) devices, involving stages like input data initialization, encoding into a quantum state, and employing quantum gates for state manipulation.
Data pre-processing methods such as normalization prepare the stock-price dataset before feeding it into the hybrid model, which integrates both classical and quantum layers. Quantum gates rotate and entangle qubits, and the results from VQCs are measured to yield predicted prices and iteratively refined to minimize error.
The core structure of QLSTM is similar to LSTM but incorporates VQCs as replacements for classical gates. VQCs1 to VQCs4 correspond to forget, input, update, and output gates, respectively. VQCs5 and VQCs6 transform the cell state into the hidden and final output states. These VQCs generate values through quantum operations, which may be further refined with activation functions. Unlike classical LSTM gates that use discrete operations, the QLSTM's gates perform linear combinations, providing the enhanced capability for learning complex temporal dependencies in data.
VQCs form a vital element of the QLSTM model. They consist of data encoding, variational, and measurement layers. The encoding layer uses quantum gates to transform classical inputs into quantum states, typically using Hadamard and rotation gates. This layer ensures that input values are appropriately normalized. Variational layers containing adjustable parameters are crucial for optimizing the quantum model's predictive power. These parameters are iteratively tuned through gradient-based methods to map nonlinear complex relationships within the input data effectively.
The performance of VQCs in the QLSTM model depends on the entanglement and rotation operations, which map information into a quantum state space. Using controlled-NOT (CNOT) and single-qubit rotation gates, these circuits entangle qubits to represent multiqubit interactions.
The optimization process adjusts rotation angles, which are not predefined but instead learned during training. The integration of VQCs within the QLSTM structure, including repeated layers for higher expressiveness, allows the model to capture intricate patterns in stock price movements, albeit with trade-offs in computation time.
Experimental Setup Overview
The experimental setup included the use of stock price data for Apple Inc. from 1 January 2022 to 1 January 2023, with 251 observations across five columns: date, open, high, low, and close. Data pre-processing was performed for numerical stability, and the dataset was split into 70% for training and 30% for testing. Evaluation metrics like root mean square error (RMSE) and prediction accuracy were used to assess model performance. Classical LSTM and QLSTM models were trained with a learning rate of 0.01, with MSE as the loss function and the Adam optimizer.
The QLSTM model showed superior performance over classical LSTM and other models. The training losses (MSE) for QLSTM remained lower with less fluctuation, indicating improved data representation using quantum encoding. The Noiseless QLSTM achieved an accuracy of 1 and RMSE of 0.0371, while the Noisy QLSTM achieved an accuracy of 0.9714 and RMSE of 0.0511. Despite the limitations of actual quantum devices, QLSTM outperformed classical models, demonstrating a 10% improvement in accuracy and a 50% reduction in RMSE.
For prediction performance, QLSTM outperformed classical models in Noiseless and Noisy scenarios, though its performance dropped in the Actual environment due to quantum noise. An experiment on the impact of the number of qubits revealed no significant improvement with an increasing number of qubits. This degradation in performance, especially between 8 to 11 qubits, could be explained by the barren plateau problem, highlighting the challenge in optimizing VQCs as the number of qubits increases.
Conclusion
To sum up, the study introduced a hybrid quantum-classical framework, QLSTM, which combined classical LSTM with variational quantum circuits (VQCs) for stock price prediction. QLSTM outperformed classical LSTM, achieving a 50% reduction in RMSE and a 10% accuracy improvement.
However, QLSTM's performance was lower than classical models when evaluated on an actual quantum machine. Future work aimed to improve scalability, explore quantum error mitigation, and extend QLSTM's applications to other time-series tasks like energy forecasting and healthcare.
Journal Reference
Kea, K., et al. (2024). A Hybrid Quantum-Classical Model for Stock Price Prediction Using Quantum-Enhanced Long Short-Term Memory. Entropy, 26:11, 954. DOI: 10.3390/e26110954, https://www.mdpi.com/1099-4300/26/11/954
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.