﻿<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD with MathML3 v1.2 20190208//EN" "http://dtd.nlm.nih.gov/publishing/3.0/journalpublishing3.dtd">
<article
    xmlns:mml="http://www.w3.org/1998/Math/MathML"
    xmlns:xlink="http://www.w3.org/1999/xlink" dtd-version="3.0" xml:lang="en" article-type="article">
  <front>
    <journal-meta>
      <journal-id journal-id-type="publisher-id">JAIBD</journal-id>
      <journal-title-group>
        <journal-title>Journal of Artificial Intelligence and Big Data</journal-title>
      </journal-title-group>
      <issn pub-type="epub">2771-2389</issn>
      <issn pub-type="ppub"></issn>
      <publisher>
        <publisher-name>Science Publications</publisher-name>
      </publisher>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.31586/jaibd.2024.877</article-id>
      <article-id pub-id-type="publisher-id">JAIBD-877</article-id>
      <article-categories>
        <subj-group subj-group-type="heading">
          <subject>Article</subject>
        </subj-group>
      </article-categories>
      <title-group>
        <article-title>
          Stock Closing Price and Trend Prediction with LSTM-RNN
        </article-title>
      </title-group>
      <contrib-group>
<contrib contrib-type="author">
<name>
<surname>Varadharajan</surname>
<given-names>Vivek</given-names>
</name>
<xref rid="af1" ref-type="aff">1</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Smith</surname>
<given-names>Nathan</given-names>
</name>
<xref rid="af2" ref-type="aff">2</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Kalla</surname>
<given-names>Dinesh</given-names>
</name>
<xref rid="af2" ref-type="aff">2</xref>
<xref rid="cr1" ref-type="corresp">*</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Kumar</surname>
<given-names>Ganesh R</given-names>
</name>
<xref rid="af3" ref-type="aff">3</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Samaah</surname>
<given-names>Fnu</given-names>
</name>
<xref rid="af4" ref-type="aff">4</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Polimetla</surname>
<given-names>Kiran</given-names>
</name>
<xref rid="af5" ref-type="aff">5</xref>
</contrib>
      </contrib-group>
<aff id="af1"><label>1</label> Denver University, Department of Computer Science, CO, USA</aff>
<aff id="af2"><label>2</label> Colorado Technical University, Department of Computer Science, CO, USA</aff>
<aff id="af3"><label>3</label> Indian Institute of Technology, TN, India</aff>
<aff id="af4"><label>4</label> Harrisburg University of Science and Technology, Department of Computer Science, Harrisburg, PA, USA</aff>
<aff id="af5"><label>5</label> Adobe, Adobe Technology Services, 345 Park Ave, San Jose, CA, USA</aff>
<author-notes>
<corresp id="c1">
<label>*</label>Corresponding author at: Colorado Technical University, Department of Computer Science, CO, USA
</corresp>
</author-notes>
      <pub-date pub-type="epub">
        <day>15</day>
        <month>02</month>
        <year>2024</year>
      </pub-date>
      <volume>4</volume>
      <issue>1</issue>
      <history>
        <date date-type="received">
          <day>10</day>
          <month>01</month>
          <year>2024</year>
        </date>
        <date date-type="rev-recd">
          <day>11</day>
          <month>02</month>
          <year>2024</year>
        </date>
        <date date-type="accepted">
          <day>14</day>
          <month>02</month>
          <year>2024</year>
        </date>
        <date date-type="pub">
          <day>15</day>
          <month>02</month>
          <year>2024</year>
        </date>
      </history>
      <permissions>
        <copyright-statement>&#xa9; Copyright 2024 by authors and Trend Research Publishing Inc. </copyright-statement>
        <copyright-year>2024</copyright-year>
        <license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
          <license-p>This work is licensed under the Creative Commons Attribution International License (CC BY). http://creativecommons.org/licenses/by/4.0/</license-p>
        </license>
      </permissions>
      <abstract>
        The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing price of the Amazon Inc. stock (ticker symbol: AMZN). We study the influence of various hyperparameters in the model to see what factors the predictive power of the model. The root mean squared error (RMSE) on the training was 2.51 with a mean absolute percentage error (MAPE) of 1.84%.
      </abstract>
      <kwd-group>
        <kwd-group><kwd>Stock Price Prediction</kwd>
<kwd>Artificial Intelligence</kwd>
<kwd>Machine Learning</kwd>
<kwd>LSTM-RNN</kwd>
</kwd-group>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec1">
<title>Introduction</title><p>In the stock market, making informed decisions is crucial for investors when buying or selling stocks and optimizing their investments to earn profits. As the market fluctuates daily, making predictions about profitable stocks is challenging. The uncertainty of the stock market negatively affects predictions if made without experience or the help of deep learning architectures. There are several techniques individuals and companies are adapting to make informed decisions when buying and selling stocks. The deep learning architectures and their applications are the most deployed techniques. Deep learning architectures have self-learning processes that capture hidden dynamics and patterns that are not easily explained by traditional regression-based approaches, which depend on a predefined set of covariates and assumptions. This ability to capture and remember hidden patterns and dynamics helps to understand the volatile, non-linear stock market. Most deep learning techniques use neural network models. These models are trained to evaluate large amounts of data and learn about features from data sets without extracting features manually [
<xref ref-type="bibr" rid="R1">1</xref>]. </p>
<p>In a traditional neural network, inputs and outputs work independently. However, in a neural network used for stock market predictions, the interdependence of input and outputs is required to capture a sequence and predict it later. Neural network architectures capture the sequence of inputs and outputs and retain relevant information long enough to make predictions in the future. For example, if one wears clothes in a sequence of red, blue, green, yellow, and black throughout the week or for two weeks in a row, it is captured in the neural network memory, and upon predictions, it will predict this sequence. This example shows that deep learning neural networks work based on the information retained in their memory.</p>
<p>Recurrent neural networks (RNNs) are a set of neural networks that help model sequence data. The logic behind RNN is that it remembers the stock price in a particular sequence and uses it later to make stock predictions in a specific pattern. RNN layers use algorithms to iterate over timesteps and encode information about these timesteps it has observed so far. Recurrent neural networks can only remember encoded timesteps for a short time; this is the reason Long-Short Term Memory (LSTM) is used to remember patterns for a more extended period. LSTM is also being used in deep learning as neural network architecture. LTSM is preferred over RNN because it retains information for a longer time. LSTM consists of an input gate, a cell, an output gate, and a forget gate. The cell holds values, and three gates regulate values in the form of information in and out of the cell [
<xref ref-type="bibr" rid="R2">2</xref>]. </p>
<p>Traditionally, stock price models and predictions were made with the Auto-Regressive Integrated Moving Average (ARIMA) model to predict short-term variations in stock price [
<xref ref-type="bibr" rid="R3">3</xref>]. Long Short-Term Memory (LSTM) neural networks are often better than Autoregressive Integrated Moving Average (ARIMA) models for stock price prediction because they capture complex and non-linear patterns in time series data. While ARIMA models are linear and rely on stationarity assumptions, LSTMs can model long-term dependencies and handle nonstationary data [
<xref ref-type="bibr" rid="R4">4</xref>].</p>
<p>Long Short-Term Memory (LSTM) recurrent neural networks (RNN) have demonstrated their advantage over other models for stock price prediction due to their ability to identify patterns and model the complex temporal dependencies inherent in financial time series data. Unlike traditional linear models that often do not consider non-linear and dynamic patterns, LSTM-RNNs simultaneously incorporate long-range dependencies and short-term sudden fluctuations or spikes in the price. Moreover, the memory cells in LSTM networks allow them to learn and remember past information over extended periods, which is particularly crucial for capturing stock price trends influenced by historical market conditions and news events.</p>
</sec><sec id="sec2">
<title>Materials and Methods</title><p>The stock market is characterized by discontinuity, nonlinearity, and multifaceted elements that affect the broker's assumptions about market trends and economic situations [
<xref ref-type="bibr" rid="R5">5</xref>]. Considering the uncertainty, rapid and informed decision-making is required in the shortest possible time. According to research studies, several validated methods are available to predict the stock market, but LSTM-RNNs are the most used among investors to gain higher profits [
<xref ref-type="bibr" rid="R6">6</xref>]. These methods are effective in finding out linear and non-linear positions of stocks. </p>
<p>Research has shown that LSTM-RNNs consistently outperform conventional statistical models and even other deep learning architectures, such as feedforward neural networks and convolutional neural networks, regarding prediction accuracy [
<xref ref-type="bibr" rid="R7">7</xref>,<xref ref-type="bibr" rid="R8">8</xref>]. Additionally, LSTM networks can seamlessly incorporate various data types, including technical indicators, macroeconomic factors, and sentiment analysis, making them a versatile choice for stock price forecasting tasks [
<xref ref-type="bibr" rid="R9">9</xref>]. These advantages underscore the suitability of LSTM RNNs as a preferred choice for accurate and robust stock closing price prediction.</p>
<p>Gupta and Wang (2010) used neural networks to trade and forecast the future stock prices of Standard and Poor's 500 (S&#x26;#x00026;P 500). This research study examines the effect of training a network with past indexed and most recent data. Results of the study reveal a higher rate of returns of past data compared to most recent trends. Thus, if stocks of a company were more likely to be profitable in the past, there is a higher possibility that the company will be profitable in the future. There were innumerable Exchange-traded Funds (ETFs) that replicated the performance of the company S&#x26;#x00026;P 500 by holding stocks in the same sequence as the index, giving the same percentage of returns [
<xref ref-type="bibr" rid="R10">10</xref>]. </p>
<p>Research by Zheng et al. (2013) explored applications of recurrent neural networks (RNNs) whose hidden layers are used to retain information to make stock predictions. This research discusses the essential components of this neural network based on input and output mechanisms. This neural network was tested on different companies, and results show that the system has a higher accuracy score when making stock predictions [
<xref ref-type="bibr" rid="R11">11</xref>].</p>
<p>Lim et al. (2016) used neural network architectures to predict housing prices in the Singaporean market. Deployed neural networks were used to estimate the difference between the release price index (RPI) of houses in Singapore with nine independent demographic and economic variables. The research study results reveal that deployed neural network architectures are good at producing predictions [
<xref ref-type="bibr" rid="R12">12</xref>]. Hence, these networks are helpful to make informed and profitable decisions when it comes to buying and selling stocks. </p>
<p>According to the above-reviewed literature, stock market predictions could be made using neural network architectures, but the study of advanced architectures is still under process. The acceptance of neural network architectures is higher to make stock predictions, but these architectures have yet to prove highly influential. The enhancement of these methods is still under process as research dimensions of researchers are different for studying stock market predictions.</p>
</sec><sec id="sec3">
<title>Research Methodology and Design</title><p>Quantitative research method is the best approach to study the influence of various hyperparameters in the model to see what factors affect the predictive power of the model. A quantitative research method is used to study the effect of a particular variable on others [
<xref ref-type="bibr" rid="R13">13</xref>]. The aim is to check the LSTM-RNNs' accuracy to predict Amazon Inc's stock trends (AMZN). Thus, data will be collected from different sources to interpret the model's accuracy. The quantitative interpretation of the model will provide a holistic understanding of the trend and how it differs from the rest of the neural network models for stock prediction.</p>
<p>Controlled synthetic data will be produced to provide a better understanding of the predictive performance of the proposed neural network LSTM-RNN. Data and results will be interpreted visually to provide a holistic picture of the trends examined. The analysis of collected data will be done using an auto-regressive (order 1) moving average (order 1) sequence with a first-order finite difference to make it stationary (ARIMA (1,1,1)). The proposed research method and design are time-consuming because each variable will be evaluated individually to understand its relationship with the predictive network. </p>
<p>After synthetically generating ARIMA(1,1,1) data, the data was split into an 80-20% ratio for training and testing purposes. The MinMax scaler was utilized to limit the data values between 0 and 1 [
<xref ref-type="bibr" rid="R14">14</xref>]. </p>
<p>Auto-regressive integrated moving average (ARIMA) model equation:</p>

<disp-formula id="FD1"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><msub><mrow><mi>y</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>=</mo><msub><mrow><mi mathvariant="normal">ϕ</mi></mrow><mrow><mn>1</mn></mrow></msub><msub><mrow><mi>y</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>1</mn></mrow></msub><mo>+</mo><msub><mrow><mi mathvariant="normal">ϕ</mi></mrow><mrow><mn>2</mn></mrow></msub><msub><mrow><mi>y</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>2</mn></mrow></msub><mo>+</mo><mo>…</mo><mo>+</mo><msub><mrow><mi mathvariant="normal">ϕ</mi></mrow><mrow><mi>p</mi></mrow></msub><msub><mrow><mi>y</mi></mrow><mrow><mi>t</mi><mo>-</mo><mi>p</mi></mrow></msub><mo>-</mo><msub><mrow><mi mathvariant="normal">θ</mi></mrow><mrow><mn>1</mn></mrow></msub><msub><mrow><mi mathvariant="normal">ϵ</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>1</mn></mrow></msub><mo>-</mo><msub><mrow><mi mathvariant="normal">θ</mi></mrow><mrow><mn>2</mn></mrow></msub><msub><mrow><mi mathvariant="normal">ϵ</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>2</mn></mrow></msub><mo>+</mo><mo>…</mo><mo>+</mo><msub><mrow><mi mathvariant="normal">ϵ</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>-</mo><msub><mrow><mi mathvariant="normal">θ</mi></mrow><mrow><mi>q</mi></mrow></msub><msub><mrow><mi mathvariant="normal">ϵ</mi></mrow><mrow><mi>t</mi><mo>-</mo><mi>q</mi></mrow></msub></mrow></semantics></math></div><div class="l"><label>(1)</label></div></div></disp-formula><p>Where  is the time series value at time ,  is a constant,  and  are model parameters, and  is white noise. ACF and PACF plays a vital role in determining the orders of ARIMA model which is defined by p,q, d parameters where p is order of autoregressive component , d is degree of differencing and q is order of moving average component. The ACF helps determine the value of q, which is the number of lag observations included in the model, while the PACF helps determine the value of p, which is the number of lag observations directly affecting the current observation. These values, along with the differencing parameter d, collectively define the ARIMA model that best fits the time series data. The ACF and PACF plots assist in identifying the optimal values for p, d, and q by examining the decay patterns and significant spikes in the correlation functions at various lags.</p>
<p>An LSTM-RNN model was fitted to the training data using a batch size of 64, 50 epochs, a sequence length 5 (number of past values as features), and two hidden layers with 50 neurons each. This fitted model was then applied to the test data to generate predictions. </p>
<p>Long Short-Term Memory (LSTM) model equations:</p>

<disp-formula id="FD2"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><msub><mrow><mi>i</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>=</mo><mi mathvariant="normal">σ</mi><mfenced separators="|"><mrow><msub><mrow><mi>W</mi></mrow><mrow><mi>i</mi><mi>i</mi></mrow></msub><msub><mrow><mi>x</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>+</mo><msub><mrow><mi>W</mi></mrow><mrow><mi>h</mi><mi>i</mi></mrow></msub><msub><mrow><mi>h</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>1</mn></mrow></msub><mo>+</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>h</mi><mi>i</mi></mrow></msub><mo>+</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>i</mi><mi>i</mi></mrow></msub></mrow></mfenced></mrow></semantics></math></div><div class="l"><label>(2)</label></div></div></disp-formula>
<disp-formula id="FD3"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><msub><mrow><mi>f</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>=</mo><mi mathvariant="normal">σ</mi><mfenced separators="|"><mrow><msub><mrow><mi>W</mi></mrow><mrow><mi>i</mi><mi>f</mi></mrow></msub><msub><mrow><mi>x</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>+</mo><msub><mrow><mi>W</mi></mrow><mrow><mi>h</mi><mi>f</mi></mrow></msub><msub><mrow><mi>h</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>1</mn></mrow></msub><mo>+</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>h</mi><mi>f</mi></mrow></msub><mo>+</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>i</mi><mi>f</mi></mrow></msub></mrow></mfenced></mrow></semantics></math></div><div class="l"><label>(3)</label></div></div></disp-formula>
<disp-formula id="FD4"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><msub><mrow><mi>c</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>=</mo><msub><mrow><mi>f</mi></mrow><mrow><mi>t</mi></mrow></msub><msub><mrow><mi>c</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>1</mn></mrow></msub><mo>+</mo><msub><mrow><mi>i</mi></mrow><mrow><mi>t</mi></mrow></msub><mrow><mrow><mi mathvariant="normal">tanh</mi></mrow><mo>⁡</mo><mrow><mfenced separators="|"><mrow><msub><mrow><mi>W</mi></mrow><mrow><mi>i</mi><mi>c</mi></mrow></msub><msub><mrow><mi>x</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>+</mo><msub><mrow><mi>W</mi></mrow><mrow><mi>h</mi><mi>c</mi></mrow></msub><msub><mrow><mi>h</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>1</mn></mrow></msub><mo>+</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>h</mi><mi>c</mi></mrow></msub></mrow></mfenced></mrow></mrow></mrow></semantics></math></div><div class="l"><label>(4)</label></div></div></disp-formula>
<disp-formula id="FD5"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><msub><mrow><mi>o</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>=</mo><mi mathvariant="normal">σ</mi><mfenced separators="|"><mrow><msub><mrow><mi>W</mi></mrow><mrow><mi>i</mi><mi>o</mi></mrow></msub><msub><mrow><mi>x</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>+</mo><msub><mrow><mi>W</mi></mrow><mrow><mi>h</mi><mi>o</mi></mrow></msub><msub><mrow><mi>h</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>1</mn></mrow></msub><mo>+</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>i</mi><mi>o</mi></mrow></msub><mo>+</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>h</mi><mi>o</mi></mrow></msub></mrow></mfenced></mrow></semantics></math></div><div class="l"><label>(5)</label></div></div></disp-formula>
<disp-formula id="FD6"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><msub><mrow><mi>h</mi></mrow><mrow><mi>t</mi><mo>=</mo><mo>.</mo></mrow></msub><msub><mrow><mi>o</mi></mrow><mrow><mi>t</mi></mrow></msub><mo>.</mo><mi mathvariant="normal">t</mi><mi mathvariant="normal">a</mi><mi mathvariant="normal">n</mi><mi mathvariant="normal">h</mi><mo>⁡</mo><mo>(</mo><msub><mrow><mi>c</mi></mrow><mrow><mi>t</mi><mo>)</mo></mrow></msub></mrow></semantics></math></div><div class="l"><label>(6)</label></div></div></disp-formula><p>Where <math><semantics><mrow><mi>i</mi><mo>,</mo><mi> </mi><mi>f</mi><mo>,</mo><mi> </mi><mi>c</mi><mo>,</mo><mi> </mi><mi>o</mi></mrow></semantics></math> are the input gate, forget gate, cell state, and output gate whereas <math><semantics><mrow><mi mathvariant="normal">σ</mi></mrow></semantics></math> is the sigmoid function.. <math><semantics><mrow><msub><mrow><mi>x</mi></mrow><mrow><mi>t</mi></mrow></msub><mi> </mi><mo>&amp;</mo><mi> </mi><msub><mrow><mi>h</mi></mrow><mrow><mi>t</mi><mo>-</mo><mn>1</mn></mrow></msub><mi> </mi></mrow></semantics></math>input at time t and hidden state from the previous time step respectively.</p>
<p> <math><semantics><mrow><msub><mrow><mi>W</mi></mrow><mrow><mi>i</mi><mi>i</mi></mrow></msub><mo>,</mo><msub><mrow><mi>W</mi></mrow><mrow><mi>h</mi><mi>i</mi></mrow></msub><mo>,</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>h</mi><mi>i</mi></mrow></msub><mo>,</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>i</mi><mi>i</mi></mrow></msub><mo>,</mo><msub><mrow><mi>W</mi></mrow><mrow><mi>i</mi><mi>f</mi></mrow></msub><mo>,</mo><mi> </mi><msub><mrow><mi>W</mi></mrow><mrow><mi>h</mi><mi>f</mi></mrow></msub><mo>,</mo><mi> </mi><msub><mrow><mi>b</mi></mrow><mrow><mi>h</mi><mi>f</mi></mrow></msub><mo>,</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>i</mi><mi>f</mi></mrow></msub><mo>,</mo><msub><mrow><mi>W</mi></mrow><mrow><mi>i</mi><mi>o</mi></mrow></msub><mo>,</mo><msub><mrow><mi>W</mi></mrow><mrow><mi>h</mi><mi>o</mi></mrow></msub><mo>,</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>i</mi><mi>o</mi></mrow></msub><mo>,</mo><msub><mrow><mi>b</mi></mrow><mrow><mi>h</mi><mi>o</mi></mrow></msub><mi> </mi></mrow></semantics></math>are the weights and biases of the LSTM unit, which are learned during the training process. The LSTM architecture allows for the modeling of sequential dependencies over long time horizons, making it particularly effective for tasks such as natural language processing and time series prediction.</p>
<p>After applying the model to both the training and test datasets, the inverse MinMax transform was applied to return the data to its original scale before computing evaluation metrics and visualization.</p>
<p>Performance metrics for stock forecasting are evaluated using Root Mean Squared Error, Mean Absolute Error and Mean absolute Percentage.</p>

<disp-formula id="FD7"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><mi>R</mi><mi>M</mi><mi>S</mi><mi>E</mi><mo>=</mo><msqrt><mfrac><mrow><mn>1</mn></mrow><mrow><mi>n</mi></mrow></mfrac><mrow><msubsup><mo stretchy="false">∑</mo><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mrow><mi>n</mi></mrow></msubsup><mrow><msup><mrow><mfenced separators="|"><mrow><msub><mrow><mi>Y</mi></mrow><mrow><mi>a</mi><mi>c</mi><mi>t</mi><mi>u</mi><mi>a</mi><mi>l</mi></mrow></msub><mo>-</mo><msub><mrow><mi>Y</mi></mrow><mrow><mi>p</mi><mi>r</mi><mi>e</mi><mi>d</mi><mi>i</mi><mi>c</mi><mi>t</mi><mi>e</mi><mi>d</mi></mrow></msub></mrow></mfenced></mrow><mrow><mn>2</mn></mrow></msup></mrow></mrow></msqrt></mrow></semantics></math></div><div class="l"><label>(7)</label></div></div></disp-formula><p>RMSE measures the square root of the average of the squared differences between predicted and actual values. It provides a measure of the magnitude of errors and is sensitive to large errors.</p>

<disp-formula id="FD8"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><mi>M</mi><mi>A</mi><mi>E</mi><mo>=</mo><mfrac><mrow><mn>1</mn></mrow><mrow><mi>n</mi></mrow></mfrac><mrow><msubsup><mo stretchy="false">∑</mo><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mrow><mi>n</mi></mrow></msubsup><mrow><mfenced open="|" close="|" separators="|"><mrow><msub><mrow><mi>Y</mi></mrow><mrow><mi>a</mi><mi>c</mi><mi>t</mi><mi>u</mi><mi>a</mi><mi>l</mi></mrow></msub><mo>-</mo><msub><mrow><mi>Y</mi></mrow><mrow><mi>p</mi><mi>r</mi><mi>e</mi><mi>d</mi><mi>i</mi><mi>c</mi><mi>t</mi><mi>e</mi><mi>d</mi></mrow></msub></mrow></mfenced></mrow></mrow></mrow></semantics></math></div><div class="l"><label>(8)</label></div></div></disp-formula><p>MAE represents the average absolute difference between predicted and actual values. It is less sensitive to outliers compared to RMSE and provides a straightforward measure of forecast accuracy.</p>

<disp-formula id="FD9"><div class="html-disp-formula-info"><div class="f"><math display="inline"><semantics><mrow><mi>M</mi><mi>A</mi><mi>P</mi><mi>E</mi><mo>=</mo><mfrac><mrow><mn>100</mn></mrow><mrow><mi>n</mi></mrow></mfrac><mrow><msubsup><mo stretchy="false">∑</mo><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mrow><mi>n</mi></mrow></msubsup><mrow><mfenced open="|" close="|" separators="|"><mrow><mfrac><mrow><msub><mrow><mi>Y</mi></mrow><mrow><mi>a</mi><mi>c</mi><mi>t</mi><mi>u</mi><mi>a</mi><mi>l</mi></mrow></msub><mo>-</mo><msub><mrow><mi>Y</mi></mrow><mrow><mi>p</mi><mi>r</mi><mi>e</mi><mi>d</mi><mi>i</mi><mi>c</mi><mi>t</mi><mi>e</mi><mi>d</mi></mrow></msub></mrow><mrow><msub><mrow><mi>Y</mi></mrow><mrow><mi>a</mi><mi>c</mi><mi>t</mi><mi>u</mi><mi>a</mi><mi>l</mi></mrow></msub></mrow></mfrac></mrow></mfenced></mrow></mrow></mrow></semantics></math></div><div class="l"><label>(9)</label></div></div></disp-formula><p>MAPE calculates the average percentage difference between predicted and actual values. It is useful for expressing errors relative to the scale of the data and is particularly informative when dealing with variables of varying magnitudes. Where <math><semantics><mrow><mi>n</mi></mrow></semantics></math> is the number of data points, <math><semantics><mrow><msub><mrow><mi>Y</mi></mrow><mrow><mi>a</mi><mi>c</mi><mi>t</mi><mi>u</mi><mi>a</mi><mi>l</mi></mrow></msub></mrow></semantics></math> is the actual value, and<math><semantics><mrow><msub><mrow><mi> </mi><mi>Y</mi></mrow><mrow><mi>p</mi><mi>r</mi><mi>e</mi><mi>d</mi><mi>i</mi><mi>c</mi><mi>t</mi><mi>e</mi><mi>d</mi></mrow></msub></mrow></semantics></math> is the predicted value.</p>
<p></p>
<p></p>
</sec><sec id="sec4">
<title>Data Analysis and Results</title><p>To gain improved insight into the predictive performance of the LSTM-RNN model, controlled synthetic data was generated using an auto-regressive integrated moving average sequence of orders (1,1,1) with a first-order finite difference applied to achieve stationarity.Figure <xref ref-type="fig" rid="fig1"> 1</xref> plots the simulated time series produced through this ARIMA(1,1,1) process. Gaussian noise with unit variance and zero mean was introduced to the data.</p>
<fig id="fig1">
<label>Figure 1</label>
<caption>
<p>A plot of 1000 data points for an ARIMA (1,1,1); the x-axis time is unitless</p>
</caption>
<graphic xlink:href="877.fig.001" />
</fig><p>The corresponding autocorrelation function (ACF) and partial autocorrelation function for 40 different lags are shown inFigure <xref ref-type="fig" rid="fig2A"> 2A</xref> andFigure <xref ref-type="fig" rid="fig2B"> 2B</xref>, respectively. The ACF shows a more gradual decay since it does not control for the indirect correlation due to shorter lags. The blue-shaded region is the confidence interval for the ACF values for a particular lag; if the ACF value lies outside this region, the ACF value at the corresponding lag is statistically significant. However, the PACF falls rapidly after lag two since it is an ARIMA(1,1,1) process, and the auto-regressive component is of order 1.</p>
<fig id="fig2">
<label>Figure 2</label>
<caption>
<p><b> </b>The ACF of the simulated time series is displayed here for a maximum of 60 lags.</p>
</caption>
<graphic xlink:href="877.fig.002" />
</fig><p>Here, the lags are also unitless since it is a simulated time series. The ACF value at any given lag includes indirect correlations for shorter lags, and hence, the ACF decays more gradually compared to the PACF inFigure <xref ref-type="fig" rid="fig2B"> 2B</xref>.</p>
<fig id="fig3">
<label>Figure 3</label>
<caption>
<p><b> </b>The PACF of the simulated time series is displayed here for a maximum of 60 lags.</p>
</caption>
<graphic xlink:href="877.fig.003" />
</fig><p>In this simulated time series, the lags are also dimensionless units. The partial autocorrelation function (PACF) value at any specified lag accounts for indirect correlations from shorter lags. Consequently, the PACF decays more rapidly than the autocorrelation function (ACF). Examination reveals sporadic PACF values surpassing the 0.05 confidence interval threshold infrequently.</p>
<p>The data was split into an 80-20% ratio for training and testing purposes. The MinMax scaler was utilized to limit the data values between 0 and 1. An LSTM-RNN model was fit to the training data using a batch size of 64, 50 epochs, a sequence length of 5 (number of past values as features), and two hidden layers of 50 neurons each. This trained model was then applied to the test data to generate predictions. After applying the model to training and test datasets, the inverse MinMax transform was applied to return the data to its original scale before computing evaluation metrics and visualization. The training data's root mean squared error (RMSE) and mean absolute error (MAE) were computed as 1.64 and 1.32, respectively, while the test set errors were 1.75 and 1.43. These standard metrics were calculated using Python's STATSMODELS package.Figure <xref ref-type="fig" rid="fig3A"> 3A</xref> displays the training data in red overlayed with the fitted values in black. Visually, an extremely close overlap can be observed, making it difficult to discern any difference due to the high fidelity of the model fit.</p>
<fig id="fig4">
<label>Figure 4</label>
<caption>
<p>The fitted data from training is the LSTM-RNN model (black) and the actual data(red).</p>
</caption>
<graphic xlink:href="877.fig.004" />
</fig><p>As illustrated inFigure <xref ref-type="fig" rid="fig3B"> 3B</xref>, the LSTM-RNN model's predicted values closely match the actual test data overall. However, some minor deviations between the model's forecasts (in black) and the actual ARIMA values (in red) become discernible in examining the plot. The two overlapping lines are different, with some visible gaps in certain sections. Still, these discrepancies are relatively small, and the model's projected trajectory adheres remarkably well to the ups and downs of the authentic time series. The LSTM-RNN convincingly reproduces the core shape and progression of the data. While a perfect fit would display a complete overlap of the red and black lines, the model's high fidelity to the intricate fluctuations of the test set is apparent inFigure <xref ref-type="fig" rid="fig3B"> 3B</xref>. This plot highlights how the model aptly captures the ARIMA pattern despite a foreseeable and modest decline in accuracy compared to the training data fit.</p>
<fig id="fig5">
<label>Figure 5</label>
<caption>
<p>The predicted values of the model (black) and the actual test data (red)</p>
</caption>
<graphic xlink:href="877.fig.005" />
</fig><p>The synthetic data analysis demonstrates that the LSTM model successfully captures the trends and patterns exhibited by the ARIMA process data.</p>
<p>The study now transitions from the simulated time series to analyze actual Amazon Inc. stock closing price information. Two recent years of daily closing price data were downloaded for this paper using the Yahoo Finance API accessible through Python.Figure <xref ref-type="fig" rid="fig4A"> 4A</xref> andFigure <xref ref-type="fig" rid="fig4B"> 4B</xref> display the autocorrelation function (ACF) and partial autocorrelation function for this actual stock data, respectively. Examination shows that the ACF and PACF pattern for the genuine stock prices closely resembles the synthetically generated time series. This similarity illustrates that actual stock prices commonly follow an ARIMA process, aligning with existing literature such as the "Time Series Analysis" text by Cryer and Chan (2008).</p>
<fig id="fig6">
<label>Figure 6</label>
<caption>
<p>ACF of daily closing price data over a maximum lag of 40 days.</p>
</caption>
<graphic xlink:href="877.fig.006" />
</fig><fig id="fig7">
<label>Figure 7</label>
<caption>
<p>PACF of daily closing price data over a maximum lag of 40 days.</p>
</caption>
<graphic xlink:href="877.fig.007" />
</fig><p>Figure 5 shows a histogram of the daily percentage change in stock price. It is interesting to notice that it follows a near-normal distribution. This can be explained by the continuously changing stock price from day to day due to supply, demand, and volume of shares traded. </p>
<fig id="fig8">
<label>Figure 8</label>
<caption>
<p>Distribution of percentage change from the previous day of closing stock price</p>
</caption>
<graphic xlink:href="877.fig.008" />
</fig><title>4.1. Data Preprocessing</title><p>As with the simulated time series, the data was split into 80-20% for training and testing purposes. Before fitting the model and making predictions on the test data, the MinMax scaler was used to transform the data. The inverse transform of the model outputs was taken before computing metrics or plotting the data.</p>
<title>4.2. Model Tuning and Evaluation</title><p>A 2-layer LSTM-RNN model was then tuned using different values for epochs (50, 100, and 200), batch size (32, 64, and 128), and number of neurons in the first layer (32, 64, and 128). The number of neurons for the first layer was varied, and half was used for the second layer before the final output layer. The original test data was split in half to perform the grid search over the 27 combinations of parameters. Half of the original test data was used to validate and tune the hyperparameters. The other half was used in making actual predictions on the model and computing performance metrics reported inTable <xref ref-type="table" rid="tab1">1</xref>, which summarizes three metrics: the RMSE, MAE, and MAPE of the LSTM model on the final test set for the different combinations of hyperparameters. Only the top 10 performing combinations of hyperparameters are shown in theTable <xref ref-type="table" rid="tabtable since"> table since</xref> it is sorted in ascending order of RMSE. All other combinations not listed had slightly higher root mean squared errors.</p>
<title>4.3. Model Selection</title><p>Finally, after selecting the model with the best RMSE (batch size, number of units in the hidden layers (64,32), and epochs=200), a dropout rate of 20% was employed after the first LSTM layer to avoid any potential overfitting of the training data.</p>
<p>Figure 6 shows the empirical dependence of training and validation loss on the number of epochs for this model. After about 120 epochs, the training and validation loss almost merge in magnitude, demonstrating that the model does not overfit or underfit the data but instead follows the data very well. Underfitting or overfitting is a typical problem that can arise in other regression models. While one can tune the deep learning model with more epochs to near perfection, it only takes a lot of computational resources and time if employed on a more powerful computer.</p>
<p></p>
<p></p>
<table-wrap id="tab1">
<label>Table 1</label>
<caption>
<p><b> Performance metrics for different combinations of hyperparameters of the LSTM model. Intuitively, the number of epochs is significant in training the model since none of the combinations feature an epoch number of 50 in the top 10</b></p>
</caption>

<table>
<thead>
<tr>
<th align="center"><bold>Number of Units in  First Layer</bold></th>
<th align="center"><bold>Batch Size</bold></th>
<th align="center"><bold>Epochs</bold></th>
<th align="center"><bold>RMSE</bold></th>
<th align="center"><bold>MAE</bold></th>
<th align="center"><bold>MAPE</bold></th>
<th align="center"></th>
</tr>
</thead>
<tbody>
<tr>
<td align="center">641281283264641283232</td>
<td align="center">646432323232323264</td>
<td align="center">200200200200100200100100200</td>
<td align="center">2.8242.8272.8322.8622.8792.8812.9212.9672.995</td>
<td align="center">2.0432.0492.0602.1062.0992.1372.1402.1882.219</td>
<td align="center">1.5101.5131.5231.5571.5481.5811.5771.6131.635</td>
<td align="center"></td>
</tr>
<tr>
<td align="center" colspan="6">
<hr />
</td>
</tr>
</tbody>
</table>
</table-wrap><fig id="fig9">
<label>Figure 9</label>
<caption>
<p>Empirical Training and Validation Loss vs. Number of Epochs Demonstrating Good Model Fit Without Overfitting or Underfitting</p>
</caption>
<graphic xlink:href="877.fig.009" />
</fig><p>Figure 7 shows the results of our best model, including five days of forecasted values. The fitted and predicted values are hardly discernible from the actual data, illustrating that the model closely follows the trends and fluctuations of the original data.</p>
<fig id="fig10">
<label>Figure 10</label>
<caption>
<p>Fitted, predicted, and forecasted daily closing prices for the AMZN stock overlayed on the actual values are shown in this figure</p>
</caption>
<graphic xlink:href="877.fig.010" />
</fig></sec><sec id="sec5">
<title>Discussion</title><title>5.1. Effectiveness of the LSTM-RNN Model</title><p>The main goal of this study was to examine the effectiveness of LSTM-RNNs in capturing predictive stock prices of Amazon Inc. The model was analyzed in relation to different hyperparameters to choose the best one by checking and comparing the accuracy of each one. The uncertainty of the stock market motivated researchers to build networks to make predictions about a company's stock performance and help investors make informed decisions.</p>
<title>5.2. Simulation Results</title><p>The simulation of collected data using a proposed research method reveals that LSTM-RNNs retain data information having closely related sequences to make future predictions about Amazon Inc. stocks. Investors use LSTM-RNNs to predict the actual value of stocks and make informed decisions about buying, holding, or selling stocks. This neural network uses pre-captured information to make predictions for investors [
<xref ref-type="bibr" rid="R15">15</xref>]. The accuracy of data collected and retained by LSTM-RNNs is higher because LSTM avoids any potential overfitting of the training data. It captures limited data sequences observed most of the time throughout the company's stock history [
<xref ref-type="bibr" rid="R16">16</xref>].</p>
<title>5.3. Model Evaluation</title><p>Data collected analyzed three metrics for the model, the RMSE, MAE, and MAPE of the LSTM model, on the final test set for the different combinations of hyperparameters. The experiment ignored other models due to the higher root mean square error. The selected criteria for the best models were batch size, number of units in the hidden layers, and epochs. Models fitted to this criterion were more accurate when predicting the company's stock value. Hence, the research proved that the prediction accuracy of neural models should be higher than 95%, with the predicted loss close to 0.1%. Investors are looking for models that could provide closely related value for stocks [
<xref ref-type="bibr" rid="R17">17</xref>].</p>
<title>5.4. Predictive Performance on Real Data</title><p>Analysis of Amazon Inc.'s actual stock closing prices and the values predicted by the LSTM-RNN model show strong agreement. This indicates the model's accuracy in forecasting real-world stock data, complementing the synthetic data results. Further supporting the effectiveness of LSTM-RNNs, prior research has demonstrated their capabilities for short-term stock price prediction [
<xref ref-type="bibr" rid="R18">18</xref>]. Comparative analysis verifies that the LSTM-RNN represents an effective neural network architecture for predicting stock values of large enterprises such as Amazon.</p>
<title>5.5. Broader Implications</title><p>The demonstrated capabilities of these neural network models carry significant implications for stock market investors seeking to make informed decisions and maximize returns when buying, selling, or holding stocks [
<xref ref-type="bibr" rid="R19">19</xref>]. More broadly, machine learning is transforming businesses across industries. With fluctuating and nonstationary market conditions, having insights into past and future trends is crucial for companies participating in the global economy [
<xref ref-type="bibr" rid="R20">20</xref>]. International investors increasingly rely on machine learning tools like LSTM-RNNs to guide stock investment decisions and risk management [
<xref ref-type="bibr" rid="R21">21</xref>].</p>
</sec><sec id="sec6">
<title>Conclusions</title><p>This research illustrates the growing utility of LSTM-RNN models for stock market prediction tasks, demonstrating their ability to capture trends and fluctuations in financial time series data accurately. Analysis of synthetic data verifies the effectiveness of LSTM-RNNs in modeling ARIMA-like stock price patterns. The study further proposes and evaluates the use of LSTM-RNN models for forecasting the closing prices of Amazon Inc. shares. Training across hyperparameters showed the most significant epochs, as more epochs allow greater learning from the data. The optimized LSTM-RNN model achieved strong performance, with an RMSE of 2.51 and MAPE of 1.84% on the training set. Overall, the availability of performant LSTM-RNN models enhances investors' ability to understand and operate within dynamic stock market environments and make informed trading decisions. Comparative literature analysis reveals that prediction is vital to profitability. This research highlights the practical business value of LSTM-RNNs. However, given market uncertainty, further research is still recommended to re-evaluate model accuracy. This study demonstrates LSTM-RNN models as an important emerging tool for stock analysis, driving financial market evolution and improved outcomes.</p>
<p></p>
<p><bold>Author Contributions:</bold> Vivek and Dinesh worked in implementing and testing the deep learning models. Nathan, Sai and Samaah helped in drafting, writing and technical support. All authors have read and agreed to the published version of the manuscript.&#x26;#x0201d; </p>
<p><bold>Funding:</bold> Authors not received any funding to conduct this research</p>
<p><bold>Data Availability Statement: </bold>The data that support the findings of this study are openly available at https://www.kaggle.com/datasets/varpit94/amazon-stock-data.</p>
<p><bold>Conflicts of Interest:</bold> The authors declare that they have no conflicts of interest to this work.</p>
<p></p>
</sec>
  </body>
  <back>
    <ref-list>
      <title>References</title>
      
<ref id="R1">
<label>[1]</label>
<mixed-citation publication-type="other">Adebiyi, A. A., Adewumi, A. O., &#x00026; Ayo, C. K. (2014). Comparison of Arima and artificial neural networks models for stock price prediction. Journal of Applied Mathematics, 2014, 1-7. https://doi.org/10.1155/2014/614342.
</mixed-citation>
</ref>
<ref id="R2">
<label>[2]</label>
<mixed-citation publication-type="other">Al-Nasseri, A., &#x00026; Menla, F. (2018). What does investors' online divergence of opinion tell us about stock returns and trading volume? Journal of Business Research, 86, 166-178. https://doi.org/10.1016/j.jbusres.2018.01.006.
</mixed-citation>
</ref>
<ref id="R3">
<label>[3]</label>
<mixed-citation publication-type="other">Akita, R., Takenouchi, T., &#x00026; Hirose, A. (2020). A comparison of deep learning models for stock price prediction. Expert Systems with Applications, 141, 112990.
</mixed-citation>
</ref>
<ref id="R4">
<label>[4]</label>
<mixed-citation publication-type="other">Benabbou, L., Berrado, A., &#x00026; Labiad, B. (2018). Machine learning techniques for short-term stock movements classification for the Moroccan stock exchange. IEEE, 2(1). DOI: 10.1109/SITA.2016.7772259
</mixed-citation>
</ref>
<ref id="R5">
<label>[5]</label>
<mixed-citation publication-type="other">Gu, J., Zhao, Y., &#x00026; Zhang, S. (2017). Stock price prediction with attention-based multi-input convolutional LSTM network. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 307-316).
</mixed-citation>
</ref>
<ref id="R6">
<label>[6]</label>
<mixed-citation publication-type="other">Gupta S., Wang, LP. (2010). Stock forecasting with feedforward neural networks and gradual data sub-sampling. Aust J Intell Inf Process Syst 11(4):14-17.
</mixed-citation>
</ref>
<ref id="R7">
<label>[7]</label>
<mixed-citation publication-type="other">Hadavandi E, Ghanbari A, Abbasian-Naghneh S (2010). Developing an evolutionary neural network model for stock index forecasting. In: Huang DS, McGinnity M, Heutte L, Zhang XP (eds) Advance intelligent computing theories and applications. Springer, Berlin, pp 407-415.
</mixed-citation>
</ref>
<ref id="R8">
<label>[8]</label>
<mixed-citation publication-type="other">Huang, T., Zhou, Y., &#x00026; Zhu, C. (2019). An empirical analysis of deep models for stock price predictions. Expert Systems with Applications, pp. 127, 220-230.
</mixed-citation>
</ref>
<ref id="R9">
<label>[9]</label>
<mixed-citation publication-type="other">Lim, WT., Wang, L., Wang, Y., Chang, Q. (2016). Housing price prediction using neural Networks. International conference on natural computation. pp. 518-522.
</mixed-citation>
</ref>
<ref id="R10">
<label>[10]</label>
<mixed-citation publication-type="other">Mingyue, Q., Cheng, L., &#x00026; Yu, S. (2018). Application of the Artificial Neural Network in Predicting the Direction of Stock Market Index. IEEE. DOI: 10.1109/CISIS.2016.115.
</mixed-citation>
</ref>
<ref id="R11">
<label>[11]</label>
<mixed-citation publication-type="other">Moghar, A., &#x00026; Hamiche, M. (2020). Stock Market Prediction Using LSTM Recurrent Neural Network. Procedia Computer Science, 170(3), 1168-1173. https://doi.org/10.1016/j.procs.2020.03.049
</mixed-citation>
</ref>
<ref id="R12">
<label>[12]</label>
<mixed-citation publication-type="other">Patel, J., Patel, M., &#x00026; Darji, M. (2018). Stock Price Prediction Using RNN and LSTM. Journal of Emerging Technologies and Innovative Research (JETIR), 5(11), 1069-1080. https://www.jetir.org/papers/JETIRK006164.pdf
</mixed-citation>
</ref>
<ref id="R13">
<label>[13]</label>
<mixed-citation publication-type="other">Pawar, K., Jalem, S., &#x00026; Tiwari, V. (2018). Stock Market Price Prediction Using LSTM RNN. Advances in Intelligent Systems and Computing, 841. https://link.springer.com/chapter/10.1007/978-981-13-2285-3_58
</mixed-citation>
</ref>
<ref id="R14">
<label>[14]</label>
<mixed-citation publication-type="other">Pires, Ivan &#x00026; Hussain, Faisal &#x00026; Garcia, Nuno &#x00026; Lameski, Petre &#x00026; Zdravevski, Eftim. (2020). Homogeneous Data Normalization and Deep Learning: A Case Study in Human Activity Classification. Future Internet. 12. 10.3390/fi12110194.
</mixed-citation>
</ref>
<ref id="R15">
<label>[15]</label>
<mixed-citation publication-type="other">Prasanna S, Ezhilmaran D. (2013). An analysis of stock market prediction using data mining techniques. Int J Comput Sci Eng Technol. 4(3):49-51.
</mixed-citation>
</ref>
<ref id="R16">
<label>[16]</label>
<mixed-citation publication-type="other">Rajak, R. (2021, June 28). Share Price Prediction using RNN and LSTM | by Rishi Rajak | Analytics Vidhya. Medium. Retrieved October 1, 2023, from https://medium.com/analytics-vidhya/share-price-prediction-using-rnn-and-lstm-8776456dea6f
</mixed-citation>
</ref>
<ref id="R17">
<label>[17]</label>
<mixed-citation publication-type="other">S. Siami-Namini, N. Tavakoli and A. Siami Namin, "A Comparison of ARIMA and LSTM in Forecasting Time Series," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 2018, pp. 1394-1401, doi: 10.1109/ICMLA.2018.00227.
</mixed-citation>
</ref>
<ref id="R18">
<label>[18]</label>
<mixed-citation publication-type="other">Singh, U., Bhuriya, D., &#x00026; Sharma, A. (2017). Survey of stock market prediction using machine learning approach. IEEE. DOI: 10.1109/ICECA.2017.8212715
</mixed-citation>
</ref>
<ref id="R19">
<label>[19]</label>
<mixed-citation publication-type="other">Sreekumar, D., &#x00026; George, E. (2023, March 23). What is Quantitative Research? Definition, Methods, Types, and Examples. Researcher.Life. Retrieved October 1, 2023, from https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
</mixed-citation>
</ref>
<ref id="R20">
<label>[20]</label>
<mixed-citation publication-type="other">Zheng T, Fataliyev K, Wang L. (2013). Wavelet neural networks for stock trading. In: Independent component analysis, compressive sampling, wavelets, neural net, biosystems, and nanoengineering. International Society for Optics and Photonics. 10(3): 10-37.
</mixed-citation>
</ref>
<ref id="R21">
<label>[21]</label>
<mixed-citation publication-type="other">Zhu, Y. (2020). Stock price prediction using the RNN model. Journal of Physics Conference Series. DOI:10.1088/1742-6596/1650/3/032103
</mixed-citation>
</ref>
<ref id="R1">
<label>[1]</label>
<mixed-citation publication-type="other">Adebiyi, A. A., Adewumi, A. O., &#x00026; Ayo, C. K. (2014). Comparison of Arima and artificial neural networks models for stock price prediction. Journal of Applied Mathematics, 2014, 1-7. https://doi.org/10.1155/2014/614342.
</mixed-citation>
</ref>
<ref id="R2">
<label>[2]</label>
<mixed-citation publication-type="other">Al-Nasseri, A., &#x00026; Menla, F. (2018). What does investors' online divergence of opinion tell us about stock returns and trading volume? Journal of Business Research, 86, 166-178. https://doi.org/10.1016/j.jbusres.2018.01.006.
</mixed-citation>
</ref>
<ref id="R3">
<label>[3]</label>
<mixed-citation publication-type="other">Akita, R., Takenouchi, T., &#x00026; Hirose, A. (2020). A comparison of deep learning models for stock price prediction. Expert Systems with Applications, 141, 112990.
</mixed-citation>
</ref>
<ref id="R4">
<label>[4]</label>
<mixed-citation publication-type="other">Benabbou, L., Berrado, A., &#x00026; Labiad, B. (2018). Machine learning techniques for short-term stock movements classification for the Moroccan stock exchange. IEEE, 2(1). DOI: 10.1109/SITA.2016.7772259
</mixed-citation>
</ref>
<ref id="R5">
<label>[5]</label>
<mixed-citation publication-type="other">Gu, J., Zhao, Y., &#x00026; Zhang, S. (2017). Stock price prediction with attention-based multi-input convolutional LSTM network. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 307-316).
</mixed-citation>
</ref>
<ref id="R6">
<label>[6]</label>
<mixed-citation publication-type="other">Gupta S., Wang, LP. (2010). Stock forecasting with feedforward neural networks and gradual data sub-sampling. Aust J Intell Inf Process Syst 11(4):14-17.
</mixed-citation>
</ref>
<ref id="R7">
<label>[7]</label>
<mixed-citation publication-type="other">Hadavandi E, Ghanbari A, Abbasian-Naghneh S (2010). Developing an evolutionary neural network model for stock index forecasting. In: Huang DS, McGinnity M, Heutte L, Zhang XP (eds) Advance intelligent computing theories and applications. Springer, Berlin, pp 407-415.
</mixed-citation>
</ref>
<ref id="R8">
<label>[8]</label>
<mixed-citation publication-type="other">Huang, T., Zhou, Y., &#x00026; Zhu, C. (2019). An empirical analysis of deep models for stock price predictions. Expert Systems with Applications, pp. 127, 220-230.
</mixed-citation>
</ref>
<ref id="R9">
<label>[9]</label>
<mixed-citation publication-type="other">Lim, WT., Wang, L., Wang, Y., Chang, Q. (2016). Housing price prediction using neural Networks. International conference on natural computation. pp. 518-522.
</mixed-citation>
</ref>
<ref id="R10">
<label>[10]</label>
<mixed-citation publication-type="other">Mingyue, Q., Cheng, L., &#x00026; Yu, S. (2018). Application of the Artificial Neural Network in Predicting the Direction of Stock Market Index. IEEE. DOI: 10.1109/CISIS.2016.115.
</mixed-citation>
</ref>
<ref id="R11">
<label>[11]</label>
<mixed-citation publication-type="other">Moghar, A., &#x00026; Hamiche, M. (2020). Stock Market Prediction Using LSTM Recurrent Neural Network. Procedia Computer Science, 170(3), 1168-1173. https://doi.org/10.1016/j.procs.2020.03.049
</mixed-citation>
</ref>
<ref id="R12">
<label>[12]</label>
<mixed-citation publication-type="other">Patel, J., Patel, M., &#x00026; Darji, M. (2018). Stock Price Prediction Using RNN and LSTM. Journal of Emerging Technologies and Innovative Research (JETIR), 5(11), 1069-1080. https://www.jetir.org/papers/JETIRK006164.pdf
</mixed-citation>
</ref>
<ref id="R13">
<label>[13]</label>
<mixed-citation publication-type="other">Pawar, K., Jalem, S., &#x00026; Tiwari, V. (2018). Stock Market Price Prediction Using LSTM RNN. Advances in Intelligent Systems and Computing, 841. https://link.springer.com/chapter/10.1007/978-981-13-2285-3_58
</mixed-citation>
</ref>
<ref id="R14">
<label>[14]</label>
<mixed-citation publication-type="other">Pires, Ivan &#x00026; Hussain, Faisal &#x00026; Garcia, Nuno &#x00026; Lameski, Petre &#x00026; Zdravevski, Eftim. (2020). Homogeneous Data Normalization and Deep Learning: A Case Study in Human Activity Classification. Future Internet. 12. 10.3390/fi12110194.
</mixed-citation>
</ref>
<ref id="R15">
<label>[15]</label>
<mixed-citation publication-type="other">Prasanna S, Ezhilmaran D. (2013). An analysis of stock market prediction using data mining techniques. Int J Comput Sci Eng Technol. 4(3):49-51.
</mixed-citation>
</ref>
<ref id="R16">
<label>[16]</label>
<mixed-citation publication-type="other">Rajak, R. (2021, June 28). Share Price Prediction using RNN and LSTM | by Rishi Rajak | Analytics Vidhya. Medium. Retrieved October 1, 2023, from https://medium.com/analytics-vidhya/share-price-prediction-using-rnn-and-lstm-8776456dea6f
</mixed-citation>
</ref>
<ref id="R17">
<label>[17]</label>
<mixed-citation publication-type="other">S. Siami-Namini, N. Tavakoli and A. Siami Namin, "A Comparison of ARIMA and LSTM in Forecasting Time Series," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 2018, pp. 1394-1401, doi: 10.1109/ICMLA.2018.00227.
</mixed-citation>
</ref>
<ref id="R18">
<label>[18]</label>
<mixed-citation publication-type="other">Singh, U., Bhuriya, D., &#x00026; Sharma, A. (2017). Survey of stock market prediction using machine learning approach. IEEE. DOI: 10.1109/ICECA.2017.8212715
</mixed-citation>
</ref>
<ref id="R19">
<label>[19]</label>
<mixed-citation publication-type="other">Sreekumar, D., &#x00026; George, E. (2023, March 23). What is Quantitative Research? Definition, Methods, Types, and Examples. Researcher.Life. Retrieved October 1, 2023, from https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
</mixed-citation>
</ref>
<ref id="R20">
<label>[20]</label>
<mixed-citation publication-type="other">Zheng T, Fataliyev K, Wang L. (2013). Wavelet neural networks for stock trading. In: Independent component analysis, compressive sampling, wavelets, neural net, biosystems, and nanoengineering. International Society for Optics and Photonics. 10(3): 10-37.
</mixed-citation>
</ref>
<ref id="R21">
<label>[21]</label>
<mixed-citation publication-type="other">Zhu, Y. (2020). Stock price prediction using the RNN model. Journal of Physics Conference Series. DOI:10.1088/1742-6596/1650/3/032103
</mixed-citation>
</ref>
<ref id="R1">
<label>[1]</label>
<mixed-citation publication-type="other">Adebiyi, A. A., Adewumi, A. O., &#x00026; Ayo, C. K. (2014). Comparison of Arima and artificial neural networks models for stock price prediction. Journal of Applied Mathematics, 2014, 1-7. https://doi.org/10.1155/2014/614342.
</mixed-citation>
</ref>
<ref id="R2">
<label>[2]</label>
<mixed-citation publication-type="other">Al-Nasseri, A., &#x00026; Menla, F. (2018). What does investors' online divergence of opinion tell us about stock returns and trading volume? Journal of Business Research, 86, 166-178. https://doi.org/10.1016/j.jbusres.2018.01.006.
</mixed-citation>
</ref>
<ref id="R3">
<label>[3]</label>
<mixed-citation publication-type="other">Akita, R., Takenouchi, T., &#x00026; Hirose, A. (2020). A comparison of deep learning models for stock price prediction. Expert Systems with Applications, 141, 112990.
</mixed-citation>
</ref>
<ref id="R4">
<label>[4]</label>
<mixed-citation publication-type="other">Benabbou, L., Berrado, A., &#x00026; Labiad, B. (2018). Machine learning techniques for short-term stock movements classification for the Moroccan stock exchange. IEEE, 2(1). DOI: 10.1109/SITA.2016.7772259
</mixed-citation>
</ref>
<ref id="R5">
<label>[5]</label>
<mixed-citation publication-type="other">Gu, J., Zhao, Y., &#x00026; Zhang, S. (2017). Stock price prediction with attention-based multi-input convolutional LSTM network. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 307-316).
</mixed-citation>
</ref>
<ref id="R6">
<label>[6]</label>
<mixed-citation publication-type="other">Gupta S., Wang, LP. (2010). Stock forecasting with feedforward neural networks and gradual data sub-sampling. Aust J Intell Inf Process Syst 11(4):14-17.
</mixed-citation>
</ref>
<ref id="R7">
<label>[7]</label>
<mixed-citation publication-type="other">Hadavandi E, Ghanbari A, Abbasian-Naghneh S (2010). Developing an evolutionary neural network model for stock index forecasting. In: Huang DS, McGinnity M, Heutte L, Zhang XP (eds) Advance intelligent computing theories and applications. Springer, Berlin, pp 407-415.
</mixed-citation>
</ref>
<ref id="R8">
<label>[8]</label>
<mixed-citation publication-type="other">Huang, T., Zhou, Y., &#x00026; Zhu, C. (2019). An empirical analysis of deep models for stock price predictions. Expert Systems with Applications, pp. 127, 220-230.
</mixed-citation>
</ref>
<ref id="R9">
<label>[9]</label>
<mixed-citation publication-type="other">Lim, WT., Wang, L., Wang, Y., Chang, Q. (2016). Housing price prediction using neural Networks. International conference on natural computation. pp. 518-522.
</mixed-citation>
</ref>
<ref id="R10">
<label>[10]</label>
<mixed-citation publication-type="other">Mingyue, Q., Cheng, L., &#x00026; Yu, S. (2018). Application of the Artificial Neural Network in Predicting the Direction of Stock Market Index. IEEE. DOI: 10.1109/CISIS.2016.115.
</mixed-citation>
</ref>
<ref id="R11">
<label>[11]</label>
<mixed-citation publication-type="other">Moghar, A., &#x00026; Hamiche, M. (2020). Stock Market Prediction Using LSTM Recurrent Neural Network. Procedia Computer Science, 170(3), 1168-1173. https://doi.org/10.1016/j.procs.2020.03.049
</mixed-citation>
</ref>
<ref id="R12">
<label>[12]</label>
<mixed-citation publication-type="other">Patel, J., Patel, M., &#x00026; Darji, M. (2018). Stock Price Prediction Using RNN and LSTM. Journal of Emerging Technologies and Innovative Research (JETIR), 5(11), 1069-1080. https://www.jetir.org/papers/JETIRK006164.pdf
</mixed-citation>
</ref>
<ref id="R13">
<label>[13]</label>
<mixed-citation publication-type="other">Pawar, K., Jalem, S., &#x00026; Tiwari, V. (2018). Stock Market Price Prediction Using LSTM RNN. Advances in Intelligent Systems and Computing, 841. https://link.springer.com/chapter/10.1007/978-981-13-2285-3_58
</mixed-citation>
</ref>
<ref id="R14">
<label>[14]</label>
<mixed-citation publication-type="other">Pires, Ivan &#x00026; Hussain, Faisal &#x00026; Garcia, Nuno &#x00026; Lameski, Petre &#x00026; Zdravevski, Eftim. (2020). Homogeneous Data Normalization and Deep Learning: A Case Study in Human Activity Classification. Future Internet. 12. 10.3390/fi12110194.
</mixed-citation>
</ref>
<ref id="R15">
<label>[15]</label>
<mixed-citation publication-type="other">Prasanna S, Ezhilmaran D. (2013). An analysis of stock market prediction using data mining techniques. Int J Comput Sci Eng Technol. 4(3):49-51.
</mixed-citation>
</ref>
<ref id="R16">
<label>[16]</label>
<mixed-citation publication-type="other">Rajak, R. (2021, June 28). Share Price Prediction using RNN and LSTM | by Rishi Rajak | Analytics Vidhya. Medium. Retrieved October 1, 2023, from https://medium.com/analytics-vidhya/share-price-prediction-using-rnn-and-lstm-8776456dea6f
</mixed-citation>
</ref>
<ref id="R17">
<label>[17]</label>
<mixed-citation publication-type="other">S. Siami-Namini, N. Tavakoli and A. Siami Namin, "A Comparison of ARIMA and LSTM in Forecasting Time Series," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 2018, pp. 1394-1401, doi: 10.1109/ICMLA.2018.00227.
</mixed-citation>
</ref>
<ref id="R18">
<label>[18]</label>
<mixed-citation publication-type="other">Singh, U., Bhuriya, D., &#x00026; Sharma, A. (2017). Survey of stock market prediction using machine learning approach. IEEE. DOI: 10.1109/ICECA.2017.8212715
</mixed-citation>
</ref>
<ref id="R19">
<label>[19]</label>
<mixed-citation publication-type="other">Sreekumar, D., &#x00026; George, E. (2023, March 23). What is Quantitative Research? Definition, Methods, Types, and Examples. Researcher.Life. Retrieved October 1, 2023, from https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
</mixed-citation>
</ref>
<ref id="R20">
<label>[20]</label>
<mixed-citation publication-type="other">Zheng T, Fataliyev K, Wang L. (2013). Wavelet neural networks for stock trading. In: Independent component analysis, compressive sampling, wavelets, neural net, biosystems, and nanoengineering. International Society for Optics and Photonics. 10(3): 10-37.
</mixed-citation>
</ref>
<ref id="R21">
<label>[21]</label>
<mixed-citation publication-type="other">Zhu, Y. (2020). Stock price prediction using the RNN model. Journal of Physics Conference Series. DOI:10.1088/1742-6596/1650/3/032103
</mixed-citation>
</ref>
<ref id="R1">
<label>[1]</label>
<mixed-citation publication-type="other">Adebiyi, A. A., Adewumi, A. O., &#x00026; Ayo, C. K. (2014). Comparison of Arima and artificial neural networks models for stock price prediction. Journal of Applied Mathematics, 2014, 1-7. https://doi.org/10.1155/2014/614342.
</mixed-citation>
</ref>
<ref id="R2">
<label>[2]</label>
<mixed-citation publication-type="other">Al-Nasseri, A., &#x00026; Menla, F. (2018). What does investors' online divergence of opinion tell us about stock returns and trading volume? Journal of Business Research, 86, 166-178. https://doi.org/10.1016/j.jbusres.2018.01.006.
</mixed-citation>
</ref>
<ref id="R3">
<label>[3]</label>
<mixed-citation publication-type="other">Akita, R., Takenouchi, T., &#x00026; Hirose, A. (2020). A comparison of deep learning models for stock price prediction. Expert Systems with Applications, 141, 112990.
</mixed-citation>
</ref>
<ref id="R4">
<label>[4]</label>
<mixed-citation publication-type="other">Benabbou, L., Berrado, A., &#x00026; Labiad, B. (2018). Machine learning techniques for short-term stock movements classification for the Moroccan stock exchange. IEEE, 2(1). DOI: 10.1109/SITA.2016.7772259
</mixed-citation>
</ref>
<ref id="R5">
<label>[5]</label>
<mixed-citation publication-type="other">Gu, J., Zhao, Y., &#x00026; Zhang, S. (2017). Stock price prediction with attention-based multi-input convolutional LSTM network. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 307-316).
</mixed-citation>
</ref>
<ref id="R6">
<label>[6]</label>
<mixed-citation publication-type="other">Gupta S., Wang, LP. (2010). Stock forecasting with feedforward neural networks and gradual data sub-sampling. Aust J Intell Inf Process Syst 11(4):14-17.
</mixed-citation>
</ref>
<ref id="R7">
<label>[7]</label>
<mixed-citation publication-type="other">Hadavandi E, Ghanbari A, Abbasian-Naghneh S (2010). Developing an evolutionary neural network model for stock index forecasting. In: Huang DS, McGinnity M, Heutte L, Zhang XP (eds) Advance intelligent computing theories and applications. Springer, Berlin, pp 407-415.
</mixed-citation>
</ref>
<ref id="R8">
<label>[8]</label>
<mixed-citation publication-type="other">Huang, T., Zhou, Y., &#x00026; Zhu, C. (2019). An empirical analysis of deep models for stock price predictions. Expert Systems with Applications, pp. 127, 220-230.
</mixed-citation>
</ref>
<ref id="R9">
<label>[9]</label>
<mixed-citation publication-type="other">Lim, WT., Wang, L., Wang, Y., Chang, Q. (2016). Housing price prediction using neural Networks. International conference on natural computation. pp. 518-522.
</mixed-citation>
</ref>
<ref id="R10">
<label>[10]</label>
<mixed-citation publication-type="other">Mingyue, Q., Cheng, L., &#x00026; Yu, S. (2018). Application of the Artificial Neural Network in Predicting the Direction of Stock Market Index. IEEE. DOI: 10.1109/CISIS.2016.115.
</mixed-citation>
</ref>
<ref id="R11">
<label>[11]</label>
<mixed-citation publication-type="other">Moghar, A., &#x00026; Hamiche, M. (2020). Stock Market Prediction Using LSTM Recurrent Neural Network. Procedia Computer Science, 170(3), 1168-1173. https://doi.org/10.1016/j.procs.2020.03.049
</mixed-citation>
</ref>
<ref id="R12">
<label>[12]</label>
<mixed-citation publication-type="other">Patel, J., Patel, M., &#x00026; Darji, M. (2018). Stock Price Prediction Using RNN and LSTM. Journal of Emerging Technologies and Innovative Research (JETIR), 5(11), 1069-1080. https://www.jetir.org/papers/JETIRK006164.pdf
</mixed-citation>
</ref>
<ref id="R13">
<label>[13]</label>
<mixed-citation publication-type="other">Pawar, K., Jalem, S., &#x00026; Tiwari, V. (2018). Stock Market Price Prediction Using LSTM RNN. Advances in Intelligent Systems and Computing, 841. https://link.springer.com/chapter/10.1007/978-981-13-2285-3_58
</mixed-citation>
</ref>
<ref id="R14">
<label>[14]</label>
<mixed-citation publication-type="other">Pires, Ivan &#x00026; Hussain, Faisal &#x00026; Garcia, Nuno &#x00026; Lameski, Petre &#x00026; Zdravevski, Eftim. (2020). Homogeneous Data Normalization and Deep Learning: A Case Study in Human Activity Classification. Future Internet. 12. 10.3390/fi12110194.
</mixed-citation>
</ref>
<ref id="R15">
<label>[15]</label>
<mixed-citation publication-type="other">Prasanna S, Ezhilmaran D. (2013). An analysis of stock market prediction using data mining techniques. Int J Comput Sci Eng Technol. 4(3):49-51.
</mixed-citation>
</ref>
<ref id="R16">
<label>[16]</label>
<mixed-citation publication-type="other">Rajak, R. (2021, June 28). Share Price Prediction using RNN and LSTM | by Rishi Rajak | Analytics Vidhya. Medium. Retrieved October 1, 2023, from https://medium.com/analytics-vidhya/share-price-prediction-using-rnn-and-lstm-8776456dea6f
</mixed-citation>
</ref>
<ref id="R17">
<label>[17]</label>
<mixed-citation publication-type="other">S. Siami-Namini, N. Tavakoli and A. Siami Namin, "A Comparison of ARIMA and LSTM in Forecasting Time Series," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 2018, pp. 1394-1401, doi: 10.1109/ICMLA.2018.00227.
</mixed-citation>
</ref>
<ref id="R18">
<label>[18]</label>
<mixed-citation publication-type="other">Singh, U., Bhuriya, D., &#x00026; Sharma, A. (2017). Survey of stock market prediction using machine learning approach. IEEE. DOI: 10.1109/ICECA.2017.8212715
</mixed-citation>
</ref>
<ref id="R19">
<label>[19]</label>
<mixed-citation publication-type="other">Sreekumar, D., &#x00026; George, E. (2023, March 23). What is Quantitative Research? Definition, Methods, Types, and Examples. Researcher.Life. Retrieved October 1, 2023, from https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
</mixed-citation>
</ref>
<ref id="R20">
<label>[20]</label>
<mixed-citation publication-type="other">Zheng T, Fataliyev K, Wang L. (2013). Wavelet neural networks for stock trading. In: Independent component analysis, compressive sampling, wavelets, neural net, biosystems, and nanoengineering. International Society for Optics and Photonics. 10(3): 10-37.
</mixed-citation>
</ref>
<ref id="R21">
<label>[21]</label>
<mixed-citation publication-type="other">Zhu, Y. (2020). Stock price prediction using the RNN model. Journal of Physics Conference Series. DOI:10.1088/1742-6596/1650/3/032103
</mixed-citation>
</ref>
<ref id="R1">
<label>[1]</label>
<mixed-citation publication-type="other">Adebiyi, A. A., Adewumi, A. O., &#x00026; Ayo, C. K. (2014). Comparison of Arima and artificial neural networks models for stock price prediction. Journal of Applied Mathematics, 2014, 1-7. https://doi.org/10.1155/2014/614342.
</mixed-citation>
</ref>
<ref id="R2">
<label>[2]</label>
<mixed-citation publication-type="other">Al-Nasseri, A., &#x00026; Menla, F. (2018). What does investors' online divergence of opinion tell us about stock returns and trading volume? Journal of Business Research, 86, 166-178. https://doi.org/10.1016/j.jbusres.2018.01.006.
</mixed-citation>
</ref>
<ref id="R3">
<label>[3]</label>
<mixed-citation publication-type="other">Akita, R., Takenouchi, T., &#x00026; Hirose, A. (2020). A comparison of deep learning models for stock price prediction. Expert Systems with Applications, 141, 112990.
</mixed-citation>
</ref>
<ref id="R4">
<label>[4]</label>
<mixed-citation publication-type="other">Benabbou, L., Berrado, A., &#x00026; Labiad, B. (2018). Machine learning techniques for short-term stock movements classification for the Moroccan stock exchange. IEEE, 2(1). DOI: 10.1109/SITA.2016.7772259
</mixed-citation>
</ref>
<ref id="R5">
<label>[5]</label>
<mixed-citation publication-type="other">Gu, J., Zhao, Y., &#x00026; Zhang, S. (2017). Stock price prediction with attention-based multi-input convolutional LSTM network. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 307-316).
</mixed-citation>
</ref>
<ref id="R6">
<label>[6]</label>
<mixed-citation publication-type="other">Gupta S., Wang, LP. (2010). Stock forecasting with feedforward neural networks and gradual data sub-sampling. Aust J Intell Inf Process Syst 11(4):14-17.
</mixed-citation>
</ref>
<ref id="R7">
<label>[7]</label>
<mixed-citation publication-type="other">Hadavandi E, Ghanbari A, Abbasian-Naghneh S (2010). Developing an evolutionary neural network model for stock index forecasting. In: Huang DS, McGinnity M, Heutte L, Zhang XP (eds) Advance intelligent computing theories and applications. Springer, Berlin, pp 407-415.
</mixed-citation>
</ref>
<ref id="R8">
<label>[8]</label>
<mixed-citation publication-type="other">Huang, T., Zhou, Y., &#x00026; Zhu, C. (2019). An empirical analysis of deep models for stock price predictions. Expert Systems with Applications, pp. 127, 220-230.
</mixed-citation>
</ref>
<ref id="R9">
<label>[9]</label>
<mixed-citation publication-type="other">Lim, WT., Wang, L., Wang, Y., Chang, Q. (2016). Housing price prediction using neural Networks. International conference on natural computation. pp. 518-522.
</mixed-citation>
</ref>
<ref id="R10">
<label>[10]</label>
<mixed-citation publication-type="other">Mingyue, Q., Cheng, L., &#x00026; Yu, S. (2018). Application of the Artificial Neural Network in Predicting the Direction of Stock Market Index. IEEE. DOI: 10.1109/CISIS.2016.115.
</mixed-citation>
</ref>
<ref id="R11">
<label>[11]</label>
<mixed-citation publication-type="other">Moghar, A., &#x00026; Hamiche, M. (2020). Stock Market Prediction Using LSTM Recurrent Neural Network. Procedia Computer Science, 170(3), 1168-1173. https://doi.org/10.1016/j.procs.2020.03.049
</mixed-citation>
</ref>
<ref id="R12">
<label>[12]</label>
<mixed-citation publication-type="other">Patel, J., Patel, M., &#x00026; Darji, M. (2018). Stock Price Prediction Using RNN and LSTM. Journal of Emerging Technologies and Innovative Research (JETIR), 5(11), 1069-1080. https://www.jetir.org/papers/JETIRK006164.pdf
</mixed-citation>
</ref>
<ref id="R13">
<label>[13]</label>
<mixed-citation publication-type="other">Pawar, K., Jalem, S., &#x00026; Tiwari, V. (2018). Stock Market Price Prediction Using LSTM RNN. Advances in Intelligent Systems and Computing, 841. https://link.springer.com/chapter/10.1007/978-981-13-2285-3_58
</mixed-citation>
</ref>
<ref id="R14">
<label>[14]</label>
<mixed-citation publication-type="other">Pires, Ivan &#x00026; Hussain, Faisal &#x00026; Garcia, Nuno &#x00026; Lameski, Petre &#x00026; Zdravevski, Eftim. (2020). Homogeneous Data Normalization and Deep Learning: A Case Study in Human Activity Classification. Future Internet. 12. 10.3390/fi12110194.
</mixed-citation>
</ref>
<ref id="R15">
<label>[15]</label>
<mixed-citation publication-type="other">Prasanna S, Ezhilmaran D. (2013). An analysis of stock market prediction using data mining techniques. Int J Comput Sci Eng Technol. 4(3):49-51.
</mixed-citation>
</ref>
<ref id="R16">
<label>[16]</label>
<mixed-citation publication-type="other">Rajak, R. (2021, June 28). Share Price Prediction using RNN and LSTM | by Rishi Rajak | Analytics Vidhya. Medium. Retrieved October 1, 2023, from https://medium.com/analytics-vidhya/share-price-prediction-using-rnn-and-lstm-8776456dea6f
</mixed-citation>
</ref>
<ref id="R17">
<label>[17]</label>
<mixed-citation publication-type="other">S. Siami-Namini, N. Tavakoli and A. Siami Namin, "A Comparison of ARIMA and LSTM in Forecasting Time Series," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 2018, pp. 1394-1401, doi: 10.1109/ICMLA.2018.00227.
</mixed-citation>
</ref>
<ref id="R18">
<label>[18]</label>
<mixed-citation publication-type="other">Singh, U., Bhuriya, D., &#x00026; Sharma, A. (2017). Survey of stock market prediction using machine learning approach. IEEE. DOI: 10.1109/ICECA.2017.8212715
</mixed-citation>
</ref>
<ref id="R19">
<label>[19]</label>
<mixed-citation publication-type="other">Sreekumar, D., &#x00026; George, E. (2023, March 23). What is Quantitative Research? Definition, Methods, Types, and Examples. Researcher.Life. Retrieved October 1, 2023, from https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
</mixed-citation>
</ref>
<ref id="R20">
<label>[20]</label>
<mixed-citation publication-type="other">Zheng T, Fataliyev K, Wang L. (2013). Wavelet neural networks for stock trading. In: Independent component analysis, compressive sampling, wavelets, neural net, biosystems, and nanoengineering. International Society for Optics and Photonics. 10(3): 10-37.
</mixed-citation>
</ref>
<ref id="R21">
<label>[21]</label>
<mixed-citation publication-type="other">Zhu, Y. (2020). Stock price prediction using the RNN model. Journal of Physics Conference Series. DOI:10.1088/1742-6596/1650/3/032103
</mixed-citation>
</ref>
<ref id="R1">
<label>[1]</label>
<mixed-citation publication-type="other">Adebiyi, A. A., Adewumi, A. O., &#x00026; Ayo, C. K. (2014). Comparison of Arima and artificial neural networks models for stock price prediction. Journal of Applied Mathematics, 2014, 1-7. https://doi.org/10.1155/2014/614342.
</mixed-citation>
</ref>
<ref id="R2">
<label>[2]</label>
<mixed-citation publication-type="other">Al-Nasseri, A., &#x00026; Menla, F. (2018). What does investors' online divergence of opinion tell us about stock returns and trading volume? Journal of Business Research, 86, 166-178. https://doi.org/10.1016/j.jbusres.2018.01.006.
</mixed-citation>
</ref>
<ref id="R3">
<label>[3]</label>
<mixed-citation publication-type="other">Akita, R., Takenouchi, T., &#x00026; Hirose, A. (2020). A comparison of deep learning models for stock price prediction. Expert Systems with Applications, 141, 112990.
</mixed-citation>
</ref>
<ref id="R4">
<label>[4]</label>
<mixed-citation publication-type="other">Benabbou, L., Berrado, A., &#x00026; Labiad, B. (2018). Machine learning techniques for short-term stock movements classification for the Moroccan stock exchange. IEEE, 2(1). DOI: 10.1109/SITA.2016.7772259
</mixed-citation>
</ref>
<ref id="R5">
<label>[5]</label>
<mixed-citation publication-type="other">Gu, J., Zhao, Y., &#x00026; Zhang, S. (2017). Stock price prediction with attention-based multi-input convolutional LSTM network. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 307-316).
</mixed-citation>
</ref>
<ref id="R6">
<label>[6]</label>
<mixed-citation publication-type="other">Gupta S., Wang, LP. (2010). Stock forecasting with feedforward neural networks and gradual data sub-sampling. Aust J Intell Inf Process Syst 11(4):14-17.
</mixed-citation>
</ref>
<ref id="R7">
<label>[7]</label>
<mixed-citation publication-type="other">Hadavandi E, Ghanbari A, Abbasian-Naghneh S (2010). Developing an evolutionary neural network model for stock index forecasting. In: Huang DS, McGinnity M, Heutte L, Zhang XP (eds) Advance intelligent computing theories and applications. Springer, Berlin, pp 407-415.
</mixed-citation>
</ref>
<ref id="R8">
<label>[8]</label>
<mixed-citation publication-type="other">Huang, T., Zhou, Y., &#x00026; Zhu, C. (2019). An empirical analysis of deep models for stock price predictions. Expert Systems with Applications, pp. 127, 220-230.
</mixed-citation>
</ref>
<ref id="R9">
<label>[9]</label>
<mixed-citation publication-type="other">Lim, WT., Wang, L., Wang, Y., Chang, Q. (2016). Housing price prediction using neural Networks. International conference on natural computation. pp. 518-522.
</mixed-citation>
</ref>
<ref id="R10">
<label>[10]</label>
<mixed-citation publication-type="other">Mingyue, Q., Cheng, L., &#x00026; Yu, S. (2018). Application of the Artificial Neural Network in Predicting the Direction of Stock Market Index. IEEE. DOI: 10.1109/CISIS.2016.115.
</mixed-citation>
</ref>
<ref id="R11">
<label>[11]</label>
<mixed-citation publication-type="other">Moghar, A., &#x00026; Hamiche, M. (2020). Stock Market Prediction Using LSTM Recurrent Neural Network. Procedia Computer Science, 170(3), 1168-1173. https://doi.org/10.1016/j.procs.2020.03.049
</mixed-citation>
</ref>
<ref id="R12">
<label>[12]</label>
<mixed-citation publication-type="other">Patel, J., Patel, M., &#x00026; Darji, M. (2018). Stock Price Prediction Using RNN and LSTM. Journal of Emerging Technologies and Innovative Research (JETIR), 5(11), 1069-1080. https://www.jetir.org/papers/JETIRK006164.pdf
</mixed-citation>
</ref>
<ref id="R13">
<label>[13]</label>
<mixed-citation publication-type="other">Pawar, K., Jalem, S., &#x00026; Tiwari, V. (2018). Stock Market Price Prediction Using LSTM RNN. Advances in Intelligent Systems and Computing, 841. https://link.springer.com/chapter/10.1007/978-981-13-2285-3_58
</mixed-citation>
</ref>
<ref id="R14">
<label>[14]</label>
<mixed-citation publication-type="other">Pires, Ivan &#x00026; Hussain, Faisal &#x00026; Garcia, Nuno &#x00026; Lameski, Petre &#x00026; Zdravevski, Eftim. (2020). Homogeneous Data Normalization and Deep Learning: A Case Study in Human Activity Classification. Future Internet. 12. 10.3390/fi12110194.
</mixed-citation>
</ref>
<ref id="R15">
<label>[15]</label>
<mixed-citation publication-type="other">Prasanna S, Ezhilmaran D. (2013). An analysis of stock market prediction using data mining techniques. Int J Comput Sci Eng Technol. 4(3):49-51.
</mixed-citation>
</ref>
<ref id="R16">
<label>[16]</label>
<mixed-citation publication-type="other">Rajak, R. (2021, June 28). Share Price Prediction using RNN and LSTM | by Rishi Rajak | Analytics Vidhya. Medium. Retrieved October 1, 2023, from https://medium.com/analytics-vidhya/share-price-prediction-using-rnn-and-lstm-8776456dea6f
</mixed-citation>
</ref>
<ref id="R17">
<label>[17]</label>
<mixed-citation publication-type="other">S. Siami-Namini, N. Tavakoli and A. Siami Namin, "A Comparison of ARIMA and LSTM in Forecasting Time Series," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 2018, pp. 1394-1401, doi: 10.1109/ICMLA.2018.00227.
</mixed-citation>
</ref>
<ref id="R18">
<label>[18]</label>
<mixed-citation publication-type="other">Singh, U., Bhuriya, D., &#x00026; Sharma, A. (2017). Survey of stock market prediction using machine learning approach. IEEE. DOI: 10.1109/ICECA.2017.8212715
</mixed-citation>
</ref>
<ref id="R19">
<label>[19]</label>
<mixed-citation publication-type="other">Sreekumar, D., &#x00026; George, E. (2023, March 23). What is Quantitative Research? Definition, Methods, Types, and Examples. Researcher.Life. Retrieved October 1, 2023, from https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
</mixed-citation>
</ref>
<ref id="R20">
<label>[20]</label>
<mixed-citation publication-type="other">Zheng T, Fataliyev K, Wang L. (2013). Wavelet neural networks for stock trading. In: Independent component analysis, compressive sampling, wavelets, neural net, biosystems, and nanoengineering. International Society for Optics and Photonics. 10(3): 10-37.
</mixed-citation>
</ref>
<ref id="R21">
<label>[21]</label>
<mixed-citation publication-type="other">Zhu, Y. (2020). Stock price prediction using the RNN model. Journal of Physics Conference Series. DOI:10.1088/1742-6596/1650/3/032103
</mixed-citation>
</ref>
<ref id="R1">
<label>[1]</label>
<mixed-citation publication-type="other">Adebiyi, A. A., Adewumi, A. O., &#x00026; Ayo, C. K. (2014). Comparison of Arima and artificial neural networks models for stock price prediction. Journal of Applied Mathematics, 2014, 1-7. https://doi.org/10.1155/2014/614342.
</mixed-citation>
</ref>
<ref id="R2">
<label>[2]</label>
<mixed-citation publication-type="other">Al-Nasseri, A., &#x00026; Menla, F. (2018). What does investors' online divergence of opinion tell us about stock returns and trading volume? Journal of Business Research, 86, 166-178. https://doi.org/10.1016/j.jbusres.2018.01.006.
</mixed-citation>
</ref>
<ref id="R3">
<label>[3]</label>
<mixed-citation publication-type="other">Akita, R., Takenouchi, T., &#x00026; Hirose, A. (2020). A comparison of deep learning models for stock price prediction. Expert Systems with Applications, 141, 112990.
</mixed-citation>
</ref>
<ref id="R4">
<label>[4]</label>
<mixed-citation publication-type="other">Benabbou, L., Berrado, A., &#x00026; Labiad, B. (2018). Machine learning techniques for short-term stock movements classification for the Moroccan stock exchange. IEEE, 2(1). DOI: 10.1109/SITA.2016.7772259
</mixed-citation>
</ref>
<ref id="R5">
<label>[5]</label>
<mixed-citation publication-type="other">Gu, J., Zhao, Y., &#x00026; Zhang, S. (2017). Stock price prediction with attention-based multi-input convolutional LSTM network. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 307-316).
</mixed-citation>
</ref>
<ref id="R6">
<label>[6]</label>
<mixed-citation publication-type="other">Gupta S., Wang, LP. (2010). Stock forecasting with feedforward neural networks and gradual data sub-sampling. Aust J Intell Inf Process Syst 11(4):14-17.
</mixed-citation>
</ref>
<ref id="R7">
<label>[7]</label>
<mixed-citation publication-type="other">Hadavandi E, Ghanbari A, Abbasian-Naghneh S (2010). Developing an evolutionary neural network model for stock index forecasting. In: Huang DS, McGinnity M, Heutte L, Zhang XP (eds) Advance intelligent computing theories and applications. Springer, Berlin, pp 407-415.
</mixed-citation>
</ref>
<ref id="R8">
<label>[8]</label>
<mixed-citation publication-type="other">Huang, T., Zhou, Y., &#x00026; Zhu, C. (2019). An empirical analysis of deep models for stock price predictions. Expert Systems with Applications, pp. 127, 220-230.
</mixed-citation>
</ref>
<ref id="R9">
<label>[9]</label>
<mixed-citation publication-type="other">Lim, WT., Wang, L., Wang, Y., Chang, Q. (2016). Housing price prediction using neural Networks. International conference on natural computation. pp. 518-522.
</mixed-citation>
</ref>
<ref id="R10">
<label>[10]</label>
<mixed-citation publication-type="other">Mingyue, Q., Cheng, L., &#x00026; Yu, S. (2018). Application of the Artificial Neural Network in Predicting the Direction of Stock Market Index. IEEE. DOI: 10.1109/CISIS.2016.115.
</mixed-citation>
</ref>
<ref id="R11">
<label>[11]</label>
<mixed-citation publication-type="other">Moghar, A., &#x00026; Hamiche, M. (2020). Stock Market Prediction Using LSTM Recurrent Neural Network. Procedia Computer Science, 170(3), 1168-1173. https://doi.org/10.1016/j.procs.2020.03.049
</mixed-citation>
</ref>
<ref id="R12">
<label>[12]</label>
<mixed-citation publication-type="other">Patel, J., Patel, M., &#x00026; Darji, M. (2018). Stock Price Prediction Using RNN and LSTM. Journal of Emerging Technologies and Innovative Research (JETIR), 5(11), 1069-1080. https://www.jetir.org/papers/JETIRK006164.pdf
</mixed-citation>
</ref>
<ref id="R13">
<label>[13]</label>
<mixed-citation publication-type="other">Pawar, K., Jalem, S., &#x00026; Tiwari, V. (2018). Stock Market Price Prediction Using LSTM RNN. Advances in Intelligent Systems and Computing, 841. https://link.springer.com/chapter/10.1007/978-981-13-2285-3_58
</mixed-citation>
</ref>
<ref id="R14">
<label>[14]</label>
<mixed-citation publication-type="other">Pires, Ivan &#x00026; Hussain, Faisal &#x00026; Garcia, Nuno &#x00026; Lameski, Petre &#x00026; Zdravevski, Eftim. (2020). Homogeneous Data Normalization and Deep Learning: A Case Study in Human Activity Classification. Future Internet. 12. 10.3390/fi12110194.
</mixed-citation>
</ref>
<ref id="R15">
<label>[15]</label>
<mixed-citation publication-type="other">Prasanna S, Ezhilmaran D. (2013). An analysis of stock market prediction using data mining techniques. Int J Comput Sci Eng Technol. 4(3):49-51.
</mixed-citation>
</ref>
<ref id="R16">
<label>[16]</label>
<mixed-citation publication-type="other">Rajak, R. (2021, June 28). Share Price Prediction using RNN and LSTM | by Rishi Rajak | Analytics Vidhya. Medium. Retrieved October 1, 2023, from https://medium.com/analytics-vidhya/share-price-prediction-using-rnn-and-lstm-8776456dea6f
</mixed-citation>
</ref>
<ref id="R17">
<label>[17]</label>
<mixed-citation publication-type="other">S. Siami-Namini, N. Tavakoli and A. Siami Namin, "A Comparison of ARIMA and LSTM in Forecasting Time Series," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, USA, 2018, pp. 1394-1401, doi: 10.1109/ICMLA.2018.00227.
</mixed-citation>
</ref>
<ref id="R18">
<label>[18]</label>
<mixed-citation publication-type="other">Singh, U., Bhuriya, D., &#x00026; Sharma, A. (2017). Survey of stock market prediction using machine learning approach. IEEE. DOI: 10.1109/ICECA.2017.8212715
</mixed-citation>
</ref>
<ref id="R19">
<label>[19]</label>
<mixed-citation publication-type="other">Sreekumar, D., &#x00026; George, E. (2023, March 23). What is Quantitative Research? Definition, Methods, Types, and Examples. Researcher.Life. Retrieved October 1, 2023, from https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
</mixed-citation>
</ref>
<ref id="R20">
<label>[20]</label>
<mixed-citation publication-type="other">Zheng T, Fataliyev K, Wang L. (2013). Wavelet neural networks for stock trading. In: Independent component analysis, compressive sampling, wavelets, neural net, biosystems, and nanoengineering. International Society for Optics and Photonics. 10(3): 10-37.
</mixed-citation>
</ref>
<ref id="R21">
<label>[21]</label>
<mixed-citation publication-type="other">Zhu, Y. (2020). Stock price prediction using the RNN model. Journal of Physics Conference Series. DOI:10.1088/1742-6596/1650/3/032103
</mixed-citation>
</ref>
    </ref-list>
  </back>
</article>