You are on page 1of 44

Gold price analysis (Neural network)

April 17, 2024

1 Analysis of historical evolution of gold and creation of neural


network algorithm for forecast
Created by: David Zerpa Import of packages for managing DataFrames

[45]: import pandas as pd


import numpy as np

Load Data (Data from Investing.com)

[46]: gold_1 = pd.read_csv('Datos Históricos - Precios Oro.csv') # Gold prices from␣


↪January 93 to October 2012

[47]: gold_2 = pd.read_csv('Dados Históricos - Precios Oro(1).csv') # Gold prices␣


↪from October 2012 to March 2024

[48]: gold_1.head()

[48]: Data Último Abertura Máxima Mínima Vol. Var%


0 16.10.2012 1.746,30 1.738,00 1.749,60 1.736,10 105,63K 0,50%
1 15.10.2012 1.737,60 1.755,20 1.755,50 1.729,70 172,77K -1,26%
2 12.10.2012 1.759,70 1.769,70 1.775,00 1.753,50 130,39K -0,62%
3 11.10.2012 1.770,60 1.764,90 1.776,60 1.760,20 117,57K 0,31%
4 10.10.2012 1.765,10 1.766,00 1.770,00 1.758,50 129,54K 0,01%

[49]: gold_1.info()

<class 'pandas.core.frame.DataFrame'>
RangeIndex: 5000 entries, 0 to 4999
Data columns (total 7 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Data 5000 non-null object
1 Último 5000 non-null object
2 Abertura 5000 non-null object
3 Máxima 5000 non-null object
4 Mínima 5000 non-null object
5 Vol. 4351 non-null object
6 Var% 5000 non-null object

1
dtypes: object(7)
memory usage: 273.6+ KB

[50]: gold_2.head()

[50]: Data Último Abertura Máxima Mínima Vol. Var%


0 03.04.2024 2.316,90 2.299,65 2.317,10 2.285,90 NaN 1,54%
1 02.04.2024 2.281,80 2.272,70 2.301,90 2.267,10 277,23K 1,09%
2 01.04.2024 2.257,10 2.259,20 2.286,40 2.249,10 220,77K 0,10%
3 29.03.2024 2.254,80 2.254,80 2.254,80 2.254,80 NaN 0,73%
4 28.03.2024 2.238,40 2.215,70 2.256,90 2.207,50 250,25K 1,16%

[51]: gold_2.info()

<class 'pandas.core.frame.DataFrame'>
RangeIndex: 2967 entries, 0 to 2966
Data columns (total 7 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Data 2967 non-null object
1 Último 2967 non-null object
2 Abertura 2967 non-null object
3 Máxima 2967 non-null object
4 Mínima 2967 non-null object
5 Vol. 2945 non-null object
6 Var% 2967 non-null object
dtypes: object(7)
memory usage: 162.4+ KB
We can see that all data is of type Object, this is because there are speacial characters that prevent
the correct reading of the format by Pandas when we loading the data. We must proceed to clean
and change all formats before starting to work with the DataFrame

1.1 Data cleaning


We created a function for future uses so we don´t have to repeate the cleaning procees. We also
have two DataFrames that we want to unite into one. Therefore, by doing the function we save
ourselves from having to write the same code twice.

[52]: def cleaningDataFrame(df):

# we start by converting the 'Data' column from object to DateTime format


df.Data = pd.to_datetime(df.Data,dayfirst=True)

# Now, we convert numeric columns to suitable format, the columns have ','␣
↪so
# first we must remove all the dots and then replace the commas with dots.
# Since if we try to convert a variable to numeric format and it has␣
↪commas, it will give us a error.

2
# This is because python uses dots as a decimal separator and not commas.
for col in df.columns[1:5]:
df[col] = df[col].str.replace('.', '')
df[col] = df[col].str.replace(',','.').astype(float)

# Now, we clean the last two columns, removing not numerical elements and␣
↪changing commas to dots.
# to be able to do the conversion without errors
df['Vol.'] = df['Vol.'].str.replace('K','')
df['Vol.'] = df['Vol.'].str.replace(',','.').astype(float)
df['Vol.'] = df['Vol.']*1000
df['Var%'] = df['Var%'].str.replace('%','')
df['Var%'] = df['Var%'].str.replace(',','.').astype(float)

# we return the dataframe after cleaning process


return df

[53]: gold_1 = cleaningDataFrame(gold_1)

/var/folders/v9/3sp988yj7ml2y8gyn8kvdff40000gq/T/ipykernel_56090/1833754963.py:1
1: FutureWarning: The default value of regex will change from True to False in a
future version. In addition, single character regular expressions will *not* be
treated as literal strings when regex=True.
df[col] = df[col].str.replace('.', '')

[54]: gold_1.info()

<class 'pandas.core.frame.DataFrame'>
RangeIndex: 5000 entries, 0 to 4999
Data columns (total 7 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Data 5000 non-null datetime64[ns]
1 Último 5000 non-null float64
2 Abertura 5000 non-null float64
3 Máxima 5000 non-null float64
4 Mínima 5000 non-null float64
5 Vol. 4351 non-null float64
6 Var% 5000 non-null float64
dtypes: datetime64[ns](1), float64(6)
memory usage: 273.6 KB

[55]: gold_1.head()

[55]: Data Último Abertura Máxima Mínima Vol. Var%


0 2012-10-16 1746.3 1738.0 1749.6 1736.1 105630.0 0.50
1 2012-10-15 1737.6 1755.2 1755.5 1729.7 172770.0 -1.26
2 2012-10-12 1759.7 1769.7 1775.0 1753.5 130390.0 -0.62

3
3 2012-10-11 1770.6 1764.9 1776.6 1760.2 117570.0 0.31
4 2012-10-10 1765.1 1766.0 1770.0 1758.5 129540.0 0.01

As we can see, All the the format have been changed, just as we want it to be, ready for
subsequent analysis

[56]: gold_2 = cleaningDataFrame(gold_2)

/var/folders/v9/3sp988yj7ml2y8gyn8kvdff40000gq/T/ipykernel_56090/1833754963.py:1
1: FutureWarning: The default value of regex will change from True to False in a
future version. In addition, single character regular expressions will *not* be
treated as literal strings when regex=True.
df[col] = df[col].str.replace('.', '')

[57]: gold_2.info()

<class 'pandas.core.frame.DataFrame'>
RangeIndex: 2967 entries, 0 to 2966
Data columns (total 7 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Data 2967 non-null datetime64[ns]
1 Último 2967 non-null float64
2 Abertura 2967 non-null float64
3 Máxima 2967 non-null float64
4 Mínima 2967 non-null float64
5 Vol. 2945 non-null float64
6 Var% 2967 non-null float64
dtypes: datetime64[ns](1), float64(6)
memory usage: 162.4 KB

[58]: gold_2.head()

[58]: Data Último Abertura Máxima Mínima Vol. Var%


0 2024-04-03 2316.9 2299.65 2317.1 2285.9 NaN 1.54
1 2024-04-02 2281.8 2272.70 2301.9 2267.1 277230.0 1.09
2 2024-04-01 2257.1 2259.20 2286.4 2249.1 220770.0 0.10
3 2024-03-29 2254.8 2254.80 2254.8 2254.8 NaN 0.73
4 2024-03-28 2238.4 2215.70 2256.9 2207.5 250250.0 1.16

1.2 Unification of DataFrame into one


[59]: gold_prices93_2024 = pd.concat([gold_1,gold_2],ignore_index=True)

[60]: gold_prices93_2024.head(-5)

[60]: Data Último Abertura Máxima Mínima Vol. Var%


0 2012-10-16 1746.3 1738.0 1749.6 1736.1 105630.0 0.50
1 2012-10-15 1737.6 1755.2 1755.5 1729.7 172770.0 -1.26

4
2 2012-10-12 1759.7 1769.7 1775.0 1753.5 130390.0 -0.62
3 2012-10-11 1770.6 1764.9 1776.6 1760.2 117570.0 0.31
4 2012-10-10 1765.1 1766.0 1770.0 1758.5 129540.0 0.01
… … … … … … … …
7957 2012-10-30 1712.1 1710.3 1715.7 1705.2 57250.0 0.20
7958 2012-10-29 1708.7 1712.2 1717.8 1706.2 57610.0 -0.19
7959 2012-10-26 1711.9 1712.1 1719.9 1701.4 126510.0 -0.06
7960 2012-10-25 1713.0 1702.8 1718.9 1702.3 141160.0 0.67
7961 2012-10-24 1701.6 1709.0 1715.2 1698.7 140580.0 -0.46

[7962 rows x 7 columns]

[61]: gold_prices93_2024.info()

<class 'pandas.core.frame.DataFrame'>
RangeIndex: 7967 entries, 0 to 7966
Data columns (total 7 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Data 7967 non-null datetime64[ns]
1 Último 7967 non-null float64
2 Abertura 7967 non-null float64
3 Máxima 7967 non-null float64
4 Mínima 7967 non-null float64
5 Vol. 7296 non-null float64
6 Var% 7967 non-null float64
dtypes: datetime64[ns](1), float64(6)
memory usage: 435.8 KB

[62]: gold_prices93_2024 = gold_prices93_2024.sort_values(by='Data',ascending=False)

[63]: gold_prices93_2024.head(-5)

[63]: Data Último Abertura Máxima Mínima Vol. Var%


5000 2024-04-03 2316.9 2299.65 2317.1 2285.9 NaN 1.54
5001 2024-04-02 2281.8 2272.70 2301.9 2267.1 277230.0 1.09
5002 2024-04-01 2257.1 2259.20 2286.4 2249.1 220770.0 0.10
5003 2024-03-29 2254.8 2254.80 2254.8 2254.8 NaN 0.73
5004 2024-03-28 2238.4 2215.70 2256.9 2207.5 250250.0 1.16
… … … … … … … …
4990 1993-01-15 327.5 327.50 327.5 327.5 NaN -0.09
4991 1993-01-14 327.8 328.40 328.4 327.8 0.0 -0.09
4992 1993-01-13 328.1 328.10 328.1 328.1 NaN -0.36
4993 1993-01-12 329.3 329.00 329.0 329.0 NaN 0.30
4994 1993-01-11 328.3 329.20 329.2 329.2 0.0 -0.52

[7962 rows x 7 columns]

5
1.3 Exploratory data analysis
[64]: import matplotlib.pyplot as plt
import seaborn as sns

[65]: gold_prices93_2024.set_index('Data', inplace=True)

[66]: gold_prices93_2024.head()

[66]: Último Abertura Máxima Mínima Vol. Var%


Data
2024-04-03 2316.9 2299.65 2317.1 2285.9 NaN 1.54
2024-04-02 2281.8 2272.70 2301.9 2267.1 277230.0 1.09
2024-04-01 2257.1 2259.20 2286.4 2249.1 220770.0 0.10
2024-03-29 2254.8 2254.80 2254.8 2254.8 NaN 0.73
2024-03-28 2238.4 2215.70 2256.9 2207.5 250250.0 1.16

[67]: time_series = gold_prices93_2024['Último']

[68]: # We extract temporal series from our dataframe


time_series.head()

[68]: Data
2024-04-03 2316.9
2024-04-02 2281.8
2024-04-01 2257.1
2024-03-29 2254.8
2024-03-28 2238.4
Name: Último, dtype: float64

Trend Visualization

[69]: plt.figure(figsize=(12, 6))


sns.lineplot(x=time_series.index, y=time_series)
plt.title('Gold price time series')
plt.xlabel('Date')
plt.ylabel('Price (USD)')
plt.show()

6
Descomposition of the time series to identify trend and seasonality

[70]: from statsmodels.tsa.seasonal import seasonal_decompose

[71]: decomposition = seasonal_decompose(time_series, model='additive', period=365) ␣


↪# annual period

trend = decomposition.trend
seasonal = decomposition.seasonal
residual = decomposition.resid

[72]: plt.figure(figsize=(12, 8))


plt.subplot(411)
plt.plot(time_series, label='Original')
plt.legend(loc='best')
plt.subplot(412)
plt.plot(trend, label='Trend')
plt.legend(loc='best')
plt.subplot(413)
plt.plot(seasonal, label='Seasonality')
plt.legend(loc='best')
plt.subplot(414)
plt.plot(residual, label='Resid')
plt.legend(loc='best')
plt.tight_layout()

7
We can see that there is an upward trend since the beginning of the series. There is also season-
ality (Which we will continue analyzing to find better conclusions) and there is also noise in the
series. Elements to take into account when developing our time series model.

[73]: # Autocorrelation
plt.figure(figsize=(10, 6))
pd.plotting.autocorrelation_plot(time_series)
plt.title('Time Series Autoccorelation')
plt.show()

8
Based on the autocorrelation, we can conclude that initially there is a strong positive cor-
relation in the first temporal lags, which could indicate the presence of seasonality or repetitive
patterns in the series. However, this correlation decreases over time and eventually becomes
a weak or no correlation, suggesting greater random variability in the time series.
Monthly seasonality analysis

[74]: monthly_seasonality = time_series.groupby(time_series.index.month).mean()

plt.figure(figsize=(10, 6))
monthly_seasonality.plot(kind='bar')
plt.title('Gold prices monthly seasonality')
plt.xlabel('Month')
plt.ylabel('Average price')
plt.xticks(rotation=0)
plt.show()

9
Annual seasonality analysis

[75]: yearly_seasonality = time_series.groupby(time_series.index.year).mean()

plt.figure(figsize=(10, 6))
yearly_seasonality.plot(kind='line', marker='o')
plt.title('Gold prices Annual seasonality')
plt.xlabel('Year')
plt.ylabel('Average Price')
plt.xticks(rotation=45)
plt.grid(True)
plt.show()

10
Quarterly seasonality analysis

[76]: quarterly_seasonality = time_series.groupby(lambda x: x.quarter).mean()

plt.figure(figsize=(10, 6))
quarterly_seasonality.plot(kind='bar')
plt.title('Estacionalidad Trimestral del Precio del Oro')
plt.xlabel('Trimestre')
plt.ylabel('Precio Promedio')
plt.xticks(rotation=0)
plt.grid(True)
plt.show()

11
As we can see, apparently we cannot identify seasonality according to the month, the year, or
the quarter.
While we may not be finding clear seasonality in terms of years, months, or quarters, the fact that
we can observe a general seasonal pattern in the graphical representation suggests that there is some
form of seasonality present in the data. This could indicate that the data follows a recurring
pattern over a time period other than a year, month, or quarter.

[77]: # Now we create a new DataFrame with time series to improve neural networks␣
↪model performance

gold_series = pd.DataFrame({'Price': time_series.values}, index=time_series.


↪index)

[78]: gold_series.head()

[78]: Price
Data
2024-04-03 2316.9
2024-04-02 2281.8
2024-04-01 2257.1
2024-03-29 2254.8
2024-03-28 2238.4

12
1.4 Data preparation
[79]: # Importing necessary libraries to normalize
from sklearn.preprocessing import MinMaxScaler

# Normalizing the data


scaler = MinMaxScaler(feature_range=(0, 1))
df_scaled = scaler.fit_transform(gold_series.Price.values.reshape(-1, 1))

## Creating sequences
def create_dataset(dataset, time_step=1):
dataX, dataY = [], []
for i in range(len(dataset)-time_step-1):
a = dataset[i:(i+time_step), 0]
dataX.append(a)
dataY.append(dataset[i + time_step, 0])
return np.array(dataX), np.array(dataY)

time_step = 60
X, y = create_dataset(df_scaled, time_step)
X = X.reshape(X.shape[0], X.shape[1], 1)

## Split data into training and test sets


split_fraction = 0.8
train_size = int(len(X) * split_fraction)

X_train, X_test = X[:train_size], X[train_size:]


y_train, y_test = y[:train_size], y[train_size:]

1.5 Building the RNN Model – MODEL 1


[80]: from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, LSTM, Dropout, Input

# Model definition
model = Sequential()
# 1. Sequential Model: This initializes a linear stack of layers. It's the␣
↪foundation for building the neural network.

model.add(Input(shape=(time_step, 1)))
# 2. Input Layer: Specifies the shape of the input data. Here, it expects data␣
↪with `time_step` timesteps,

# each with 1 feature. This is crucial for time series forecasting where␣
↪sequence matters.

model.add(LSTM(50, return_sequences=True))

13
# 3. LSTM Layer: A type of recurrent neural network (RNN) layer suited for␣
↪sequence prediction problems.

# It has 50 units (neurons).


# The `return_sequences=True` parameter means it returns the output for each␣
↪timestep,

# allowing the next LSTM layer to receive sequences rather than a flattened␣
↪output, which is essential

# for capturing temporal dependencies in data.

model.add(Dropout(0.2))
# 4. Dropout Layer: Randomly sets input units to 0 with a frequency of 20% at␣
↪each step during training,

# which helps prevent overfitting.


# Overfitting occurs when a model learns the detail and noise in the␣
↪training data to the extent

# that it negatively impacts the performance of the model on new data. By␣
↪dropping out neurons,

# it forces the network to learn more robust features that are useful in␣
↪conjunction with many different

# random subsets of the other neurons.

model.add(LSTM(50))
# 5. Second LSTM Layer: Another LSTM layer with 50 units but without returning␣
↪sequences,

# meaning this layer only returns the output of the last timestep.
# It helps to further analyze the information provided by the previous LSTM␣
↪layers,

# summarizing the learned features before passing to the dense layers.

model.add(Dropout(0.2))
# 6. Second Dropout Layer: Again, this layer helps in reducing overfitting by␣
↪randomly dropping 20% of

# the connections, ensuring that the model does not rely too heavily on any␣
↪particular neuron.

model.add(Dense(25))
# 7. Dense Layer: A fully connected layer that has 25 neurons.
# It takes the output from the previous LSTM layer and begins the process of␣
↪classification or regression

# (depending on the final layer), integrating learned features from the LSTM␣
↪layers.

model.add(Dense(1))
# 8. Output Dense Layer: The final layer with a single neuron. This outputs the␣
↪prediction for the model.

14
# For regression tasks, like forecasting a continuous value (e.g., price), it␣
↪uses linear activation by default.

# Model compilation
model.compile(optimizer='adam', loss='mean_squared_error')

1.6 Model training and validation


[81]: # Model training
history = model.fit(X_train, y_train, epochs=100, batch_size=64,␣
↪validation_data=(X_test, y_test), verbose=2)

Epoch 1/100
99/99 - 4s - 38ms/step - loss: 0.0130 - val_loss: 9.4971e-05
Epoch 2/100
99/99 - 2s - 23ms/step - loss: 0.0019 - val_loss: 7.5848e-06
Epoch 3/100
99/99 - 2s - 23ms/step - loss: 0.0015 - val_loss: 6.6699e-06
Epoch 4/100
99/99 - 3s - 25ms/step - loss: 0.0013 - val_loss: 1.3386e-05
Epoch 5/100
99/99 - 3s - 26ms/step - loss: 0.0011 - val_loss: 1.1442e-05
Epoch 6/100
99/99 - 3s - 26ms/step - loss: 0.0010 - val_loss: 6.1237e-06
Epoch 7/100
99/99 - 2s - 24ms/step - loss: 9.7885e-04 - val_loss: 7.0141e-06
Epoch 8/100
99/99 - 2s - 23ms/step - loss: 8.5437e-04 - val_loss: 5.6327e-05
Epoch 9/100
99/99 - 2s - 24ms/step - loss: 7.5665e-04 - val_loss: 7.0768e-06
Epoch 10/100
99/99 - 2s - 25ms/step - loss: 7.3605e-04 - val_loss: 9.8303e-06
Epoch 11/100
99/99 - 2s - 23ms/step - loss: 7.5934e-04 - val_loss: 6.9272e-06
Epoch 12/100
99/99 - 2s - 24ms/step - loss: 6.9216e-04 - val_loss: 5.3564e-06
Epoch 13/100
99/99 - 2s - 23ms/step - loss: 6.4600e-04 - val_loss: 1.1854e-05
Epoch 14/100
99/99 - 2s - 23ms/step - loss: 6.2626e-04 - val_loss: 5.4214e-06
Epoch 15/100
99/99 - 2s - 23ms/step - loss: 6.7290e-04 - val_loss: 1.7734e-05
Epoch 16/100
99/99 - 2s - 24ms/step - loss: 5.9394e-04 - val_loss: 5.4544e-06
Epoch 17/100
99/99 - 2s - 24ms/step - loss: 5.8198e-04 - val_loss: 2.5878e-05

15
Epoch 18/100
99/99 - 2s - 24ms/step - loss: 6.0419e-04 - val_loss: 4.8426e-06
Epoch 19/100
99/99 - 2s - 25ms/step - loss: 6.5408e-04 - val_loss: 7.4312e-05
Epoch 20/100
99/99 - 2s - 25ms/step - loss: 6.0698e-04 - val_loss: 2.2073e-04
Epoch 21/100
99/99 - 2s - 24ms/step - loss: 6.0888e-04 - val_loss: 1.1143e-04
Epoch 22/100
99/99 - 2s - 23ms/step - loss: 6.0037e-04 - val_loss: 2.3428e-05
Epoch 23/100
99/99 - 2s - 24ms/step - loss: 6.1911e-04 - val_loss: 2.0695e-05
Epoch 24/100
99/99 - 2s - 25ms/step - loss: 6.0378e-04 - val_loss: 1.0429e-05
Epoch 25/100
99/99 - 3s - 27ms/step - loss: 6.1293e-04 - val_loss: 1.3531e-04
Epoch 26/100
99/99 - 3s - 26ms/step - loss: 5.8850e-04 - val_loss: 3.5075e-05
Epoch 27/100
99/99 - 2s - 24ms/step - loss: 5.7250e-04 - val_loss: 8.5741e-06
Epoch 28/100
99/99 - 3s - 25ms/step - loss: 5.9071e-04 - val_loss: 1.8856e-05
Epoch 29/100
99/99 - 3s - 26ms/step - loss: 6.1326e-04 - val_loss: 5.3208e-05
Epoch 30/100
99/99 - 3s - 27ms/step - loss: 5.8837e-04 - val_loss: 1.0588e-05
Epoch 31/100
99/99 - 3s - 26ms/step - loss: 5.6843e-04 - val_loss: 1.5737e-04
Epoch 32/100
99/99 - 3s - 25ms/step - loss: 5.7359e-04 - val_loss: 5.1152e-06
Epoch 33/100
99/99 - 3s - 26ms/step - loss: 5.6634e-04 - val_loss: 1.0342e-04
Epoch 34/100
99/99 - 3s - 27ms/step - loss: 6.1721e-04 - val_loss: 3.9151e-06
Epoch 35/100
99/99 - 3s - 26ms/step - loss: 5.6913e-04 - val_loss: 1.1561e-05
Epoch 36/100
99/99 - 3s - 26ms/step - loss: 6.0420e-04 - val_loss: 3.7633e-06
Epoch 37/100
99/99 - 3s - 27ms/step - loss: 5.8666e-04 - val_loss: 1.1366e-05
Epoch 38/100
99/99 - 3s - 26ms/step - loss: 5.4071e-04 - val_loss: 3.4420e-05
Epoch 39/100
99/99 - 3s - 26ms/step - loss: 5.5971e-04 - val_loss: 5.0782e-05
Epoch 40/100
99/99 - 3s - 26ms/step - loss: 5.7754e-04 - val_loss: 3.2586e-05
Epoch 41/100
99/99 - 3s - 25ms/step - loss: 5.6429e-04 - val_loss: 1.1648e-04

16
Epoch 42/100
99/99 - 3s - 27ms/step - loss: 6.0148e-04 - val_loss: 3.6349e-05
Epoch 43/100
99/99 - 2s - 25ms/step - loss: 5.3978e-04 - val_loss: 1.3722e-05
Epoch 44/100
99/99 - 3s - 27ms/step - loss: 5.8052e-04 - val_loss: 8.3608e-06
Epoch 45/100
99/99 - 3s - 26ms/step - loss: 5.5936e-04 - val_loss: 3.5196e-05
Epoch 46/100
99/99 - 3s - 26ms/step - loss: 5.8337e-04 - val_loss: 3.4584e-05
Epoch 47/100
99/99 - 2s - 25ms/step - loss: 5.8590e-04 - val_loss: 1.2082e-05
Epoch 48/100
99/99 - 3s - 26ms/step - loss: 5.4867e-04 - val_loss: 5.7363e-06
Epoch 49/100
99/99 - 3s - 27ms/step - loss: 5.9701e-04 - val_loss: 4.6206e-06
Epoch 50/100
99/99 - 3s - 27ms/step - loss: 5.6745e-04 - val_loss: 5.2262e-06
Epoch 51/100
99/99 - 3s - 27ms/step - loss: 5.8653e-04 - val_loss: 1.5183e-04
Epoch 52/100
99/99 - 3s - 27ms/step - loss: 5.5162e-04 - val_loss: 4.4254e-05
Epoch 53/100
99/99 - 3s - 26ms/step - loss: 5.5882e-04 - val_loss: 7.2491e-05
Epoch 54/100
99/99 - 3s - 27ms/step - loss: 5.6574e-04 - val_loss: 1.0307e-04
Epoch 55/100
99/99 - 3s - 26ms/step - loss: 5.7468e-04 - val_loss: 1.4743e-04
Epoch 56/100
99/99 - 3s - 27ms/step - loss: 6.0322e-04 - val_loss: 8.1270e-05
Epoch 57/100
99/99 - 3s - 27ms/step - loss: 5.5312e-04 - val_loss: 6.3004e-05
Epoch 58/100
99/99 - 2s - 25ms/step - loss: 5.5602e-04 - val_loss: 1.2979e-05
Epoch 59/100
99/99 - 2s - 25ms/step - loss: 5.5543e-04 - val_loss: 1.2124e-05
Epoch 60/100
99/99 - 2s - 25ms/step - loss: 5.4647e-04 - val_loss: 9.1151e-06
Epoch 61/100
99/99 - 2s - 24ms/step - loss: 5.6134e-04 - val_loss: 7.5481e-06
Epoch 62/100
99/99 - 2s - 24ms/step - loss: 5.2441e-04 - val_loss: 3.0390e-06
Epoch 63/100
99/99 - 3s - 28ms/step - loss: 5.5493e-04 - val_loss: 1.3405e-05
Epoch 64/100
99/99 - 2s - 24ms/step - loss: 5.3586e-04 - val_loss: 9.0481e-05
Epoch 65/100
99/99 - 2s - 24ms/step - loss: 5.9240e-04 - val_loss: 5.8765e-06

17
Epoch 66/100
99/99 - 2s - 25ms/step - loss: 5.3960e-04 - val_loss: 1.2643e-04
Epoch 67/100
99/99 - 2s - 24ms/step - loss: 5.6391e-04 - val_loss: 1.1913e-05
Epoch 68/100
99/99 - 2s - 24ms/step - loss: 5.5925e-04 - val_loss: 2.6741e-05
Epoch 69/100
99/99 - 2s - 24ms/step - loss: 5.8842e-04 - val_loss: 6.1013e-05
Epoch 70/100
99/99 - 2s - 24ms/step - loss: 5.8400e-04 - val_loss: 4.0890e-04
Epoch 71/100
99/99 - 2s - 25ms/step - loss: 5.5859e-04 - val_loss: 2.1998e-05
Epoch 72/100
99/99 - 2s - 23ms/step - loss: 5.4594e-04 - val_loss: 2.9647e-06
Epoch 73/100
99/99 - 2s - 23ms/step - loss: 5.1706e-04 - val_loss: 4.9980e-05
Epoch 74/100
99/99 - 3s - 26ms/step - loss: 5.4792e-04 - val_loss: 1.8434e-04
Epoch 75/100
99/99 - 3s - 26ms/step - loss: 5.3778e-04 - val_loss: 3.7872e-04
Epoch 76/100
99/99 - 2s - 23ms/step - loss: 5.5681e-04 - val_loss: 1.1534e-04
Epoch 77/100
99/99 - 2s - 23ms/step - loss: 5.3330e-04 - val_loss: 1.5711e-05
Epoch 78/100
99/99 - 3s - 26ms/step - loss: 5.9710e-04 - val_loss: 1.7520e-04
Epoch 79/100
99/99 - 3s - 25ms/step - loss: 5.4407e-04 - val_loss: 2.1019e-05
Epoch 80/100
99/99 - 2s - 23ms/step - loss: 5.4519e-04 - val_loss: 1.1159e-04
Epoch 81/100
99/99 - 2s - 23ms/step - loss: 5.3514e-04 - val_loss: 2.6046e-05
Epoch 82/100
99/99 - 2s - 25ms/step - loss: 5.0742e-04 - val_loss: 6.1785e-05
Epoch 83/100
99/99 - 2s - 25ms/step - loss: 5.5356e-04 - val_loss: 1.5859e-04
Epoch 84/100
99/99 - 2s - 24ms/step - loss: 5.2888e-04 - val_loss: 2.0536e-05
Epoch 85/100
99/99 - 2s - 23ms/step - loss: 5.7007e-04 - val_loss: 1.5923e-05
Epoch 86/100
99/99 - 2s - 24ms/step - loss: 5.2447e-04 - val_loss: 8.3547e-05
Epoch 87/100
99/99 - 3s - 25ms/step - loss: 5.2700e-04 - val_loss: 3.1569e-06
Epoch 88/100
99/99 - 2s - 24ms/step - loss: 5.6278e-04 - val_loss: 2.0232e-05
Epoch 89/100
99/99 - 2s - 22ms/step - loss: 5.2179e-04 - val_loss: 2.0799e-04

18
Epoch 90/100
99/99 - 2s - 24ms/step - loss: 5.2928e-04 - val_loss: 1.7738e-05
Epoch 91/100
99/99 - 2s - 25ms/step - loss: 5.6055e-04 - val_loss: 7.2826e-06
Epoch 92/100
99/99 - 2s - 25ms/step - loss: 5.6139e-04 - val_loss: 8.6739e-06
Epoch 93/100
99/99 - 2s - 23ms/step - loss: 5.5577e-04 - val_loss: 4.2765e-06
Epoch 94/100
99/99 - 2s - 23ms/step - loss: 5.4223e-04 - val_loss: 1.1974e-05
Epoch 95/100
99/99 - 3s - 27ms/step - loss: 5.0363e-04 - val_loss: 1.9093e-05
Epoch 96/100
99/99 - 3s - 26ms/step - loss: 5.1833e-04 - val_loss: 3.1829e-06
Epoch 97/100
99/99 - 2s - 23ms/step - loss: 5.3811e-04 - val_loss: 7.6731e-05
Epoch 98/100
99/99 - 2s - 23ms/step - loss: 5.6258e-04 - val_loss: 9.4047e-05
Epoch 99/100
99/99 - 2s - 24ms/step - loss: 5.4084e-04 - val_loss: 5.9582e-06
Epoch 100/100
99/99 - 2s - 25ms/step - loss: 5.3078e-04 - val_loss: 2.8627e-06

[82]: # Model evaluation on test set


test_loss = model.evaluate(X_test, y_test)
print(f'Pérdida en el conjunto de prueba: {test_loss}')

50/50 �������������������� 0s 7ms/step - loss:


2.6073e-06
Pérdida en el conjunto de prueba: 2.8626843686652137e-06

[83]: # Generate predictions


predictions = model.predict(X_test)
# Reverse normalization for predictions
actual_predictions = scaler.inverse_transform(predictions)

50/50 �������������������� 0s 6ms/step

[84]: y_test_resized = y_test.reshape(-1, 1)


y_test_reales = scaler.inverse_transform(y_test_resized)

[85]: plt.figure(figsize=(10,6))
plt.plot(y_test_reales, label='Real Value')
plt.plot(actual_predictions, label='Forecast')
plt.title('Gold Price Forecast')
plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()
plt.show()

19
2 Prediction next 60 days.
[86]: last_days = df_scaled[:time_step]

future_predictions = []

for i in range(60):
# Predict using the last available days
pred = model.predict(last_days.reshape(1, time_step, 1))
# Add the new prediction at the start of your available data
last_days = np.insert(last_days, 0, pred, axis=0)
# Ensure the array maintains the correct size
last_days = last_days[:time_step]
# Save the prediction
future_predictions.append(pred[0, 0])

# Reverse the predictions to match the correct temporal order


future_predictions = future_predictions[::-1]

# Reverse normalization
future_predictions = scaler.inverse_transform(np.array(future_predictions).
↪reshape(-1, 1))

1/1 �������������������� 0s 55ms/step

20
1/1 �������������������� 0s 11ms/step
1/1 �������������������� 0s 11ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 11ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 11ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 11ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step

21
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step

[87]: # Generate future dates without the 'closed' argument


# Assuming gold_series.index[0] is the most recent date

# The last date in your data


last_date = gold_series.index[0]

# Generate 60 future dates starting from the day after the last date
future_dates = pd.date_range(start=last_date + pd.Timedelta(days=1), periods=60)

# Ensure the display is correctly adjusted


plt.figure(figsize=(15, 9))
plt.plot(gold_series.index, gold_series.Price, label='Historical Data')
# Since future dates are generated in order, but your data is in reverse,
# we might need to reverse future_dates or gold_series.index to match them on␣
↪the graph.

plt.plot(future_dates, future_predictions, label='Future Predictions',␣


↪linestyle='--')

plt.title('Gold Price Prediction for the Next 60 Days')


plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()
plt.xticks(rotation=45)
# If your historical data is in reverse (most recent date first),
# we should also reverse the X-axis to show recent dates on the left
plt.show()

22
[88]: last_month_data = gold_series.head(30)
last_month_dates = last_month_data.index
plt.figure(figsize=(15, 9))

# Last month's data


plt.plot(last_month_dates, last_month_data.Price, label='Historical Data from␣
↪Last Month', marker='o')

# Future predictions
# Ensure future_dates start right after the last day of your historical data
plt.plot(future_dates, future_predictions, label='Future Predictions',␣
↪linestyle='--', marker='x')

#plt.plot(future_dates[::-1], future_predictions, label='Future Predictions',␣


↪linestyle='--')

plt.title('Gold Price Comparison: Last Month vs. Future Predictions')


plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()
plt.xticks(rotation=45)
plt.grid(True)
plt.show()

23
[89]: # This code assumes that 'future_dates' is a pandas DatetimeIndex of prediction␣
↪dates

# and 'future_predictions' contains your predicted values.

plt.figure(figsize=(15, 9))
plt.plot(future_dates, future_predictions, label='Future Predictions',␣
↪linestyle='--', marker='o')

plt.title('Predicted Evolution of Gold Price for the Next 60 Days')


plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()
plt.xticks(rotation=45)
plt.grid(True)
plt.show()

24
2.1 Building the RNN Model – MODEL 2
The provided Python function, create_weighted_dataset, is designed to prepare a dataset for
machine learning purposes, where more emphasis is placed on more recent data. This is particularly
useful in time series forecasting where recent trends and patterns might be more indicative of future
outcomes than older data.

[90]: def create_weighted_dataset(dataset, recent_days=7, total_days=60):


X, y = [], []
# Iterate through the dataset from the beginning (most recent date)␣
↪backwards

for i in range(total_days, len(dataset) - 1):


# More recent data with greater detail (using a backward approach)
recent_data = dataset[i - total_days:i - total_days + recent_days]
# The rest of the older data
older_data = dataset[i - total_days + recent_days:i]
# Combine
combined_data = np.concatenate([recent_data, older_data])
X.append(combined_data)
y.append(dataset[i, 0])
return np.array(X), np.array(y)

25
[91]: from sklearn.model_selection import train_test_split
# Normalization and data preparation
scaler = MinMaxScaler(feature_range=(0, 1))
df_scaled = scaler.fit_transform(gold_series.Price.values.reshape(-1, 1))

X, y = create_weighted_dataset(df_scaled, recent_days=30, total_days=90)


X = X.reshape(X.shape[0], X.shape[1], 1)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2,␣
↪random_state=42)

Custom Loss Function: We will implement a custom loss function to make the model more
sensitive to errors in the most recent data.

[92]: import tensorflow as tf


sequence_length = 90
weights = tf.linspace(1.0, 2.0, sequence_length)
weights = tf.reshape(weights, (1, sequence_length))

def weighted_mse(y_true, y_pred):


sq_diff = tf.square(y_true - y_pred)
weighted_sq_diff = sq_diff * weights
return tf.reduce_mean(weighted_sq_diff)

[93]: # Model 2 definition


model_2 = Sequential()
model_2.add(Input(shape=(X.shape[1], 1))) # Using Input to define the entry␣
↪shape

model_2.add(LSTM(50, return_sequences=True))
model_2.add(Dropout(0.2))
model_2.add(LSTM(50))
model_2.add(Dropout(0.2))
model_2.add(Dense(25))
model_2.add(Dense(1))
model_2.compile(optimizer='adam', loss = weighted_mse) # Ensure 'weighted_mse'␣
↪is correctly defined

# Training the model


history_2 = model_2.fit(X_train, y_train, epochs=100, batch_size=64,␣
↪validation_data=(X_test, y_test), verbose=2)

Epoch 1/100
99/99 - 5s - 47ms/step - loss: 0.0121 - val_loss: 3.1189e-04
Epoch 2/100
99/99 - 3s - 31ms/step - loss: 0.0022 - val_loss: 3.2325e-04
Epoch 3/100
99/99 - 3s - 31ms/step - loss: 0.0019 - val_loss: 3.2663e-04
Epoch 4/100
99/99 - 3s - 31ms/step - loss: 0.0016 - val_loss: 2.8749e-04

26
Epoch 5/100
99/99 - 3s - 31ms/step - loss: 0.0015 - val_loss: 4.2687e-04
Epoch 6/100
99/99 - 3s - 31ms/step - loss: 0.0013 - val_loss: 2.2582e-04
Epoch 7/100
99/99 - 3s - 33ms/step - loss: 0.0012 - val_loss: 3.0889e-04
Epoch 8/100
99/99 - 3s - 33ms/step - loss: 0.0011 - val_loss: 2.0867e-04
Epoch 9/100
99/99 - 3s - 33ms/step - loss: 0.0011 - val_loss: 2.1665e-04
Epoch 10/100
99/99 - 3s - 32ms/step - loss: 0.0010 - val_loss: 3.9222e-04
Epoch 11/100
99/99 - 3s - 29ms/step - loss: 0.0010 - val_loss: 1.9851e-04
Epoch 12/100
99/99 - 3s - 30ms/step - loss: 0.0010 - val_loss: 2.4068e-04
Epoch 13/100
99/99 - 3s - 33ms/step - loss: 9.9799e-04 - val_loss: 3.9945e-04
Epoch 14/100
99/99 - 3s - 33ms/step - loss: 9.3437e-04 - val_loss: 3.2376e-04
Epoch 15/100
99/99 - 3s - 33ms/step - loss: 9.6877e-04 - val_loss: 3.3437e-04
Epoch 16/100
99/99 - 3s - 33ms/step - loss: 8.8937e-04 - val_loss: 2.0148e-04
Epoch 17/100
99/99 - 3s - 34ms/step - loss: 8.9148e-04 - val_loss: 3.6924e-04
Epoch 18/100
99/99 - 3s - 32ms/step - loss: 8.8240e-04 - val_loss: 2.2275e-04
Epoch 19/100
99/99 - 3s - 32ms/step - loss: 8.5599e-04 - val_loss: 2.1810e-04
Epoch 20/100
99/99 - 3s - 35ms/step - loss: 9.0986e-04 - val_loss: 1.8227e-04
Epoch 21/100
99/99 - 3s - 34ms/step - loss: 8.8320e-04 - val_loss: 2.6968e-04
Epoch 22/100
99/99 - 3s - 32ms/step - loss: 9.4969e-04 - val_loss: 3.3517e-04
Epoch 23/100
99/99 - 3s - 32ms/step - loss: 9.0847e-04 - val_loss: 1.7822e-04
Epoch 24/100
99/99 - 4s - 36ms/step - loss: 9.1784e-04 - val_loss: 2.2486e-04
Epoch 25/100
99/99 - 3s - 35ms/step - loss: 9.2302e-04 - val_loss: 1.9838e-04
Epoch 26/100
99/99 - 3s - 35ms/step - loss: 8.8649e-04 - val_loss: 1.5054e-04
Epoch 27/100
99/99 - 3s - 31ms/step - loss: 8.7615e-04 - val_loss: 1.7378e-04
Epoch 28/100
99/99 - 3s - 32ms/step - loss: 8.6541e-04 - val_loss: 4.0889e-04

27
Epoch 29/100
99/99 - 3s - 32ms/step - loss: 9.1472e-04 - val_loss: 1.5639e-04
Epoch 30/100
99/99 - 3s - 32ms/step - loss: 9.0386e-04 - val_loss: 1.3789e-04
Epoch 31/100
99/99 - 3s - 32ms/step - loss: 9.0380e-04 - val_loss: 1.4517e-04
Epoch 32/100
99/99 - 3s - 32ms/step - loss: 8.3637e-04 - val_loss: 1.3180e-04
Epoch 33/100
99/99 - 3s - 31ms/step - loss: 9.0694e-04 - val_loss: 1.4760e-04
Epoch 34/100
99/99 - 3s - 34ms/step - loss: 8.9918e-04 - val_loss: 1.4307e-04
Epoch 35/100
99/99 - 3s - 32ms/step - loss: 9.0477e-04 - val_loss: 1.5028e-04
Epoch 36/100
99/99 - 3s - 32ms/step - loss: 8.8931e-04 - val_loss: 1.2974e-04
Epoch 37/100
99/99 - 3s - 35ms/step - loss: 8.6720e-04 - val_loss: 1.8156e-04
Epoch 38/100
99/99 - 3s - 32ms/step - loss: 8.7824e-04 - val_loss: 2.2094e-04
Epoch 39/100
99/99 - 3s - 34ms/step - loss: 8.4188e-04 - val_loss: 4.5172e-04
Epoch 40/100
99/99 - 3s - 34ms/step - loss: 9.2531e-04 - val_loss: 2.9625e-04
Epoch 41/100
99/99 - 3s - 33ms/step - loss: 8.8754e-04 - val_loss: 1.7715e-04
Epoch 42/100
99/99 - 3s - 34ms/step - loss: 8.5897e-04 - val_loss: 2.3130e-04
Epoch 43/100
99/99 - 3s - 29ms/step - loss: 8.7322e-04 - val_loss: 1.3088e-04
Epoch 44/100
99/99 - 3s - 30ms/step - loss: 9.0617e-04 - val_loss: 1.9477e-04
Epoch 45/100
99/99 - 3s - 30ms/step - loss: 8.7715e-04 - val_loss: 1.7226e-04
Epoch 46/100
99/99 - 4s - 36ms/step - loss: 8.8053e-04 - val_loss: 1.0559e-04
Epoch 47/100
99/99 - 4s - 36ms/step - loss: 8.6210e-04 - val_loss: 1.6573e-04
Epoch 48/100
99/99 - 4s - 37ms/step - loss: 8.8757e-04 - val_loss: 1.2454e-04
Epoch 49/100
99/99 - 3s - 34ms/step - loss: 8.7902e-04 - val_loss: 2.1976e-04
Epoch 50/100
99/99 - 3s - 31ms/step - loss: 8.6466e-04 - val_loss: 2.0781e-04
Epoch 51/100
99/99 - 3s - 34ms/step - loss: 7.9545e-04 - val_loss: 1.3942e-04
Epoch 52/100
99/99 - 3s - 29ms/step - loss: 8.5283e-04 - val_loss: 1.0940e-04

28
Epoch 53/100
99/99 - 3s - 35ms/step - loss: 8.1868e-04 - val_loss: 1.0285e-04
Epoch 54/100
99/99 - 3s - 34ms/step - loss: 9.1962e-04 - val_loss: 1.8148e-04
Epoch 55/100
99/99 - 3s - 33ms/step - loss: 9.1332e-04 - val_loss: 1.2319e-04
Epoch 56/100
99/99 - 3s - 33ms/step - loss: 8.1458e-04 - val_loss: 1.6473e-04
Epoch 57/100
99/99 - 3s - 32ms/step - loss: 8.5022e-04 - val_loss: 1.7144e-04
Epoch 58/100
99/99 - 4s - 36ms/step - loss: 8.0837e-04 - val_loss: 1.1498e-04
Epoch 59/100
99/99 - 4s - 37ms/step - loss: 9.0002e-04 - val_loss: 1.1828e-04
Epoch 60/100
99/99 - 3s - 33ms/step - loss: 8.6013e-04 - val_loss: 2.2564e-04
Epoch 61/100
99/99 - 4s - 37ms/step - loss: 8.5969e-04 - val_loss: 2.4868e-04
Epoch 62/100
99/99 - 4s - 37ms/step - loss: 8.5208e-04 - val_loss: 1.1475e-04
Epoch 63/100
99/99 - 3s - 34ms/step - loss: 8.1505e-04 - val_loss: 9.4489e-05
Epoch 64/100
99/99 - 3s - 33ms/step - loss: 8.3087e-04 - val_loss: 1.4662e-04
Epoch 65/100
99/99 - 4s - 36ms/step - loss: 8.5577e-04 - val_loss: 2.6066e-04
Epoch 66/100
99/99 - 3s - 31ms/step - loss: 8.2369e-04 - val_loss: 1.0735e-04
Epoch 67/100
99/99 - 3s - 34ms/step - loss: 8.3865e-04 - val_loss: 1.0648e-04
Epoch 68/100
99/99 - 3s - 31ms/step - loss: 8.5144e-04 - val_loss: 1.1340e-04
Epoch 69/100
99/99 - 3s - 30ms/step - loss: 8.8682e-04 - val_loss: 2.5481e-04
Epoch 70/100
99/99 - 3s - 32ms/step - loss: 9.3077e-04 - val_loss: 1.7128e-04
Epoch 71/100
99/99 - 3s - 32ms/step - loss: 8.3553e-04 - val_loss: 3.2599e-04
Epoch 72/100
99/99 - 3s - 33ms/step - loss: 8.1229e-04 - val_loss: 1.2546e-04
Epoch 73/100
99/99 - 3s - 33ms/step - loss: 7.8226e-04 - val_loss: 1.5081e-04
Epoch 74/100
99/99 - 3s - 29ms/step - loss: 8.8546e-04 - val_loss: 2.5479e-04
Epoch 75/100
99/99 - 3s - 31ms/step - loss: 8.1387e-04 - val_loss: 7.8812e-05
Epoch 76/100
99/99 - 3s - 34ms/step - loss: 8.2031e-04 - val_loss: 2.4666e-04

29
Epoch 77/100
99/99 - 3s - 30ms/step - loss: 8.0514e-04 - val_loss: 9.5507e-05
Epoch 78/100
99/99 - 3s - 30ms/step - loss: 9.0787e-04 - val_loss: 1.9740e-04
Epoch 79/100
99/99 - 3s - 34ms/step - loss: 8.4965e-04 - val_loss: 8.4725e-05
Epoch 80/100
99/99 - 3s - 32ms/step - loss: 8.3462e-04 - val_loss: 1.0201e-04
Epoch 81/100
99/99 - 3s - 30ms/step - loss: 8.2107e-04 - val_loss: 1.8833e-04
Epoch 82/100
99/99 - 3s - 32ms/step - loss: 8.4170e-04 - val_loss: 9.5876e-05
Epoch 83/100
99/99 - 3s - 35ms/step - loss: 8.2495e-04 - val_loss: 9.4773e-05
Epoch 84/100
99/99 - 3s - 30ms/step - loss: 8.1833e-04 - val_loss: 1.0244e-04
Epoch 85/100
99/99 - 3s - 33ms/step - loss: 8.1254e-04 - val_loss: 1.3573e-04
Epoch 86/100
99/99 - 3s - 33ms/step - loss: 7.8993e-04 - val_loss: 1.5358e-04
Epoch 87/100
99/99 - 3s - 29ms/step - loss: 8.7287e-04 - val_loss: 8.1502e-05
Epoch 88/100
99/99 - 3s - 31ms/step - loss: 8.0609e-04 - val_loss: 1.7097e-04
Epoch 89/100
99/99 - 3s - 30ms/step - loss: 8.0802e-04 - val_loss: 7.3461e-05
Epoch 90/100
99/99 - 3s - 30ms/step - loss: 8.2437e-04 - val_loss: 1.4295e-04
Epoch 91/100
99/99 - 3s - 32ms/step - loss: 8.2870e-04 - val_loss: 1.6173e-04
Epoch 92/100
99/99 - 3s - 34ms/step - loss: 8.4807e-04 - val_loss: 1.6688e-04
Epoch 93/100
99/99 - 3s - 31ms/step - loss: 8.2351e-04 - val_loss: 1.4493e-04
Epoch 94/100
99/99 - 3s - 30ms/step - loss: 7.9081e-04 - val_loss: 7.6080e-05
Epoch 95/100
99/99 - 3s - 32ms/step - loss: 8.3733e-04 - val_loss: 1.0382e-04
Epoch 96/100
99/99 - 3s - 33ms/step - loss: 8.1837e-04 - val_loss: 1.5865e-04
Epoch 97/100
99/99 - 3s - 30ms/step - loss: 7.8793e-04 - val_loss: 1.0073e-04
Epoch 98/100
99/99 - 3s - 33ms/step - loss: 8.0536e-04 - val_loss: 9.7567e-05
Epoch 99/100
99/99 - 3s - 32ms/step - loss: 8.1719e-04 - val_loss: 8.9304e-05
Epoch 100/100
99/99 - 3s - 32ms/step - loss: 8.2972e-04 - val_loss: 2.1712e-04

30
[94]: # Model evaluation on the test set
test_loss = model_2.evaluate(X_test, y_test)
print(f'Test set loss: {test_loss}')

50/50 �������������������� 1s 11ms/step -


loss: 2.1447e-04
Test set loss: 0.00021711694716941565

[96]: # Assuming df_scaled is your normalized DataFrame and is reversed (most recent␣
↪dates at the beginning)

# Take the first `time_step` days to start the prediction (the most recent)
last_days = df_scaled[:60]

future_predictions = []

for i in range(60):
# Predict using the last available days
pred = model_2.predict(last_days.reshape(1, time_step, 1))
# Add the new prediction at the beginning of your available data
last_days = np.insert(last_days, 0, pred, axis=0)
# Ensure the array maintains the correct size
last_days = last_days[:time_step]
# Save the prediction
future_predictions.append(pred[0, 0])

# Reverse the predictions to match the correct temporal order


future_predictions = future_predictions[::-1]

# Reverse normalization
future_predictions = scaler.inverse_transform(np.array(future_predictions).
↪reshape(-1, 1))

1/1 �������������������� 0s 132ms/step


1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 15ms/step
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 14ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 15ms/step
1/1 �������������������� 0s 9ms/step

31
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 12ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 19ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 16ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 11ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 18ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step

[97]: # Last month's data


last_month_data = gold_series.head(30) # Last 30 days of data

32
last_month_dates = last_month_data.index

plt.figure(figsize=(15, 9))
plt.plot(last_month_dates, last_month_data['Price'], label='Last Month Data',␣
↪marker='o', color='red')

plt.plot(future_dates, future_predictions, label='Future Predictions',␣


↪linestyle='--', marker='x', color='blue')

plt.title('Gold Price Comparison: Last Month vs. Future Predictions')


plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()
plt.xticks(rotation=45)
plt.grid(True)
plt.show()

[98]: # Visualization of only future predictions with actual dates


plt.figure(figsize=(10, 6))
plt.plot(future_dates, future_predictions, label='Future Predictions',␣
↪linestyle='--', marker='o', color='blue')

plt.title('Future Predictions of Gold Price')


plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()

33
plt.grid(True)
plt.xticks(rotation=45)
plt.show()

2.2 Building the RNN Model – MODEL 3


[99]: # Normalizing the data
scaler = MinMaxScaler(feature_range=(0, 1))
df_scaled = scaler.fit_transform(gold_series.Price.values.reshape(-1, 1))

X, y = create_dataset(df_scaled, time_step)
X = X.reshape(X.shape[0], X.shape[1], 1)

## Splitting the data into training and test sets


split_fraction = 0.8
train_size = int(len(X) * split_fraction)

X_train, X_test = X[:train_size], X[train_size:]


y_train, y_test = y[:train_size], y[train_size:]

Brief Comment on the Attention Layer: The Attention layer allows the model to prioritize
recent gold prices, which can be more relevant for forecasting. By focusing on these key data points,
the model aims to improve prediction accuracy for future gold price trends.

34
[100]: from tensorflow.keras.layers import Attention
from tensorflow.keras.models import Model

# model 3 definition
def attention_model(time_steps, features):
inputs = Input(shape=(time_steps, features))
x = LSTM(50, return_sequences=True)(inputs)
x = Dropout(0.2)(x)
# Apply attention
attention = Attention()([x, x]) # The Attention layer
x = LSTM(50)(attention)
x = Dropout(0.2)(x)
x = Dense(25)(x)
output = Dense(1)(x)
model = Model(inputs, output)
model.compile(optimizer='adam', loss='mean_squared_error')
return model

model_3 = attention_model(60, 1)

[101]: # training the model 3


history_3 = model_3.fit(X_train, y_train, epochs=100, batch_size=64,␣
↪validation_data=(X_test, y_test), verbose=2)

Epoch 1/100
99/99 - 5s - 50ms/step - loss: 0.0108 - val_loss: 2.3655e-05
Epoch 2/100
99/99 - 2s - 25ms/step - loss: 0.0020 - val_loss: 2.4251e-05
Epoch 3/100
99/99 - 3s - 25ms/step - loss: 0.0018 - val_loss: 7.4476e-05
Epoch 4/100
99/99 - 3s - 26ms/step - loss: 0.0016 - val_loss: 3.4537e-05
Epoch 5/100
99/99 - 3s - 27ms/step - loss: 0.0015 - val_loss: 3.6620e-05
Epoch 6/100
99/99 - 3s - 28ms/step - loss: 0.0014 - val_loss: 3.2552e-05
Epoch 7/100
99/99 - 3s - 29ms/step - loss: 0.0014 - val_loss: 2.6645e-05
Epoch 8/100
99/99 - 3s - 29ms/step - loss: 0.0012 - val_loss: 4.5494e-05
Epoch 9/100
99/99 - 3s - 29ms/step - loss: 0.0012 - val_loss: 8.3663e-05
Epoch 10/100
99/99 - 3s - 28ms/step - loss: 0.0012 - val_loss: 9.8340e-05
Epoch 11/100
99/99 - 3s - 26ms/step - loss: 0.0011 - val_loss: 2.2851e-05
Epoch 12/100
99/99 - 3s - 29ms/step - loss: 0.0011 - val_loss: 3.5734e-05

35
Epoch 13/100
99/99 - 3s - 29ms/step - loss: 0.0011 - val_loss: 3.1172e-05
Epoch 14/100
99/99 - 3s - 29ms/step - loss: 0.0011 - val_loss: 1.0445e-04
Epoch 15/100
99/99 - 3s - 28ms/step - loss: 0.0011 - val_loss: 3.9797e-05
Epoch 16/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 7.7866e-05
Epoch 17/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 2.0950e-04
Epoch 18/100
99/99 - 3s - 28ms/step - loss: 0.0011 - val_loss: 1.8806e-04
Epoch 19/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 2.8015e-05
Epoch 20/100
99/99 - 3s - 28ms/step - loss: 0.0011 - val_loss: 2.6986e-05
Epoch 21/100
99/99 - 3s - 29ms/step - loss: 0.0010 - val_loss: 2.9329e-05
Epoch 22/100
99/99 - 3s - 27ms/step - loss: 0.0010 - val_loss: 5.7443e-05
Epoch 23/100
99/99 - 3s - 26ms/step - loss: 0.0011 - val_loss: 7.9482e-05
Epoch 24/100
99/99 - 2s - 25ms/step - loss: 0.0011 - val_loss: 4.3668e-05
Epoch 25/100
99/99 - 3s - 26ms/step - loss: 0.0010 - val_loss: 3.5643e-05
Epoch 26/100
99/99 - 2s - 25ms/step - loss: 0.0010 - val_loss: 3.6051e-05
Epoch 27/100
99/99 - 2s - 25ms/step - loss: 0.0011 - val_loss: 2.2128e-05
Epoch 28/100
99/99 - 3s - 27ms/step - loss: 0.0010 - val_loss: 2.2250e-05
Epoch 29/100
99/99 - 3s - 28ms/step - loss: 0.0011 - val_loss: 1.7235e-04
Epoch 30/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 2.8551e-05
Epoch 31/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 2.2803e-05
Epoch 32/100
99/99 - 3s - 29ms/step - loss: 0.0010 - val_loss: 4.0071e-05
Epoch 33/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 9.4300e-05
Epoch 34/100
99/99 - 3s - 27ms/step - loss: 0.0010 - val_loss: 6.8638e-05
Epoch 35/100
99/99 - 3s - 26ms/step - loss: 0.0011 - val_loss: 2.2473e-05
Epoch 36/100
99/99 - 3s - 27ms/step - loss: 0.0010 - val_loss: 7.5569e-05

36
Epoch 37/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 6.1416e-05
Epoch 38/100
99/99 - 3s - 26ms/step - loss: 0.0010 - val_loss: 5.2404e-05
Epoch 39/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 2.3320e-05
Epoch 40/100
99/99 - 3s - 28ms/step - loss: 9.9523e-04 - val_loss: 1.2577e-04
Epoch 41/100
99/99 - 3s - 29ms/step - loss: 9.6982e-04 - val_loss: 2.0828e-04
Epoch 42/100
99/99 - 3s - 29ms/step - loss: 9.5740e-04 - val_loss: 5.0008e-05
Epoch 43/100
99/99 - 3s - 28ms/step - loss: 0.0011 - val_loss: 2.4536e-05
Epoch 44/100
99/99 - 3s - 27ms/step - loss: 9.9017e-04 - val_loss: 9.3202e-05
Epoch 45/100
99/99 - 3s - 28ms/step - loss: 0.0010 - val_loss: 4.0402e-05
Epoch 46/100
99/99 - 3s - 26ms/step - loss: 0.0010 - val_loss: 2.3427e-05
Epoch 47/100
99/99 - 3s - 27ms/step - loss: 0.0010 - val_loss: 1.6582e-04
Epoch 48/100
99/99 - 3s - 28ms/step - loss: 9.7491e-04 - val_loss: 1.8634e-04
Epoch 49/100
99/99 - 3s - 28ms/step - loss: 9.8393e-04 - val_loss: 2.4522e-05
Epoch 50/100
99/99 - 3s - 28ms/step - loss: 9.4542e-04 - val_loss: 2.2694e-05
Epoch 51/100
99/99 - 3s - 29ms/step - loss: 9.5204e-04 - val_loss: 2.2759e-05
Epoch 52/100
99/99 - 3s - 29ms/step - loss: 9.2859e-04 - val_loss: 4.4513e-05
Epoch 53/100
99/99 - 3s - 27ms/step - loss: 9.4062e-04 - val_loss: 2.3401e-05
Epoch 54/100
99/99 - 3s - 26ms/step - loss: 9.7780e-04 - val_loss: 2.4313e-05
Epoch 55/100
99/99 - 2s - 24ms/step - loss: 0.0010 - val_loss: 4.8990e-04
Epoch 56/100
99/99 - 3s - 27ms/step - loss: 9.5884e-04 - val_loss: 1.6682e-04
Epoch 57/100
99/99 - 2s - 25ms/step - loss: 9.3835e-04 - val_loss: 1.3326e-04
Epoch 58/100
99/99 - 3s - 28ms/step - loss: 9.3860e-04 - val_loss: 3.8619e-05
Epoch 59/100
99/99 - 3s - 26ms/step - loss: 9.3360e-04 - val_loss: 5.7910e-05
Epoch 60/100
99/99 - 3s - 27ms/step - loss: 9.9899e-04 - val_loss: 2.5418e-05

37
Epoch 61/100
99/99 - 3s - 28ms/step - loss: 9.0982e-04 - val_loss: 3.0850e-05
Epoch 62/100
99/99 - 2s - 25ms/step - loss: 8.6784e-04 - val_loss: 2.1483e-05
Epoch 63/100
99/99 - 3s - 27ms/step - loss: 8.9512e-04 - val_loss: 3.1655e-05
Epoch 64/100
99/99 - 3s - 27ms/step - loss: 8.8231e-04 - val_loss: 5.0047e-05
Epoch 65/100
99/99 - 3s - 26ms/step - loss: 8.7356e-04 - val_loss: 2.7084e-05
Epoch 66/100
99/99 - 2s - 25ms/step - loss: 8.6573e-04 - val_loss: 3.1293e-05
Epoch 67/100
99/99 - 3s - 28ms/step - loss: 9.0713e-04 - val_loss: 2.7708e-05
Epoch 68/100
99/99 - 3s - 29ms/step - loss: 8.8666e-04 - val_loss: 2.0709e-05
Epoch 69/100
99/99 - 3s - 29ms/step - loss: 8.6372e-04 - val_loss: 3.3895e-05
Epoch 70/100
99/99 - 3s - 28ms/step - loss: 8.3681e-04 - val_loss: 1.5723e-04
Epoch 71/100
99/99 - 2s - 25ms/step - loss: 8.8624e-04 - val_loss: 2.1448e-05
Epoch 72/100
99/99 - 3s - 26ms/step - loss: 8.7173e-04 - val_loss: 4.1779e-05
Epoch 73/100
99/99 - 2s - 24ms/step - loss: 9.1587e-04 - val_loss: 3.2554e-04
Epoch 74/100
99/99 - 3s - 26ms/step - loss: 8.1774e-04 - val_loss: 3.2291e-05
Epoch 75/100
99/99 - 3s - 28ms/step - loss: 8.8838e-04 - val_loss: 2.2719e-05
Epoch 76/100
99/99 - 3s - 29ms/step - loss: 7.9876e-04 - val_loss: 2.0587e-05
Epoch 77/100
99/99 - 3s - 29ms/step - loss: 8.4625e-04 - val_loss: 2.0133e-05
Epoch 78/100
99/99 - 3s - 29ms/step - loss: 8.2607e-04 - val_loss: 2.4182e-05
Epoch 79/100
99/99 - 3s - 29ms/step - loss: 8.9032e-04 - val_loss: 3.9642e-05
Epoch 80/100
99/99 - 3s - 29ms/step - loss: 7.6884e-04 - val_loss: 1.0231e-04
Epoch 81/100
99/99 - 3s - 29ms/step - loss: 7.8472e-04 - val_loss: 2.3773e-05
Epoch 82/100
99/99 - 3s - 29ms/step - loss: 7.6791e-04 - val_loss: 1.4441e-04
Epoch 83/100
99/99 - 3s - 29ms/step - loss: 7.9383e-04 - val_loss: 3.2360e-05
Epoch 84/100
99/99 - 3s - 28ms/step - loss: 7.7345e-04 - val_loss: 1.1168e-04

38
Epoch 85/100
99/99 - 3s - 29ms/step - loss: 7.5280e-04 - val_loss: 2.3077e-05
Epoch 86/100
99/99 - 3s - 29ms/step - loss: 8.4925e-04 - val_loss: 2.1367e-05
Epoch 87/100
99/99 - 3s - 29ms/step - loss: 8.0057e-04 - val_loss: 1.6006e-04
Epoch 88/100
99/99 - 3s - 28ms/step - loss: 7.3700e-04 - val_loss: 9.2010e-05
Epoch 89/100
99/99 - 3s - 29ms/step - loss: 7.7159e-04 - val_loss: 3.3814e-05
Epoch 90/100
99/99 - 3s - 29ms/step - loss: 8.0079e-04 - val_loss: 1.6051e-04
Epoch 91/100
99/99 - 3s - 28ms/step - loss: 7.7512e-04 - val_loss: 1.5174e-04
Epoch 92/100
99/99 - 3s - 28ms/step - loss: 7.3159e-04 - val_loss: 2.6150e-05
Epoch 93/100
99/99 - 3s - 29ms/step - loss: 7.4907e-04 - val_loss: 2.6058e-05
Epoch 94/100
99/99 - 3s - 29ms/step - loss: 7.1977e-04 - val_loss: 1.9258e-05
Epoch 95/100
99/99 - 3s - 29ms/step - loss: 7.0355e-04 - val_loss: 1.9429e-05
Epoch 96/100
99/99 - 3s - 28ms/step - loss: 6.9538e-04 - val_loss: 2.5064e-05
Epoch 97/100
99/99 - 3s - 29ms/step - loss: 6.9249e-04 - val_loss: 4.3426e-05
Epoch 98/100
99/99 - 3s - 29ms/step - loss: 6.9669e-04 - val_loss: 6.2162e-05
Epoch 99/100
99/99 - 3s - 28ms/step - loss: 6.7062e-04 - val_loss: 1.1153e-04
Epoch 100/100
99/99 - 3s - 28ms/step - loss: 6.9950e-04 - val_loss: 1.3921e-04

[102]: # Model evaluation on the test set


test_loss = model_3.evaluate(X_test, y_test)
print(f'Test set loss: {test_loss}')

50/50 �������������������� 1s 7ms/step - loss:


1.2123e-04
Test set loss: 0.00013920964556746185

[103]: last_days = df_scaled[:time_step]

future_predictions = []

for i in range(60):
# Predict using the last available days
pred = model_3.predict(last_days.reshape(1, time_step, 1))

39
# Add the new prediction to the beginning of your available data
last_days = np.insert(last_days, 0, pred, axis=0)
# Ensure the array maintains the correct size
last_days = last_days[:time_step]
# Save the prediction
future_predictions.append(pred[0, 0])

# Reverse the predictions to match the correct temporal order


future_predictions = future_predictions[::-1]

# Reverse normalization
future_predictions = scaler.inverse_transform(np.array(future_predictions).
↪reshape(-1, 1))

1/1 �������������������� 0s 296ms/step


1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 14ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 10ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step

40
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 15ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 8ms/step
1/1 �������������������� 0s 9ms/step
1/1 �������������������� 0s 8ms/step

[104]: # Ensuring the visualization is correctly adjusted


plt.figure(figsize=(15, 9))
plt.plot(gold_series.index, gold_series['Price'], label='Historical Data')
# Since future dates are generated in order, but your data is reversed,
# you might need to reverse future_dates or gold_series.index to align them on␣
↪the graph.

plt.plot(future_dates, future_predictions, label='Future Predictions',␣


↪linestyle='--')

plt.title('Gold Price Prediction for the Next 60 Days')


plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()
plt.xticks(rotation=45)
# If your historical data is reversed (most recent date first),
# you might also need to reverse the X axis to show recent dates on the left
plt.show()

41
[105]: # Last month's data
last_month_data = gold_series.head(30) # Last 30 days of data
last_month_dates = last_month_data.index

plt.figure(figsize=(15, 9))
plt.plot(last_month_dates, last_month_data['Price'], label='Last Month Data',␣
↪marker='o', color='red')

plt.plot(future_dates, future_predictions, label='Future Predictions',␣


↪linestyle='--', marker='x', color='blue')

plt.title('Gold Price Comparison: Last Month vs. Future Predictions')


plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()
plt.xticks(rotation=45)
plt.grid(True)
plt.show()

42
[106]: # Visualization of only future predictions with actual dates
plt.figure(figsize=(10, 6))
plt.plot(future_dates, future_predictions, label='Future Predictions',␣
↪linestyle='--', marker='o', color='blue')

plt.title('Future Gold Price Predictions')


plt.xlabel('Date')
plt.ylabel('Gold Price')
plt.legend()
plt.grid(True)
plt.xticks(rotation=45)
plt.show()

43
[ ]:

44

You might also like