从keras.wrappers.sci-learn迁移到scikeras

g0czyy6m  于 7个月前  发布在  其他
关注(0)|答案(1)|浏览(51)

我在编程方面不是很有经验,所以我真的不知道如何进行迁移,因为没有太多关于它的文档,我找到的唯一的东西是这个web https://adriangb.com/scikeras/stable/migration.html,我还试图回到旧的keras。
任何指导或帮助将不胜感激。
这是我的代码

import tensorflow as tf
from sklearn.model_selection import TimeSeriesSplit
from sklearn.model_selection import GridSearchCV

def create_model(lstm_1,lstm_2,dense):
    model = Sequential()
    model.add(LSTM(lstm_1, input_shape=(X_train.shape[1], X_train.shape[2]), return_sequences=True))
    model.add(LSTM(lstm_2))
    model.add(Dense(dense, activation='linear'))
    model.add(Dense(1, activation='linear'))
    model.compile(optimizer='adam', loss='mean_squared_error')
    return model

param_grid = param_grid = {
    'lstm_1': [32, 64, 128],
    'lstm_2': [32, 64, 128],
    'dense': [16, 32, 64]
}


tscv = TimeSeriesSplit(n_splits=5)

class TwoLayerFeedForward:
     def __call__():
         clf = Sequential()
         clf.add(Dense(9, activation='relu', input_dim=3))
         clf.add(Dense(9, activation='relu'))
         clf.add(Dense(3, activation='softmax'))
         clf.compile(loss='categorical_crossentropy', optimizer=SGD())
         return clf


grid = GridSearchCV(estimator=model, param_grid=param_grid, cv=tscv, n_jobs=-1)
grid_result = grid.fit(X_train, y_train)

print("The Best parameters: ", grid_result.best_params_)

best_model = grid_result.best_estimator_.model

NN_predictions = best_model.predict(X_test)

NN_mse = np.mean((y_test - NN_predictions.flatten()) ** 2)
NN_rmse = np.sqrt(NN_mse)
print("The Root Mean Squared Error for the Neural Network model is: ", NN_rmse)

NN_r_squared = r2_score(y_test, NN_predictions)

n = len(y_test)
p = 8
NN_adjusted_r_squared = 1 - (1 - NN_r_squared) * (n - 1) / (n - p - 1)

print('The OOS R-squared is: ', NN_r_squared)
print('The OOS Adjusted R-squared is: ', NN_adjusted_r_squared)

plt.figure(figsize=(12, 6))
plt.plot(y_test, label='Implied Volatility')
plt.plot(NN_predictions, label='NN Forecast')
plt.xlabel('Date Index')
plt.ylabel('Implied Volatility')
plt.title('Actual vs Predicted values')
plt.legend()
plt.show()


from tensorflow.keras.regularizers import L2

LSTM1 = grid_result.best_params_['lstm_1']
LSTM2 = grid_result.best_params_['lstm_2']
DENSE = grid_result.best_params_['dense']

model_REG = Sequential([
    LSTM(LSTM1, input_shape=(X_train.shape[1], X_train.shape[2]), return_sequences=True, kernel_regularizer= L2(0.001)),
    LSTM(LSTM2),
    Dense(DENSE, activation = 'linear'),
    Dense(K, activation = 'linear')
])

model_REG.compile(optimizer='adam', loss='mean_squared_error')

history_REG = model_REG.fit(X_train, y_train, epochs=N_epochs, batch_size=N_batch_size, validation_data=(X_test, y_test))

NN_REG_predictions = model_REG.predict(X_test)

NN_REG_mse = np.mean((y_test - NN_REG_predictions.flatten()) ** 2)
NN_REG_rmse = np.sqrt(NN_REG_mse)
print("The Root Mean Squared Error for the Neural Network model is: ", NN_REG_rmse)

NN_r_squared = r2_score(y_test, NN_REG_predictions)

n = len(y_test)
p = 8
NN_adjusted_r_squared = 1 - (1 - NN_r_squared) * (n - 1) / (n - p - 1)

print('The OOS R-squared is: ', NN_r_squared)
print('The OOS Adjusted R-squared is: ', NN_adjusted_r_squared)

plt.figure(figsize=(12, 6))
plt.plot(y_test, label='Implied Volatility')
plt.plot(NN_REG_predictions, label='NN Forecast')
plt.xlabel('Date Index')
plt.ylabel('Implied Volatility')
plt.title('Actual vs Predicted values')
plt.legend()
plt.show()

字符串
`

nhjlsmyf

nhjlsmyf1#

https://stackoverflow.com/a/77251441有一个答案.从keras迁移到scikeras,一般是相同的方法.然而有一些差异.也许链接包含您的答案.

相关问题