R/Lrnr_bilstm.R
Lrnr_bilstm.RdThis learner supports bidirectinal long short-term memory recurrent neural network algorithm. In order to use this learner, you will need keras Python module 2.0.0 or higher. Note that all preprocessing, such as differencing and seasonal effects for time series, should be addressed before using this learner.
R6Class object.
Lrnr_base object with methods for training and prediction
unitsPositive integer, dimensionality of the output space.
lossName of a loss function used.
optimizername of optimizer, or optimizer object.
batch_sizeNumber of samples per gradient update.
epochsNumber of epochs to train the model.
windowSize of the sliding window input.
activationThe activation function to use.
denseregular, densely-connected NN layer. Default is 1.
dropoutfloat between 0 and 1. Fraction of the input units to drop.
Other Learners:
Custom_chain,
Lrnr_HarmonicReg,
Lrnr_arima,
Lrnr_bartMachine,
Lrnr_base,
Lrnr_bayesglm,
Lrnr_caret,
Lrnr_cv_selector,
Lrnr_cv,
Lrnr_dbarts,
Lrnr_define_interactions,
Lrnr_density_discretize,
Lrnr_density_hse,
Lrnr_density_semiparametric,
Lrnr_earth,
Lrnr_expSmooth,
Lrnr_gam,
Lrnr_ga,
Lrnr_gbm,
Lrnr_glm_fast,
Lrnr_glmnet,
Lrnr_glm,
Lrnr_grf,
Lrnr_gru_keras,
Lrnr_gts,
Lrnr_h2o_grid,
Lrnr_hal9001,
Lrnr_haldensify,
Lrnr_hts,
Lrnr_independent_binomial,
Lrnr_lightgbm,
Lrnr_lstm_keras,
Lrnr_mean,
Lrnr_multiple_ts,
Lrnr_multivariate,
Lrnr_nnet,
Lrnr_nnls,
Lrnr_optim,
Lrnr_pca,
Lrnr_pkg_SuperLearner,
Lrnr_polspline,
Lrnr_pooled_hazards,
Lrnr_randomForest,
Lrnr_ranger,
Lrnr_revere_task,
Lrnr_rpart,
Lrnr_rugarch,
Lrnr_screener_augment,
Lrnr_screener_coefs,
Lrnr_screener_correlation,
Lrnr_screener_importance,
Lrnr_sl,
Lrnr_solnp_density,
Lrnr_solnp,
Lrnr_stratified,
Lrnr_subset_covariates,
Lrnr_svm,
Lrnr_tsDyn,
Lrnr_ts_weights,
Lrnr_xgboost,
Pipeline,
Stack,
define_h2o_X(),
undocumented_learner