Calls readRDS in what is expected to be a serialized lgb.Booster object, and then restores its handle through lgb.restore_handle.

This function throws a warning and will be removed in future versions.

readRDS.lgb.Booster(file, refhook = NULL)

Arguments

file

a connection or the name of the file where the R object is saved to or read from.

refhook

a hook function for handling reference objects.

Value

lgb.Booster

Examples

# \donttest{
library(lightgbm)
data(agaricus.train, package = "lightgbm")
train <- agaricus.train
dtrain <- lgb.Dataset(train$data, label = train$label)
data(agaricus.test, package = "lightgbm")
test <- agaricus.test
dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
params <- list(
  objective = "regression"
  , metric = "l2"
  , min_data = 1L
  , learning_rate = 1.0
)
valids <- list(test = dtest)
model <- lgb.train(
  params = params
  , data = dtrain
  , nrounds = 10L
  , valids = valids
  , early_stopping_rounds = 5L
)
#> [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001056 seconds.
#> You can set `force_row_wise=true` to remove the overhead.
#> And if memory is not enough, you can set `force_col_wise=true`.
#> [LightGBM] [Info] Total Bins 232
#> [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
#> [LightGBM] [Info] Start training from score 0.482113
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [1] "[1]:  test's l2:6.44165e-17"
#> [1] "Will train until there is no improvement in 5 rounds."
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [1] "[2]:  test's l2:1.97215e-31"
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [1] "[3]:  test's l2:0"
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [1] "[4]:  test's l2:0"
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [1] "[5]:  test's l2:0"
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [1] "[6]:  test's l2:0"
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [1] "[7]:  test's l2:0"
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [1] "[8]:  test's l2:0"
#> [1] "Early stopping, best iteration is: [3]:  test's l2:0"
model_file <- tempfile(fileext = ".rds")
saveRDS.lgb.Booster(model, model_file)
#> Warning: 'saveRDS.lgb.Booster' is deprecated and will be removed in a future release. Use saveRDS() instead.
new_model <- readRDS.lgb.Booster(model_file)
#> Warning: 'readRDS.lgb.Booster' is deprecated and will be removed in a future release. Use readRDS() instead.
# }