Attempts to save a model using RDS. Has an additional parameter (raw) which decides whether to save the raw model or not.

saveRDS.lgb.Booster(
  object,
  file = "",
  ascii = FALSE,
  version = NULL,
  compress = TRUE,
  refhook = NULL,
  raw = TRUE
)

Arguments

object

R object to serialize.

file

a connection or the name of the file where the R object is saved to or read from.

ascii

a logical. If TRUE or NA, an ASCII representation is written; otherwise (default), a binary one is used. See the comments in the help for save.

version

the workspace format version to use. NULL specifies the current default version (2). Versions prior to 2 are not supported, so this will only be relevant when there are later versions.

compress

a logical specifying whether saving to a named file is to use "gzip" compression, or one of "gzip", "bzip2" or "xz" to indicate the type of compression to be used. Ignored if file is a connection.

refhook

a hook function for handling reference objects.

raw

whether to save the model in a raw variable or not, recommended to leave it to TRUE.

Value

NULL invisibly.

Examples

# \donttest{ library(lightgbm) data(agaricus.train, package = "lightgbm") train <- agaricus.train dtrain <- lgb.Dataset(train$data, label = train$label) data(agaricus.test, package = "lightgbm") test <- agaricus.test dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label) params <- list(objective = "regression", metric = "l2") valids <- list(test = dtest) model <- lgb.train( params = params , data = dtrain , nrounds = 10L , valids = valids , min_data = 1L , learning_rate = 1.0 , early_stopping_rounds = 5L )
#> [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000961 seconds. #> You can set `force_row_wise=true` to remove the overhead. #> And if memory is not enough, you can set `force_col_wise=true`. #> [LightGBM] [Info] Total Bins 232 #> [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116 #> [LightGBM] [Info] Start training from score 0.482113 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [1]: test's l2:6.44165e-17 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [2]: test's l2:1.97215e-31 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [3]: test's l2:0 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements #> [4]: test's l2:0 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements #> [5]: test's l2:0 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements #> [6]: test's l2:0 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements #> [7]: test's l2:0 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements #> [8]: test's l2:0
model_file <- tempfile(fileext = ".rds") saveRDS.lgb.Booster(model, model_file) # }