Save LightGBM model

lgb.save(booster, filename, num_iteration = NULL, start_iteration = 1L)

Arguments

booster

Object of class lgb.Booster

filename

Saved filename

num_iteration

Number of iterations to save, NULL or <= 0 means use best iteration

start_iteration

Index (1-based) of the first boosting round to save. For example, passing start_iteration=5, num_iteration=3 for a regression model means "save the fifth, sixth, and seventh tree"

New in version 4.4.0

Value

lgb.Booster

Examples

# \donttest{
setLGBMthreads(2L)
data.table::setDTthreads(1L)
library(lightgbm)
data(agaricus.train, package = "lightgbm")
train <- agaricus.train
dtrain <- lgb.Dataset(train$data, label = train$label)
data(agaricus.test, package = "lightgbm")
test <- agaricus.test
dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
params <- list(
  objective = "regression"
  , metric = "l2"
  , min_data = 1L
  , learning_rate = 1.0
  , num_threads = 2L
)
valids <- list(test = dtest)
model <- lgb.train(
  params = params
  , data = dtrain
  , nrounds = 10L
  , valids = valids
  , early_stopping_rounds = 5L
)
#> [LightGBM] [Info] Auto-choosing row-wise multi-threading, the overhead of testing was 0.000901 seconds.
#> You can set `force_row_wise=true` to remove the overhead.
#> And if memory is not enough, you can set `force_col_wise=true`.
#> [LightGBM] [Info] Total Bins 232
#> [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
#> [LightGBM] [Info] Start training from score 0.482113
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [1]:  test's l2:6.44165e-17 
#> Will train until there is no improvement in 5 rounds.
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [2]:  test's l2:1.97215e-31 
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [3]:  test's l2:0 
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [4]:  test's l2:0 
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [5]:  test's l2:0 
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [6]:  test's l2:0 
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [7]:  test's l2:0 
#> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
#> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
#> [8]:  test's l2:0 
#> Early stopping, best iteration is: [3]:  test's l2:0
lgb.save(model, tempfile(fileext = ".txt"))
# }