Attempts to unload LightGBM packages so you can remove objects cleanly without having to restart R. This is useful for instance if an object becomes stuck for no apparent reason and you do not want to restart R to fix the lost object.

lgb.unloader(restore = TRUE, wipe = FALSE, envir = .GlobalEnv)

Arguments

restore

Whether to reload LightGBM immediately after detaching from R. Defaults to TRUE which means automatically reload LightGBM once unloading is performed.

wipe

Whether to wipe all lgb.Dataset and lgb.Booster from the global environment. Defaults to FALSE which means to not remove them.

envir

The environment to perform wiping on if wipe == TRUE. Defaults to .GlobalEnv which is the global environment.

Value

NULL invisibly.

Examples

# \donttest{ data(agaricus.train, package = "lightgbm") train <- agaricus.train dtrain <- lgb.Dataset(train$data, label = train$label) data(agaricus.test, package = "lightgbm") test <- agaricus.test dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label) params <- list(objective = "regression", metric = "l2") valids <- list(test = dtest) model <- lgb.train( params = params , data = dtrain , nrounds = 5L , valids = valids , min_data = 1L , learning_rate = 1.0 )
#> [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001088 seconds. #> You can set `force_row_wise=true` to remove the overhead. #> And if memory is not enough, you can set `force_col_wise=true`. #> [LightGBM] [Info] Total Bins 232 #> [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116 #> [LightGBM] [Info] Start training from score 0.482113 #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [1] "[1]: test's l2:6.44165e-17" #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [1] "[2]: test's l2:1.97215e-31" #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [1] "[3]: test's l2:0" #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements #> [1] "[4]: test's l2:0" #> [LightGBM] [Warning] No further splits with positive gain, best gain: -inf #> [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements #> [1] "[5]: test's l2:0"
lgb.unloader(restore = FALSE, wipe = FALSE, envir = .GlobalEnv) rm(model, dtrain, dtest) # Not needed if wipe = TRUE gc() # Not needed if wipe = TRUE
#> used (Mb) gc trigger (Mb) max used (Mb) #> Ncells 1900987 101.6 2916840 155.8 2916840 155.8 #> Vcells 3747984 28.6 8388608 64.0 8388473 64.0
library(lightgbm) # Do whatever you want again with LightGBM without object clashing # }