Models
NeuroTabRegressor
NeuroTabModels.Learners.NeuroTabRegressor Type
NeuroTabRegressor(arch::Architecture; kwargs...) NeuroTabRegressor(; arch_name="NeuroTreeConfig", arch_config::AbstractDict=Dict(), kwargs...)
A model type for constructing a NeuroTabRegressor, based on NeuroTabModels.jl, and implementing both an internal API and the MLJ model interface.
Hyper-parameters
loss=:mse: Loss to be be minimized during training. One of::mse:mae:logloss:mlogloss:gaussian_mle
nrounds=100: Max number of rounds (epochs).lr=1.0f-2: Learning rate. Must be > 0. A loweretaresults in slower learning, typically requiring a highernrounds.wd=0.f0: Weight decay applied to the gradients by the optimizer.batchsize=2048: Batch size.seed=123: An integer used as a seed to the random number generator.device=:cpu: Device on which to perform the computation, either:cpuor:gpugpuID=0: GPU device to use, only relveant ifdevice = :gpu
Internal API
Do config = NeuroTabRegressor() to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in NeuroTabRegressor(loss=:logistic, depth=5, ...).
Training model
A model is trained using fit:
m = fit(config, dtrain; feature_names, target_name, kwargs...)Inference
Models act as a functor. returning predictions when called as a function with features as argument:
m(data)MLJ Interface
From MLJ, the type can be imported using:
NeuroTabRegressor = @load NeuroTabRegressor pkg=NeuroTabModelsDo model = NeuroTabRegressor() to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in NeuroTabRegressor(loss=...).
Training model
In MLJ or MLJBase, bind an instance model to data with mach = machine(model, X, y) where
X: any table of input features (eg, aDataFrame) whose columns each have one of the following element scitypes:Continuous,Count, or<:OrderedFactor; check column scitypes withschema(X)y: is the target, which can be anyAbstractVectorwhose element scitype is<:Continuous; check the scitype withscitype(y)
Train the machine using fit!(mach, rows=...).
Operations
predict(mach, Xnew): return predictions of the target given featuresXnewhaving the same scitype asXabove.
Fitted parameters
The fields of fitted_params(mach) are:
:fitresult: TheNeuroTabModelobject.
Report
The fields of report(mach) are:
:features: The names of the features encountered in training.
Examples
Internal API
using NeuroTabModels, DataFrames
config = NeuroTabRegressor(depth=5, nrounds=10)
nobs, nfeats = 1_000, 5
dtrain = DataFrame(randn(nobs, nfeats), :auto)
dtrain.y = rand(nobs)
feature_names, target_name = names(dtrain, r"x"), "y"
m = fit(config, dtrain; feature_names, target_name)
p = m(dtrain)MLJ Interface
using MLJBase, NeuroTabModels
m = NeuroTabRegressor(depth=5, nrounds=10)
X, y = @load_boston
mach = machine(m, X, y) |> fit!
p = predict(mach, X)NeuroTabClassifier
NeuroTabModels.Learners.NeuroTabClassifier Type
NeuroTabClassifier(arch::Architecture; kwargs...) NeuroTabClassifier(; arch_name="NeuroTreeConfig", arch_config::AbstractDict=Dict(), kwargs...)
A model type for constructing a NeuroTabClassifier, based on NeuroTabModels.jl, and implementing both an internal API and the MLJ model interface.
Hyper-parameters
nrounds=100: Max number of rounds (epochs).lr=1.0f-2: Learning rate. Must be > 0. A loweretaresults in slower learning, typically requiring a highernrounds.wd=0.f0: Weight decay applied to the gradients by the optimizer.batchsize=2048: Batch size.seed=123: An integer used as a seed to the random number generator.device=:cpu: Device on which to perform the computation, either:cpuor:gpugpuID=0: GPU device to use, only relveant ifdevice = :gpu
Internal API
Do config = NeuroTabClassifier() to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in NeuroTabClassifier(depth=5, ...).
Training model
A model is trained using fit:
m = fit(config, dtrain; feature_names, target_name, kwargs...)Inference
Models act as a functor. returning predictions when called as a function with features as argument:
m(data)MLJ Interface
From MLJ, the type can be imported using:
NeuroTabClassifier = @load NeuroTabClassifier pkg=NeuroTabModelsDo model = NeuroTabClassifier() to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in NeuroTabClassifier(loss=...).
Training model
In MLJ or MLJBase, bind an instance model to data with mach = machine(model, X, y) where
X: any table of input features (eg, aDataFrame) whose columns each have one of the following element scitypes:Continuous,Count, or<:OrderedFactor; check column scitypes withschema(X)y: is the target, which can be anyAbstractVectorwhose element scitype is<:Finite; check the scitype withscitype(y)
Train the machine using fit!(mach, rows=...).
Operations
predict(mach, Xnew): return predictions of the target given featuresXnewhaving the same scitype asXabove.
Fitted parameters
The fields of fitted_params(mach) are:
:fitresult: TheNeuroTabModelobject.
Report
The fields of report(mach) are:
:features: The names of the features encountered in training.
Examples
Internal API
using NeuroTabModels, DataFrames, CategoricalArrays, Random
config = NeuroTabClassifier(depth=5, nrounds=10)
nobs, nfeats = 1_000, 5
dtrain = DataFrame(randn(nobs, nfeats), :auto)
dtrain.y = categorical(rand(1:2, nobs))
feature_names, target_name = names(dtrain, r"x"), "y"
m = fit(config, dtrain; feature_names, target_name)
p = m(dtrain)MLJ Interface
using MLJBase, NeuroTabModels
m = NeuroTabClassifier(depth=5, nrounds=10)
X, y = @load_crabs
mach = machine(m, X, y) |> fit!
p = predict(mach, X)