Getting Started¶
BuildingBlock.ai’s Python module bbai automatically tunes hyperparameters for logistic regression, ridge regression, and other GLMs to optimize Approximate Leave-one-out Cross-validation.
To install bbai for Mac or Linux, run
pip install bbai
Fitting Ridge Regression¶
We’ll use bbai to find the hyperparameters for ridge regression that optimize Leave-one-out Cross-validation.
First, load an example data set.
from sklearn.datasets import load_boston
from sklearn.preprocessing import StandardScaler
X, y = load_boston(return_X_y=True)
X = StandardScaler().fit_transform(X)
Next, fit the model. Because bbai uses second-order information to find the best hyperparameters, there’s no need to specify a search space.
from bbai.glm import RidgeRegression
model = RidgeRegression()
model.fit(X, y)
We can now print out the hyperparameter found.
print('alpha = ', model.alpha_)
Prints
alpha = 4.680170622758263
Fitting Logistic Regression¶
We can also find hyperparameters for logistic regression that optimize Approximate Leave-one-out Cross-validation.
Load an example classification data set.
from sklearn.datasets import load_breast_cancer
from sklearn.preprocessing import StandardScaler
X, y = load_breast_cancer(return_X_y=True)
X = StandardScaler().fit_transform(X)
Fit and find the best hyperparameters.
from bbai.glm import LogisticRegression
model = LogisticRegression()
model.fit(X, y)
Print out the hyperparameter we found.
print('C = ', model.C_)
Outputs
C = 0.6655139682151202