Skip to main content

SGDRegressor

#include <Skigen/LinearModel>

template <typename Scalar = double>
class Skigen::SGDRegressor(alpha=1e-4, max_iter=1000, tol=1e-3, eta0=0.01, random_state=42)

Linear regressor fitted by minimizing a regularized empirical loss with SGD.

SGDRegressor implements regularized linear regression with stochastic gradient descent (squared error loss, L2 penalty).

Mirrors sklearn.linear_model.SGDRegressor.


Parameters:

  • alpha : Scalar, default=1e-4 Regularization constant (Scalar, default 1e-4).

  • max_iter : int, default=1000 Maximum number of epochs (int, default 1000).

  • tol : Scalar, default=1e-3 Stopping tolerance (Scalar, default 1e-3).

  • eta0 : Scalar, default=0.01 Initial learning rate (Scalar, default 0.01).

  • random_state : unsigned int, default=42 RNG seed (unsigned int, default 42).


Attributes:

  • coef : RowVectorType Parameter vector ww (1 × n_features).

  • intercept_value : Scalar Independent term in the decision function.


Methods

fit(X, y)

Fit the linear model with SGD.

Parameters:

  • X : MatrixType Training matrix of shape (n_samples, n_features).

  • y : VectorType Target vector of shape (n_samples,).

Returns:

  • result : SGDRegressor Reference to the fitted estimator (*this).

Throws:

  • std::invalid_argument — if X and y have inconsistent lengths.

predict(X)

Predict using the linear model.


score(X, y)

Return the R2R^2 coefficient of determination on test data.


Example

Skigen::SGDRegressor<double> regressor;
regressor.fit(split_reg.X_train, split_reg.y_train);

std::cout << "=== SGD Regressor ===\n";
std::cout << "R²: " << regressor.score(split_reg.X_test, split_reg.y_test) << "\n";
std::cout << "Coef: " << regressor.coef() << "\n";