KNeighborsClassifier
#include <Skigen/Neighbors>
template <typename Scalar = double>
class Skigen::KNeighborsClassifier(n_neighbors=5)
Classifier implementing the k-nearest neighbors vote.
Uses brute-force computation of distances. For each query point, finds the n_neighbors closest training points and predicts by majority vote.
Mirrors sklearn.neighbors.KNeighborsClassifier.
Parameters:
- n_neighbors : int, default=5
Number of neighbors to use (
int, default5).
Attributes:
-
is_fitted : bool Whether the estimator has been fitted.
-
n_neighbors : int Number of neighbors.
Methods
fit(X, y)
Fit the k-nearest neighbors classifier from the training set.
Stores the training data for later queries.
Parameters:
-
X : MatrixType Training data of shape (n_samples, n_features).
-
y : IndexVector Target values of shape (n_samples,) with integer class labels.
Returns:
- result : KNeighborsClassifier
Reference to the fitted estimator (
*this).
Throws:
std::invalid_argument— ifn_samples < n_neighborsor X and y have inconsistent lengths.
predict(X)
Predict class labels for the provided data.
Parameters:
- X : MatrixType Test samples of shape (n_samples, n_features).
Returns:
- result : IndexVector Integer vector of predicted class labels (n_samples,).
Throws:
std::runtime_error— if the model has not been fitted.std::invalid_argument— if feature count doesn't match training data.
score(X, y)
Return the mean accuracy on the given test data and labels.
Parameters:
-
X : MatrixType Test samples of shape (n_samples, n_features).
-
y : IndexVector True class labels of shape (n_samples,).
Returns:
- result : Scalar Mean accuracy.
Example
// Compare different k values
std::cout << "=== KNN: varying k ===\n";
for (int k : {1, 3, 5, 7, 11}) {
Skigen::KNeighborsClassifier<double> knn(k);
knn.fit(split.X_train, split.y_train);
auto pred = knn.predict(split.X_test);
std::cout << " k=" << std::setw(2) << k
<< " accuracy=" << Skigen::Metrics::accuracy_score(split.y_test, pred)
<< " F1=" << Skigen::Metrics::f1_score(split.y_test, pred) << "\n";
}