Benchmarks

The benchmarks module provides a comprehensive collection of test problems for evaluating optimization algorithms and machine learning models. It includes classification datasets, optimization test functions, and benchmark suites.

Contents

ML Datasets

Machine learning datasets from the UCI Machine Learning Repository and other sources. These datasets are commonly used for testing classification algorithms.

Reference: Dua, D. and Graff, C. (2019). UCI Machine Learning Repository. Irvine, CA: University of California, School of Information and Computer Science.

Dataset

Description

IrisDataset

Famous dataset with iris flower measurements (150 samples, 4 features, 3 classes)

WineDataset

Wine recognition dataset from chemical analysis (178 samples, 13 features, 3 classes)

BreastCancerDataset

Breast cancer diagnostic dataset (569 samples, 30 features, 2 classes)

DigitsDataset

Handwritten digits recognition (5620 samples, 64 features, 10 classes)

CreditRiskDataset

Credit risk prediction dataset (3 features, 2 classes)

UserKnowladgeDataset

Student knowledge modeling (403 samples, 5 features, 4 classes)

BanknoteDataset

Banknote authentication dataset (1372 samples, 4 features, 2 classes)

IrisDataset

class thefittest.benchmarks.IrisDataset

Bases: Dataset

The Iris dataset - one of the most famous datasets in machine learning.

Contains measurements of iris flowers from three different species.

Features (4):
  • sepal length in cm

  • sepal width in cm

  • petal length in cm

  • petal width in cm

Classes (3):
  • Iris-setosa (0)

  • Iris-versicolor (1)

  • Iris-virginica (2)

Samples: 150 (50 per class)

References

Fisher R. A.. (1988). Iris. UCI Machine Learning Repository. https://doi.org/10.24432/C56C76.

__init__() None
get_X() ndarray[Any, dtype[float64]]

Get feature matrix.

Returns:
NDArray[np.float64]

Feature matrix of shape (n_samples, n_features)

get_X_names() Dict[int, str]

Get feature names.

Returns:
Dict[int, str]

Dictionary mapping feature indices to feature names

get_y() ndarray[Any, dtype[int64 | float64]]

Get target values.

Returns:
NDArray[Union[np.int64, np.float64]]

Target array of shape (n_samples,)

get_y_names() Dict[int, str]

Get class/target names.

Returns:
Dict[int, str]

Dictionary mapping class indices to class names

WineDataset

class thefittest.benchmarks.WineDataset

Bases: Dataset

Wine recognition dataset.

Contains results of a chemical analysis of wines grown in the same region in Italy but derived from three different cultivars.

Features (13):
  • Alcohol

  • Malic acid

  • Ash

  • Alcalinity of ash

  • Magnesium

  • Total phenols

  • Flavanoids

  • Nonflavanoid phenols

  • Proanthocyanins

  • Color intensity

  • Hue

  • OD280/OD315 of diluted wines

  • Proline

Classes (3): class 1 (0), class 2 (1), class 3 (2)

Samples: 178

References

Aeberhard, Stefan and Forina, M.. (1991). Wine. UCI Machine Learning Repository. https://doi.org/10.24432/C5PC7J.

__init__() None
get_X() ndarray[Any, dtype[float64]]

Get feature matrix.

Returns:
NDArray[np.float64]

Feature matrix of shape (n_samples, n_features)

get_X_names() Dict[int, str]

Get feature names.

Returns:
Dict[int, str]

Dictionary mapping feature indices to feature names

get_y() ndarray[Any, dtype[int64 | float64]]

Get target values.

Returns:
NDArray[Union[np.int64, np.float64]]

Target array of shape (n_samples,)

get_y_names() Dict[int, str]

Get class/target names.

Returns:
Dict[int, str]

Dictionary mapping class indices to class names

BreastCancerDataset

class thefittest.benchmarks.BreastCancerDataset

Bases: Dataset

Breast Cancer Wisconsin (Diagnostic) dataset.

Contains features computed from digitized images of fine needle aspirate (FNA) of breast masses. They describe characteristics of the cell nuclei present in the image.

Features (30):

Ten real-valued features computed for each cell nucleus:

  • Mean: radius, texture, perimeter, area, smoothness, compactness, concavity, concave points, symmetry, fractal dimension

  • Error: radius error, texture error, perimeter error, area error, smoothness error, compactness error, concavity error, concave points error, symmetry error, fractal dimension error

  • Worst: worst radius, texture, perimeter, area, smoothness, compactness, concavity, concave points, symmetry, fractal dimension

Classes (2): M (malignant, 0), B (benign, 1)

Samples: 569

References

Wolberg, William, Mangasarian, Olvi, Street, Nick, and Street, W.. (1995). Breast Cancer Wisconsin (Diagnostic). UCI Machine Learning Repository. https://doi.org/10.24432/C5DW2B.

__init__() None
get_X() ndarray[Any, dtype[float64]]

Get feature matrix.

Returns:
NDArray[np.float64]

Feature matrix of shape (n_samples, n_features)

get_X_names() Dict[int, str]

Get feature names.

Returns:
Dict[int, str]

Dictionary mapping feature indices to feature names

get_y() ndarray[Any, dtype[int64 | float64]]

Get target values.

Returns:
NDArray[Union[np.int64, np.float64]]

Target array of shape (n_samples,)

get_y_names() Dict[int, str]

Get class/target names.

Returns:
Dict[int, str]

Dictionary mapping class indices to class names

DigitsDataset

class thefittest.benchmarks.DigitsDataset

Bases: Dataset

Optical Recognition of Handwritten Digits dataset.

Contains normalized bitmaps of handwritten digits from 0 to 9.

Features (64): 8x8 pixel values (0-16)

Classes (10): Digits 0-9

Samples: 5620

References

Alpaydin, E. and Kaynak, C.. (1998). Optical Recognition of Handwritten Digits. UCI Machine Learning Repository. https://doi.org/10.24432/C50P49.

__init__() None
get_X() ndarray[Any, dtype[float64]]

Get feature matrix.

Returns:
NDArray[np.float64]

Feature matrix of shape (n_samples, n_features)

get_X_names() Dict[int, str]

Get feature names.

Returns:
Dict[int, str]

Dictionary mapping feature indices to feature names

get_y() ndarray[Any, dtype[int64 | float64]]

Get target values.

Returns:
NDArray[Union[np.int64, np.float64]]

Target array of shape (n_samples,)

get_y_names() Dict[int, str]

Get class/target names.

Returns:
Dict[int, str]

Dictionary mapping class indices to class names

CreditRiskDataset

class thefittest.benchmarks.CreditRiskDataset

Bases: Dataset

Credit Risk dataset.

For predicting whether a client is a good or bad credit risk based on financial information.

Features (3):
  • income

  • age

  • loan

Classes (2): good client (0), bad client (1)

References

https://www.kaggle.com/datasets/upadorprofzs/credit-risk

__init__() None
get_X() ndarray[Any, dtype[float64]]

Get feature matrix.

Returns:
NDArray[np.float64]

Feature matrix of shape (n_samples, n_features)

get_X_names() Dict[int, str]

Get feature names.

Returns:
Dict[int, str]

Dictionary mapping feature indices to feature names

get_y() ndarray[Any, dtype[int64 | float64]]

Get target values.

Returns:
NDArray[Union[np.int64, np.float64]]

Target array of shape (n_samples,)

get_y_names() Dict[int, str]

Get class/target names.

Returns:
Dict[int, str]

Dictionary mapping class indices to class names

UserKnowladgeDataset

class thefittest.benchmarks.UserKnowladgeDataset

Bases: Dataset

User Knowledge Modeling dataset.

Real dataset about students’ knowledge status about the subject of Electrical DC Machines.

Features (5):
  • STG: The degree of study time for goal object materials

  • SCG: The degree of repetition number of user for goal object materials

  • STR: The degree of study time of user for related objects with goal object

  • LPR: The exam performance of user for related objects with goal object

  • PEG: The exam performance of user for goal objects

Classes (4): Very Low (0), Low (1), Middle (2), High (3)

Samples: 403

References

Kahraman, Hamdi, Colak, Ilhami, and Sagiroglu, Seref. (2013). User Knowledge Modeling. UCI Machine Learning Repository. https://doi.org/10.24432/C5231X.

__init__() None
get_X() ndarray[Any, dtype[float64]]

Get feature matrix.

Returns:
NDArray[np.float64]

Feature matrix of shape (n_samples, n_features)

get_X_names() Dict[int, str]

Get feature names.

Returns:
Dict[int, str]

Dictionary mapping feature indices to feature names

get_y() ndarray[Any, dtype[int64 | float64]]

Get target values.

Returns:
NDArray[Union[np.int64, np.float64]]

Target array of shape (n_samples,)

get_y_names() Dict[int, str]

Get class/target names.

Returns:
Dict[int, str]

Dictionary mapping class indices to class names

BanknoteDataset

class thefittest.benchmarks.BanknoteDataset

Bases: Dataset

Banknote authentication dataset.

Data extracted from images taken for the evaluation of an authentication procedure for banknotes.

Features (4):
  • variance of Wavelet Transformed image (continuous)

  • skewness of Wavelet Transformed image (continuous)

  • curtosis of Wavelet Transformed image (continuous)

  • entropy of image (continuous)

Classes (2): not original (0), original (1)

Samples: 1372

References

Lohweg, Volker. (2013). banknote authentication. UCI Machine Learning Repository. https://doi.org/10.24432/C55P57.

__init__() None
get_X() ndarray[Any, dtype[float64]]

Get feature matrix.

Returns:
NDArray[np.float64]

Feature matrix of shape (n_samples, n_features)

get_X_names() Dict[int, str]

Get feature names.

Returns:
Dict[int, str]

Dictionary mapping feature indices to feature names

get_y() ndarray[Any, dtype[int64 | float64]]

Get target values.

Returns:
NDArray[Union[np.int64, np.float64]]

Target array of shape (n_samples,)

get_y_names() Dict[int, str]

Get class/target names.

Returns:
Dict[int, str]

Dictionary mapping class indices to class names

Optimization Functions

Classic benchmark functions for testing continuous optimization algorithms. Each function has different characteristics (unimodal/multimodal, separable/non-separable) that challenge different aspects of optimization algorithms.

Function

Description

Sphere

Simple quadratic function, unimodal and convex

Rosenbrock

Valley-shaped function, unimodal with narrow parabolic valley

Rastrigin

Highly multimodal with many local minima

Ackley

Multimodal with nearly flat outer region and large hole at center

Griewank

Multimodal with many widespread local minima

Weierstrass

Continuous nowhere differentiable function with fractal structure

Schwefe1_2

Unimodal with non-separable variables

HighConditionedElliptic

Unimodal with high condition number

OneMax

Simple sum of all variables

Sphere

class thefittest.benchmarks.Sphere

Bases: TestFunction

Sphere function - simple quadratic function.

One of the simplest optimization test functions. It is continuous, convex, and unimodal. The global minimum is at x = [0, 0, …, 0] with f(x) = 0.

Formula: f(x) = sum(x_i^2)

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Rosenbrock

class thefittest.benchmarks.Rosenbrock

Bases: TestFunction

Rosenbrock function (De Jong’s function 2, Valley function).

A classic optimization test function with a narrow, parabolic valley. The global minimum is inside a long, narrow, parabolic shaped flat valley. Finding the valley is trivial, but convergence to the global minimum is difficult. The global minimum is at x = [1, 1, …, 1] with f(x) = 0.

Formula: f(x) = sum_{i=1}^{n-1} [100(x_{i+1} - x_i^2)^2 + (x_i - 1)^2]

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Rastrigin

class thefittest.benchmarks.Rastrigin

Bases: TestFunction

Rastrigin function.

A highly multimodal function with a large number of local minima. The function is based on the Sphere function with added cosine modulation to create the local minima. The global minimum is at x = [0, 0, …, 0] with f(x) = 0.

Formula: f(x) = 10n + sum_{i=1}^n [x_i^2 - 10*cos(2*pi*x_i)]

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Ackley

class thefittest.benchmarks.Ackley

Bases: TestFunction

Ackley function.

A multimodal function with many local minima and a single global minimum. It is characterized by a nearly flat outer region and a large hole at the center. The global minimum is at x = [0, 0, …, 0] with f(x) = 0.

Formula: f(x) = -a*exp(-b*sqrt(sum(x_i^2)/n)) - exp(sum(cos(c*x_i))/n) + a + e where a=20, b=0.2, c=2*pi

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Griewank

class thefittest.benchmarks.Griewank

Bases: TestFunction

Griewank function.

A multimodal function with many widespread local minima. The number and positioning of local minima is space dependent. The global minimum is at x = [0, 0, …, 0] with f(x) = 0.

Formula: f(x) = 1 + sum_{i=1}^n x_i^2/4000 - prod_{i=1}^n cos(x_i/sqrt(i))

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Weierstrass

class thefittest.benchmarks.Weierstrass

Bases: TestFunction

Weierstrass function.

A continuous but nowhere differentiable function with a fractal structure. It is highly multimodal with many local optima. The global minimum is at x = [0, 0, …, 0].

Formula: f(x) = sum_{i=1}^n sum_{k=0}^{k_max} [a^k * cos(2*pi*b^k*(x_i+0.5))]
  • n*sum_{k=0}^{k_max} [a^k * cos(2*pi*b^k*0.5)]

where a=0.5, b=3, k_max=20

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Schwefe1_2

class thefittest.benchmarks.Schwefe1_2

Bases: TestFunction

Schwefel’s Problem 1.2.

A unimodal function with a single global minimum. The variables are not separable, which makes it harder to optimize than the Sphere function. The global minimum is at x = [0, 0, …, 0] with f(x) = 0.

Formula: f(x) = sum_{i=1}^n (sum_{j=1}^i x_j)^2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

HighConditionedElliptic

class thefittest.benchmarks.HighConditionedElliptic

Bases: TestFunction

High Conditioned Elliptic function.

A unimodal function with high condition number, making it difficult for optimization algorithms that are sensitive to the scaling of variables. The global minimum is at x = [0, 0, …, 0] with f(x) = 0.

Formula: f(x) = sum_{i=1}^n (10^6)^((i-1)/(n-1)) * x_i^2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

OneMax

class thefittest.benchmarks.OneMax

Bases: TestFunction

OneMax function - simple sum of all variables.

A basic test function that simply sums all input variables. The global minimum is at x = [0, 0, …, 0] with f(x) = 0.

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, n_dimensions)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

f(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]
build_grid(x: ndarray[Any, dtype[float64]], y: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Benchmark Suites

Comprehensive benchmark suites with multiple test functions for systematic algorithm evaluation.

CEC2005

The CEC 2005 Special Session on Real-Parameter Optimization provides 25 test functions organized into categories: unimodal (F1-F5), basic multimodal (F6-F12), expanded (F13-F14), and hybrid composition functions (F15-F25).

Reference: Suganthan, P. N., Hansen, N., Liang, J. J., Deb, K., Chen, Y. P., Auger, A., & Tiwari, S. (2005). Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization.

Module: thefittest.benchmarks.CEC2005

Usage Example:

from thefittest.benchmarks import CEC2005

# Access problem dictionary
problems = CEC2005.problems_dict

# Get F1 (Shifted Sphere)
f1_config = problems["F1"]
function = f1_config["function"]()
bounds = f1_config["bounds"]
optimum = f1_config["optimum"]

Symbolic Regression

A collection of 17 test functions for symbolic regression and genetic programming benchmarks. Functions range from 1D to 2D with varying complexity.

Module: thefittest.benchmarks.symbolicregression17

thefittest.benchmarks.symbolicregression17.z(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Helper function used in F13 and F14.

Combination of three inverse squared terms with different parameters.

Parameters:
xNDArray[np.float64]

Input array

Returns:
NDArray[np.float64]

Function values

thefittest.benchmarks.symbolicregression17.F1(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Complex oscillatory function with exponential decay (1D).

Domain: [-1, 1] Variables: 1

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 1)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F2(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Multi-frequency cosine composition (1D).

Domain: [-1, 1] Variables: 1

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 1)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F3(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Quadratic with cosine terms (2D).

Domain: [-16, 16] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F4(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Scaled version of F3 (2D).

Domain: [-16, 16] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F5(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Rosenbrock function (2D).

Classic optimization benchmark with a narrow parabolic valley.

Domain: [-2, 2] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F6(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Modified Griewank-like function (2D).

Domain: [-16, 16] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F7(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Inverse Rosenbrock (2D).

Domain: [-5, 5] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F8(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Modified Schaffer function (2D).

Domain: [-10, 10] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F9(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Quadratic with multi-frequency cosines (2D).

Domain: [-2.5, 2.5] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F10(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Similar to F9 with different domain (2D).

Domain: [-5, 5] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F11(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Absolute sine products with inverse term (2D).

Domain: [-4, 4] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F12(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Cross-term interaction function (2D).

Domain: [0, 4] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F13(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Product of z-functions (2D).

Domain: [0, 4] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F14(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Sum of z-functions (2D).

Domain: [0, 4] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F15(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Simple quadratic (2D).

Domain: [-5, 5] Variables: 2

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 2)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F16(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Sine with quadratic term (1D).

Domain: [-5, 5] Variables: 1

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 1)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

thefittest.benchmarks.symbolicregression17.F17(x: ndarray[Any, dtype[float64]]) ndarray[Any, dtype[float64]]

Linear sine function (1D).

Domain: [-5, 5] Variables: 1

Parameters:
xNDArray[np.float64]

Input array of shape (n_samples, 1)

Returns:
NDArray[np.float64]

Function values of shape (n_samples,)

Usage Example:

from thefittest.benchmarks import symbolicregression17
import numpy as np

# Get F5 (Rosenbrock)
f5_config = symbolicregression17.problems_dict["F5"]
function = f5_config["function"]

# Generate data
X = np.random.uniform(*f5_config["bounds"], size=(100, 2))
y = function(X)