Thefittest Documentation
thefittest is an open-source library designed for the efficient application of classical evolutionary algorithms and their effective modifications in optimization and machine learning. Our project aims to provide performance, accessibility, and ease of use, opening up the world of advanced evolutionary methods to you.
Modules
Optimizers
DifferentialEvolution, jDE, SHADE, GeneticAlgorithm, SelfCGA, PDPGA, GeneticProgramming, SelfCGP, PDPGP, SHAGA
View documentation →Classifiers
GeneticProgrammingClassifier, MLPEAClassifier, GeneticProgrammingClassifier
View documentation →Installation
Basic installation (for evolutionary algorithms and symbolic regression):
pip install thefittest
Full installation with neural networks (requires GPU with CUDA):
First, install PyTorch with CUDA support: https://pytorch.org/get-started/locally/
pip3 install torch --index-url https://download.pytorch.org/whl/cu124
pip install thefittest
Quick Start
Optimization Example
from thefittest.optimizers import SHADE
# Define the objective function to minimize
def custom_problem(x):
return (5 - x[:, 0])**2 + (12 - x[:, 1])**2
# Initialize the SHADE optimizer
optimizer = SHADE(
fitness_function=custom_problem,
iters=25,
pop_size=10,
left_border=-100,
right_border=100,
num_variables=2,
show_progress_each=10,
minimization=True,
)
optimizer.fit()
fittest = optimizer.get_fittest()
print('Best solution:', fittest['phenotype'])
print('Fitness:', fittest['fitness'])
Machine Learning Example
from thefittest.optimizers import SHAGA
from thefittest.benchmarks import IrisDataset
from thefittest.classifiers import MLPEAClassifier
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import minmax_scale
from sklearn.metrics import f1_score
# Load and prepare data
data = IrisDataset()
X = minmax_scale(data.get_X())
y = data.get_y()
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1)
# Train model
model = MLPEAClassifier(
n_iter=500,
pop_size=500,
hidden_layers=[5, 5],
weights_optimizer=SHAGA,
weights_optimizer_args={"show_progress_each": 10}
)
model.fit(X_train, y_train)
# Evaluate
predict = model.predict(X_test)
print("F1 score:", f1_score(y_test, predict, average="macro"))
Dependencies
Required
-
Python ≥ 3.7, ≤ 3.13
Core runtime environment supported across a wide range of Python versions -
NumPy ≥ 1.26
Vectorized numerical computations and population-level data representation -
Numba ≥ 0.60
JIT compilation of performance-critical evolutionary operators and neural network routines -
SciPy
Linear algebra and numerical utilities used in optimization procedures -
scikit-learn ≥ 1.4
Dataset utilities, preprocessing, metrics, and sklearn-compatible API design -
joblib ≥ 1.3.0
Lightweight parallelization and efficient execution of independent tasks
Optional
-
networkx
Visualization of genetic programming trees and neural network structures -
PyTorch ≥ 2.0 (CUDA)
Evolutionary training of neural networks with optional GPU acceleration