Top Related Projects
A hyperparameter optimization framework
Bayesian optimization in PyTorch
A Python implementation of global optimization with gaussian processes.
Distributed Asynchronous Hyperparameter Optimization in Python
Open source platform for the machine learning lifecycle
Quick Overview
Ax is an open-source, modular, and accessible platform for adaptive experimentation developed by Facebook. It is designed to optimize any kind of experiment, from A/B tests to complex, multi-armed bandit problems. Ax integrates with PyTorch and leverages Bayesian optimization to efficiently explore large parameter spaces.
Pros
- Flexible and modular architecture, allowing for easy customization and extension
- Powerful Bayesian optimization capabilities for efficient parameter tuning
- Seamless integration with PyTorch for machine learning experiments
- Comprehensive documentation and examples for various use cases
Cons
- Steep learning curve for users unfamiliar with Bayesian optimization
- Limited support for distributed experiments across multiple machines
- Requires additional setup and dependencies for certain advanced features
- May be overkill for simple A/B testing scenarios
Code Examples
- Creating a simple experiment:
from ax import SimpleExperiment, ParameterType, RangeParameter, ChoiceParameter
experiment = SimpleExperiment(
name="test_experiment",
parameters=[
RangeParameter(name="x1", lower=0, upper=1, parameter_type=ParameterType.FLOAT),
ChoiceParameter(name="x2", values=["a", "b", "c"], parameter_type=ParameterType.STRING),
],
objective_name="metric",
minimize=False,
)
- Running trials and logging data:
from ax.utils.measurement.synthetic_functions import hartmann6
def evaluate(parameters):
x = [parameters.get(f"x{i+1}") for i in range(6)]
return {"metric": (hartmann6(x), 0.0)}
for _ in range(5):
parameters, trial_index = experiment.new_trial()
experiment.trial(trial_index).run()
data = evaluate(parameters)
experiment.trial(trial_index).complete(data)
- Retrieving the best parameters:
best_parameters, values = experiment.get_best_parameters()
print(f"Best parameters: {best_parameters}")
print(f"Best objective value: {values[0]['metric']}")
Getting Started
To get started with Ax, follow these steps:
-
Install Ax using pip:
pip install ax-platform
-
Import the necessary modules:
from ax import SimpleExperiment, ParameterType, RangeParameter from ax.utils.measurement.synthetic_functions import hartmann6
-
Create an experiment, define parameters, and run trials:
experiment = SimpleExperiment( name="my_experiment", parameters=[ RangeParameter(name="x1", lower=0, upper=1, parameter_type=ParameterType.FLOAT), RangeParameter(name="x2", lower=0, upper=1, parameter_type=ParameterType.FLOAT), ], objective_name="hartmann6", minimize=True, ) def evaluate(parameters): x = [parameters.get(f"x{i+1}", 0.0) for i in range(6)] return {"hartmann6": (hartmann6(x), 0.0)} for _ in range(10): parameters, trial_index = experiment.new_trial() experiment.trial(trial_index).run() data = evaluate(parameters) experiment.trial(trial_index).complete(data) best_parameters, values = experiment.get_best_parameters() print(f"Best parameters: {best_parameters}") print(f"Best objective value: {values[0]['hartmann6']}")
Competitor Comparisons
A hyperparameter optimization framework
Pros of Optuna
- More lightweight and easier to integrate into existing projects
- Supports a wider range of optimization algorithms, including pruning methods
- Offers better visualization tools for hyperparameter importance and optimization history
Cons of Optuna
- Less suitable for multi-objective optimization compared to Ax
- Lacks built-in support for Bayesian optimization with contextual information
Code Comparison
Optuna:
import optuna
def objective(trial):
x = trial.suggest_float('x', -10, 10)
return (x - 2) ** 2
study = optuna.create_study()
study.optimize(objective, n_trials=100)
Ax:
from ax import optimize
def evaluation_function(parameters):
x = parameters.get("x")
return {"objective": (x - 2) ** 2}
best_parameters, _, _ = optimize(
parameters=[{"name": "x", "type": "range", "bounds": [-10, 10]}],
evaluation_function=evaluation_function,
minimize=True,
)
Both Optuna and Ax are powerful hyperparameter optimization frameworks, but they cater to different use cases. Optuna is more flexible and easier to use for simple optimization tasks, while Ax provides more advanced features for complex experimental designs and multi-objective optimization.
Bayesian optimization in PyTorch
Pros of BoTorch
- More flexible and customizable for advanced users
- Deeper integration with PyTorch for neural network-based models
- Better suited for research and development of new Bayesian optimization algorithms
Cons of BoTorch
- Steeper learning curve for beginners
- Requires more manual setup and configuration
- Less out-of-the-box functionality for common use cases
Code Comparison
Ax example:
from ax import optimize
best_parameters, values, experiment, model = optimize(
parameters=[
{"name": "x1", "type": "range", "bounds": [-10, 10]},
{"name": "x2", "type": "range", "bounds": [-10, 10]},
],
evaluation_function=lambda p: (p["x1"] - 1)**2 + (p["x2"] - 2)**2,
minimize=True,
)
BoTorch example:
import torch
from botorch.models import SingleTaskGP
from botorch.fit import fit_gpytorch_model
from botorch.acquisition import ExpectedImprovement
from botorch.optim import optimize_acqf
train_X = torch.rand(10, 2)
train_Y = (train_X[:, 0] - 1)**2 + (train_X[:, 1] - 2)**2
model = SingleTaskGP(train_X, train_Y)
mll = ExactMarginalLogLikelihood(model.likelihood, model)
fit_gpytorch_model(mll)
A Python implementation of global optimization with gaussian processes.
Pros of BayesianOptimization
- Lightweight and easy to use, with a simple API
- Focuses specifically on Bayesian optimization, making it more straightforward for this particular task
- Well-documented with clear examples and tutorials
Cons of BayesianOptimization
- Less feature-rich compared to Ax, with fewer advanced optimization techniques
- Limited support for complex experiment designs and multi-objective optimization
- Smaller community and less frequent updates
Code Comparison
BayesianOptimization:
from bayes_opt import BayesianOptimization
def black_box_function(x, y):
return -x**2 - (y-1)**2 + 1
optimizer = BayesianOptimization(
f=black_box_function,
pbounds={'x': (-2, 2), 'y': (-3, 3)},
random_state=1,
)
optimizer.maximize(init_points=2, n_iter=3)
Ax:
from ax import optimize
def evaluation_function(parameters):
x, y = parameters["x"], parameters["y"]
return -x**2 - (y-1)**2 + 1
best_parameters, values, experiment, model = optimize(
parameters=[
{"name": "x", "type": "range", "bounds": [-2.0, 2.0]},
{"name": "y", "type": "range", "bounds": [-3.0, 3.0]},
],
evaluation_function=evaluation_function,
objective_name="score",
total_trials=5,
)
Distributed Asynchronous Hyperparameter Optimization in Python
Pros of Hyperopt
- More mature and established project with a larger user base
- Supports a wider range of optimization algorithms, including Tree of Parzen Estimators (TPE)
- Easier to integrate with existing Python projects due to its simplicity
Cons of Hyperopt
- Less actively maintained compared to Ax
- Lacks some advanced features like multi-objective optimization and bandit optimization
- Limited built-in visualization tools
Code Comparison
Hyperopt:
from hyperopt import fmin, tpe, hp
def objective(x):
return x**2
best = fmin(fn=objective, space=hp.uniform('x', -10, 10), algo=tpe.suggest, max_evals=100)
Ax:
from ax import optimize
def objective(x):
return x**2
best, _ = optimize(parameters=[{"name": "x", "type": "range", "bounds": [-10, 10]}],
evaluation_function=objective,
minimize=True)
Both libraries offer concise ways to define and optimize objective functions, but Ax provides a more structured approach with its parameter definitions and built-in optimization methods.
Open source platform for the machine learning lifecycle
Pros of MLflow
- More comprehensive end-to-end ML lifecycle management
- Broader language support (Python, R, Java, and more)
- Larger community and ecosystem of integrations
Cons of MLflow
- Less specialized for hyperparameter optimization
- May require more setup for specific use cases
- Can be overwhelming for simple projects
Code Comparison
MLflow:
import mlflow
mlflow.start_run()
mlflow.log_param("learning_rate", 0.01)
mlflow.log_metric("accuracy", 0.85)
mlflow.end_run()
Ax:
from ax import optimize
def evaluate(params):
return {"accuracy": params["learning_rate"] * 10}
best_params, _ = optimize(
parameters=[{"name": "learning_rate", "type": "range", "bounds": [0.001, 0.1]}],
evaluation_function=evaluate,
objective_name="accuracy",
)
Summary
MLflow offers a more comprehensive solution for ML lifecycle management, supporting various languages and integrations. It's suitable for a wide range of projects but may be overkill for simpler tasks. Ax, on the other hand, specializes in hyperparameter optimization and experimentation, making it more focused but potentially less versatile for full ML pipeline management.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
Ax is an accessible, general-purpose platform for understanding, managing, deploying, and automating adaptive experiments.
Adaptive experimentation is the machine-learning guided process of iteratively exploring a (possibly infinite) parameter space in order to identify optimal configurations in a resource-efficient manner. Ax currently supports Bayesian optimization and bandit optimization as exploration strategies. Bayesian optimization in Ax is powered by BoTorch, a modern library for Bayesian optimization research built on PyTorch.
For full documentation and tutorials, see the Ax website
Why Ax?
-
Expressive API: Ax has an expressive API that can address many real-world optimization tasks. It handles complex search spaces, multiple objectives, constraints on both parameters and outcomes, and noisy observations. It supports suggesting multiple designs to evaluate in parallel (both synchronously and asynchronously) and the ability to early-stop evaluations.
-
Strong performance out of the box: Ax abstracts away optimization details that are important but obscure, providing sensible defaults and enabling practitioners to leverage advanced techniques otherwise only accessible to optimization experts.
-
State-of-the-art methods: Ax leverages state-of-the-art Bayesian optimization algorithms implemented in BoTorch, to deliver strong performance across a variety of problem classes.
-
Flexible: Ax is highly configurable, allowing researchers to plug in novel optimization algorithms, models, and experimentation flows.
-
Production ready: Ax offers automation and orchestration features as well as robust error handling for real-world deployment at scale.
Getting Started
To run a simple optimization loop in Ax (using the Booth response surface as the artificial evaluation function):
>>> from ax import Client, RangeParameterConfig
>>> client = Client()
>>> client.configure_experiment(
parameters=[
RangeParameterConfig(
name="x1",
bounds=(-10.0, 10.0),
parameter_type=ParameterType.FLOAT,
),
RangeParameterConfig(
name="x2",
bounds=(-10.0, 10.0),
parameter_type=ParameterType.FLOAT,
),
],
)
>>> client.configure_optimization(objective="-1 * booth")
>>> for _ in range(20):
>>> for trial_index, parameters in client.get_next_trials(max_trials=1).items():
>>> client.complete_trial(
>>> trial_index=trial_index,
>>> raw_data={
>>> "booth": (parameters["x1"] + 2 * parameters["x2"] - 7) ** 2
>>> + (2 * parameters["x1"] + parameters["x2"] - 5) ** 2
>>> },
>>> )
>>> client.get_best_parameterization()
Installation
Ax requires Python 3.10 or newer. A full list of Ax's direct dependencies can be found in setup.py.
We recommend installing Ax via pip, even if using Conda environment:
pip install ax-platform
Installation will use Python wheels from PyPI, available for OSX, Linux, and Windows.
Note: Make sure the pip
being used to install ax-platform
is actually the
one from the newly created Conda environment. If you're using a Unix-based OS,
you can use which pip
to check.
Installing with Extras
Ax can be installed with additional dependencies, which are not included in the
default installation. For example, in order to use Ax within a Jupyter notebook,
install Ax with the notebook
extra:
pip install "ax-platform[notebook]"
Extras for using Ax with MySQL storage (mysql
), for running Ax's tutorial's
locally (tutorials
), and for installing all dependencies necessary for
developing Ax (dev
) are also available.
Install Ax from source
You can install the latest (bleeding edge) version from GitHub using pip
.
The bleeding edge for Ax depends on bleeding edge versions of BoTorch and GPyTorch. We therefore recommend installing those from Github, as well as setting the following environment variables to allow the Ax to use the latest version of both BoTorch and GPyTorch.
export ALLOW_LATEST_GPYTORCH_LINOP=true
export ALLOW_BOTORCH_LATEST=true
pip install git+https://github.com/cornellius-gp/gpytorch.git
pip install git+https://github.com/pytorch/botorch.git
pip install 'git+https://github.com/facebook/Ax.git#egg=ax-platform'
Join the Ax Community
Getting help
Please open an issue on our issues page with any questions, feature requests or bug reports! If posting a bug report, please include a minimal reproducible example (as a code snippet) that we can use to reproduce and debug the problem you encountered.
Contributing
See the CONTRIBUTING file for how to help out.
When contributing to Ax, we recommend cloning the repository and installing all optional dependencies:
pip install git+https://github.com/cornellius-gp/linear_operator.git
pip install git+https://github.com/cornellius-gp/gpytorch.git
export ALLOW_LATEST_GPYTORCH_LINOP=true
pip install git+https://github.com/pytorch/botorch.git
export ALLOW_BOTORCH_LATEST=true
git clone https://github.com/facebook/ax.git --depth 1
cd ax
pip install -e .[tutorial]
See recommendation for installing PyTorch for MacOS users above.
The above example limits the cloned directory size via the
--depth
argument to git clone
. If you require the entire commit history you may remove
this argument.
License
Ax is licensed under the MIT license.
Top Related Projects
A hyperparameter optimization framework
Bayesian optimization in PyTorch
A Python implementation of global optimization with gaussian processes.
Distributed Asynchronous Hyperparameter Optimization in Python
Open source platform for the machine learning lifecycle
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot