Optimizing the settings and configurations of a machine learning model to improve its performance. Like fine-tuning a recipe by adjusting cooking temperature and time for the best results.
Data scientists tune hyperparameters like learning rate and number of layers to make their neural network more accurate.
All four clouds provide managed hyperparameter tuning that runs many training trials with different parameter values, tracks metrics, and selects the best configuration. AWS uses SageMaker HPO jobs, Azure uses HyperDrive/AML tuning, GCP uses Vertex AI tuning jobs, and OCI provides tuning via OCI Data Science jobs and pipelines.