autoemulate.emulators.base#
- class Emulator(x=None, y=None, **kwargs)[source]#
Bases:
ABC
,ValidationMixin
,ConversionMixin
,TorchDeviceMixin
Base class for all emulators.
This class provides the basic structure and methods for emulators in AutoEmulate. It includes methods for fitting, predicting, and handling device management.
- is_fitted_ = False#
- supports_grad = False#
- scheduler_cls = None#
- x_transform = None#
- y_transform = None#
- classmethod short_name()[source]#
Return a short name for the model.
Take the capital letters of the class name and return them as a lower case string. For example, if the class name is GaussianProcess, this will return gp.
- predict(x, with_grad=False)[source]#
Predict the output for the given input.
- Parameters:
x (TensorLike) – Input tensor to make predictions for.
with_grad (bool) – Whether to enable gradient calculation. Defaults to False.
- Returns:
The predicted output.
- Return type:
OutputLike
- static get_tune_params()[source]#
Return a dictionary of hyperparameters to tune.
The keys in the TuneParams must be implemented as keyword arguments in the __init__ method of any subclasses.
e.g.
- tune_params: TuneParams = {
“lr”: list[0.01, 0.1], “batch_size”: [16, 32], “mean”
}
- model_params: ModelParams = {
“lr”: 0.01, “batch_size”: 16
}
- class MySubClass(Emulator):
- def __init__(lr, batch_size):
self.lr = lr self.batch_size = batch_size
- class DeterministicEmulator(x=None, y=None, **kwargs)[source]#
Bases:
Emulator
A base class for deterministic emulators.
- class ProbabilisticEmulator(x=None, y=None, **kwargs)[source]#
Bases:
Emulator
A base class for probabilistic emulators.
- predict(x, with_grad=False)[source]#
Predict the output distribution for the given input.
- Parameters:
x (TensorLike) – Input tensor to make predictions for.
with_grad (bool) – Whether to enable gradient calculation. Defaults to False.
- Returns:
The emulator predicted distribution for x.
- Return type:
DistributionLike
- class GaussianEmulator(x=None, y=None, **kwargs)[source]#
Bases:
ProbabilisticEmulator
A base class for Gaussian emulators.
- supports_grad = True#
- predict(x, with_grad=False)[source]#
Predict the Gaussian distribution for the given input.
- Parameters:
x (TensorLike) – Input tensor to make predictions for.
with_grad (bool) – Whether to enable gradient calculation. Defaults to False.
- Returns:
The emulator predicted Gaussian distribution for x.
- Return type:
GaussianLike
- class GaussianProcessEmulator(x=None, y=None, **kwargs)[source]#
Bases:
GaussianEmulator
A base class for Gaussian Process emulators.
- predict(x, with_grad=False)[source]#
Predict the Gaussian distribution for the given input.
- Parameters:
x (TensorLike) – Input tensor to make predictions for.
with_grad (bool) – Whether to enable gradient calculation. Defaults to False.
- Returns:
The emulator predicted Gaussian distribution for x.
- Return type:
GaussianLike
- class PyTorchBackend(*args, **kwargs)[source]#
Bases:
Module
,Emulator
PyTorchBackend provides a backend for PyTorch models.
The class provides the basic structure and methods for PyTorch-based emulators to enable further subclassing and customization. This provides default implementations to simplify model-specific subclasses by only needing to implement:
.__init__(): the constructor for the model
.forward(): the forward pass of the model
.get_tune_params(): the hyperparameters to tune for the model
- batch_size = 16#
- shuffle = True#
- epochs = 10#
- loss_history = []#
- verbose = False#
- loss_fn = MSELoss()#
- optimizer_cls#
alias of
Adam
- optimizer#
- supports_grad = True#
- lr = 0.1#
- scheduler_cls = None#
- class SklearnBackend(x=None, y=None, **kwargs)[source]#
Bases:
DeterministicEmulator
SklearnBackend provides a backend for sklearn models.
The class provides the basic structure and methods for sklearn-based emulators to enable further subclassing and customization. This provides default implementations to simplify model-specific subclasses by only needing to implement:
.__init__(): the constructor for the model
.get_tune_params(): the hyperparameters to tune for the model
- model#
- normalize_y = False#
- y_mean#
- y_std#
- supports_grad = False#
- class DropoutTorchBackend(*args, **kwargs)[source]#
Bases:
PyTorchBackend
PyTorch backend model that is able to support dropout.