Model Builder¶
Factory functions for building models, optimizers, loss functions, and LR schedulers.
models.model_builder
¶
build_loss_function(loss_type: str, kwargs: dict[str, object] | None = None) -> nn.Module
¶
Build and return a loss function.
The function accepts an explicit kwargs mapping instead of a dynamic **kwargs to improve static typing. If kwargs is None, an empty mapping will be used when constructing the loss class.
Source code in src/models/model_builder.py
build_lr_scheduler(optimizer: optim.Optimizer, scheduler_config: dict[str, Any] | None = None, steps_per_epoch: int | None = None, epochs: int | None = None) -> LRScheduler | None
¶
Build and return a learning rate scheduler.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
optimizer
|
Optimizer
|
Optimizer to wrap with the scheduler. |
required |
scheduler_config
|
dict[str, Any] | None
|
Mapping containing the scheduler |
None
|
steps_per_epoch
|
int | None
|
Number of steps per epoch (required for OneCycleLR if not in args). |
None
|
epochs
|
int | None
|
Total number of training epochs (required for OneCycleLR if not in args). |
None
|
Returns:
| Type | Description |
|---|---|
LRScheduler | None
|
Instantiated scheduler or |
Source code in src/models/model_builder.py
build_model(model_type: str, encoder_name: str, in_channels: int, n_classes: int, encoder_weights: str | None = None, activation: str | None = None, *, model_config: dict[str, Any] | None = None, stochastic_depth: float | None = None, decoder_norm: bool | str | dict[str, Any] | None = None) -> nn.Module
¶
Build a segmentation model.
Supports both SMP models and temporal models like U-TAE.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
model_type
|
str
|
Type of model (e.g., 'Unet', 'FPN', 'UTAE') |
required |
encoder_name
|
str
|
Name of encoder backbone |
required |
in_channels
|
int
|
Number of input channels |
required |
n_classes
|
int
|
Number of output classes |
required |
encoder_weights
|
str | None
|
Pre-trained weights for encoder |
None
|
activation
|
str | None
|
Activation function for output |
None
|
model_config
|
dict[str, Any] | None
|
Additional model-specific configuration parameters |
None
|
stochastic_depth
|
float | None
|
Drop path rate for stochastic depth (regularization). |
None
|
decoder_norm
|
bool | str | dict[str, Any] | None
|
Decoder normalization config. Can be: - True: use BatchNorm (default) - False: no normalization - str: 'batchnorm', 'groupnorm', 'layernorm', 'instancenorm' - dict: {'type': 'groupnorm', 'num_groups': 8} |
None
|
Returns:
| Type | Description |
|---|---|
Module
|
Initialized model |
Source code in src/models/model_builder.py
build_optimizer(model: nn.Module, optimizer_type: str, learning_rate: float, weight_decay: float = 0.0, betas: tuple[float, float] | None = None, encoder_lr_mult: float | None = None) -> optim.Optimizer
¶
Build and return an optimizer for the given model.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
encoder_lr_mult
|
float | None
|
Optional multiplier for encoder LR (encoder_lr = learning_rate * mult). |
None
|
Source code in src/models/model_builder.py
freeze_encoder(model: nn.Module) -> None
¶
Freeze encoder parameters (set requires_grad=False).
Source code in src/models/model_builder.py
unfreeze_encoder(model: nn.Module) -> None
¶
Unfreeze encoder parameters (set requires_grad=True).