torchdms.loss¶
Loss functions and functions relevant to losses.
Functions
Computes l1 norm of the difference between adjacent betas for each latent dimension. |
|
l1_loss, perhaps with loss decay or target exponentiation |
|
mse_loss, perhaps with loss decay or target exponentiation |
|
Computes l1 norm of product of betas across latent dimensions. |
|
Root mean square error, perhaps with loss decay or target exponentiation. |
|
The sum of the 2-norm across columns. |
|
Computes l1 norm of the difference between aggregated betas at adjacent sites for each latent dimension. |
|
Generic loss function decorator with loss decay or target exponentiation. |
- torchdms.loss.weighted_loss(base_loss)[source]¶
Generic loss function decorator with loss decay or target exponentiation.
- torchdms.loss.l1(y_true, y_predicted, loss_decay=None, exp_target=None)¶
l1_loss, perhaps with loss decay or target exponentiation
- torchdms.loss.mse(y_true, y_predicted, loss_decay=None, exp_target=None)¶
mse_loss, perhaps with loss decay or target exponentiation
- torchdms.loss.rmse(*args, **kwargs)[source]¶
Root mean square error, perhaps with loss decay or target exponentiation.
- torchdms.loss.sitewise_group_lasso(matrix)[source]¶
The sum of the 2-norm across columns.
We omit the square root of the group sizes, as they are all constant in our case.
- torchdms.loss.product_penalty(betas)[source]¶
Computes l1 norm of product of betas across latent dimensions.