BinaryCrossEntropy
minnt.losses.BinaryCrossEntropy
Bases: Loss
Binary cross-entropy loss implementation.
Source code in minnt/losses/binary_cross_entropy.py
13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 | |
__init__
__init__(
*,
label_smoothing: float = 0.0,
probs: bool = False,
reduction: Reduction = "mean"
) -> None
Create the BinaryCrossEntropy loss object with the specified reduction method.
Parameters:
-
label_smoothing(float, default:0.0) –A float in [0.0, 1.0] specifying the label smoothing factor. If greater than 0.0, the used ground-truth targets are computed as a mixture of the original targets and uniform distribution with weight
1 - label_smoothing. -
probs(bool, default:False) –If False, the predictions are assumed to be logits; if
True, the predictions are assumed to be probabilities. Note that gold targets are always expected to be probabilities. -
reduction(Reduction, default:'mean') –The reduction method to apply to the computed loss.
Source code in minnt/losses/binary_cross_entropy.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 | |
__call__
Compute the binary cross-entropy loss, optionally with sample weights.
Parameters:
-
y(Tensor) –The predicted outputs. Their shape either has to be exactly the same as
y_true(no broadcasting), or can contain an additional single dimension of size 1. -
y_true(Tensor) –The ground-truth targets.
-
sample_weights(Tensor | None, default:None) –Optional sample weights. If provided, their shape must be broadcastable to a prefix of a shape of
y_true, and the loss for each sample is weighted accordingly.
Returns:
-
Tensor–A tensor representing the computed loss. A scalar tensor if reduction is
"mean"or"sum"; otherwise (if reduction is"none"), a tensor of the same shape asy_true.
Source code in minnt/losses/binary_cross_entropy.py
36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 | |