BinaryCrossEntropy
minnt.metrics.BinaryCrossEntropy
Bases: Mean
Binary cross-entropy metric implementation.
Source code in minnt/metrics/binary_cross_entropy.py
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 | |
__init__
Create the BinaryCrossEntropy metric object.
Parameters:
-
label_smoothing(float, default:0.0) –A float in [0.0, 1.0] specifying the label smoothing factor. If greater than 0.0, the used ground-truth targets are computed as a mixture of the original targets and uniform distribution with weight
1 - label_smoothing. -
probs(bool, default:False) –If False, the predictions are assumed to be logits; if
True, the predictions are assumed to be probabilities. Note that gold targets are always expected to be probabilities.
Source code in minnt/metrics/binary_cross_entropy.py
15 16 17 18 19 20 21 22 23 24 25 26 27 | |
update
Update the accumulated binary cross-entropy by introducing new values.
Optional sample weight might be provided; if not, all values are weighted with 1.
Parameters:
-
y(Tensor) –The predicted outputs. Their shape either has to be exactly the same as
y_true(no broadcasting), or can contain an additional single dimension of size 1. -
y_true(Tensor) –The ground-truth targets.
-
sample_weights(Tensor | None, default:None) –Optional sample weights. If provided, their shape must be broadcastable to a prefix of a shape of
y_true, and the loss for each sample is weighted accordingly.
Source code in minnt/metrics/binary_cross_entropy.py
29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 | |