Skip to content

BinaryCrossEntropy

minnt.metrics.BinaryCrossEntropy

Bases: Mean

Binary cross-entropy metric implementation.

Source code in minnt/metrics/binary_cross_entropy.py
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
class BinaryCrossEntropy(Mean):
    """Binary cross-entropy metric implementation."""

    def __init__(self, *, label_smoothing: float = 0.0, probs: bool = False) -> None:
        """Create the BinaryCrossEntropy metric object.

        Parameters:
          label_smoothing: A float in [0.0, 1.0] specifying the label smoothing factor.
            If greater than 0.0, the used ground-truth targets are computed as a mixture
            of the original targets and uniform distribution with weight `1 - label_smoothing`.
          probs: If False, the predictions are assumed to be logits; if `True`, the
            predictions are assumed to be probabilities. Note that gold targets are
            always expected to be probabilities.
        """
        super().__init__()
        self._bce_loss = losses.BinaryCrossEntropy(label_smoothing=label_smoothing, probs=probs, reduction="none")

    def update(
        self, y: torch.Tensor, y_true: torch.Tensor, sample_weights: torch.Tensor | None = None,
    ) -> None:
        """Update the accumulated binary cross-entropy by introducing new values.

        Optional sample weight might be provided; if not, all values are weighted with 1.

        Parameters:
          y: The predicted outputs. Their shape either has to be exactly the same as `y_true` (no broadcasting),
            or can contain an additional single dimension of size 1.
          y_true: The ground-truth targets.
          sample_weights: Optional sample weights. If provided, their shape must be broadcastable
            to a prefix of a shape of `y_true`, and the loss for each sample is weighted accordingly.
        """
        super().update(self._bce_loss(y, y_true), sample_weights=sample_weights)

__init__

__init__(*, label_smoothing: float = 0.0, probs: bool = False) -> None

Create the BinaryCrossEntropy metric object.

Parameters:

  • label_smoothing (float, default: 0.0 ) –

    A float in [0.0, 1.0] specifying the label smoothing factor. If greater than 0.0, the used ground-truth targets are computed as a mixture of the original targets and uniform distribution with weight 1 - label_smoothing.

  • probs (bool, default: False ) –

    If False, the predictions are assumed to be logits; if True, the predictions are assumed to be probabilities. Note that gold targets are always expected to be probabilities.

Source code in minnt/metrics/binary_cross_entropy.py
15
16
17
18
19
20
21
22
23
24
25
26
27
def __init__(self, *, label_smoothing: float = 0.0, probs: bool = False) -> None:
    """Create the BinaryCrossEntropy metric object.

    Parameters:
      label_smoothing: A float in [0.0, 1.0] specifying the label smoothing factor.
        If greater than 0.0, the used ground-truth targets are computed as a mixture
        of the original targets and uniform distribution with weight `1 - label_smoothing`.
      probs: If False, the predictions are assumed to be logits; if `True`, the
        predictions are assumed to be probabilities. Note that gold targets are
        always expected to be probabilities.
    """
    super().__init__()
    self._bce_loss = losses.BinaryCrossEntropy(label_smoothing=label_smoothing, probs=probs, reduction="none")

update

update(y: Tensor, y_true: Tensor, sample_weights: Tensor | None = None) -> None

Update the accumulated binary cross-entropy by introducing new values.

Optional sample weight might be provided; if not, all values are weighted with 1.

Parameters:

  • y (Tensor) –

    The predicted outputs. Their shape either has to be exactly the same as y_true (no broadcasting), or can contain an additional single dimension of size 1.

  • y_true (Tensor) –

    The ground-truth targets.

  • sample_weights (Tensor | None, default: None ) –

    Optional sample weights. If provided, their shape must be broadcastable to a prefix of a shape of y_true, and the loss for each sample is weighted accordingly.

Source code in minnt/metrics/binary_cross_entropy.py
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
def update(
    self, y: torch.Tensor, y_true: torch.Tensor, sample_weights: torch.Tensor | None = None,
) -> None:
    """Update the accumulated binary cross-entropy by introducing new values.

    Optional sample weight might be provided; if not, all values are weighted with 1.

    Parameters:
      y: The predicted outputs. Their shape either has to be exactly the same as `y_true` (no broadcasting),
        or can contain an additional single dimension of size 1.
      y_true: The ground-truth targets.
      sample_weights: Optional sample weights. If provided, their shape must be broadcastable
        to a prefix of a shape of `y_true`, and the loss for each sample is weighted accordingly.
    """
    super().update(self._bce_loss(y, y_true), sample_weights=sample_weights)