Common objective (loss) functions

Attribution

This section introduces the FlatLoss class. This code is, for the most part, copied by the fast.ai library.

Wrappers

Below we define several wrappers for well known losses that are defined and implemented in the Pytorch library. The main idea is that we need to flatten our prediction before we pass them accordingly to the chosen loss function.

class FlatLoss[source]

FlatLoss(func:_Loss, axis:int=-1, to_float:bool=False, is_2d:bool=False, **kwargs)

Same as whatever func is, but with flattened input and target.

The FlatLoss class creates a callable that will do whatever the function that we pass would do, but with flattened input and target before the operation.

Common losses

FlatCrossEntropyLoss[source]

FlatCrossEntropyLoss(axis:int=-1, to_float:bool=True, is_2d:bool=False, **kwargs)

Same as nn.CrossEntropyLoss, but with flattened input and target.

FlatBCELoss[source]

FlatBCELoss(axis:int=-1, to_float:bool=True, is_2d:bool=False, **kwargs)

Same as nn.BCELoss, but with flattened input and target.

FlatMSELoss[source]

FlatMSELoss(axis:int=-1, to_float:bool=True, is_2d:bool=False, **kwargs)

Same as nn.MSELoss, but with flattened input and target.