bitorch.layers.qactivation.QActivation

class bitorch.layers.qactivation.QActivation(activation: Optional[Union[Quantization, str]] = None, gradient_cancellation_threshold: Optional[float] = 0.0)[source]

Activation layer for quantization

Methods

__init__

initialization function for fetching suitable activation function.

forward

Forwards input tensor through activation function.

Attributes

__init__(activation: Optional[Union[Quantization, str]] = None, gradient_cancellation_threshold: Optional[float] = 0.0) None[source]

initialization function for fetching suitable activation function.

Parameters:
  • activation (Union[str, Quantization], optional) – quantization module or name of quantization function. Defaults to None.

  • gradient_cancellation_threshold (Optional[float], optional) – threshold for input gradient cancellation. Disabled if threshold is 0.

forward(input_tensor: Tensor) Tensor[source]

Forwards input tensor through activation function.

Parameters:

input_tensor (torch.Tensor) – input tensor

Returns:

quantized input tensor.

Return type:

torch.Tensor