bitorch.layers.qactivation.QActivation¶
- class bitorch.layers.qactivation.QActivation(activation: Optional[Union[Quantization, str]] = None, gradient_cancellation_threshold: Optional[float] = 0.0)[source]¶
Activation layer for quantization
Methods
initialization function for fetching suitable activation function.
Forwards input tensor through activation function.
Attributes
- __init__(activation: Optional[Union[Quantization, str]] = None, gradient_cancellation_threshold: Optional[float] = 0.0) None [source]¶
initialization function for fetching suitable activation function.
- Parameters:
activation (Union[str, Quantization], optional) – quantization module or name of quantization function. Defaults to None.
gradient_cancellation_threshold (Optional[float], optional) – threshold for input gradient cancellation. Disabled if threshold is 0.