bitorch.models.dlrm.create_mlp

bitorch.models.dlrm.create_mlp(layer_sizes: List[int], quantized: bool = False) Sequential[source]

creates a mlp module

Parameters:

layer_sizes (List[int]) – linear layer unit sizes

for size in enumerate(layer_sizes_str.split(“,”)):
parsed_layer_sizes.append(int(size))oid activation function.

all other layers will have relu activation.