Hello! Thanks so much for your entry! We've successfully evaluated your checkpoint and the quality checks out! And we'd like to say that we greatly appreciate the organization and quality of the code.
One question on your quantization scoring: In your report you say that you count the additions and multiplications separately, but in flop_counter.py it looks like you sum them together and scale both by the reduced precision factor:
Linear/Conv Counting:
https://github.com/yashbhalgat/QualcommAI-MicroNet-submission-MixNet/blob/master/lsq_quantizer/flops_counter.py#L286
https://github.com/yashbhalgat/QualcommAI-MicroNet-submission-MixNet/blob/master/lsq_quantizer/flops_counter.py#L346
Quantization Scaling:
|
mod_flops = module.__flops__*max(w_str[quant_idx], a_str[quant_idx])/32.0 |
Am I understanding this correct? It looks like you're properly rounding the weights and activations prior to each linear operation during evaluation, but the additions in these kernels should be counted as FP32 unless I'm missing something.
Trevor
Hello! Thanks so much for your entry! We've successfully evaluated your checkpoint and the quality checks out! And we'd like to say that we greatly appreciate the organization and quality of the code.
One question on your quantization scoring: In your report you say that you count the additions and multiplications separately, but in flop_counter.py it looks like you sum them together and scale both by the reduced precision factor:
Linear/Conv Counting:
https://github.com/yashbhalgat/QualcommAI-MicroNet-submission-MixNet/blob/master/lsq_quantizer/flops_counter.py#L286
https://github.com/yashbhalgat/QualcommAI-MicroNet-submission-MixNet/blob/master/lsq_quantizer/flops_counter.py#L346
Quantization Scaling:
QualcommAI-MicroNet-submission-MixNet/lsq_quantizer/flops_counter.py
Line 176 in 0d67473
Am I understanding this correct? It looks like you're properly rounding the weights and activations prior to each linear operation during evaluation, but the additions in these kernels should be counted as FP32 unless I'm missing something.
Trevor