Drop the last layer weights during fine-tuning if the number of classes if different
At the moment, the pre-trained models are fully loaded for fine-tuning. We should also support cases where the number of classes is different between the pre-trained model and the fine-tuning. For this, it is necessary to reset the weights of the last layer of the model.
See https://gitlab.com/teklia/dla/doc-ufcn/-/blob/main/doc_ufcn/model.py#L41