Correctly support "DistributedDataParallel"
I got this error when trying to train DAN (teklia-dan train document
) with the parameter "use_ddp": True
:
File "/home/users/mblanco/dan/dan/manager/dataset.py", line 117, in load_dataloaders
self.train_loader = DataLoader(
File "/home/users/mblanco/dan/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 334, in __init__
raise ValueError('batch_sampler option is mutually exclusive '
ValueError: batch_sampler option is mutually exclusive with batch_size, shuffle, sampler, and drop_last