Skip to content
GitLab
Explore
Sign in
Register
Primary navigation
Search or go to…
Project
D
DAN
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Deploy
Releases
Package registry
Container Registry
Operate
Terraform modules
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Automatic Text Recognition
DAN
Merge requests
!229
Support DistributedDataParallel
Code
Review changes
Check out branch
Download
Patches
Plain diff
Merged
Support DistributedDataParallel
fix-ddp
into
main
Overview
5
Commits
3
Pipelines
0
Changes
5
Merged
Mélodie Boillet
requested to merge
fix-ddp
into
main
1 year ago
Overview
5
Commits
3
Pipelines
0
Changes
5
Expand
Closes
#116 (closed)
With this fix, we can
Train on multiple GPUs
Continue a training started with 1 GPU on multiple GPUs
Continue a training started with multiple GPUs on 1 GPU
Load any pre-trained model (trained with 0, 1 or more GPUs)
It also closes
#143 (closed)
and
#118 (closed)
Edited
1 year ago
by
Yoann Schneider
0
0
Merge request reports
Activity
All activity
Filter activity
Deselect all
Approvals
Assignees & reviewers
Comments (from bots)
Comments (from users)
Commits & branches
Edits
Labels
Lock status
Mentions
Merge request status
Tracking
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Please
register
or
sign in
to reply
Aug 07, 2023
Add fix_ddp_layers_names function
· 41872136
Mélodie Boillet
authored
1 year ago
41872136
Document training on multiple GPUs
· 27001cba
Mélodie Boillet
authored
1 year ago
27001cba
Support DistributedDataParallel
· bc660a06
Mélodie Boillet
authored
1 year ago
and
Mélodie Boillet
committed
1 year ago
bc660a06
Loading