Skip to content
GitLab
Explore
Sign in
Register
Primary navigation
Search or go to…
Project
D
DAN
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Deploy
Releases
Package registry
Container Registry
Operate
Terraform modules
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Automatic Text Recognition
DAN
Merge requests
!141
Remove growing models
Code
Review changes
Check out branch
Download
Patches
Plain diff
Merged
Remove growing models
remove-growing-models
into
main
Overview
0
Commits
1
Pipelines
0
Changes
1
Merged
Mélodie Boillet
requested to merge
remove-growing-models
into
main
1 year ago
Overview
0
Commits
1
Pipelines
0
Changes
1
Expand
Closes
#60 (closed)
0
0
Merge request reports
Compare
main
main (base)
and
latest version
latest version
3ddaebef
1 commit,
1 year ago
1 file
+
0
−
18
Inline
Compare changes
Side-by-side
Inline
Show whitespace changes
Show one file at a time
dan/manager/training.py
+
0
−
18
Options
@@ -357,7 +357,6 @@ class GenericTrainingManager:
Load the optimizer of each model
"""
for
model_name
in
self
.
models
.
keys
():
new_params
=
dict
()
if
(
checkpoint
and
"
optimizer_named_params_{}
"
.
format
(
model_name
)
in
checkpoint
@@ -365,16 +364,6 @@ class GenericTrainingManager:
self
.
optimizers_named_params_by_group
[
model_name
]
=
checkpoint
[
"
optimizer_named_params_{}
"
.
format
(
model_name
)
]
# for progressively growing models
for
name
,
param
in
self
.
models
[
model_name
].
named_parameters
():
existing
=
False
for
gr
in
self
.
optimizers_named_params_by_group
[
model_name
]:
if
name
in
gr
:
gr
[
name
]
=
param
existing
=
True
break
if
not
existing
:
new_params
.
update
({
name
:
param
})
else
:
self
.
optimizers_named_params_by_group
[
model_name
]
=
[
dict
(),
@@ -420,13 +409,6 @@ class GenericTrainingManager:
checkpoint
[
"
lr_scheduler_{}_state_dict
"
.
format
(
model_name
)]
)
# for progressively growing models, keeping learning rate
if
checkpoint
and
new_params
:
self
.
optimizers_named_params_by_group
[
model_name
].
append
(
new_params
)
self
.
optimizers
[
model_name
].
add_param_group
(
{
"
params
"
:
list
(
new_params
.
values
())}
)
@staticmethod
def
set_model_learnable
(
model
,
learnable
=
True
):
for
p
in
list
(
model
.
parameters
()):
Loading