Skip to content
Snippets Groups Projects

Always use 1D and 2D positional embedding

Merged Manon Blanco requested to merge fix-positional-embedding into main
6 files
+ 2
18
Compare changes
  • Side-by-side
  • Inline
Files
6
+ 0
2
@@ -151,8 +151,6 @@ def get_config():
"dec_pred_dropout": 0.1, # dropout rate before decision layer
"dec_att_dropout": 0.1, # dropout rate in multi head attention
"dec_dim_feedforward": 256, # number of dimension for feedforward layer in transformer decoder layers
"use_2d_pe": True, # use 2D positional embedding
"use_1d_pe": True, # use 1D positional embedding
"attention_win": 100, # length of attention window
# Curriculum dropout
"dropout_scheduler": {
Loading