Skip to content
This repository has been archived by the owner on Jun 10, 2021. It is now read-only.

non recurrent mode for variational dropout #390

Open
wants to merge 8 commits into
base: master
Choose a base branch
from

Conversation

jsenellart
Copy link
Contributor

variational dropout as described in Gal et al., 2016 does not have the expected result for NMT. Adding a new mode variational_non_recurrent for further exploration.

@codecov-io
Copy link

codecov-io commented Oct 4, 2017

Codecov Report

Merging #390 into master will decrease coverage by <.01%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #390      +/-   ##
==========================================
- Coverage   69.62%   69.61%   -0.01%     
==========================================
  Files          74       74              
  Lines        6282     6290       +8     
==========================================
+ Hits         4374     4379       +5     
- Misses       1908     1911       +3
Impacted Files Coverage Δ
onmt/modules/Encoder.lua 92.2% <100%> (ø) ⬆️
onmt/modules/LSTM.lua 98.3% <100%> (ø) ⬆️
onmt/modules/GRU.lua 94.44% <100%> (ø) ⬆️
onmt/train/Trainer.lua 17.69% <0%> (-0.21%) ⬇️
onmt/data/Preprocessor.lua 92.97% <0%> (+0.05%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b3b26fc...7cf6909. Read the comment docs.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants