fixed issue of passing empty parameter list to optimizer #9
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hi @Andras7, first I want to thank you for providing this code, it really is a big help. I ran into this error when trying to train the model:
Traceback (most recent call last):
File "/path/to/test.py", line 8, in
w2v_trainer.train()
File "/path/to/venv/lib/python3.8/site-packages/word2vec/trainer.py", line 37, in train
optimizer = optim.SparseAdam(self.skip_gram_model.parameters(), lr=self.initial_lr)
File "/path/to/venv/lib/python3.8/site-packages/torch/optim/sparse_adam.py", line 49, in init
super(SparseAdam, self).init(params, defaults)
File "/path/to/venv/lib/python3.8/site-packages/torch/optim/optimizer.py", line 47, in init
raise ValueError("optimizer got an empty parameter list")
ValueError: optimizer got an empty parameter list
So I went ahead and wrapped self.skip_gram_model.parameters() in list().
Might this be a version issue? I ran it with python=3.8, torch=1.7.1
Works for me now, hope this is fine with you? Feel free to accept the PR.