-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some answers in phi3-vision just return </s> #823
Comments
I have noticed it works better if I leave out the newline characters |
(Same things happen in version 0.4.0) And there is no way to tell it to ignore this token and pick a different one. There needs to be a way to allow us to have more control on which tokens we choose from the distribution. |
The original You can also update your
to
|
@elephantpanda did you still see the issue when you update the EOS token? |
Yes, still seems to end without an answer for certain inputs. Although my bigger problem with the vision model is it is crashing for me when trying to give it an image. But I think I narrowed that problem down with more details here. |
@elephantpanda Did you happen to try this on CPU? |
It's the same on GPU and CPU. If I don't include a newline character after "<|assistant|>" in the prompt, that seems to fix it as far as I can tell. I'll keep trying to see if this really does fix it. |
In phi-3 vision directml using either python or c# certain questions just return
</s>
For example "Why is the sky blue?" returns a complete answer but "What is the capital of France?" consistently returns just
</s>
(i.e. it returns blank).Not sure this is a bug specifically with genai but it may be a bug with the model itself. Either way, is there a suggested way to combat this? And has anyone else noticed this?
Perhaps genai needs to be fixed so that it doesn't generate the "STOP" token as it's first token! (Also it shouldn't be writing the stop token to the screen)
The text was updated successfully, but these errors were encountered: