Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Which layer's activation is used? #1

Open
iyupan opened this issue Feb 28, 2024 · 1 comment
Open

Which layer's activation is used? #1

iyupan opened this issue Feb 28, 2024 · 1 comment

Comments

@iyupan
Copy link

iyupan commented Feb 28, 2024

Hello,

This is great work! And I wonder about the layer that the analyzed activations are from. The last layer?

@QiaoranC
Copy link

This is in section 2.1 Which Layers?

In LLaMA2-7B, massive activations first appear in layer 2 and remain nearly constant values until layer 30. Intriguingly, for LLaMA2-7B and 13B, massive activations emerge very rapidly from one layer of computation, e.g., layer 2 and layer 4 respectively. This means that they do not emerge as a result of gradual accumulation through many layers, and are caused by a rather different mechanism.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants