Skip to content

Commit

Permalink
remove unnecessary model
Browse files Browse the repository at this point in the history
  • Loading branch information
CuriousPanCake committed Oct 16, 2024
1 parent 0e32749 commit e60122d
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
if using cache-eviction) containing a map in the
following format with nodes number changes for each model:
pa_reference_map = {
ref_diff_map = {
"hf-internal-testing/tiny-random-LlamaForCausalLM" : {
"PagedAttentionExtension" : 2,
"ScaledDotProductAttention" : -2,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,5 +40,4 @@ Xenova/tiny-random-Phi3ForCausalLM,https://huggingface.co/Xenova/tiny-random-Phi
facebook/opt-125m,https://huggingface.co/facebook/opt-125m
facebook/opt-350m,https://huggingface.co/facebook/opt-350m
katuni4ka/tiny-random-chatglm2,https://huggingface.co/katuni4ka/tiny-random-chatglm2
katuni4ka/tiny-random-glm4,https://huggingface.co/katuni4ka/tiny-random-glm4
katuni4ka/tiny-random-orion,https://huggingface.co/katuni4ka/tiny-random-orion,xfail,No ScaledDotProductAttention operation observed in the graph CVS-145820
katuni4ka/tiny-random-glm4,https://huggingface.co/katuni4ka/tiny-random-glm4

0 comments on commit e60122d

Please sign in to comment.