Skip to content

Commit

Permalink
fix: default proxyllm generator function (#971)
Browse files Browse the repository at this point in the history
  • Loading branch information
xtyuns authored Dec 25, 2023
1 parent 0c46c33 commit 048fb6c
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion dbgpt/model/llm_out/proxy_llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def proxyllm_generate_stream(
model_name = model_params.model_name
default_error_message = f"{model_name} LLM is not supported"
generator_function = generator_mapping.get(
model_name, lambda: default_error_message
model_name, lambda *args: [default_error_message]
)

yield from generator_function(model, tokenizer, params, device, context_len)

0 comments on commit 048fb6c

Please sign in to comment.