Skip to content

model_handler

Model handlers for BFCL integration with Stained Glass.

Classes:

Name Description
LlamaChatCompletionsHandler

Handler for the Llama models in chat/completions mode for function calling and

LlamaChatCompletionsHandler

Bases: OpenAICompletionsHandler

Handler for the Llama models in chat/completions mode for function calling and uses the same system prompt used when using the Llama model with completions request. The chat template can be used directly or by removing the default Huggingface system prompt. According to the Llama model card, function calling should be handled differently than what is suggested by the standard Hugging Face chat template. For more details, see: https://www.llama.com/docs/model-cards-and-prompt-formats/llama4_omni/#-zero-shot-function-calling---system-message- This applies to all Llama 3 and Llama 4 series models.

In addition, because Llama uses the same system prompt as the default BFCL system prompt that's normally provided to the model in "prompt mode", the constructed formatted prompt string remains same in both modes. As a result, we will not have separate "prompt mode" for Llama models to avoid confusion.

Parameters:

Name Type Description Default

model_name

str

Name of the model to be evaluated based on the supported models in BFCL documentation.

required

temperature

float

Temperature for generation.

required

registry_name

str

Internal bfcl name for the model.

required

is_fc_model

bool

If the evaluation will be in function calling mode or not.

required

**kwargs

Any

Other key word arguments.

{}

Raises:

Type Description
ValueError

If Llama model is not in the name.

Added in version v3.6.3.