Skip to content

huggingface

Modules:

Name Description
data_collator
tokenization_utils
transform

Classes:

Name Description
SGHFLM

Eval harness model class for evaluating Stained Glass Transforms.

SGHFLM

Bases: HFLM

Eval harness model class for evaluating Stained Glass Transforms.

Parameters:

Name Type Description Default

transform_model_path

str

The path to the Stained Glass Transform.

required

apply_stainedglass

bool

Whether to apply the Stained Glass Transform.

required

args

Any

Additional arguments to pass to the parent class.

()

kwargs

Any

Additional keyword arguments to pass to the parent class.

{}

Methods:

Name Description
apply_chat_template

Return the context from the chat history because the template is applied in the _encode_pair method.

tok_batch_encode

Encode a batch of strings into input ids and attention masks.

apply_chat_template

apply_chat_template(chat_history: list[dict[str, str]]) -> str

Return the context from the chat history because the template is applied in the _encode_pair method.

Parameters:

Name Type Description Default

chat_history

list[dict[str, str]]

The chat history.

required

Raises:

Type Description
ValueError

If the chat history is not a single user message.

Returns:

Type Description
str

The context.

tok_batch_encode

tok_batch_encode(strings: list[str], padding_side: str = 'left', left_truncate_len: int | None = None, truncation: bool = False) -> tuple[torch.Tensor, torch.Tensor]

Encode a batch of strings into input ids and attention masks.

Parameters:

Name Type Description Default

strings

list[str]

The list of strings to encode.

required

padding_side

str

The side to pad the sequences.

'left'

left_truncate_len

int | None

The length to truncate the left side of the sequences.

None

truncation

bool

Whether to truncate the sequences.

False

Returns:

Type Description
tuple[torch.Tensor, torch.Tensor]

A tuple of input embeddings and attention masks if apply_stainedglass is True, otherwise the input ids and attention masks.