parameterizations
CloakStandardDeviationParameterization
¶
Bases: ScaledStandardDeviationParameterization
A parameterization of rhos tensors (on the domain of all real numbers) as standard deviations tensors (on the open domain of
min_scale
to max_scale
).
min_scale
is strictly less than max_scale
, and both must be nonnegative real numbers.
Added in version 0.11.0.
__init__
¶
__init__(scale: tuple[float, float] | Tensor = (0.0001, 2.0), shallow: float | Tensor = 1.0) -> None
Construct a layer to perform the reparameterization.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
scale |
tuple[float, float] | Tensor
|
The asymptotic minimum and maximum values of the parameterized standard deviations. |
(0.0001, 2.0)
|
shallow |
float | Tensor
|
A temperature-like parameter which controls the spread of the parameterization. Controls both the magnitude of
parameterized standard deviations and their rate of change with respect to rhos. |
1.0
|
Raises:
Type | Description |
---|---|
ValueError
|
If |
ValueError
|
If |
DirectStandardDeviationParameterization
¶
Bases: ScaledStandardDeviationParameterization
A direct parameterization of rhos tensors as standard deviations tensors (clamped into the closed domain of min_scale
to
max_scale
).
__init__
¶
Construct a layer to perform the reparameterization.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
scale |
tuple[float, float] | Tensor
|
The minimum and maximum values of the parameterized standard deviations. |
(0.0001, 2.0)
|
Raises:
Type | Description |
---|---|
ValueError
|
If |
ValueError
|
If |
ValueError
|
If |
ScaledStandardDeviationParameterization
¶
Bases: StandardDeviationParameterization
Defines the common structures necessary to parameterize rhos tensors (on the domain of all real numbers) as standard deviation
tensors (on the domain of min_scale
to max_scale
).
min_scale
is strictly less than max_scale
, and both must be nonnegative real numbers.
__init__
¶
Construct a layer to perform the reparameterization.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
scale |
tuple[float, float] | Tensor
|
The minimum and maximum values of the parameterized standard deviations. |
(0.0001, 2.0)
|
Raises:
Type | Description |
---|---|
ValueError
|
If |
ValueError
|
If |
ValueError
|
If |
StandardDeviationParameterization
¶
Defines the interface for the reparameterization of rhos tensors (on the domain of all real numbers) as standard deviation tensors (on the domain of nonnegative real numbers) of the applied transformation.
Rhos can be learned directly or estimated as the output of a neural network.
The derivative of this parameterization defines the rate of change of the standard deviations with respect rhos. When combined with a so-called "noise loss", used to penalize the standard deviations from deviating from the target distribution, and a task loss (i.e. classification, token prediction), we define a complete training objective and loss landscape for transform layers.
Added in version 0.11.0.