#[non_exhaustive]pub struct TextGenerationJobConfigBuilder { /* private fields */ }
Expand description
A builder for TextGenerationJobConfig
.
Implementations§
source§impl TextGenerationJobConfigBuilder
impl TextGenerationJobConfigBuilder
sourcepub fn completion_criteria(self, input: AutoMlJobCompletionCriteria) -> Self
pub fn completion_criteria(self, input: AutoMlJobCompletionCriteria) -> Self
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).
sourcepub fn set_completion_criteria(
self,
input: Option<AutoMlJobCompletionCriteria>
) -> Self
pub fn set_completion_criteria( self, input: Option<AutoMlJobCompletionCriteria> ) -> Self
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).
sourcepub fn get_completion_criteria(&self) -> &Option<AutoMlJobCompletionCriteria>
pub fn get_completion_criteria(&self) -> &Option<AutoMlJobCompletionCriteria>
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).
sourcepub fn base_model_name(self, input: impl Into<String>) -> Self
pub fn base_model_name(self, input: impl Into<String>) -> Self
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided, the default model used is Falcon7BInstruct.
sourcepub fn set_base_model_name(self, input: Option<String>) -> Self
pub fn set_base_model_name(self, input: Option<String>) -> Self
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided, the default model used is Falcon7BInstruct.
sourcepub fn get_base_model_name(&self) -> &Option<String>
pub fn get_base_model_name(&self) -> &Option<String>
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided, the default model used is Falcon7BInstruct.
sourcepub fn text_generation_hyper_parameters(
self,
k: impl Into<String>,
v: impl Into<String>
) -> Self
pub fn text_generation_hyper_parameters( self, k: impl Into<String>, v: impl Into<String> ) -> Self
Adds a key-value pair to text_generation_hyper_parameters
.
To override the contents of this collection use set_text_generation_hyper_parameters
.
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
-
"epochCount"
: The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of "1" to "10". -
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of "1" to "64". -
"learningRate"
: The step size at which a model's parameters are updated during training. Its value should be a string containing a floating-point value within the range of "0" to "1". -
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
sourcepub fn set_text_generation_hyper_parameters(
self,
input: Option<HashMap<String, String>>
) -> Self
pub fn set_text_generation_hyper_parameters( self, input: Option<HashMap<String, String>> ) -> Self
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
-
"epochCount"
: The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of "1" to "10". -
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of "1" to "64". -
"learningRate"
: The step size at which a model's parameters are updated during training. Its value should be a string containing a floating-point value within the range of "0" to "1". -
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
sourcepub fn get_text_generation_hyper_parameters(
&self
) -> &Option<HashMap<String, String>>
pub fn get_text_generation_hyper_parameters( &self ) -> &Option<HashMap<String, String>>
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
-
"epochCount"
: The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of "1" to "10". -
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of "1" to "64". -
"learningRate"
: The step size at which a model's parameters are updated during training. Its value should be a string containing a floating-point value within the range of "0" to "1". -
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
sourcepub fn build(self) -> TextGenerationJobConfig
pub fn build(self) -> TextGenerationJobConfig
Consumes the builder and constructs a TextGenerationJobConfig
.
Trait Implementations§
source§impl Clone for TextGenerationJobConfigBuilder
impl Clone for TextGenerationJobConfigBuilder
source§fn clone(&self) -> TextGenerationJobConfigBuilder
fn clone(&self) -> TextGenerationJobConfigBuilder
1.0.0 · source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moresource§impl Default for TextGenerationJobConfigBuilder
impl Default for TextGenerationJobConfigBuilder
source§fn default() -> TextGenerationJobConfigBuilder
fn default() -> TextGenerationJobConfigBuilder
source§impl PartialEq for TextGenerationJobConfigBuilder
impl PartialEq for TextGenerationJobConfigBuilder
source§fn eq(&self, other: &TextGenerationJobConfigBuilder) -> bool
fn eq(&self, other: &TextGenerationJobConfigBuilder) -> bool
self
and other
values to be equal, and is used
by ==
.