#[non_exhaustive]pub struct TextGenerationJobConfigBuilder { /* private fields */ }
Expand description
A builder for TextGenerationJobConfig
.
Implementations§
Source§impl TextGenerationJobConfigBuilder
impl TextGenerationJobConfigBuilder
Sourcepub fn completion_criteria(self, input: AutoMlJobCompletionCriteria) -> Self
pub fn completion_criteria(self, input: AutoMlJobCompletionCriteria) -> Self
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).
Sourcepub fn set_completion_criteria(
self,
input: Option<AutoMlJobCompletionCriteria>,
) -> Self
pub fn set_completion_criteria( self, input: Option<AutoMlJobCompletionCriteria>, ) -> Self
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).
Sourcepub fn get_completion_criteria(&self) -> &Option<AutoMlJobCompletionCriteria>
pub fn get_completion_criteria(&self) -> &Option<AutoMlJobCompletionCriteria>
How long a fine-tuning job is allowed to run. For TextGenerationJobConfig
problem types, the MaxRuntimePerTrainingJobInSeconds
attribute of AutoMLJobCompletionCriteria
defaults to 72h (259200s).
Sourcepub fn base_model_name(self, input: impl Into<String>) -> Self
pub fn base_model_name(self, input: impl Into<String>) -> Self
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided, the default model used is Falcon7BInstruct.
Sourcepub fn set_base_model_name(self, input: Option<String>) -> Self
pub fn set_base_model_name(self, input: Option<String>) -> Self
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided, the default model used is Falcon7BInstruct.
Sourcepub fn get_base_model_name(&self) -> &Option<String>
pub fn get_base_model_name(&self) -> &Option<String>
The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName
is provided, the default model used is Falcon7BInstruct.
Sourcepub fn text_generation_hyper_parameters(
self,
k: impl Into<String>,
v: impl Into<String>,
) -> Self
pub fn text_generation_hyper_parameters( self, k: impl Into<String>, v: impl Into<String>, ) -> Self
Adds a key-value pair to text_generation_hyper_parameters
.
To override the contents of this collection use set_text_generation_hyper_parameters
.
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
-
"epochCount"
: The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of "1" to "10". -
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of "1" to "64". -
"learningRate"
: The step size at which a model's parameters are updated during training. Its value should be a string containing a floating-point value within the range of "0" to "1". -
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
Sourcepub fn set_text_generation_hyper_parameters(
self,
input: Option<HashMap<String, String>>,
) -> Self
pub fn set_text_generation_hyper_parameters( self, input: Option<HashMap<String, String>>, ) -> Self
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
-
"epochCount"
: The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of "1" to "10". -
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of "1" to "64". -
"learningRate"
: The step size at which a model's parameters are updated during training. Its value should be a string containing a floating-point value within the range of "0" to "1". -
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
Sourcepub fn get_text_generation_hyper_parameters(
&self,
) -> &Option<HashMap<String, String>>
pub fn get_text_generation_hyper_parameters( &self, ) -> &Option<HashMap<String, String>>
The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.
-
"epochCount"
: The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of "1" to "10". -
"batchSize"
: The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of "1" to "64". -
"learningRate"
: The step size at which a model's parameters are updated during training. Its value should be a string containing a floating-point value within the range of "0" to "1". -
"learningRateWarmupSteps"
: The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of "0" to "250".
Here is an example where all four hyperparameters are configured.
{ "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }
Sourcepub fn model_access_config(self, input: ModelAccessConfig) -> Self
pub fn model_access_config(self, input: ModelAccessConfig) -> Self
The access configuration file to control access to the ML model. You can explicitly accept the model end-user license agreement (EULA) within the ModelAccessConfig
.
-
If you are a Jumpstart user, see the End-user license agreements section for more details on accepting the EULA.
-
If you are an AutoML user, see the Optional Parameters section of Create an AutoML job to fine-tune text generation models using the API for details on How to set the EULA acceptance when fine-tuning a model using the AutoML API.
Sourcepub fn set_model_access_config(self, input: Option<ModelAccessConfig>) -> Self
pub fn set_model_access_config(self, input: Option<ModelAccessConfig>) -> Self
The access configuration file to control access to the ML model. You can explicitly accept the model end-user license agreement (EULA) within the ModelAccessConfig
.
-
If you are a Jumpstart user, see the End-user license agreements section for more details on accepting the EULA.
-
If you are an AutoML user, see the Optional Parameters section of Create an AutoML job to fine-tune text generation models using the API for details on How to set the EULA acceptance when fine-tuning a model using the AutoML API.
Sourcepub fn get_model_access_config(&self) -> &Option<ModelAccessConfig>
pub fn get_model_access_config(&self) -> &Option<ModelAccessConfig>
The access configuration file to control access to the ML model. You can explicitly accept the model end-user license agreement (EULA) within the ModelAccessConfig
.
-
If you are a Jumpstart user, see the End-user license agreements section for more details on accepting the EULA.
-
If you are an AutoML user, see the Optional Parameters section of Create an AutoML job to fine-tune text generation models using the API for details on How to set the EULA acceptance when fine-tuning a model using the AutoML API.
Sourcepub fn build(self) -> TextGenerationJobConfig
pub fn build(self) -> TextGenerationJobConfig
Consumes the builder and constructs a TextGenerationJobConfig
.
Trait Implementations§
Source§impl Clone for TextGenerationJobConfigBuilder
impl Clone for TextGenerationJobConfigBuilder
Source§fn clone(&self) -> TextGenerationJobConfigBuilder
fn clone(&self) -> TextGenerationJobConfigBuilder
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moreSource§impl Default for TextGenerationJobConfigBuilder
impl Default for TextGenerationJobConfigBuilder
Source§fn default() -> TextGenerationJobConfigBuilder
fn default() -> TextGenerationJobConfigBuilder
Source§impl PartialEq for TextGenerationJobConfigBuilder
impl PartialEq for TextGenerationJobConfigBuilder
Source§fn eq(&self, other: &TextGenerationJobConfigBuilder) -> bool
fn eq(&self, other: &TextGenerationJobConfigBuilder) -> bool
self
and other
values to be equal, and is used by ==
.impl StructuralPartialEq for TextGenerationJobConfigBuilder
Auto Trait Implementations§
impl Freeze for TextGenerationJobConfigBuilder
impl RefUnwindSafe for TextGenerationJobConfigBuilder
impl Send for TextGenerationJobConfigBuilder
impl Sync for TextGenerationJobConfigBuilder
impl Unpin for TextGenerationJobConfigBuilder
impl UnwindSafe for TextGenerationJobConfigBuilder
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left
is true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left(&self)
returns true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§impl<T> Paint for Twhere
T: ?Sized,
impl<T> Paint for Twhere
T: ?Sized,
Source§fn fg(&self, value: Color) -> Painted<&T>
fn fg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the foreground set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like red()
and
green()
, which have the same functionality but are
pithier.
§Example
Set foreground color to white using fg()
:
use yansi::{Paint, Color};
painted.fg(Color::White);
Set foreground color to white using white()
.
use yansi::Paint;
painted.white();
Source§fn bright_black(&self) -> Painted<&T>
fn bright_black(&self) -> Painted<&T>
Source§fn bright_red(&self) -> Painted<&T>
fn bright_red(&self) -> Painted<&T>
Source§fn bright_green(&self) -> Painted<&T>
fn bright_green(&self) -> Painted<&T>
Source§fn bright_yellow(&self) -> Painted<&T>
fn bright_yellow(&self) -> Painted<&T>
Source§fn bright_blue(&self) -> Painted<&T>
fn bright_blue(&self) -> Painted<&T>
Source§fn bright_magenta(&self) -> Painted<&T>
fn bright_magenta(&self) -> Painted<&T>
Source§fn bright_cyan(&self) -> Painted<&T>
fn bright_cyan(&self) -> Painted<&T>
Source§fn bright_white(&self) -> Painted<&T>
fn bright_white(&self) -> Painted<&T>
Source§fn bg(&self, value: Color) -> Painted<&T>
fn bg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the background set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like on_red()
and
on_green()
, which have the same functionality but
are pithier.
§Example
Set background color to red using fg()
:
use yansi::{Paint, Color};
painted.bg(Color::Red);
Set background color to red using on_red()
.
use yansi::Paint;
painted.on_red();
Source§fn on_primary(&self) -> Painted<&T>
fn on_primary(&self) -> Painted<&T>
Source§fn on_magenta(&self) -> Painted<&T>
fn on_magenta(&self) -> Painted<&T>
Source§fn on_bright_black(&self) -> Painted<&T>
fn on_bright_black(&self) -> Painted<&T>
Source§fn on_bright_red(&self) -> Painted<&T>
fn on_bright_red(&self) -> Painted<&T>
Source§fn on_bright_green(&self) -> Painted<&T>
fn on_bright_green(&self) -> Painted<&T>
Source§fn on_bright_yellow(&self) -> Painted<&T>
fn on_bright_yellow(&self) -> Painted<&T>
Source§fn on_bright_blue(&self) -> Painted<&T>
fn on_bright_blue(&self) -> Painted<&T>
Source§fn on_bright_magenta(&self) -> Painted<&T>
fn on_bright_magenta(&self) -> Painted<&T>
Source§fn on_bright_cyan(&self) -> Painted<&T>
fn on_bright_cyan(&self) -> Painted<&T>
Source§fn on_bright_white(&self) -> Painted<&T>
fn on_bright_white(&self) -> Painted<&T>
Source§fn attr(&self, value: Attribute) -> Painted<&T>
fn attr(&self, value: Attribute) -> Painted<&T>
Enables the styling Attribute
value
.
This method should be used rarely. Instead, prefer to use
attribute-specific builder methods like bold()
and
underline()
, which have the same functionality
but are pithier.
§Example
Make text bold using attr()
:
use yansi::{Paint, Attribute};
painted.attr(Attribute::Bold);
Make text bold using using bold()
.
use yansi::Paint;
painted.bold();
Source§fn rapid_blink(&self) -> Painted<&T>
fn rapid_blink(&self) -> Painted<&T>
Source§fn quirk(&self, value: Quirk) -> Painted<&T>
fn quirk(&self, value: Quirk) -> Painted<&T>
Enables the yansi
Quirk
value
.
This method should be used rarely. Instead, prefer to use quirk-specific
builder methods like mask()
and
wrap()
, which have the same functionality but are
pithier.
§Example
Enable wrapping using .quirk()
:
use yansi::{Paint, Quirk};
painted.quirk(Quirk::Wrap);
Enable wrapping using wrap()
.
use yansi::Paint;
painted.wrap();
Source§fn clear(&self) -> Painted<&T>
👎Deprecated since 1.0.1: renamed to resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.
fn clear(&self) -> Painted<&T>
resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.Source§fn whenever(&self, value: Condition) -> Painted<&T>
fn whenever(&self, value: Condition) -> Painted<&T>
Conditionally enable styling based on whether the Condition
value
applies. Replaces any previous condition.
See the crate level docs for more details.
§Example
Enable styling painted
only when both stdout
and stderr
are TTYs:
use yansi::{Paint, Condition};
painted.red().on_yellow().whenever(Condition::STDOUTERR_ARE_TTY);