[][src]Struct google_ml1::GoogleCloudMlV1__Version

pub struct GoogleCloudMlV1__Version {
    pub error_message: Option<String>,
    pub machine_type: Option<String>,
    pub description: Option<String>,
    pub runtime_version: Option<String>,
    pub manual_scaling: Option<GoogleCloudMlV1__ManualScaling>,
    pub labels: Option<HashMap<String, String>>,
    pub framework: Option<String>,
    pub create_time: Option<String>,
    pub name: Option<String>,
    pub prediction_class: Option<String>,
    pub auto_scaling: Option<GoogleCloudMlV1__AutoScaling>,
    pub service_account: Option<String>,
    pub package_uris: Option<Vec<String>>,
    pub python_version: Option<String>,
    pub state: Option<String>,
    pub etag: Option<String>,
    pub last_use_time: Option<String>,
    pub deployment_uri: Option<String>,
    pub is_default: Option<bool>,
}

Represents a version of the model.

Each version is a trained model deployed in the cloud, ready to handle prediction requests. A model can have multiple versions. You can get information about all of the versions of a given model by calling projects.models.versions.list.

Activities

This type is used in activities, which are methods you may call on this type or where this type is involved in. The list links the activity name, along with information about where it is used (one of request and response).

Fields

error_message: Option<String>

Output only. The details of a failure or a cancellation.

machine_type: Option<String>

Optional. The type of machine on which to serve the model. Currently only applies to online prediction service.

mls1-c1-m2
The default machine type, with 1 core and 2 GB RAM. The deprecated name for this machine type is "mls1-highmem-1".
mls1-c4-m2
In Beta. This machine type has 4 cores and 2 GB RAM. The deprecated name for this machine type is "mls1-highcpu-4".
description: Option<String>

Optional. The description specified for the version when it was created.

runtime_version: Option<String>

Optional. The AI Platform runtime version to use for this deployment. If not set, AI Platform uses the default stable version, 1.0. For more information, see the runtime version list and how to manage runtime versions.

manual_scaling: Option<GoogleCloudMlV1__ManualScaling>

Manually select the number of nodes to use for serving the model. You should generally use auto_scaling with an appropriate min_nodes instead, but this option is available if you want more predictable billing. Beware that latency and error rates will increase if the traffic exceeds that capability of the system to serve it based on the selected number of nodes.

labels: Option<HashMap<String, String>>

Optional. One or more labels that you can add, to organize your model versions. Each label is a key-value pair, where both the key and the value are arbitrary strings that you supply. For more information, see the documentation on using labels.

framework: Option<String>

Optional. The machine learning framework AI Platform uses to train this version of the model. Valid values are TENSORFLOW, SCIKIT_LEARN, XGBOOST. If you do not specify a framework, AI Platform will analyze files in the deployment_uri to determine a framework. If you choose SCIKIT_LEARN or XGBOOST, you must also set the runtime version of the model to 1.4 or greater.

Do not specify a framework if you're deploying a custom prediction routine.

create_time: Option<String>

Output only. The time the version was created.

name: Option<String>

Required.The name specified for the version when it was created.

The version name must be unique within the model it is created in.

prediction_class: Option<String>

Optional. The fully qualified name (module_name.class_name) of a class that implements the Predictor interface described in this reference field. The module containing this class should be included in a package provided to the packageUris field.

Specify this field if and only if you are deploying a custom prediction routine (beta). If you specify this field, you must set runtimeVersion to 1.4 or greater.

The following code sample provides the Predictor interface:

class Predictor(object):
"""Interface for constructing custom predictors."""
 
def predict(self, instances, **kwargs):
    """Performs custom prediction.
 
    Instances are the decoded values from the request. They have already
    been deserialized from JSON.
 
    Args:
        instances: A list of prediction input instances.
        **kwargs: A dictionary of keyword args provided as additional
            fields on the predict request body.
 
    Returns:
        A list of outputs containing the prediction results. This list must
        be JSON serializable.
    """
    raise NotImplementedError()
 
@classmethod
def from_path(cls, model_dir):
    """Creates an instance of Predictor using the given path.
 
    Loading of the predictor should be done in this method.
 
    Args:
        model_dir: The local directory that contains the exported model
            file along with any additional files uploaded when creating the
            version resource.
 
    Returns:
        An instance implementing this Predictor class.
    """
    raise NotImplementedError()

Learn more about the Predictor interface and custom prediction routines.

auto_scaling: Option<GoogleCloudMlV1__AutoScaling>

Automatically scale the number of nodes used to serve the model in response to increases and decreases in traffic. Care should be taken to ramp up traffic according to the model's ability to scale or you will start seeing increases in latency and 429 response codes.

service_account: Option<String>

Optional. Specifies the service account for resource access control.

package_uris: Option<Vec<String>>

Optional. Cloud Storage paths (gs://…) of packages for custom prediction routines or scikit-learn pipelines with custom code.

For a custom prediction routine, one of these packages must contain your Predictor class (see predictionClass). Additionally, include any dependencies used by your Predictor or scikit-learn pipeline uses that are not already included in your selected runtime version.

If you specify this field, you must also set runtimeVersion to 1.4 or greater.

python_version: Option<String>

Optional. The version of Python used in prediction. If not set, the default version is '2.7'. Python '3.5' is available when runtime_version is set to '1.4' and above. Python '2.7' works with all supported runtime versions.

state: Option<String>

Output only. The state of a version.

etag: Option<String>

etag is used for optimistic concurrency control as a way to help prevent simultaneous updates of a model from overwriting each other. It is strongly suggested that systems make use of the etag in the read-modify-write cycle to perform model updates in order to avoid race conditions: An etag is returned in the response to GetVersion, and systems are expected to put that etag in the request to UpdateVersion to ensure that their change will be applied to the model as intended.

last_use_time: Option<String>

Output only. The time the version was last used for prediction.

deployment_uri: Option<String>

Required. The Cloud Storage location of the trained model used to create the version. See the guide to model deployment for more information.

When passing Version to projects.models.versions.create the model service uses the specified location as the source of the model. Once deployed, the model version is hosted by the prediction service, so this location is useful only as a historical record. The total number of model files can't exceed 1000.

is_default: Option<bool>

Output only. If true, this version will be used to handle prediction requests that do not specify a version.

You can change the default version by calling projects.methods.versions.setDefault.

Trait Implementations

impl ResponseResult for GoogleCloudMlV1__Version[src]

impl RequestValue for GoogleCloudMlV1__Version[src]

impl Default for GoogleCloudMlV1__Version[src]

impl Clone for GoogleCloudMlV1__Version[src]

fn clone_from(&mut self, source: &Self)1.0.0[src]

Performs copy-assignment from source. Read more

impl Debug for GoogleCloudMlV1__Version[src]

impl Serialize for GoogleCloudMlV1__Version[src]

impl<'de> Deserialize<'de> for GoogleCloudMlV1__Version[src]

Auto Trait Implementations

Blanket Implementations

impl<T> ToOwned for T where
    T: Clone
[src]

type Owned = T

The resulting type after obtaining ownership.

impl<T> From<T> for T[src]

impl<T, U> Into<U> for T where
    U: From<T>, 
[src]

impl<T, U> TryFrom<U> for T where
    U: Into<T>, 
[src]

type Error = Infallible

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 
[src]

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.

impl<T> BorrowMut<T> for T where
    T: ?Sized
[src]

impl<T> Borrow<T> for T where
    T: ?Sized
[src]

impl<T> Any for T where
    T: 'static + ?Sized
[src]

impl<T> Typeable for T where
    T: Any

fn get_type(&self) -> TypeId

Get the TypeId of this object.

impl<T> DeserializeOwned for T where
    T: Deserialize<'de>, 
[src]