google.ai.generativelanguage.ModelServiceAsyncClient

Provides methods for getting metadata information about Generative Models.

credentials Optional[google.auth.credentials.Credentials]

The authorization credentials to attach to requests. These credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. transport (Union[str, ~.ModelServiceTransport]): The transport to use. If set to None, a transport is chosen automatically.

client_options ClientOptions

Custom options for the client. It won't take effect if a transport instance is provided. (1) The api_endpoint property can be used to override the default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT environment variable can also be used to override the endpoint: "always" (always use the default mTLS endpoint), "never" (always use the default regular endpoint) and "auto" (auto switch to the default mTLS endpoint if client certificate is present, this is the default value). However, the api_endpoint property takes precedence if provided. (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable is "true", then the client_cert_source property can be used to provide client certificate for mutual TLS transport. If not provided, the default SSL client certificate will be used if present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not set, no client certificate will be used.

google.auth.exceptions.MutualTlsChannelError If mutual TLS transport creation failed for any reason.

transport Returns the transport used by the client instance.

Methods

create_tuned_model

View source

Creates a tuned model. Intermediate tuning progress (if any) is accessed through the [google.longrunning.Operations] service.

Status and results can be accessed through the Operations service. Example: GET /v1/tunedModels/az2mb0bpw6i/operations/000-111-222

# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
#   client as shown in:
#   <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta

async def sample_create_tuned_model():
    # Create a client
    client = generativelanguage_v1beta.ModelServiceAsyncClient()

    # Initialize request argument(s)
    tuned_model = generativelanguage_v1beta.TunedModel()
    tuned_model.tuning_task.training_data.examples.examples.text_input = "text_input_value"
    tuned_model.tuning_task.training_data.examples.examples.output = "output_value"

    request = generativelanguage_v1beta.CreateTunedModelRequest(
        tuned_model=tuned_model,
    )

    # Make the request
    operation = client.create_tuned_model(request=request)

    print("Waiting for operation to complete...")

    response = (await operation).result()

    # Handle the response
    print(response)

Args
request Optional[Union[google.ai.generativelanguage.CreateTunedModelRequest, dict]]

The request object. Request to create a TunedModel.

tuned_model (:class:google.ai.generativelanguage.TunedModel): Required. The tuned model to create. This corresponds to the tuned_model field on the request instance; if request is provided, this should not be set. tuned_modelid (:class:str): Optional. The unique id for the tuned model if specified. This value should be up to 40 characters, the first character must be a letter, the last could be a letter or a number. The id must match the regular expression: a-z <[a-z0-9-]{0,38}[a-z0-9]>_?.

This corresponds to the ``tuned_model_id`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

retry google.api_core.retry_async.AsyncRetry

Designation of what errors, if any, should be retried.

timeout float

The timeout for this request.

metadata Sequence[Tuple[str, str]]

Strings which should be sent along with the request as metadata.

Returns
google.api_core.operation_async.AsyncOperation An object representing a long-running operation.

The result type for the operation will be :class:google.ai.generativelanguage.TunedModel A fine-tuned model created using ModelService.CreateTunedModel.

delete_tuned_model

View source

Deletes a tuned model.

# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
#   client as shown in:
#   <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta

async def sample_delete_tuned_model():
    # Create a client
    client = generativelanguage_v1beta.ModelServiceAsyncClient()

    # Initialize request argument(s)
    request = generativelanguage_v1beta.DeleteTunedModelRequest(
        name="name_value",
    )

    # Make the request
    await client.delete_tuned_model(request=request)

Args
request Optional[Union[google.ai.generativelanguage.DeleteTunedModelRequest, dict]]

The request object. Request to delete a TunedModel.

name (:class:str): Required. The resource name of the model. Format: tunedModels/my-model-id

This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

retry google.api_core.retry_async.AsyncRetry

Designation of what errors, if any, should be retried.

timeout float

The timeout for this request.

metadata Sequence[Tuple[str, str]]

Strings which should be sent along with the request as metadata.

from_service_account_file

View source

Creates an instance of this client using the provided credentials file.

Args
filename str

The path to the service account private key json file.

args Additional arguments to pass to the constructor.
kwargs Additional arguments to pass to the constructor.

Returns
ModelServiceAsyncClient The constructed client.

from_service_account_info

View source

Creates an instance of this client using the provided credentials info.

Args
info dict

The service account private key info.

args Additional arguments to pass to the constructor.
kwargs Additional arguments to pass to the constructor.

Returns
ModelServiceAsyncClient The constructed client.

from_service_account_json

View source

Creates an instance of this client using the provided credentials file.

Args
filename str

The path to the service account private key json file.

args Additional arguments to pass to the constructor.
kwargs Additional arguments to pass to the constructor.

Returns
ModelServiceAsyncClient The constructed client.

get_model

View source

Gets information about a specific Model.

# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
#   client as shown in:
#   <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta

async def sample_get_model():
    # Create a client
    client = generativelanguage_v1beta.ModelServiceAsyncClient()

    # Initialize request argument(s)
    request = generativelanguage_v1beta.GetModelRequest(
        name="name_value",
    )

    # Make the request
    response = await client.get_model(request=request)

    # Handle the response
    print(response)

Args
request Optional[Union[google.ai.generativelanguage.GetModelRequest, dict]]

The request object. Request for getting information about
a specific Model.

name (:class:str): Required. The resource name of the model.

This name should match a model name returned by the
``ListModels`` method.

Format: ``models/{model}``

This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

retry google.api_core.retry_async.AsyncRetry

Designation of what errors, if any, should be retried.

timeout float

The timeout for this request.

metadata Sequence[Tuple[str, str]]

Strings which should be sent along with the request as metadata.

Returns
google.ai.generativelanguage.Model Information about a Generative Language Model.

get_mtls_endpoint_and_cert_source

View source

Return the API endpoint and client cert source for mutual TLS.

The client cert source is determined in the following order: (1) if GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable is not "true", the client cert source is None. (2) if client_options.client_cert_source is provided, use the provided one; if the default client cert source exists, use the default one; otherwise the client cert source is None.

The API endpoint is determined in the following order: (1) if client_options.api_endpoint if provided, use the provided one. (2) if GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable is "always", use the default mTLS endpoint; if the environment variable is "never", use the default API endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise use the default API endpoint.

More details can be found at https://google.aip.dev/auth/4114

Args
client_options google.api_core.client_options.ClientOptions

Custom options for the client. Only the api_endpoint and client_cert_source properties may be used in this method.

Returns
Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the client cert source to use.

Raises
google.auth.exceptions.MutualTLSChannelError If any errors happen.

get_transport_class

partial(func, *args, **keywords) - new function with partial application of the given arguments and keywords.

get_tuned_model

View source

Gets information about a specific TunedModel.

# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
#   client as shown in:
#   <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta

async def sample_get_tuned_model():
    # Create a client
    client = generativelanguage_v1beta.ModelServiceAsyncClient()

    # Initialize request argument(s)
    request = generativelanguage_v1beta.GetTunedModelRequest(
        name="name_value",
    )

    # Make the request
    response = await client.get_tuned_model(request=request)

    # Handle the response
    print(response)

Args
request Optional[Union[google.ai.generativelanguage.GetTunedModelRequest, dict]]

The request object. Request for getting information about
a specific Model.

name (:class:str): Required. The resource name of the model.

Format: ``tunedModels/my-model-id``

This corresponds to the ``name`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

retry google.api_core.retry_async.AsyncRetry

Designation of what errors, if any, should be retried.

timeout float

The timeout for this request.

metadata Sequence[Tuple[str, str]]

Strings which should be sent along with the request as metadata.

Returns
google.ai.generativelanguage.TunedModel A fine-tuned model created using ModelService.CreateTunedModel.

list_models

View source

Lists models available through the API.

# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
#   client as shown in:
#   <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta

async def sample_list_models():
    # Create a client
    client = generativelanguage_v1beta.ModelServiceAsyncClient()

    # Initialize request argument(s)
    request = generativelanguage_v1beta.ListModelsRequest(
    )

    # Make the request
    page_result = client.list_models(request=request)

    # Handle the response
    async for response in page_result:
        print(response)

Args
request Optional[Union[google.ai.generativelanguage.ListModelsRequest, dict]]

The request object. Request for listing all Models.

page_size (:class:int): The maximum number of Models to return (per page).

The service may return fewer models. If unspecified, at
most 50 models will be returned per page. This method
returns at most 1000 models per page, even if you pass a
larger page_size.

This corresponds to the ``page_size`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

page_token (:class:str): A page token, received from a previous ListModels call.

Provide the ``page_token`` returned by one request as an
argument to the next request to retrieve the next page.

When paginating, all other parameters provided to
``ListModels`` must match the call that provided the
page token.

This corresponds to the ``page_token`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

retry google.api_core.retry_async.AsyncRetry

Designation of what errors, if any, should be retried.

timeout float

The timeout for this request.

metadata Sequence[Tuple[str, str]]

Strings which should be sent along with the request as metadata.

Returns
google.ai.generativelanguage_v1beta.services.model_service.pagers.ListModelsAsyncPager Response from ListModel containing a paginated list of Models.

Iterating over this object will yield results and resolve additional pages automatically.

list_tuned_models

View source

Lists tuned models owned by the user.

# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
#   client as shown in:
#   <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta

async def sample_list_tuned_models():
    # Create a client
    client = generativelanguage_v1beta.ModelServiceAsyncClient()

    # Initialize request argument(s)
    request = generativelanguage_v1beta.ListTunedModelsRequest(
    )

    # Make the request
    page_result = client.list_tuned_models(request=request)

    # Handle the response
    async for response in page_result:
        print(response)

Args
request Optional[Union[google.ai.generativelanguage.ListTunedModelsRequest, dict]]

The request object. Request for listing TunedModels.

page_size (:class:int): Optional. The maximum number of TunedModels to return (per page). The service may return fewer tuned models.

If unspecified, at most 10 tuned models will be
returned. This method returns at most 1000 models per
page, even if you pass a larger page_size.

This corresponds to the ``page_size`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

page_token (:class:str): Optional. A page token, received from a previous ListTunedModels call.

Provide the ``page_token`` returned by one request as an
argument to the next request to retrieve the next page.

When paginating, all other parameters provided to
``ListTunedModels`` must match the call that provided
the page token.

This corresponds to the ``page_token`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

retry google.api_core.retry_async.AsyncRetry

Designation of what errors, if any, should be retried.

timeout float

The timeout for this request.

metadata Sequence[Tuple[str, str]]

Strings which should be sent along with the request as metadata.

Returns
google.ai.generativelanguage_v1beta.services.model_service.pagers.ListTunedModelsAsyncPager Response from ListTunedModels containing a paginated list of Models.

Iterating over this object will yield results and resolve additional pages automatically.

update_tuned_model

View source

Updates a tuned model.

# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
#   client as shown in:
#   <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta

async def sample_update_tuned_model():
    # Create a client
    client = generativelanguage_v1beta.ModelServiceAsyncClient()

    # Initialize request argument(s)
    tuned_model = generativelanguage_v1beta.TunedModel()
    tuned_model.tuning_task.training_data.examples.examples.text_input = "text_input_value"
    tuned_model.tuning_task.training_data.examples.examples.output = "output_value"

    request = generativelanguage_v1beta.UpdateTunedModelRequest(
        tuned_model=tuned_model,
    )

    # Make the request
    response = await client.update_tuned_model(request=request)

    # Handle the response
    print(response)

Args
request Optional[Union[google.ai.generativelanguage.UpdateTunedModelRequest, dict]]

The request object. Request to update a TunedModel.

tuned_model (:class:google.ai.generativelanguage.TunedModel): Required. The tuned model to update. This corresponds to the tuned_model field on the request instance; if request is provided, this should not be set. update_mask (:class:google.protobuf.field_mask_pb2.FieldMask): Required. The list of fields to update.

This corresponds to the ``update_mask`` field
on the ``request`` instance; if ``request`` is provided, this
should not be set.

retry google.api_core.retry_async.AsyncRetry

Designation of what errors, if any, should be retried.

timeout float

The timeout for this request.

metadata Sequence[Tuple[str, str]]

Strings which should be sent along with the request as metadata.

Returns
google.ai.generativelanguage.TunedModel A fine-tuned model created using ModelService.CreateTunedModel.

DEFAULT_ENDPOINT 'generativelanguage.googleapis.com'
DEFAULT_MTLS_ENDPOINT 'generativelanguage.mtls.googleapis.com'