View source on GitHub |
Provides methods for getting metadata information about Generative Models.
google.ai.generativelanguage.ModelServiceClient(
*,
credentials: Optional[ga_credentials.Credentials] = None,
transport: Optional[Union[str, ModelServiceTransport]] = None,
client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO
) -> None
Raises | |
---|---|
google.auth.exceptions.MutualTLSChannelError
|
If mutual TLS transport creation failed for any reason. |
Attributes | |
---|---|
transport
|
Returns the transport used by the client instance. |
Methods
create_tuned_model
create_tuned_model(
request: Optional[Union[google.ai.generativelanguage.CreateTunedModelRequest
, dict]] = None,
*,
tuned_model: Optional[google.ai.generativelanguage.TunedModel
] = None,
tuned_model_id: Optional[str] = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: Union[float, object] = gapic_v1.method.DEFAULT,
metadata: Sequence[Tuple[str, str]] = ()
) -> operation.Operation
Creates a tuned model. Intermediate tuning progress (if any) is accessed through the [google.longrunning.Operations] service.
Status and results can be accessed through the Operations service. Example: GET /v1/tunedModels/az2mb0bpw6i/operations/000-111-222
# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
# client as shown in:
# <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta
def sample_create_tuned_model():
# Create a client
client = generativelanguage_v1beta.ModelServiceClient()
# Initialize request argument(s)
tuned_model = generativelanguage_v1beta.TunedModel()
tuned_model.tuning_task.training_data.examples.examples.text_input = "text_input_value"
tuned_model.tuning_task.training_data.examples.examples.output = "output_value"
request = generativelanguage_v1beta.CreateTunedModelRequest(
tuned_model=tuned_model,
)
# Make the request
operation = client.create_tuned_model(request=request)
print("Waiting for operation to complete...")
response = operation.result()
# Handle the response
print(response)
Args | |
---|---|
request
|
Union[google.ai.generativelanguage.CreateTunedModelRequest, dict]
The request object. Request to create a TunedModel. |
tuned_model
|
google.ai.generativelanguage.TunedModel
Required. The tuned model to create.
This corresponds to the |
tuned_model_id
|
str
Optional. The unique id for the tuned model if
specified. This value should be up to 40 characters, the
first character must be a letter, the last could be a
letter or a number. The id must match the regular
expression: This corresponds to the |
retry
|
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout
|
float
The timeout for this request. |
metadata
|
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Returns | |
---|---|
google.api_core.operation.Operation
|
An object representing a long-running operation.
The result type for the operation will be
:class: |
delete_tuned_model
delete_tuned_model(
request: Optional[Union[google.ai.generativelanguage.DeleteTunedModelRequest
, dict]] = None,
*,
name: Optional[str] = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: Union[float, object] = gapic_v1.method.DEFAULT,
metadata: Sequence[Tuple[str, str]] = ()
) -> None
Deletes a tuned model.
# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
# client as shown in:
# <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta
def sample_delete_tuned_model():
# Create a client
client = generativelanguage_v1beta.ModelServiceClient()
# Initialize request argument(s)
request = generativelanguage_v1beta.DeleteTunedModelRequest(
name="name_value",
)
# Make the request
client.delete_tuned_model(request=request)
Args | |
---|---|
request
|
Union[google.ai.generativelanguage.DeleteTunedModelRequest, dict]
The request object. Request to delete a TunedModel. |
name
|
str
Required. The resource name of the model. Format:
This corresponds to the |
retry
|
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout
|
float
The timeout for this request. |
metadata
|
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
from_service_account_file
@classmethod
from_service_account_file( filename: str, *args, **kwargs )
Creates an instance of this client using the provided credentials file.
Args | |
---|---|
filename
|
str
The path to the service account private key json file. |
args
|
Additional arguments to pass to the constructor. |
kwargs
|
Additional arguments to pass to the constructor. |
Returns | |
---|---|
ModelServiceClient
|
The constructed client. |
from_service_account_info
@classmethod
from_service_account_info( info: dict, *args, **kwargs )
Creates an instance of this client using the provided credentials info.
Args | |
---|---|
info
|
dict
The service account private key info. |
args
|
Additional arguments to pass to the constructor. |
kwargs
|
Additional arguments to pass to the constructor. |
Returns | |
---|---|
ModelServiceClient
|
The constructed client. |
from_service_account_json
@classmethod
from_service_account_json( filename: str, *args, **kwargs )
Creates an instance of this client using the provided credentials file.
Args | |
---|---|
filename
|
str
The path to the service account private key json file. |
args
|
Additional arguments to pass to the constructor. |
kwargs
|
Additional arguments to pass to the constructor. |
Returns | |
---|---|
ModelServiceClient
|
The constructed client. |
get_model
get_model(
request: Optional[Union[google.ai.generativelanguage.GetModelRequest
, dict]] = None,
*,
name: Optional[str] = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: Union[float, object] = gapic_v1.method.DEFAULT,
metadata: Sequence[Tuple[str, str]] = ()
) -> google.ai.generativelanguage.Model
Gets information about a specific Model.
# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
# client as shown in:
# <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta
def sample_get_model():
# Create a client
client = generativelanguage_v1beta.ModelServiceClient()
# Initialize request argument(s)
request = generativelanguage_v1beta.GetModelRequest(
name="name_value",
)
# Make the request
response = client.get_model(request=request)
# Handle the response
print(response)
Args | |
---|---|
request
|
Union[google.ai.generativelanguage.GetModelRequest, dict]
The request object. Request for getting information about a specific Model. |
name
|
str
Required. The resource name of the model. This name should match a model name returned by the
Format: This corresponds to the |
retry
|
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout
|
float
The timeout for this request. |
metadata
|
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Returns | |
---|---|
google.ai.generativelanguage.Model
|
Information about a Generative Language Model. |
get_mtls_endpoint_and_cert_source
@classmethod
get_mtls_endpoint_and_cert_source( client_options: Optional[client_options_lib.ClientOptions] = None )
Return the API endpoint and client cert source for mutual TLS.
The client cert source is determined in the following order:
(1) if GOOGLE_API_USE_CLIENT_CERTIFICATE
environment variable is not "true", the
client cert source is None.
(2) if client_options.client_cert_source
is provided, use the provided one; if the
default client cert source exists, use the default one; otherwise the client cert
source is None.
The API endpoint is determined in the following order:
(1) if client_options.api_endpoint
if provided, use the provided one.
(2) if GOOGLE_API_USE_CLIENT_CERTIFICATE
environment variable is "always", use the
default mTLS endpoint; if the environment variable is "never", use the default API
endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
use the default API endpoint.
More details can be found at https://google.aip.dev/auth/4114
Args | |
---|---|
client_options
|
google.api_core.client_options.ClientOptions
Custom options for the
client. Only the |
Returns | |
---|---|
Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the client cert source to use. |
Raises | |
---|---|
google.auth.exceptions.MutualTLSChannelError
|
If any errors happen. |
get_tuned_model
get_tuned_model(
request: Optional[Union[google.ai.generativelanguage.GetTunedModelRequest
, dict]] = None,
*,
name: Optional[str] = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: Union[float, object] = gapic_v1.method.DEFAULT,
metadata: Sequence[Tuple[str, str]] = ()
) -> google.ai.generativelanguage.TunedModel
Gets information about a specific TunedModel.
# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
# client as shown in:
# <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta
def sample_get_tuned_model():
# Create a client
client = generativelanguage_v1beta.ModelServiceClient()
# Initialize request argument(s)
request = generativelanguage_v1beta.GetTunedModelRequest(
name="name_value",
)
# Make the request
response = client.get_tuned_model(request=request)
# Handle the response
print(response)
Args | |
---|---|
request
|
Union[google.ai.generativelanguage.GetTunedModelRequest, dict]
The request object. Request for getting information about a specific Model. |
name
|
str
Required. The resource name of the model. Format: This corresponds to the |
retry
|
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout
|
float
The timeout for this request. |
metadata
|
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Returns | |
---|---|
google.ai.generativelanguage.TunedModel
|
A fine-tuned model created using ModelService.CreateTunedModel. |
list_models
list_models(
request: Optional[Union[google.ai.generativelanguage.ListModelsRequest
, dict]] = None,
*,
page_size: Optional[int] = None,
page_token: Optional[str] = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: Union[float, object] = gapic_v1.method.DEFAULT,
metadata: Sequence[Tuple[str, str]] = ()
) -> pagers.ListModelsPager
Lists models available through the API.
# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
# client as shown in:
# <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta
def sample_list_models():
# Create a client
client = generativelanguage_v1beta.ModelServiceClient()
# Initialize request argument(s)
request = generativelanguage_v1beta.ListModelsRequest(
)
# Make the request
page_result = client.list_models(request=request)
# Handle the response
for response in page_result:
print(response)
Args | |
---|---|
request
|
Union[google.ai.generativelanguage.ListModelsRequest, dict]
The request object. Request for listing all Models. |
page_size
|
int
The maximum number of The service may return fewer models. If unspecified, at most 50 models will be returned per page. This method returns at most 1000 models per page, even if you pass a larger page_size. This corresponds to the |
page_token
|
str
A page token, received from a previous Provide the When paginating, all other parameters provided to
This corresponds to the |
retry
|
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout
|
float
The timeout for this request. |
metadata
|
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Returns | |
---|---|
google.ai.generativelanguage_v1beta.services.model_service.pagers.ListModelsPager
|
Response from ListModel containing a paginated list of
Models.
Iterating over this object will yield results and resolve additional pages automatically. |
list_tuned_models
list_tuned_models(
request: Optional[Union[google.ai.generativelanguage.ListTunedModelsRequest
, dict]] = None,
*,
page_size: Optional[int] = None,
page_token: Optional[str] = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: Union[float, object] = gapic_v1.method.DEFAULT,
metadata: Sequence[Tuple[str, str]] = ()
) -> pagers.ListTunedModelsPager
Lists tuned models owned by the user.
# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
# client as shown in:
# <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta
def sample_list_tuned_models():
# Create a client
client = generativelanguage_v1beta.ModelServiceClient()
# Initialize request argument(s)
request = generativelanguage_v1beta.ListTunedModelsRequest(
)
# Make the request
page_result = client.list_tuned_models(request=request)
# Handle the response
for response in page_result:
print(response)
Args | |
---|---|
request
|
Union[google.ai.generativelanguage.ListTunedModelsRequest, dict]
The request object. Request for listing TunedModels. |
page_size
|
int
Optional. The maximum number of If unspecified, at most 10 tuned models will be returned. This method returns at most 1000 models per page, even if you pass a larger page_size. This corresponds to the |
page_token
|
str
Optional. A page token, received from a previous
Provide the When paginating, all other parameters provided to
This corresponds to the |
retry
|
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout
|
float
The timeout for this request. |
metadata
|
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Returns | |
---|---|
google.ai.generativelanguage_v1beta.services.model_service.pagers.ListTunedModelsPager
|
Response from ListTunedModels containing a paginated
list of Models.
Iterating over this object will yield results and resolve additional pages automatically. |
update_tuned_model
update_tuned_model(
request: Optional[Union[google.ai.generativelanguage.UpdateTunedModelRequest
, dict]] = None,
*,
tuned_model: Optional[google.ai.generativelanguage.TunedModel
] = None,
update_mask: Optional[field_mask_pb2.FieldMask] = None,
retry: OptionalRetry = gapic_v1.method.DEFAULT,
timeout: Union[float, object] = gapic_v1.method.DEFAULT,
metadata: Sequence[Tuple[str, str]] = ()
) -> google.ai.generativelanguage.TunedModel
Updates a tuned model.
# This snippet has been automatically generated and should be regarded as a
# code template only.
# It will require modifications to work:
# - It may require correct/in-range values for request initialization.
# - It may require specifying regional endpoints when creating the service
# client as shown in:
# <a href="https://googleapis.dev/python/google-api-core/latest/client_options.html">https://googleapis.dev/python/google-api-core/latest/client_options.html</a>
from google.ai import generativelanguage_v1beta
def sample_update_tuned_model():
# Create a client
client = generativelanguage_v1beta.ModelServiceClient()
# Initialize request argument(s)
tuned_model = generativelanguage_v1beta.TunedModel()
tuned_model.tuning_task.training_data.examples.examples.text_input = "text_input_value"
tuned_model.tuning_task.training_data.examples.examples.output = "output_value"
request = generativelanguage_v1beta.UpdateTunedModelRequest(
tuned_model=tuned_model,
)
# Make the request
response = client.update_tuned_model(request=request)
# Handle the response
print(response)
Args | |
---|---|
request
|
Union[google.ai.generativelanguage.UpdateTunedModelRequest, dict]
The request object. Request to update a TunedModel. |
tuned_model
|
google.ai.generativelanguage.TunedModel
Required. The tuned model to update.
This corresponds to the |
update_mask
|
google.protobuf.field_mask_pb2.FieldMask
Required. The list of fields to update. This corresponds to the |
retry
|
google.api_core.retry.Retry
Designation of what errors, if any, should be retried. |
timeout
|
float
The timeout for this request. |
metadata
|
Sequence[Tuple[str, str]]
Strings which should be sent along with the request as metadata. |
Returns | |
---|---|
google.ai.generativelanguage.TunedModel
|
A fine-tuned model created using ModelService.CreateTunedModel. |
__enter__
__enter__() -> 'ModelServiceClient'
__exit__
__exit__(
type, value, traceback
)
Releases underlying transport's resources.
.. warning:: ONLY use as a context manager if the transport is NOT shared with other clients! Exiting the with block will CLOSE the transport and may cause errors in other clients!
Class Variables | |
---|---|
DEFAULT_ENDPOINT |
'generativelanguage.googleapis.com'
|
DEFAULT_MTLS_ENDPOINT |
'generativelanguage.mtls.googleapis.com'
|