Module: google.generativeai

A high level client library for generative AI.


pip install google-generativeai
import google.generativeai as palm
import os



Use the palm.generate_text function to have the model complete some initial text.

response = palm.generate_text(prompt="The opposite of hot is")
print(response.result) #  'cold.'


Use the function to have a discussion with a model:

response =["Hello."])
print(response.last) #  'Hello! What can I help you with?'
response.reply("Can you tell me a joke?")


Use the model service discover models and find out more about them:

Use palm.get_model to get details if you know a model's name:

model = palm.get_model('models/chat-bison-001') # 🦬

Use palm.list_models to discover models:

import pprint
for model in palm.list_models():
    pprint.pprint(model) # 🦎🦦🦬🦄


types module: A collection of type definitions used throughout the library.


chat(...): Calls the API and returns a types.ChatResponse containing the response.

chat_async(...): Calls the API and returns a types.ChatResponse containing the response.

configure(...): Captures default client configuration.


create_tuned_model(...): Launches a tuning job to create a TunedModel.


generate_embeddings(...): Calls the API to create an embedding for the text passed in.

generate_text(...): Calls the API and returns a types.Completion containing the response.

get_base_model(...): Get the types.Model for the given base model name.

get_model(...): Given a model name, fetch the types.Model or types.TunedModel object.

get_tuned_model(...): Get the types.TunedModel for the given tuned model name.

list_models(...): Lists available models.

list_tuned_models(...): Lists available models.

update_tuned_model(...): Push updates to the tuned model. Only certain attributes are updatable.

version '0.2.0'
annotations Instance of __future__._Feature