MakerSuite quickstart

MakerSuite is a browser-based IDE for prototyping with generative language models. MakerSuite lets you quickly try out models and experiment with different prompts.

When you've built something you're happy with, you can easily export it to Python code, and call the same models using the PaLM API.

Which prompt?

At a high level, MakerSuite provides three interfaces for prompts that are designed for different use cases.

MakerSuite homepage

The right prompt type for you depends on what you want to do.

  • Text prompts - Text prompts in MakerSuite offer a lightweight, freeform experience that let you creatively express yourself through your prompts. Experiment with simple zero-shot prompts or with more structured, few-shot prompts where you include some examples.

    MakerSuite also lets you write the same prompt in different ways. To learn how, check out the text prompt example below.

    When you are familiar with structuring some few-shot prompts, try using data prompts.

  • Data prompts - Data prompts let you construct structured few-shot prompts in a tabular format. Use these to provide a repeated input and output prompt structure which is tailored for your specific application. See the data prompt example below for a walkthrough of this prompt type.

    Get started with data prompts with your own prompts or modify an example from the prompt gallery.

  • Chat prompts - Use chat prompts to build conversational experiences. The quickstart tutorial below shows you how to prototype a PaLM-powered chatbot in MakerSuite.

Text prompt example: Generate code for a game

This example shows you how to use text prompts to generate code in multiple languages for the same computational problem.

Note that this example builds on an existing example in the prompt gallery.

Step 1 - Create a text prompt

On the MakerSuite homepage, create a new Text prompt. Write the following prompt in the prompt window:

Write fizzbuzz in python. Comment each line.

Fizzbuzz is a group word game for children that is commonly posed as a computer science problem to assess programming ability.

Run the prompt and the model returns a solution for the Fizzbuzz problem in Python.

# This function prints the numbers from 1 to 100.
def fizz_buzz():
    for number in range(1, 101):
        # Check if the number is divisible by 3 and 5.
        if number % 3 == 0 and number % 5 == 0:
            print("FizzBuzz")
        # Check if the number is divisible by 3.
        elif number % 3 == 0:
            print("Fizz")
        # Check if the number is divisible by 5.
        elif number % 5 == 0:
            print("Buzz")
        # Otherwise, print the number.
        else:
            print(number)

# Call the fizz_buzz function.
fizz_buzz()

Step 2 - Same prompt using a test input

The Python code is correct but what if you wanted to solve the problem in C++? You could rewrite the prompt but that can get cumbersome for more complex examples. MakerSuite makes it easy for you to reuse prompts in different ways by using test inputs in your prompts.

Click Insert test input in the top right corner of the text prompt field and update your prompt as shown:

Replace python with variable

Change the value of the test input in the prompt testing pane at the bottom to run the same prompt to return code in C++, JavaScript etc.

Step 3 - Experiment with model parameters

As you're prototyping your prompt, you can also play around with model run settings by clicking the button at the bottom of the screen:

Run settings

The temperature controls how much randomness is allowed in the model's responses. Often, raising this number allows the model to produce more unexpected, "creative" responses.

You can also increase the number of outputs (the number of responses the model returns for each call) to go through multiple, alternative model responses in the testing panel.

This is also where you can adjust safety settings for text prompts. See the safety settings guide for details about the settings.

Step 4 - Next steps

You've just prototyped your first LLM-powered application. At this stage, you can move to code by clicking the Get code button on the top-right-hand corner of the screen:

Get code

You can export your prompt as Python or JavaScript code, JSON objects, or as a cURL command. Note that you can run the Python code in Google Colaboratory.

You can also save your prompt for later use and access it from your prompt library.

Prompt library

Once you save your prompt, you can also share it with others!

Data prompt example: Build a product copy generator

Data prompts in MakerSuite combine instructions with a table for providing examples. This table is useful for managing examples by allowing you to quickly visualize, reformat, delete and generate examples.

Providing examples can help get better results from the model, especially when it's difficult to encapsulate what you want the model to do in a single instruction. For example, if you want the LLM to create an ad copy for a product, it might be more efficient to provide several examples of product and copy pairs instead of trying to describe the style of copy you're looking for. This section shows you how to create such a prompt using the data prompt type in MakerSuite.

Step 1 - Create a data prompt

  1. On the MakerSuite homepage, create a new Data prompt.

  2. For data prompts, you must provide examples of input and outputs for the model to mimic. You can also optionally provide instructions for the model to set some context or specify tone and style for the outputs. For this example, use the following instructions:

    You are a product marketer targeting a Gen Z audience. Create exciting and
    fresh advertising copy for products and their simple description. Keep copy
    under a few sentences long.
    
  3. Next you can set up the structure of your prompt using the examples table provided. You can add additional input and output columns as well as rename each column for readability. This example will use 2 columns, one each for input and output. You can rename the input to Product: and the output column to Product copy:.

Step 2 - Add examples

Now that you have a structure for your examples, fill in the fields with examples to send to the model. You can enter examples manually or import from a file using the import data menu.

Manually enter examples

Enter your examples, one per row, in the table you created in step 1.

Data prompt examples table

After you've manually added a few examples, you can use the Generate examples feature under the Action menu to have the model generate additional examples for you.

Generate examples menu

Import examples

To import examples from a file, choose the Import examples option under the Action menu. You can import from a CSV or Google Sheets file in your Google drive or upload from your local machine.

In the import examples dialogue, you can choose which columns to import and which to leave out. The dialogue also allows you to specify which data column imports to which table column.

Generate examples menu

How examples are sent to the model

Under the hood, MakerSuite constructs a prompt by combining the instructions with the examples you provide. As you add more examples, these get added to the text sent to the model. Depending on how long your examples are, you may start hitting the model's token limit. All LLMs have a token limit, which is the maximum length of the text they can accept as input.

To see the complete prompt, click Text preview at the bottom of the screen to bring up the preview pane.

Preview pane

The preview pane shows how the structured examples are formatted as a single prompt that's sent to the model.

Step 3 - Test your prompt

Once you have the examples that show the model what you want, test your prompt with new input in the Test your prompt table at the bottom. As with the text prompt type, you can adjust model parameters to test if they help produce better results for your use case.

Step 4 - Next steps

Similar to the other prompt types, once you have your prompt prototyped to your satisfaction, you can use the Get Code button to start coding or save your prompt to work on later or share with others.

You can also export the examples table to a CSV file or Google sheets to use in a different prompt or to add onto offline. Choose the Export examples option under the Action menu to export your examples.

Chat prompt example: Build a custom chat application

If you've used a general-purpose chatbot like Bard before, you've experienced first-hand how powerful LLMs can be for open-ended dialog.

These chatbots can be used for a variety of different tasks, like creative brainstorming ("Write me a haiku about Android phones"), planning ("Make a bulleted list of what I should pack for my 3-day camping trip"), learning ("Explain quantum physics to me like I'm five"), and cooking ("generate a lasagna recipe for me").

But while these general-purpose chatbots are useful, often they need to be tailored for particular use cases. For example, maybe you want to build a customer service chatbot that only supports conversations that talk about a company's product. Or maybe you want to build a motivational chatbot, one that always responds with positive affirmations.

User: How are you doing today?

Bot: Amazing! The sun is shining! ☀️The earth is turning! 🌎What a glorious time
to be a chatbot!!

Beyond these specific use cases, you might just want to build a chatbot that speaks with a particular tone or style - like a bot that cracks lots of jokes, or rhymes like a poet, or uses lots of emojis in its answers. Or maybe, you'd like to simulate a conversation with a character or personality, like an alien, living on Jupiter's moon Europa.

Step 1 - Create a chat prompt

On the MakerSuite homepage, create a new Chat prompt.

On the left pane of the chat prompt interface, create a prompt that helps define the chatbot behavior.

LLMs excel at recognizing and replicating patterns in text. So, to create a chatbot that behaves like an alien, create prompts that contain some examples of what a hypothetical conversation between a user and an alien living on Europa might look like.

Use the User and the Model fields to provide some examples of what a conversation between a user and your chatbot might look like. The model recognizes the pattern and provides good alien responses on its own. Formally, this technique is known as few-shot prompting.

User: Hi!
Model: Hi! My name is Tim and I live on Europa, one of Jupiter's moons. Brr!
It's cold down here!

Instruct the model to behave a certain way by providing a descriptive instructional sentence in the Context field, such as:

Context: You are Tim, a friendly alien that lives on Europa, one of
Jupiter's moons.

Example prompt with context

After you've filled out an example, start testing your application by chatting with the model on the right pane of the chat prompt interface. For example, ask it what the weather is like:

Start chatting

The one example you provided in the previous example was enough to get the model to reply with an appropriate response - the weather on Europa.

Step 2 - Teach your bot to chat better

By providing a single example, you were able to build a basic Europa alien bot. However, a single example is usually not enough to ensure consistency and quality in the model's responses. From the previous example, the model's response to what the weather is like is very long, and sounds like it comes out of a textbook rather than from a friendly alien.

Customize the tone of your chatbot by using this model response and editing it to match the desired tone and style of your alien chatbot. On the top left corner of your conversation, click Add to examples to convert the test chat between you and the model (your question and the model's generated response) to an example.

Add to teaching

Edit the model's response to match the desired style and tone of your chatbot.

Edit model response

You can use the model to add one more examples to the prompt. For example, if you want to give the model an example of a user asking "What's your favorite thing about home?", you can generate a model response right here in the examples panel by clicking on the magic wand:

Generate response

Edit the model's response to get the tone and style you desire.

Continue to add examples and test how they modify the behavior of your chatbot. Typically, more examples correspond to higher quality chatbot responses.

To see the complete conversation history (that is, all the text that gets sent to the model), click Text preview at the bottom of the screen to bring up the preview pane.

Preview pane

The preview pane shows how the example text is combined with the prior conversation history text to form a single prompt that's sent to the model.

Note that the model token limit is displayed at the bottom of the preview pane.

Step 3 - Experiment with model parameters

You can also try adjusting the model parameters to see if they produce more appropriate results for your use case.

Step 4 - Next steps

Similar to the other prompt types, once you have your prompt prototyped to your satisfaction, you can use the Get Code button to start coding or save your prompt to work on later and/or share with others.

Model tuning example: Create a tuned model

Prompt design strategies such as few shot prompting may not always produce the results you need. Use model tuning to improve a model's performance on specific tasks or help the model adhere to specific output requirements when instructions aren't sufficient and you have a set of examples that demonstrate the outputs you want.

Step 1 - Create a tuned model

On the MakerSuite homepage, create a new Tuned model.

The main part of model tuning is the dataset you provide. If you've been testing out examples using the data prompt, you can use the drop down to import those examples as your dataset. The Import button also gives you the option of importing data from files.

Data import dialog

See the tuning guide to learn about how to prepare your dataset.

Step 2 - Monitor the tuning job

The Tune button becomes active when all the required fields are filled in. Click it to start the job. You can monitor the status of the tuning job in your My Library page. This is where you can cancel the job as well if you need to.

Step 3 - Experiment with the tuned model

Once the tuning completes, you can create text and data prompts with your tuned model. Select your model name from the Run Settings, Model drop down.

Step 4 - Next steps

Once you are satisfied with your tuned model, it's ready to use from the API. If the model didn't meet your needs, you can delete the instance and start a new tuning job.

Further reading