Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Including more than 1 tool for a Gemini Model results in 400 - Request contains an invalid argument #3771

Open
expresspotato opened this issue May 10, 2024 · 4 comments
Labels
api: vertex-ai Issues related to the googleapis/python-aiplatform API.

Comments

@expresspotato
Copy link

Environment details

  • OS type and version: Mac OS / Python

Steps to reproduce

  1. Include more than one Tool object
  2. Generate

Code example

'''
Created on May 10, 2024

@author: Kevin
'''

from django.core.management.base import BaseCommand

import vertexai
from vertexai.generative_models import GenerativeModel, Tool
from vertexai.preview import generative_models as preview_generative_models
from vertexai.generative_models import (
    Content,
    FunctionDeclaration,
    GenerationConfig,
    GenerativeModel,
    Part,
    Tool,
)
from pprint import pprint

PROJECT_ID = ""
LOCATION_ID = ""
AGENT_ID = ""

MODEL_ID = 'gemini-1.5-pro-preview-0409'

class ModelFunction():
    def __init__(self):
        self.function_name = self.__class__.__name__
        self.data = {}

class MFGetCurrentWeather(ModelFunction):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.description = "Get the current weather in a given location"
        self.parameters = [("location", "string", "Location"), ]

    def set_parameter_values(self, location=None):
        self.data['location'] = location

    def get_functions_gemini(self):
        properties = {}
        for item in self.parameters: properties[item[0]] = {"type": item[1], "description": item[2]}

        return [
            FunctionDeclaration(
                name=self.function_name,
                description=self.description,
                parameters={
                    "type": "object",
                    "properties": properties,
                },
            )
        ]
    
    def get_response_gemini(self):
        return """{ "location": "Boston, MA", "temperature": 38, "description": "Partly Cloudy", "icon": "partly-cloudy", "humidity": 65, "wind": { "speed": 10, "direction": "NW" } }"""
    
class MFGetCurrentTimezone(ModelFunction):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.description = "Get the current timezone in a given location"
        self.parameters = [("location", "string", "Location"), ]

    def set_parameter_values(self, location=None):
        self.data['location'] = location

    def get_functions_gemini(self):
        properties = {}
        for item in self.parameters: properties[item[0]] = {"type": item[1], "description": item[2]}

        return [
            FunctionDeclaration(
                name=self.function_name,
                description=self.description,
                parameters={
                    "type": "object",
                    "properties": properties,
                },
            )
        ]
    
    def get_response_gemini(self):
        return """{ "location": "Boston, MA", "timeZone": {"name": "US / Eastern", "friendlyName": "Eastern Time"}} """

class Command(BaseCommand):
    help = 'Test basic VertexAI / Gemini functionality'
    
    def handle(self, *args, **options):
        print('--> Test...')

        generation_config = {
            'candidate_count': 1,
            'temperature': 0.1,
            'top_k': 5,
            'top_p': 0.90,
            'max_output_tokens': 1000,
        }

        messages = []

        prompt = 'What is the timezone in Boston?'

        messages.append({'role': 'user', 'parts': [{'text': prompt}]})

        mfgcw = MFGetCurrentWeather()
        mfgct = MFGetCurrentTimezone()

        weather_tool = Tool(function_declarations=mfgcw.get_functions_gemini())
        timezone_tool = Tool(function_declarations=mfgct.get_functions_gemini())

        # import ipdb; ipdb.set_trace();

        vertexai.init(project=PROJECT_ID, location=LOCATION_ID)
        gemini_model = GenerativeModel(MODEL_ID, system_instruction=[])

        model_response = gemini_model.generate_content(messages, tools=[timezone_tool, weather_tool], generation_config=generation_config)

        pprint(model_response)

        function_call = model_response.candidates[0].function_calls[0]
        print(f'--> Model requests function call: \n{function_call}')

        if function_call:
            if function_call.name == mfgcw.function_name:
                mfgcw.set_parameter_values(location=function_call.args['location'])
                function_response = mfgcw.get_response_gemini()
            elif function_call.name == mfgct.function_name:
                mfgct.set_parameter_values(location=function_call.args['location'])
                function_response = mfgct.get_response_gemini()
                
            # Return the API response to Gemini so it can generate a model response or request another function call
            response = gemini_model.generate_content(
                [
                    Content(role="user", parts=[Part.from_text(prompt)]),
                    model_response.candidates[0].content,  # Function call response
                    Content(
                        parts=[
                            Part.from_function_response(
                                name=mfgcw.function_name,
                                response={
                                    "content": function_response,  # Return the API response to Gemini
                                },
                            ),
                        ],
                    ),
                ],
                tools=[weather_tool],
            )

            pprint(response)
            pprint(response.candidates[0].content.parts[0].text)

Stack trace

google.api_core.exceptions.InvalidArgument: 400 Request contains an invalid argument.

Tried both tools on their own and the response is fine.

@product-auto-label product-auto-label bot added the api: vertex-ai Issues related to the googleapis/python-aiplatform API. label May 10, 2024
@said-rasidin
Copy link

I also get the same error for using example grounding using Google search with the example provided in this, already using vertexai version 1.51.0

import vertexai

from vertexai.generative_models import grounding
from vertexai.generative_models import GenerationConfig, GenerativeModel, Tool

# TODO(developer): Update and un-comment below line
project_id = "PROJECT ID"

vertexai.init(project=project_id, location="us-central1")

model = GenerativeModel(model_name="gemini-1.0-pro-002")

# Use Google Search for grounding
tool = Tool.from_google_search_retrieval(grounding.GoogleSearchRetrieval())

prompt = "When is the next total solar eclipse in US?"
response = model.generate_content(
    prompt,
    tools=[tool],
    generation_config=GenerationConfig(
        temperature=0.0,
    ),
)
print(response)

error return:

_InactiveRpcError                         Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/google/api_core/grpc_helpers.py](https://localhost:8080/#) in error_remapped_callable(*args, **kwargs)
     71         try:
---> 72             return callable_(*args, **kwargs)
     73         except grpc.RpcError as exc:

7 frames
_InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.INVALID_ARGUMENT
	details = "Request contains an invalid argument."
	debug_error_string = "UNKNOWN:Error received from peer ipv4:142.251.163.95:443 {grpc_message:"Request contains an invalid argument.", grpc_status:3, created_time:"2024-05-11T14:06:26.807301202+00:00"}"
>

The above exception was the direct cause of the following exception:

InvalidArgument                           Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/google/api_core/grpc_helpers.py](https://localhost:8080/#) in error_remapped_callable(*args, **kwargs)
     72             return callable_(*args, **kwargs)
     73         except grpc.RpcError as exc:
---> 74             raise exceptions.from_grpc_error(exc) from exc
     75 
     76     return error_remapped_callable

InvalidArgument: 400 Request contains an invalid argument.

@expresspotato
Copy link
Author

I also get the same error for using example grounding using Google search with the example provided in this, already using vertexai version 1.51.0

import vertexai

from vertexai.generative_models import grounding
from vertexai.generative_models import GenerationConfig, GenerativeModel, Tool

# TODO(developer): Update and un-comment below line
project_id = "PROJECT ID"

vertexai.init(project=project_id, location="us-central1")

model = GenerativeModel(model_name="gemini-1.0-pro-002")

# Use Google Search for grounding
tool = Tool.from_google_search_retrieval(grounding.GoogleSearchRetrieval())

prompt = "When is the next total solar eclipse in US?"
response = model.generate_content(
    prompt,
    tools=[tool],
    generation_config=GenerationConfig(
        temperature=0.0,
    ),
)
print(response)

error return:

_InactiveRpcError                         Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/google/api_core/grpc_helpers.py](https://localhost:8080/#) in error_remapped_callable(*args, **kwargs)
     71         try:
---> 72             return callable_(*args, **kwargs)
     73         except grpc.RpcError as exc:

7 frames
_InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.INVALID_ARGUMENT
	details = "Request contains an invalid argument."
	debug_error_string = "UNKNOWN:Error received from peer ipv4:142.251.163.95:443 {grpc_message:"Request contains an invalid argument.", grpc_status:3, created_time:"2024-05-11T14:06:26.807301202+00:00"}"
>

The above exception was the direct cause of the following exception:

InvalidArgument                           Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/google/api_core/grpc_helpers.py](https://localhost:8080/#) in error_remapped_callable(*args, **kwargs)
     72             return callable_(*args, **kwargs)
     73         except grpc.RpcError as exc:
---> 74             raise exceptions.from_grpc_error(exc) from exc
     75 
     76     return error_remapped_callable

InvalidArgument: 400 Request contains an invalid argument.

Your code seems valid, but I suspect you've not agreed to the terms for grounding.
See here: https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/ground-gemini

@Ark-kun
Copy link
Contributor

Ark-kun commented May 14, 2024

@expresspotato Can you please try putting all function declarations in a single Tool?

P.S. AFAIK, some of the most recent models might support multiple tools.

@koverholt
Copy link

Sorry to hear about your troubles here. We've (Google DevRel) also heard reports from users who want to use different types of tools (e.g., grounding/retrieval tools along with FunctionDeclarations).

There is a related issue in the generative-ai repo at GoogleCloudPlatform/generative-ai#636, and they've opened an issue in the public issue tracker for Vertex AI here: https://issuetracker.google.com/issues/340729475. Feel free to add more details and/or +1 the latter issue in the public tracker to increase the signal to the product teams. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: vertex-ai Issues related to the googleapis/python-aiplatform API.
Projects
None yet
Development

No branches or pull requests

4 participants