Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] when running Langchain AgentExecutor, TypeError occurs - 'generator' object does not support the context manager protocol #3079

Closed
hyeonje-cho opened this issue Apr 30, 2024 · 2 comments · Fixed by #3179
Assignees
Labels
bug Something isn't working

Comments

@hyeonje-cho
Copy link

Describe the bug
When running the langchain AgentExecutor in a Python tool, a TypeError occurs. The function runs fine in environments without the Python tool decorator

How To Reproduce the bug
Here is my code.

from promptflow.core import tool
from dotenv import load_dotenv, find_dotenv
from langchain_openai import AzureChatOpenAI
from langchain.agents import create_tool_calling_agent, AgentExecutor
from langchain_community.tools import DuckDuckGoSearchRun
from promptflow.connections import AzureOpenAIConnection
from langchain import hub


@tool
def my_python_tool(question: str, openai_connect: AzureOpenAIConnection) -> str:
    load_dotenv(find_dotenv(), override=True)
    llm = AzureChatOpenAI(
        azure_deployment="gpt-35-turbo-16k",  # gpt-35-turbo-16k or gpt-4-32k
        openai_api_key=openai_connect.api_key,
        azure_endpoint=openai_connect.api_base,
        openai_api_type=openai_connect.api_type,
        openai_api_version=openai_connect.api_version,
    )
    search = DuckDuckGoSearchRun()
    tools = [search]
    prompt = hub.pull("hwchase17/openai-tools-agent")
    agent = create_tool_calling_agent(llm, tools, prompt)
    agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=False)
    result = agent_executor.invoke({"input": question})

    return result["output"]

Expected behavior
Return agent_executor's output

Screenshots

image

Running Information(please complete the following information):

  • Promptflow Package Version using pf -v:
    • {
      "promptflow": "1.9.0",
      "promptflow-core": "1.9.0",
      "promptflow-devkit": "1.9.0",
      "promptflow-tracing": "1.9.0"
      }
  • Operating System: Windows 11
  • Python Version using python --version: python==3.12.3

Additional context
Add any other context about the problem here.

@hyeonje-cho hyeonje-cho added the bug Something isn't working label Apr 30, 2024
@guming-learning guming-learning self-assigned this May 7, 2024
@guming-learning
Copy link
Contributor

Root cause:
langchain-openai wrap stream code in context manager block in this PR, which is released in langchain-openai 0.1.2 on Apr 10, 2024. Which introduces such change:

with self.client.create(messages=message_dicts, **params) as response:
    ...

However, promptflow wraps the generator output of OpenAI api for tracing, and the wrapped generator does not implement __enter__ and __exit__ function, thus causes the error.

We will change the wrapper to align with the original context manager behavior.

@gurvinder-dhillon
Copy link

Is this released ? I am still seeing the same error.
TypeError: 'generator' object does not support the context manager protocol

promptflow 1.11.0
promptflow-azure 1.11.0
promptflow-core 1.11.0
promptflow-devkit 1.11.0
promptflow-tools 1.4.0
promptflow-tracing 1.11.0
langchain-openai 0.1.7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants