Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discrepancy Between Mentioned and Available Plugins in Repository #132

Closed
yiyiyi0817 opened this issue Apr 29, 2024 · 8 comments
Closed
Assignees
Labels

Comments

@yiyiyi0817
Copy link

yiyiyi0817 commented Apr 29, 2024

Hello,

I was reading through your paper, specifically section A.2.3 AUTO-SCALING PLUGINS, where it's mentioned, "This method has enabled us to introduce over 200 high-quality plugins." However, upon inspecting the GitHub repository at https://github.com/xlang-ai/OpenAgents/blob/main/real_agents/plugins_agent/plugins/plugin_names.py, I only found references to 16 plugins.

Here's a brief overview of what's listed in plugin_names.py:

class PluginName(str, Enum):
    """
    Enum class for plugin names
    each name is a plugin name 🔌 , each value is the folder name 📁 of the plugin
    """
    KLARNA = "klarna"
    ZAPIER = "zapier"
    COURSERA = "Coursera"
    JOBSEARCH = "jobsearch"
    SHOW_ME = "show_me"
    SPEAK = "speak"
    CREATE_QR_CODE = "create_qr_code"
    MAPS = "maps"
    ASKYOURPDF = "askyourpdf"
    OUTSCHOOL = "Outschool"
    NBA_STATS = "nba_stats"
    WOLFRAM = "wolfram"
    WEB_SCRAPER = "web_scraper"
    DREAMINTERPRETER = "DreamInterpreter"
    BIZTOC = "biztoc"
    XWEATHER = "XWeather"

Given the significant discrepancy between the 200+ plugins mentioned in the paper and the 16 plugins listed in the repository, I'm curious about the reason behind this. Are the additional plugins not open-sourced, or do the listed 16 plugins serve as wrappers for a broader set of functionalities?

Thank you for clarifying this discrepancy. 🙂 @Timothyxxx

@Timothyxxx Timothyxxx self-assigned this Apr 29, 2024
@Timothyxxx
Copy link
Contributor

Thank you for pointing it out. In our publicly released demo (which we no longer maintain now since it is costly), the over 200 APIs are complete, but as you mentioned, this part is missing in the code. I believe this might be due to our failure to fully clean up the code when uploading and making it public (considering the varying quality of APIs, many of which are constantly changing, cleaning and maintaining them takes some time). This aspect was overlooked while we were engaged in other tasks and maintaining other features. If you need it, I will find some time this week to continue the cleanup and upload it to meet your requirements.

Tianbao

@yiyiyi0817
Copy link
Author

Thank you vey much for you clarification. I got it.

If you could upload the missing part, I would greatly appreciate it. Hope it's not too much trouble for you. Please feel free to upload.

@yiyiyi0817
Copy link
Author

Thank you for pointing it out. In our publicly released demo (which we no longer maintain now since it is costly), the over 200 APIs are complete, but as you mentioned, this part is missing in the code. I believe this might be due to our failure to fully clean up the code when uploading and making it public (considering the varying quality of APIs, many of which are constantly changing, cleaning and maintaining them takes some time). This aspect was overlooked while we were engaged in other tasks and maintaining other features. If you need it, I will find some time this week to continue the cleanup and upload it to meet your requirements.

Tianbao

I have another question that I would like to seek your advice on. Regarding the 200 tools you mentioned, I noticed that each one has a .py file in its corresponding path, such as the search.py file found in the real_agents/plugins_agent/plugins/Coursera/paths directory.

Were these collected by you via Rapid API, or were they generated offline using the OpenAPI Generator tool from a YAML file? Does this process involve a significant amount of manual effort?

I am a developer at Camel, and I've observed that compared to other LLM agent projects, your work incorporates a notably large number of tools, which I find to be very useful. I greatly appreciate the effort and innovation your team has put into this project, and I look forward to your advice on the matter. @Timothyxxx

Ziyi Yang

@Timothyxxx
Copy link
Contributor

Timothyxxx commented May 1, 2024

No problem, I have an ICLR conference to attend next week, so I'll try to find some time this week to organize and upload the information, and then I'll let you know~

Regarding the source of the tools, that's a good question. I suddenly realized we haven't mentioned this elsewhere. Initially, we tried to import some tools from Rapid API to expand our collection, but we found (I'm not sure if any research is currently addressing this) that the input parameters and returned information from these traditional APIs are overly complex and structured data. The current models struggle with planning and accurately parsing natural language into such inputs, and understanding the responses is also limited, especially with noisy data. Not to mention the API providers are generally not willing to provide APIs to developers instead of companies. We cannot afford the efforts on wrapping on these APIs.
So, we actually reverse-engineered the functionality of OpenAI Plugins, obtained the endpoints and openapi.yaml from the providers offering Plugins API to them, and then automated pings to see if they were free or if some required additional verification or had IP restrictions. Then, we retrieved the APIs that were accessible. At that time, we were able to retrieve over a hundred APIs using this method. I'm not sure if the ecosystem of OpenAI Plugins has significantly improved by now (I think they abandoned Plugins and opened the GPTs store and seem to abandon and try something bigger instead), but if it has, you might be able to access even more APIs for use with Camel (I know this project has also been in touch with Guohao haha).

After knowing the available API meta-info, we then unify them into the same function interface (It has been a while since I read the code, I will update after I check that again), for the promoting to work. This can be changed according to your needs!

Contact me if you need more information for the development from your side.

Thank you,
Tianbao

@yiyiyi0817
Copy link
Author

Thank you very much for your detailed explanation; it has been incredibly helpful. 🙏🙏

Could I inquire about waht tools or Python libraries do you use for converting an openapi.yaml file into automatically ping code?

@yiyiyi0817
Copy link
Author

Thank you very much for your detailed explanation; it has been incredibly helpful. 🙏🙏

Could I inquire about waht tools or Python libraries do you use for converting an openapi.yaml file into automatically ping code?

Oh I saw the use of prance in your code. https://github.com/xlang-ai/OpenAgents/blob/main/real_agents/adapters/data_model/plugin/base.py

From my search, it also seems the best openapi file parser in python.

I think I know the answer. Thank you for your time.

Copy link

github-actions bot commented May 6, 2024

This issue is stale because it has been open 3 days with no activity. Remove stale label or comment or this will be closed in 4 days.

@github-actions github-actions bot added the Stale label May 6, 2024
Copy link

This issue was closed because it has been stalled for 4 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants