Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request: ChatGPTClient] Add option to avoid buildPrompt and use standard messages payload #404

Open
danny-avila opened this issue Jun 4, 2023 · 0 comments
Labels
enhancement New feature or request

Comments

@danny-avila
Copy link
Contributor

I might work on this myself, but I'm highlighting the discrepancy between building the prompt and making the standard API call with all messages as array in payload

In each test, I use the same set of messages.

ChatGPTClient
initial message prompt tokens: 58
followup prompt tokens: 82
2nd followup prompt tokens: 114
final: 179

Raw API Calls
initial message prompt tokens: 9
followup prompt tokens: 32
2nd followup prompt tokens: 63
final: 127

The results I get are very comparable, and I could save tokens this way.


With System Message

Note that the current buildPrompt method is more useful when you want a system message, as it's comparable in token usage, and it often nets better results for gpt-3.5.

ChatGPTClient
initial message prompt tokens: 47
followup prompt tokens: 111
2nd followup prompt tokens: 194
final: 280

Raw API Calls
initial message prompt tokens: 34
followup prompt tokens: 98
2nd followup prompt tokens: 181
final: 267

@danny-avila danny-avila added the enhancement New feature or request label Jun 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant