-
Notifications
You must be signed in to change notification settings - Fork 289
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
👽 (Compatability) Make model optional to allow using TogetherAI endpoint #177
Conversation
TogetherAI's API mocks OpenAI's, however the model is not returned in chat stream responses, thus it must be made optional in order to proceed without erroring.
Does this fall under this scope-creep? This project specifically implements the OpenAI API. |
It's an extremely minor change, and most of the inference providers intentionally clone the OpenAI API in order to be compatible with project such as this. I have submitted a ticket to them regarding this issue as well, but this solution will be more robust. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maintainers will most likely ask for a cleaner pull.
is it just the streaming version of this one endpoint that you're interested in compatibility for?
@@ -124,7 +121,7 @@ public struct ChatStreamResult: Codable, Equatable { | |||
/// Each chunk has the same timestamp. | |||
public let created: TimeInterval | |||
/// The model to generate the completion. | |||
public let model: String | |||
public let model: String? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is the only change relevant to the intent of the pull
Assumes that there is no trailing slash
I've now added support for paths in the model host config, enabling Groq API support (api.groq.com/openai) extension OpenAI {
func buildURL(path: String) -> URL {
var components = URLComponents()
components.scheme = "https"
// Check if host contains a path component
let hostParts = configuration.host.split(separator: "/", maxSplits: 1, omittingEmptySubsequences: true)
if hostParts.count > 1 {
// Host contains a path component
components.host = String(hostParts[0])
let hostPath = "/" + hostParts[1]
components.path = hostPath + path
} else {
// Host does not contain a path component
components.host = configuration.host
components.path = path
}
guard let url = components.url else {
fatalError("Unable to construct URL from components.")
}
return url
}
} |
Quality Gate passedIssues Measures |
Any update on this I'm interested in this because many models can be use this way, Grow, TogetherAI, Huggingface, Azure, etc |
It works yeah, I use it with Groq, TogetherAI, all the LiteLLM supported models (Anthropic mostly). My fork will be updated to maintain that as well as OpenAI, unsure if they will merge this into master here.
…On Fri, May 03, 2024 at 11:52 AM, Danielh Carranza < ***@***.*** > wrote:
Any update on this I'm interested in this because many models can be use
this way, Grow, TogetherAI, Huggingface, Azure, etc
—
Reply to this email directly, view it on GitHub (
#177 (comment) ) , or unsubscribe
(
https://github.com/notifications/unsubscribe-auth/AGZNIXEJ5MJZVPCAN5N5RHTZAO6DRAVCNFSM6AAAAABDR7G3HCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOJTGM4TCMZSG4
).
You are receiving this because you authored the thread. Message ID: <MacPaw/OpenAI/pull/177/c2093391327
@ github. com>
|
Quality Gate passedIssues Measures |
branching far away from the original scope of this project now |
What
TogetherAI's API mocks OpenAI's, however the model is not returned in chat stream responses, thus it must be made optional in order to proceed without erroring.
Simply added a ? flag to make model optional.
Enables you to use the TogetherAI API like so:
Affected Areas
ChatStreamResponse.swift