-
Notifications
You must be signed in to change notification settings - Fork 908
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider replacing ApiClient
#1502
Comments
grammY is not going to split out its client implementation anytime soon. It doesn't make much sense from the grammY point of view. I understand that it would be cool to reuse it for Telegraf, but we can't start designing grammY after what is good for Telegraf. If anything, then you can copy and—if necessary—adjust the files of the grammY client over to this repo. I won't spend any time on this, I'm rather investing it in making grammY itself better. |
Yeah, would definitely be interested. Usually most of the time a bare bones client is needed if you are using few methods like As for the file handling, I think this implementation is pretty standard. NTBA has a similar implementation (supporting streams, buffers, e.t.c.).
Yup, only useful in particular cases where you are 100% sure of no error when making the "blind" reply. |
Done: https://github.com/telegraf/client Currently ESM-only, because @ejnshtein is it ok if I publish it as |
Sure |
@kamikazechaser should I try switching https://github.com/telegraf/client to https://github.com/nodejs/undici and CJS? Catch: requires Node >=16.5. |
Yeah, undici > standard lib |
To be clear, currently telegraf/client uses |
Node.js 16 will be LTS in 1.5 month. Updating to Node 16 is easier and less disruptive than switching JavaScript project to ESM. |
I'll wait until nodejs/node#37340. telegraf/client is ahead of its time :) |
fyi octet-stream/form-data is cjs only for now, so it can't depend on lates fetch-blob that is ESM-only. |
Actually, |
|
Fixed all known caveats and published |
I think I don't like how you're importing the dependencies dynamically for the first call. There no performance gain when measuring from instantiation to first completed request. On the other hand side, there's a small performance overhead for every request because you need to look up the cached module. Why even using dynamic imports in the first place? Also consider supporting arbitrary third-party API servers with custom URL scheme, and not just one that uses user+token instead of bot+token. grammY can do this. LGTM otherwise. |
I don't like the dynamic imports either. But it's the only way to import ESM-only Do third-party APIs with custom URL scheme actually exist? In 0.2.0, I'll replace Unsure how |
Why do you need v3 again?
I'm not sure, I'm not aware of any. The point is rather that the runtime code supports arbitrary strings, and you're artificially constraining usage here. You don't lose anything if you allow people to use your implementation's full capabilities. I've heard of deasync and I lost quite some time because some dependency deep down the chain thought it was a good idea to use it … *nervous laughter that reveals the hope for that you were joking* |
Whaaat!?! that reminds me of a old package i wrote a long time 4 years ago by turning async stuff into something sync-ish using proxy and promises chains I posted it on stackoverflow a long time ago: https://stackoverflow.com/a/44048287/1008999 You can use it like this: const resolve = require('lazy-resolver')
// node-fetch@3 is a ESM only package
const fetch = resolve(import('node-fetch')).default
fetch('https://httpbin.org/get').then(console.log) it even goes above and beound so you can actually abuse the promise chain all the way down and do crazy stuff like this: const resolve = require('lazy-resolver')
const fetch = resolve(import('node-fetch')).default
fetch('https://httpbin.org/get?items=4&items=2')
.json() // no need to call then
.args
.items
.map(n => ~~n * 4)
.then(console.log, console.warn) |
Magical. Honestly—if your code doesn't support CJS, then it doesn't support CJS. I wouldn't try to hack myself back into the past, that's only gonna bite you eventually. It's not a bug, it's a feature: “minimalist cutting-edge Telegram bot client built around streams and ESM” and you're good :D |
I'm planning to switch to |
Resolved in d65b938 towards v5, using native fetch; supports Node 18+ only. |
It's too early to start writing Telegraf 5, but not too early to discuss what changes it should bring.
node-fetch/node-fetch#1226 will make upgrading to
node-fetch@3
a breaking change.If Bot API starts supporting HTTP/2, https://npm.im/fetch-h2 could be used instead; https://npm.im/undici otherwise. They don't support
Agent
tho.Whoever ends up working on Telegraf 5 could replace entire
ApiClient
instead of justfetch
. CurrentApiClient
sandwich-stream
,I wrote my own minimal
Client
: https://gist.github.com/fdf35587d4fbb3b55a5a67a1adca32ec.FormData
,Blob
/File
, andfetch
,sendDocument
andsendMediaGroup
, so it should be able to perform any API call.I can easily turn it into a npm package if anyone is interested in using it (@kamikazechaser? yagop/node-telegram-bot-api#855).
It should be easy to extend such a small and simple client with any features necessary. Except for uploading files via a stream, since that's not supported by
FormData
, so that it can be reuploaded (node-fetch/node-fetch#1167).I won't support
webhookReply
at all: #1250, #1082. According to @KnightNiwrem's test,webhookReply
requests are still rate-limited, but no errors (including 429) can be reported back to client.Alternatively, @KnorpelSenf could turn
grammY
'sApiClient
into a standalone npm package.The text was updated successfully, but these errors were encountered: