Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Filtering of the content using transformations in the pipeline for multiple transports #1916

Open
WoZ opened this issue Mar 7, 2024 · 1 comment

Comments

@WoZ
Copy link

WoZ commented Mar 7, 2024

Hi!

I'm looking at how to implement the desired logic. I want to add a filtering feature to limit debug logs by specifying a set of modules. To demonstrate this approach we may take the debug module as an example.

const debug = require('debug');
const debugWorkerA = debug('worker:a');
const debugWorkerA = debug('worker:b');

debugWorkerA('worker:a - debug message');
debugWorkerA('worker:b - debug message');

When you launch such a script with env variable DEBUG=worker:a the output will be

worker:a - debug message

You may launch with DEBUG=worker:* and the output will be

worker:a - debug message
worker:b - debug message

So, I want to use the same approach and specify what modules must be taken into account. I've found similar discussion (#206 and #208), and pino-filter was mentioned there, but pino-filer can't be used as the in-code transformer and can't be applied for every transport in the list.

What do I mean? Let's say I want to have 2 transports working in parallel:

  • pino-socket;
  • pino-slack-webhook.

But when something is completely wrong with application, there is a need to lower the log level and start sending messages with loglevel >=trace. By default, you will receive debug messages from all "modules" I have (I use module field as mixin). But I don't need them and logs may be huge, with a huge impact on the production environment. So, here an idea with filtering like debug module provides comes into play.

Let's say I may develop my own transport-transformer and put it before my current transports into pipeline.

transport: {
    pipeline: [
        {
            target: 'my-pino-filter',
        },
        {
            target: 'pino-socket',
            options: {}
        },
        {
            target: '@youngkiu/pino-slack-webhook',
            options: {}
        }
    ],
}

But this will not work even if I play with Transform stream, because both pino-soket and pino-slack-webhook implement Writeable stream. Pino just throws an exception

TypeError [ERR_INVALID_ARG_TYPE]: The "val" argument must be an instance of Readable, Iterable, AsyncIterable, ReadableStream, or TransformStream. Received an instance of Writable.

Despite of that, by design, there is no option to mix pipeline and transform options to separate these steps.

May you suggest any solution that allows:

  1. in the main thread or in every worker thread to process filtering/transformation on the very first step?
  2. apply this logic to multiple transports in parallel?

Thank you!

@mcollina
Copy link
Member

mcollina commented Mar 7, 2024

Currently this is not supported.

You can likely create two worker threads by using pino.transport(), and combine them with pino.multistream inside the main process.

It would be better to have a better syntax for "pipeline" that would allow forks. A PR for that would be welcomed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants