-
Notifications
You must be signed in to change notification settings - Fork 436
Add postProcess
plugin hook
#749
Comments
I haven't fully absorbed this proposal yet, but just wanted to note that it does not have to be backward incompatible. The old/new method signature can be detected, as we've done before when evolving other plugin methods. |
Ah, of course, that's great. Well, the main thing here is I want to avoid reading and writing to the filesystem all the time and proper chaining. All the cache busting, hashing and gzipping stuff are just examples and use cases. Oh, and while I'm at it: The |
Hmm, does autoreload-brunch depend on that the files are already written to disk when it is run? after-brunch probably does. But all other plugins would benefit. So do we need Btw, I could try to implement this, if it is accepted. |
Does it hurt performance to read+write instead of copying assets? |
Yes, we need another API, but onAfter doesn't sound great!
yes, but I guess not much |
Why another one? If
Doesn't Any older The current semantics of |
Yes, It doesn't really matter to me what we improve in the end. I just feel like lots of current plugins could get more help from brunch. I also want to develop that hash-filename plugin, but it's no fun currently. Using |
Yeah I think it'll be hard to pipeline something the manipulates filenames - that should probably be an A single plugin can potentially do both - an |
We just must remember that a file created by one plugin should be readable by another. For example, my "hash-filename" plugin creates app-HASH.js, then I want it to be gzipped by another plugin. That could be solved by letting the plugins run in chain via a callback. Still, reading files is boring ... |
Perhaps a bit can be solved by more documentation. For example, what's the proper way to read the public folder? |
Ok.. I hadn't really absorbed what your proposed +1 on this concept from me if you want to go ahead and work on it (assuming @paulmillr is on board). |
Why |
|
So it receives all files? |
Yes, look at the first paragraph of @lydell's "Proposal" above. It receives the path and data of everything brunch was about to write to the file system, and allows the plugin to mutate it before it's written. |
Yeah great idea. |
I thought that I could start working on this the past days, and continue this weekend, but things came in between, unfortunately :( |
let me know when you start - I may end up working on it in the meantime if you don't |
Will do, but don't expect anything at least before Christmas. |
I would love to see this API be added as well. I just finished digest-brunch, which implements the example above, but feels like a hack. |
I still haven't started working on this, and it looks like it won't happen soon :( |
Hello everyone! I'm facing the same issue trying to implement a plugin. Do we still want to implement this? If the answer is yes, I can take some time during the week to send a PR. |
Background
In one of my projects I've added the following to my config:
I use the after-brunch plugin to run the following command line tools, to achieve cache busting
and gzip compression for production:
npm install -g hash-filename
)npm install -g map-replace
)That works fine for me in this case, where my public/ directory contains no subdirectories. If it would, though, it wouldn't be so easy to specify files for the commands anymore, in a way that works cross-platform.
I've developed the
hash-filename
andmap-replace
programs myself. I initially intended them to be brunchonCompile
plugins, but in the end I found this solution much easier to program (and I needed such tools for more than brunch, so I killed two birds with one stone).But I've read many brunch issues about cache busting and gzip compression. I guess there should be brunch plugins for that, since it seems to be a great demand for it.
There actually is a gzip plugin (gzip-brunch), but just like most other
onCompile
plugins, it suffers from a few things.Issues
onCompile
plugins issues:hash-filename
must be run beforemap-replace
. Luckily, it is possible to hash synchronously, but not all modules have sync interfaces.readDirSyncRecursive(...)
.So, a change is needed, both for my use case, and for
onCompile
plugins in general.Proposal
Change
onCompile: (generatedFiles) ->
toonCompile: (public, callback) ->
, wherepublic
is an array offile
s. Afile
is an{path: ..., data: ...}
object.file.path
is relative to the public/ directory.file.data
is the contents of the file. None of the files are written yet. Thepublic
array contains not only generated files, but also asset files that should just be copied. It effectively means that thepublic
array contains everything that the public/ folder eventually will contain.All
onCompile
plugins are run asynchronously in chain, by usingcallback
.callback
takes an error as its only parameter. If a plugin does not need to be run in chain, it could justcallback(null)
immediately.Doing something for every file that matches some pattern is now really easy:
Or asynchronously:
@pattern
comes from:If you want to modify a file, you just modify
file.data
:If you want to create a new file, just append it to the
public
array:When brunch has run all
onCompile
plugins it finally writes whatever is in thepublic
array to the disk.Example
If something like this comes true, I'd like to write a
hash-filename
-like plugin:Doesn't that look great? The "map.json" file produced could then be fed into a function used in some templating language. static-underscore-brunch could be used to update all paths in static HTML etc.
Notes
I've been looking through the plugin list, and my conclusion is that lots of plugins will benefit from this change. It's backwards incompatible though. But I'd be happy to contribute to plugins to use the new API. And while speaking about the API: I love how simple it is, don't you think?
The text was updated successfully, but these errors were encountered: