New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
groupBy (permanently?) enters a broken state when used as transducer #3232
Comments
Update: just tried the same against 0.27.2 and it worked as expected, so this can at least be narrowed down to a bug introduced in 0.28.x. Thanks! |
Thank you for this detailed report. |
Fixes ramda#3232 Reverts code change in #9b5d8925 Setting the accumulator when not prsent rather than passing it tin to reducerby; this prevents the cached copy behavior causing polution of accumulators in subsequent invocations. It may be more desirable to correct the copy caching to prevent the error while still deterministically setting the accumulator; but it's in line with traditional transducer behavior to get the initial accumulator value from the transformer, albeit not quite in this fashion.
Fixes ramda#3232 Reverts code change in #9b5d8925 Setting an absent absent/nullish accumulator, rather than pasing an empty on to reduceBy, appears to correct the cached copy behavior causing pollution in the accumulators on subsequent invocations, as well as the "passthrough"; I believe this is due due the cached shallow-copying combined with the currying of `reduceBy`. It may be more desirable to correct the copy caching to prevent the error while still deterministically setting the accumulator, but for what it's worth, it's at least in line with traditional transducer behavior to get the initial value from the transformer, albeit not *quite* this way!
@customcommander Sure thing. Mostly because I couldn't leave it well enough alone, I poked at it a bit more and noticed that reverting the most recent commit to that module fixes the issue; I went ahead and submitted a PR that does so as well as throws in a new test case. If you'd rather fix it on the |
Damn, do I want to stop supporting transducers altogether. The code complexity is much higher than it would otherwise be because of them. |
@CrossEye Aw, I love the transducer support! Though you're not wrong that the internal implementations can be.. a bit mindbinding in JS (moreso than the clojure equivalents, even), so I totally see where you're coming from. Though, to play devil's advocate for a sec, at a glance it look like a lot of the complexity is coming from adding on the transducer support to things that are actually reducers, which is sort of doing a transform+reduce on top of what is already a reduce? I was tempted to bring this up earlier, but held off since it's a bit beyond the scope of the bug. But hey, why not while it's on my mind 😀 . leaning into transducersSince transducers are already transform + reduce, it might actually simplify Ramda over the current dispatcher/wrapper stuff to make transducers the core of the operators that support both usages; once you implement the transform, the normal, non-transducer equivalents are easily expressed in terms of a transducer. For example, here's a function groupByXF(keyFn) {
return function groupByTransformer(xform) {
return {
keyFn,
xform,
groups: new Map(),
'@@transducer/init'() { return this.xform['@@transducer/init'](); },
'@@transducer/step'(value, input) {
const key = this.keyFn(input);
if (this.groups.has(key)) {
this.groups.get(key).push(input);
} else {
this.groups.set(key, [input]);
}
return value;
},
'@@transducer/result'(value) {
for (const groupEntry of this.groups.entries()) {
value = this.xform['@@transducer/step'](value, groupEntry);
if (value && value['@@transducer/reduced']) {
value = value['@@transducer/value'];
break;
}
}
return this.xform['@@transducer/result'](value);
}
};
};
}
export function groupBy(keyFn, col) {
if (arguments.length > 1) {
// you can optimize a bit more than this by overriding the step/result functions instead of passing to into,
// which would skip the final "copy the groups to the object", but I apparently didn't bother when I wrote this!
return into({}, groupByXF(keyFn), col);
} else {
return groupByXF(keyFn);
}
}; That's the whole thing! Which allows stuff like groupBy(x => `length ${x.length}`, ['a', 'ab', 'abc']);
// => { 'length 1': [ 'a' ], 'length 2': [ 'ab' ], 'length 3': [ 'abc' ] }
into([], compose(groupBy(x => x.length), take(2)), ['a', 'ab', 'abc']);
// => [ [ 1, [ 'a' ] ], [ 2, [ 'ab' ] ] ] |
...that said, for reduce applications like So.. you could probably get away with that too? It can be kinda nice, though. For one thing, you don't necessarily need to coerce the key to a string if you aggregate the groups using a Map or similar... |
That's fascinating stuff. The last few days I've been trying to think of entirely different ways to manage What I would really like to find is some way to make |
While
groupBy
is one of the Ramda functions documented as being usable as a transducer, and I believe this worked years ago when I last tried it, now it appears to pass through every item into every group list. After this, every subsequent use ofgroupBy
(as a transducer or otherwise) will be cloned as the starting accumulator for each group.I'm guessing there was an update to
reduceBy
at some point that broke the arity to cause this but haven't quite gone through all the layers of indirection to narrow it down.Repro
comparing
countBy
The text was updated successfully, but these errors were encountered: