Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Random operators (| and ?) always make the same choice in a cycle #530

Closed
ijc8 opened this issue Mar 20, 2023 · 4 comments · Fixed by #531
Closed

Random operators (| and ?) always make the same choice in a cycle #530

ijc8 opened this issue Mar 20, 2023 · 4 comments · Fixed by #531

Comments

@ijc8
Copy link
Contributor

ijc8 commented Mar 20, 2023

Hello! I've observed a funny interaction between rand and seq().

I initially noticed that s("[sd|cp] [sd|cp]") will always select both sds or both cps in any given cycle; you never get the combinations sd cp or cp sd. This is surprising because it seems like the choices should be independent.

I confirmed that this is different than Tidal's behavior: after evaluating d1 $ s "bd [sd|cp] [sd|cp]", I observe all four combinations of [sd, cp] x [sd, cp].

The | syntax is mini-notation for chooseCycles(), which uses rand; I can see that rand itself is producing duplicate values:

s("hh*2").gain(seq(rand, rand).log())

generates console output like

[hap] 136/1 → 273/2: s:hh gain:0.13287117518484592
[hap] 273/2 → 137/1: s:hh gain:0.13287117518484592
[hap] 137/1 → 275/2: s:hh gain:0.1258764062076807
[hap] 275/2 → 138/1: s:hh gain:0.1258764062076807
[hap] 138/1 → 277/2: s:hh gain:0.564989872276783
[hap] 277/2 → 139/1: s:hh gain:0.564989872276783

However, this does not occur with s("hh*2").gain(rand.log()).

(s("hh").gain(rand.log()).pan(rand.log()) also produces duplicate values for gain and pan, but I get the impression that's expected behavior because the PRNG is a pure function of time, and the two uses of rand here are simultaneous for each hap.)

In short, this seems like a regression from the behavior described here: #165 (comment).

That's as far as I've gotten for now; I plan to take a closer look at this soon and hopefully submit a fix.

@felixroos
Copy link
Collaborator

Hey! thanks for the investigation

you're right, the randomness is currently a bit weird..

In short, this seems like a regression from the behavior described here: #165 (comment).

that's true, I rolled some of it back because it introduced non deterministic behavior, see #245 .

I can see that rand itself is producing duplicate values

Interesting observation.. this produces the same value 3 cycles in a row:

s("hh").gain(cat(rand, rand, rand)).log()

I am not sure what to do here, as I haven't fully dived into the inner workings of the rng.. @yaxu any ideas?

@ijc8
Copy link
Contributor Author

ijc8 commented Mar 20, 2023

Thanks for the quick reply!

Interesting observation.. this produces the same value 3 cycles in a row:

s("hh").gain(cat(rand, rand, rand)).log()

Ah, I noticed that s("hh*2").gain(cat(rand, rand).log()) did some interleaved repetition, but you're right: just s("hh").gain(cat(rand, rand).log()) already repeats values. And I see that seq() is defined in terms of slowcat(), so maybe that's the real source of the issue.

Tracing backwards from rand, it looks like timeToIntSeed() is generating the same value for consecutive cycles, because it's receiving the same time twice, because signal is getting the same state twice and thus calling func (in this case timeToRand) with the same midpoints.

For example, if I cat(rand, rand).queryArc(0, 2) and log the state received here (before it gets passed to timeToRand()):

export const signal = (func) => {
const query = (state) => [new Hap(undefined, state.span, func(state.span.midpoint()))];
return new Pattern(query);
};

then I see:

state: {"span":{"begin":{"s":1,"n":0,"d":1},"end":{"s":1,"n":1,"d":1}},"controls":{}}
state: {"span":{"begin":{"s":1,"n":0,"d":1},"end":{"s":1,"n":1,"d":1}},"controls":{}}

so rand is seeing the same state twice and thus generating the same value twice.

@ijc8
Copy link
Contributor Author

ijc8 commented Mar 20, 2023

Upon further contemplation, the stuff with rand itself actually seems like expected behavior. Other signals behave the same way, which is why e.g. there's a difference between these expressions:

s("hh*2").gain(seq(sine, sine).log()) // always yield ~0.5 because the signal always starts from the beginning
s("hh*2").gain(sine.log())            // alternates between 0 and 1 because the signal is sampled at two points per cycle

and indeed, Tidal behaves identically; seq(rand, rand).queryArc(0, 1) in Strudel returns two events with the same value, and fastcat [rand, rand] does the same thing in Tidal.

So then why doesn't random choice behave like this in Tidal? It looks like it's because of this, which causes a new seed to be associated with each occurrence of | in the mini-notation (and pRand does the same for ?). (Now I understand the rest of @bpow's comment!)

Regarding the problem mentioned in #245 (comment), maybe the issue is that _seedState was global state, whereas in Tidal the state only exists within in the context of a single parse. That is, evaluating parseTPat "[a|b] [a|b]" always yields the same result, where the first TPat_CycleChoose has a seed of 0 and the second has a seed of 1.

If so, probably the right thing to do is have some similar parser state in Strudel, so that each occurrence of | or ? within a pattern can get a new seed, without affecting future patterns.

@ijc8 ijc8 changed the title rand produces duplicate values when used repeatedly in seq() Random operators (| and ?) always make the same choice in a cycle Mar 20, 2023
@yaxu
Copy link
Member

yaxu commented Mar 20, 2023

Yes that's a solid analysis! rand is a function of time, so you have to shift it or slow it down/speed it up if you want different results. The tidal mini-notation does some time manipulation for you as you describe.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants