Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MIDI API should be available from Workers? #99

Open
cwilso opened this issue Feb 13, 2014 · 43 comments · May be fixed by #256
Open

MIDI API should be available from Workers? #99

cwilso opened this issue Feb 13, 2014 · 43 comments · May be fixed by #256
Assignees
Labels
class: substantive https://www.w3.org/2023/Process-20230612/#correction-classes status: ready for editing Enough information should be available to implement this change in the spec
Milestone

Comments

@cwilso
Copy link
Contributor

cwilso commented Feb 13, 2014

It has been suggested to me that the MIDI API should be available from Workers. For the purposes of keeping a sequence going in a reasonable manner, this sounds like a great idea to me.

@toyoshim
Copy link
Contributor

+1

2 similar comments
@nfroidure
Copy link

+1

@notator
Copy link

notator commented Feb 14, 2014

+1

@marcoscaceres
Copy link
Contributor

For record keeping, can you just list some of the use cases that you envision for having the API running in workers?

@nfroidure
Copy link

@marcoscaceres at least this one cwilso/WebMIDIAPIShim#33

@igorclark
Copy link

Hi @marcoscaceres, this would be really helpful for my use case. I'm generating MIDI events in Javascript, and GUI/window resize events can really affect the timing.

I've moved the scheduling code into a web worker, which works well; on average, events inside the worker fire within an average of 1.5-1.7ms of their intended time, tested over a period of 5 minutes and more.

However after that, the scheduler thread has to send the outgoing MIDI messages back via the main thread, at which point they can still be interrupted. If the web worker could send the MIDI messages directly, this wouldn't happen, and the timing could be much more reliable.

@notator
Copy link

notator commented Feb 17, 2014

@marcoscaceres @igorclark
Hi, my AssistantPerformer (https://github.com/notator/assistant-performer)
also generates MIDI events in Javascript, and wants to do that over arbitrarily long periods of time without being disturbed by interruptions in the main thread. So +1 to igorclark -- especially his last paragraph.
I'm also planning a polyphonic version of this application. This would be like playing a prepared piano. Obviously, the more integration with workers, the better.

@marcoscaceres
Copy link
Contributor

Thanks everyone! These are really helpful.

@jussi-kalliokoski
Copy link
Member

Sounds great to me!

@cwilso cwilso added this to the V1 milestone May 6, 2014
@cwilso cwilso self-assigned this May 6, 2014
@jaycliff
Copy link

jaycliff commented May 7, 2014

This would be really nice :)

@cwilso cwilso removed their assignment Mar 6, 2015
@cwilso cwilso removed this from the V1 milestone Mar 6, 2015
@cwilso cwilso added the status: needs WG review Needs to be discussed with the Audio Working Group before proceeding label Mar 6, 2015
@joeberkovitz
Copy link

+1 on the feature, for sure. But perhaps this isn't critical for release, given that events in a sequence can be assigned exact timestamps and that the main thread need not schedule events in tight proximity to their physical output.

@MidiHax
Copy link

MidiHax commented Mar 9, 2015

+1 I think @igorclark nailed the use case that all of us developing sequencers have encountered. This seems like a rather essential feature in my opinion.

@toyoshim
Copy link
Contributor

I also think this is an important feature. Once basic part of the spec is fixed, I'll work on this.

@cwilso cwilso removed the future label Mar 10, 2015
@cwilso cwilso added this to the V1 milestone Mar 10, 2015
@cwilso
Copy link
Contributor Author

cwilso commented Mar 10, 2015

It's sounding like people want this in v1. (It's pretty straightforward to do; in addition to

partial interface Navigator {
    Promise<MIDIAccess> requestMIDIAccess (optional MIDIOptions options);
};

we need

partial interface WorkerNavigator {
    Promise<MIDIAccess> requestMIDIAccess (optional MIDIOptions options);
};

and (I think) that's pretty much it.) Additional cost on implementations, of course.

@cwilso cwilso self-assigned this Mar 10, 2015
@cwilso cwilso added status: ready for editing Enough information should be available to implement this change in the spec and removed status: needs WG review Needs to be discussed with the Audio Working Group before proceeding labels Jun 2, 2015
@ryanlaws
Copy link

ryanlaws commented Apr 2, 2017

This is critical for acceptable timing. How can I help move this forward?

@7ombie
Copy link

7ombie commented Jan 24, 2022

What happened to this? The spec talks about Launchpads and DJ controllers, but you can't build anything serious around MIDI controllers with main-thread latencies. In a pro-audio context, that would be super nasty.

@7ombie
Copy link

7ombie commented Jan 24, 2022

I think AudioWorkletGlobalScope might be the better fit for this purpose. Then you can process MIDI data right inside of the audio rendering thread.

I don't think the audio thread should be handling MIDI (especially outbound MIDI messages for controlling LEDs etc). It's not what worklets were designed for. Besides, a regular worker can use a shared array buffer (or Wasm memory) to write directly to the memory that the audio thread is rendering from, without extra latency or blocking the audio thread.

@7ombie
Copy link

7ombie commented Jan 31, 2022

Just nudging this thread, hoping to get a status update. A few noteworthy points...

There's an interesting proposal for exposing regular input (basically keyboard and pointer events) in workers. That proposal contains some use cases and general observations that are relevant to controlling realtime audio.

Chromium et al automatically use high priority threads for audio worklets now, and that has substantially expanded the range of serious audio applications that are possible on the platform. The only major limitation that remains (on Chromium, at least) is the inability to handle low-latency input (WebUSB is threadable, but does not expose (class-compliant) HID or MIDI devices).

In short, if we could (directly) handle keyboard, touch and MIDI events in workers, we will have all of the essential primitives for serious realtime audio programming in the browser. Given that this thread is just about updating the spec, I cannot see any reason to hesitate any longer.

@cwilso
Copy link
Contributor Author

cwilso commented Feb 3, 2022

@hoch and @padenot should weigh in on the feasibility, but I'd hesitate to just put it in the spec without some confidence that it's possible and likely to implement.

@hoch
Copy link
Member

hoch commented Feb 3, 2022

In terms of the feasibility, I believe @toyoshim can provide us with the better answer.

I am also aware that FireFox has a working implementation - so If both implementors are positive about the feasibility, the spec change doesn't seem controversial.

@cwilso
Copy link
Contributor Author

cwilso commented Feb 3, 2022

@padenot is Firefox's implementation exposed on Worker?

@padenot
Copy link
Member

padenot commented Feb 4, 2022

We implement the spec as it is today, so no.

I don't think it would be particularly hard to do so, though, modulo the permission aspect, that is tied to navigator at the minute.

I do think it would be a good idea, and in fact, necessary, for any software that does anything non-trivial on the main thread.

@toyoshim
Copy link
Contributor

toyoshim commented Feb 7, 2022

I experimentally created a POC patch to support Web MIDI in dedicated workers.
https://chromium-review.googlesource.com/c/chromium/src/+/3439238

We have a few code that depends on information bound to DOMWindow, but we can fix it easily. So, what we really need is just to expose the requestMIDIAccess interface to the WorkerNavigator. It should not be difficult to brash up this CL to be production ready.

@7ombie
Copy link

7ombie commented Apr 2, 2022

Can we move this forwards, please guys, and get the specification updated? I believe the reasons for holding off on this have been addressed now. It's just the API that needs finalizing (re. navigator).
Thanks.

@Boscop
Copy link

Boscop commented Oct 30, 2022

I'd really appreciate if the Web MIDI API would be accessible from Web Workers.
When implementing a MIDI player that needs to be stepped forward at 60 FPS, the only way to do it right now seems to be either setInterval (bad performance & jitter) or requestAnimationFrame (which doesn't work while the window is not focused, so the MIDI playback pauses when the user switches from the browser window to the DAW window).
Or did I miss another way? I'm curious if there is one. If I run the midi player in a worker but use message passing to send the midi events to the main thread before sending them out to the midi port, do you think that would work without added latency/jitter or would it be suboptimal to setInterval/requestAnimationFrame?

Anyhow, if the MIDI API was accessible from web workers, a loop could be run in a worker thread for the MIDI playback.

@JSmithOner
Copy link

any updates on this? is this that hard to implement.I'm using exhaustively webmidi in my app and the dom is drastically slowing down some processes. Thanks in advance

@hoch
Copy link
Member

hoch commented Dec 5, 2022

I believe everyone in the WG understood and agreed with the need of this change, but the group needs to come up with the spec text first. Sorry for the delay, but I am cautiously looking at 2023.

@7ombie
Copy link

7ombie commented Dec 7, 2022

Thanks, @hoch. I appreciate the update. It's nice to know things are moving along, even if it'll take a while.

If this spec is committed to threaded MIDI, then the Input for Workers Proposal should drop their (stalled) effort to do the same thing. IMO, the scope of that proposal should have always been limited to keyboard and pointer events (which are tied to the DOM), and never addressed MIDI events (which have always belonged in the MIDI spec).

I'm not sure if the Input for Workers proposal is actively maintained, but if so, somebody should let them know to remove MIDI events.

I forgot that I tried to contact the Input for Workers people six months ago, just asking for a sign of life, and got nowhere. I've updated them anyway.

@7ombie
Copy link

7ombie commented Dec 13, 2022

As mentioned briefly before, while I'd like to see MIDI in workers, I don't personally think MIDI should be exposed to audio worklets.

Worklets are very specialized. I don't think it's appropriate to handle MIDI on the browser's audio thread.

Note: Apps that are sensitive to latency can implement their MIDI interface in a regular worker, then use a shared array buffer to write directly to the memory that the audio worklet is rendering from.

I'm not certain, but I don't think any of the other input APIs (WebHID, WebUSB, WebBluetooth etc) are planning to make their APIs available in worklets (of any kind).

@JohnWeisz
Copy link

Note: Apps that are sensitive to latency can implement their MIDI interface in a regular worker, then use a shared array buffer to write directly to the memory that the audio worklet is rendering from.

Or a MessageChannel to communicate directly between the MIDI worker thread and the audio worklet/rendering thread, avoiding the main thread.

@padenot
Copy link
Member

padenot commented Dec 13, 2022

There are no plans to expose MIDI to Audio Worklets. I also don't know of other APIs being exposed in Worklets.

Communicating from a regular Web Worker to an AudioWorkletGlobaleScope using SharedArrayBuffer, via an SPSC wait-free queue, is the way to go.

postMessage(...); will cause problems, this blog post explains why, with benchmarks.

@mjwilson-google mjwilson-google added class: substantive https://www.w3.org/2023/Process-20230612/#correction-classes status: needs WG review Needs to be discussed with the Audio Working Group before proceeding labels Sep 13, 2023
@mjwilson-google
Copy link
Contributor

mjwilson-google commented Oct 5, 2023

Audio Working Group 2023-10-05 meeting conclusions:

  • We will pursue this for V1
  • Two things to do to resolve this:
    • Make MIDIAccess Transferable
    • Expose MIDIAccess in WorkletGlobalScope

@mjwilson-google mjwilson-google added status: ready for editing Enough information should be available to implement this change in the spec and removed status: needs WG review Needs to be discussed with the Audio Working Group before proceeding labels Oct 5, 2023
@mjwilson-google mjwilson-google self-assigned this Oct 15, 2023
@mjwilson-google
Copy link
Contributor

I wrote "WorkletGlobalScope" but it looks like we will expose this in WorkerGlobalScope; please see discussion in #256 if interested.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
class: substantive https://www.w3.org/2023/Process-20230612/#correction-classes status: ready for editing Enough information should be available to implement this change in the spec
Projects
None yet
Development

Successfully merging a pull request may close this issue.