Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Demo of Networked-Aframe with MediaPipe Self-Segmentation #296

Open
jimver04 opened this issue Oct 28, 2021 · 8 comments
Open

New Demo of Networked-Aframe with MediaPipe Self-Segmentation #296

jimver04 opened this issue Oct 28, 2021 · 8 comments

Comments

@jimver04
Copy link

There is no place for demos so I am adding it here !

We have made a stage for live concerts using Networked-Aframe and MediaPipe.

Video

https://www.youtube.com/watch?v=2d8P70C7N_4

image
In the comments of the video you can find the Demo link for live testing. Currently in Beta. Since MediaPipe is sometimes glitchy, you may need to refresh Firefox or Chrome. Use a pc, laptop, or tablet. On mobiles (Chrome only) there might be issues.

@arpu
Copy link
Member

arpu commented Oct 28, 2021

nice do you use vp9 or vp8 (in webrtc) for the transparent video?

@vincentfretin
Copy link
Member

This is a cool scene @jimver04 !
@arpu see open-easyrtc/open-easyrtc#73 (comment) and #269 and #291 (comment) for context about this demo.

@jimver04 tell me if I'm wrong, your demo currently use a custom aframe build that stops the camera video track and add a new video track coming from canvas.captureStream() to the default easyrtc sending stream. It's a little bit hacky and you may encounter some issues as discussed. It should be easier to add a new video track once #294 is merged.
If I understand it correctly, mediapipe is getting access to the camera, replace the background by green color and render it to a canvas, this is this canvas via canvas.captureStream() you send as a second video track to the other particpants.
On the receiving side, you have a custom networked-video-source-mediapiped component that render the second video track with a custom shader with THREE.ShaderMaterial to render green pixels to transparent.
With #294 networked-video-source get a streamName parameter. After the PR is merged, it may be good to add some option to replace some color with transparent via a custom shader.

@vincentfretin
Copy link
Member

@jimver04 The multi streams example is now merged
https://github.com/networked-aframe/networked-aframe/blob/master/examples/basic-multi-streams.html
and live here
https://naf-examples.glitch.me/basic-multi-streams.html

If you can adapt your demo to use the new API and give feedback on it that would be awesome.
You should be able to use networked-scene="audio:true;video:false" to not have the camera enabled, and use
NAF.connection.adapter.addLocalMediaStream(canvasStream, "mediapipedCamera");
and rewrite your networked-video-source-mediapiped to use latest changes of networked-video-source and use it like this networked-video-source-mediapiped="streamName:mediapipedCamera"
Be aware of known issues #299 though. I don't know it you have currently those issues in your demo.

@vincentfretin
Copy link
Member

https://webrtchacks.com/how-to-make-virtual-backgrounds-transparent-in-webrtc/ is a similar implementation than yours.

@jimver04
Copy link
Author

jimver04 commented Nov 2, 2021

@vincentfretin Yes we have copied it from the same library of Mediapipe 👍
I have tried to make video avatars to co-exist with 3D avatars but there is a problem with streams. We have an evaluation at our project this week and I do not have time. I am also preparing an authoring tool for artists to make their own environment as a plugin for WordPress using Three.js. I will definitely come back on this on the next week. Best, D.

https://www.youtube.com/embed/WataNoHgjlo
image

@jimver04
Copy link
Author

Sorry for not replying earlier but I was engaged in another project in UE4.

The steps are as follows:

  1. Create a canvas
  2. Add in this canvas the selfie-segmented video by MediaPipe as follows:.Draw only the user pixels but as background draw green pixels (this is due to the fact that transparent pixels can not pass through webrtc).
  3. Add this stream by calling within EasyRtcAdapter.js

this.easyrtc.initMediaSource(
        function(stream) {
            //if (that.useCapturedCanvasStreamSource){

            // Stop the original video stream
            stream.getVideoTracks()[0].stop();

            // Add the new canvas stream from canvas
            if (that.capturedCanvasStreamSource.getVideoTracks()[0]) {
              stream.addTrack(that.capturedCanvasStreamSource.getVideoTracks()[0]);
            }

          that.setMediaStream(that.easyrtc.myEasyrtcid, stream);
          that.easyrtc.connect(that.app, connectSuccess, connectFailure);
        },
        function(errorCode, errmesg) {
          NAF.log.error(errorCode, errmesg);
        }
      );

Note that I have stopped the original video stream to release some bandwidth.

  1. On client side, for all the streams received from others, replace green pixels with transparent ones.

Now I see that you have in basic-multi-streams.html

NAF.connection.adapter.addLocalMediaStream(stream, "screen");

I will try to exploit that to make

  1. An example for just replacing green pixels with transparent (without MediaPipe). Let us assume that you have a real green-screen background like me (well not perfect but will do the job)

image

PS: I have always the same stream within the other. I can not send two different streams by pressing "Share Screen". Is this a bug ?

Then when ensuring that (1) works, I will do

  1. A full example with "artificial" green pixels and MediaPipe (let's hope that google has fixed instability WebAssembly issues)

@jimver04
Copy link
Author

ok forget about the bug. It was something left from a previous version of naf. I have managed to share my phpstorm screen:

image

@vincentfretin
Copy link
Member

Hi, no worries. Thanks for taking back at it.
The two planes are at the same position in the demo, the screen share is the smaller plane. This demo could be improved on that, maybe move the position up or to the right.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants