Skip to content

VLprojects/mediasoup-ios-client

 
 

Repository files navigation

mediasoup-ios-client

Objective-C wrapper library for libmediasoupclient for building mediasoup iOS based applications.

This project supports both 64 bit iOS devices and 64 bit iOS Simulators

License

Website and Documentation

Support Forum


Getting Started

Cocoapods

Add the below into your Podfile:

use_frameworks!

target "target" do
  pod 'vl_mediasoup_client_ios', :git => 'https://github.com/VLprojects/mediasoup-ios-client.git'
end

You will need to set enable bitcode to false

Due to the size of the WebRTC.framework with bitcode, it cannot be uploaded to Github.

Swift users will need to implement a Objective-C Bridging Header

Bridging header sample:

https://github.com/ethand91/mediasoup-ios-client-sample/blob/master/mediasoup-ios-cient-sample/mediasoup-ios-cient-sample-Bridging-Header.h

Documentation

API

https://github.com/VLprojects/mediasoup-ios-client/blob/master/documentation/Api.md

INSTALLATION (Only needed for development, not needed if you only intent to use the project)

https://github.com/VLprojects/mediasoup-ios-client/blob/master/documentation/Installation.md

Usage Example

#import "mediasoup_client_ios/Mediasoupclient.h

// Create a Device
MediasoupDevice *device = [[MediasoupDevice alloc] init];

// Communicate with our server app to retrieve router RTP capabilities
NSString *routerRtpCapabilities = [mySignalling request:@"getRouterRtpCapabilities"];

// Load the device with the routerRtpCapabilities
[device load:routerRtpCapabilities];

// Check whether we can produce video to the router
if ![device canProduce:@"video"] {
 NSLog(@"cannot produce video");
 // Abort next steps
}

// Create a transport in the server for sending our media through it
NSDictionary *transportData = [mySignalling request:@"createTransport"];

// Object to handle SendTransportListener events
@interface SendTransportHandler: NSObject<SendTransportListener>
@property (nonatomic) id delegate;
@end

@implementation SendTransportHandler
-(void)onConnect:(Transport *)transport dtlsParameters:(NSString *)dtlsParameters {
 // Here we communicate out local parameters to our remote transport
 [mySignalling request:@"transport-connect" transportId:[transport getId] dtlsParameters:dtlsParameters];
}

-(void)onConnectionStateChange:(Transport *)transport connectionState:(NSString *)connectionState {
 NSLog(@"sendTransport::onConnectionStateChange newState = %@", connectionState);
}

-(NSString *)onProduce:(Transport *)transport kind:(NSString *)kind rtpParameters:(NSString *)rtpParameters appData:(NSString *)appData callback:(void(^)(NSString *))callback {
 // Here we must communicate our local parameters to our remote transport
 NSString *id = [mySignalling request:@"produce" transportId:[transport getId] kind:kind rtpParameters:rtpParameters appData:appData];
 
 callback(id);
}
@end

SendTransport *sendTransport = [device createSendTransport:sendTransportHandler.delegate id:transportData["id"] iceParameters:transportData["iceParameters"] iceCandidates:transportData["iceCandidates"] dtlsParameters:transportData["dtlsParameters"]];

// Get the device camera
NSArray *devices = [AVCaptureDevice devicesWithMediaType: AVMediaTypeVideo];

// Start capturing it
RTCPeerConnectionFactory *factory = [[RTCPeerConnectionFactory alloc] init];
RTCCameraVideoCapturer *videoCapturer = [[RTCCameraVideoCapturer alloc] init];
[videoCapturer startCaptureWithDevice:devices[0] format:[devices[0] activeFormat] fps:30];
RTCVideoSource *videoSource = [factory videoSource];
[videoSource adaptOutputFormatToWidth:640 height:480 fps:30];

RTCVideoTrack *videoTrack = [factory videoTrackWithSource:videoSource trackId:@"trackId"];

// Handler to handle producer events
@interface ProducerHandler : NSObject<ProducerListener>
@property (nonatomic) id delegate;
@end

@implementation ProducerHandler
-(void)onTransportClose:(Producer *)producer {
 NSLog(@"Producer::onTransportClose");
}
@end

// Produce out camera video
Producer *videoProducer = [sendTransport produce:producerHandler.delegate track:videoTrack encodings:nil codecOptions:nil];

Roadmap

  • Fix memory leaks
  • Update usage examples and tests, translate to modern Swift syntax
  • Update build instructions
  • Implement configuring codec factories

Packages

No packages published

Languages

  • Objective-C 79.6%
  • C++ 12.4%
  • Objective-C++ 5.8%
  • Shell 1.0%
  • C 0.8%
  • CMake 0.2%
  • Ruby 0.2%