本文主要是介绍iOS ARKit推流到WebRTC,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
背景
直播SDK接入iOS ARKit。
WebRTC数据传入
创建PeerConnection的时候需要创建一个VideoTrackSourceInterface对象,该对象可以作为外部视频数据传入WebRTC的入口。
ARKit数据来源
ARKit有两种数据,一种是直接从摄像头采集到的原始数据,一种是经过AR渲染的数据,也就是我们直观看到的增强现实的图像。无论是哪种数据,都要经过CVPixelBuffer ==> RTCCVPixelBuffer ==> RTCVideoFrame的转换,才能交给VideoTrackSourceInterface,进而通过WebRTC推流。
摄像头原始数据
实现ARSessionDelegate协议的以下方法:
……
RTCVideoCapturer *_dummyCapturer = [[RTCVideoCapturer alloc] init];
……- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame {CVPixelBufferRef pixelBuffer = frame.capturedImage; // 获得CVPixelBufferRefRTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer]; // 转成RTCCVPixelBuffer.int64_t timeStampNs = frame.timestamp * 1000000000; // 时间单位转换,单位为ns. RTCVideoRotation rotation = RTCVideoRotation_0; //TBD, check rotation. RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:rotation timeStampNs:timeStampNs]; //转成RTCVideoFrame. [_videoSource capturer:_dummyCapturer didCaptureVideoFrame:videoFrame]; //发送数据给WebRTC, _dummyCapturer弄个假的即可。
}
渲染后的AR数据
需要创建一个SCNRenderer,以一定的帧率调用snapshotAtTime抓取场景。由于该方法返回的是UIImage,需要先把UIImage转成CVPixelBufferRef,才能使用上述方法发送数据到WebRTC。代码示例如下:
#define screenWidth [UIScreen mainScreen].bounds.size.width
#define screenHeight [UIScreen mainScreen].bounds.size.height……
ARSCNView *_arView;
ARSession *_arSession;
SCNRenderer *_scnRenderer;
……- (ARSession *)arSession {if(_arSession == nil) {_arSession = [[ARSession alloc] init];_arSession.delegate = self;}return _arSession;
}- (ARSCNView *)arView {if (_arView == nil) {_arView = [[ARSCNView alloc] initWithFrame:CGRectMake(0,0,screenWidth,screenHeight)];_arView.session = self.arSession;_arView.automaticallyUpdatesLighting = YES;_arView.delegate = self;}return _arView;
}- (SCNRenderer*)scnRenderer {if (_scnRenderer == nil) {_scnRenderer = [SCNRenderer rendererWithDevice:nil options:nil];_scnRenderer.scene = self.arView.scene;}return _scnRenderer;
}-(CVPixelBufferRef)capturePixelBuffer : (NSTimeInterval)timestamp {UIImage *image = [self.scnRenderer snapshotAtTime:timestamp withSize:CGSizeMake(_outputSize.width, _outputSize.height) antialiasingMode:SCNAntialiasingModeMultisampling4X];CVPixelBufferRef pixelBuffer = [self imageToPixelBuffer:image.CGImage]; // UIImage ==> CVPixelBuffer.return pixelBuffer;
}//CGImageRef ==> CVPixelBufferRef,也就是UIImage ==> CVPixelBuffer.
- (CVPixelBufferRef)imageToPixelBuffer:(CGImageRef)image {CGSize frameSize = CGSizeMake(_outputSize.width, _outputSize.height); // _outputSize = CGSizeMake(720, 1280), 分辨率越高,转换消耗越高.NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES],kCVPixelBufferCGImageCompatibilityKey,[NSNumber numberWithBool:YES],kCVPixelBufferCGBitmapContextCompatibilityKey,nil];CVPixelBufferRef pxbuffer = NULL;CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, frameSize.height,kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef)options, &pxbuffer);NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);CVPixelBufferLockBaseAddress(pxbuffer, 0);void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width, frameSize.height,8, CVPixelBufferGetBytesPerRow(pxbuffer),rgbColorSpace,(CGBitmapInfo)kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);CGContextDrawImage(context, CGRectMake(0, 0, _outputSize.width, _outputSize.height), image);CGColorSpaceRelease(rgbColorSpace);CGContextRelease(context);CVPixelBufferUnlockBaseAddress(pxbuffer, 0);return pxbuffer;
}- (void)captureAndSend {CVPixelBufferRef pixelBuffer = [self capturePixelBuffer:[self getCurrentTimestamp]]; //获取当前时间戳的图像RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer]; // 转成RTCCVPixelBuffer.int64_t timeStampNs = frame.timestamp * 1000000000; // 时间单位转换,单位为ns. RTCVideoRotation rotation = RTCVideoRotation_0; //TBD, check rotation. RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:rotation timeStampNs:timeStampNs]; //转成RTCVideoFrame. [_videoSource capturer:_dummyCapturer didCaptureVideoFrame:videoFrame]; //发送数据给WebRTC, _dummyCapturer弄个假的即可。
}
在一个定时器里周期地调用captureAndSend即可。
原始AR Demo
https://github.com/miliPolo/ARSolarPlay
把这个流通过WebRTC推出去,还是蛮有意思的,AR+WebRTC,也许是一个值得尝试的方向。
这篇关于iOS ARKit推流到WebRTC的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!