本文主要是介绍ios开发之AVFoundation总结,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
1, Using Assets [自己理解为:数据的来源]
这个资源可以来自自己的ipod媒体库或图片也可以时文件
creating an Asset Object
NSRUL *url = 后跟一个网址 如电影文件等资源
AVURLAsset *ansset = [[AVURLSset alloc] initwithURL:url options:nil];
2.获得一个视频的图像
使用AVASsetImageGenerator类来实现
用来生成图像序列
3.Playback
我们在播放视频时可以使用AVPlayer和AVQueuePlayer播放AVPlayer是AVQueuePlayer的父类
a先创建一个路径
b可以使用AVPlayerItem加载路径
c使用AVPlayer播放文件
当然我们还可以控制它的播放速度
使用rate属性它是一个介于0.0--1.0之间的数
我们也可以播放多个项目
NSArray *items = // 设置一个播放的组合
AVQueuePlayer *queueplayer = [[AVQueuePlayer alloc]initwithItems:items];
然后使用AVPlayerItem
AVPlayerItem *anItem = // get a player item
使用canInsertItem:afterItem 测试
4.Media capture
我们可以配置预设图片的质量和分辨率
Symbol Resolution Comments
AVCaptureSessionPresetHigh High Highest recording quality. This varies per device.
AVCaptureSessionPresetMedium Medium Suitable for WiFi sharing. The actual values may change.
AVCaptureSessionPresetLow Low Suitable for 3g sharing. The actual values may change.
AVCaptureSessionPreset640x480 640x480 VGA
AVCaptureSessionPreset1280x720 1280x720 720p HD
AVCaptureSessionPresetPhoto Photo Full photo resolution. This is not supported for video output
判断一个设备是否适用
AVCaptreSessuion *session = [[AVCaptureSession alloc]init];
if([session canSetSessionPreset:AVCaptureSessionPrese 1280x720]){
session.sessionPreset = AVCaptureSessionPreset 1280x720;
}else{
// Handle the failure.
}
当然在
[session beginConfigration],
[session commitconfiguration]之间配置重新添加你想要适用的设备以及删除以前的设备等操作
5.当我们不知道设备的一些特性时我们可以使用以下代码查找相应的设备
NSArray *devices = [AVCaptureDevice devices];
fo(AVCaptureDevice *device in device){
NSLogO("Device name %@",[devic localizedName]);
当然还可以判断设备的位置
if([device hasMediaType:AVMediaTypeVideo]){
if([device postion] == AVCaptureDevicePostionBack){
nslog(@"Device postion :back");
}else{
NSLog(@"Device postion :front");
}
}
}
下面的demo说明如何找到视频输入设备
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
NsMutableArray *torchDevices = [[NSMutableArray alloc]init];
for(AVCaptureDevice *device in devices){
if([device hasTorch]&&[device supportsAVCaptureSessionPreset:AVCaptureSessionPreset 640x480]){
[torchDevices addObject:device];
}
}
6设备间切换
AVCaptureSession *session = //一个设备session
[session beginConfiguration];
[session removeInput:frontFacingCameraDeviceInput];
[session AddInput:backFacikngCameraDeviceInput];
[session commitConfiguration];
7 配置AVCaptureDeviceInput
AVCaptureSession *captureSession = <#Get a capture session#>;
AVCaptureDeviceInput *captureDeviceInput = <#Get a capture device input#>;
// 检查是否适用
if ([captureSession canAddInput:captureDeviceInput]) {
// 适用则添加
[captureSession addInput:captureDeviceInput];
} else {
// Handle the failure.
}
8 配置AVCaptureOutput
输出的类型:
a.AVCaptureMovieFileOutput 输出一个电影文件
b.AVCaptureVideoDataOutput 输出处理视频帧被捕获
c.AVCaptureAudioDataOutput 输出音频数据被捕获
d.AVCaptureStillImageOutput 捕获元数据
AVCaptureSession *captureSession = <#Get a capture session#>;
AVCaptureMovieFileOutput *movieInput = <#Create and configure a movie output#>;
if ([captureSession canAddOutput:movieInput]) {
[captureSession addOutput:movieInput];
} else {
// Handle the failure.
}
9 保存到一个电影文件
AVCaptureMovieFileOutput *aMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
CMTime maxDuration = <#Create a CMTime to represent the maximum duration#>; aMovieFileOutput.maxRecordedDuration = maxDuration;
aMovieFileOutput.minFreeDiskSpaceLimit = <#An appropriate minimum given the quality of the movie format and the duration#>;
10 录音设备
The delegate must conform to the
AVCaptureFileOutputRecordingDelegate
protocol,
and must implement the
captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error: method.
11 像素和编码格式
iphone 3G iphone 3GS iphone 4
yuvs,2vuy,BGRA,jpeg 420f,420v,BGRA,jpeg 420f, 420v, BGRA, jpeg
12 静态图像捕捉
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey, nil]; [stillImageOutput setOutputSettings:outputSettings];
13 重力模式
The preview layer supports three gravity modes that you set using videoGravity:
● AVLayerVideoGravityResizeAspect: This preserves the aspect ratio, leaving black bars where the
video does not fill the available screen area.
● AVLayerVideoGravityResizeAspectFill: This preserves the aspect ratio, but fills the available screen area, cropping the video when necessary.
● AVLayerVideoGravityResize: This simply stretches the video to fill the available screen area, even if doing so distorts the image.
设备之间切换
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for ( AVCaptureDevice *device in devices )
if ( device.position == position )
return device;
return nil;
}
- (void)swapFrontAndBackCameras {
// Assume the session is already running
NSArray *inputs = self.session.inputs;
for ( AVCaptureDeviceInput *input in inputs ) {
AVCaptureDevice *device = input.device;
if ( [device hasMediaType:AVMediaTypeVideo] ) {
AVCaptureDevicePosition position = device.position;
AVCaptureDevice *newCamera = nil;
AVCaptureDeviceInput *newInput = nil;
if (position == AVCaptureDevicePositionFront)
newCamera = [self cameraWithPosition:AVCaptureDevicePositionBack];
else
newCamera = [self cameraWithPosition:AVCaptureDevicePositionFront];
newInput = [AVCaptureDeviceInput deviceInputWithDevice:newCamera error:nil];
// beginConfiguration ensures that pending changes are not applied immediately
[self.session beginConfiguration];
[self.session removeInput:input];
[self.session addInput:newInput];
// Changes take effect once the outermost commitConfiguration is invoked.
[self.session commitConfiguration];
break;
}
}
}
1, Using Assets [自己理解为:数据的来源]
这个资源可以来自自己的ipod媒体库或图片也可以时文件
creating an Asset Object
NSRUL *url = 后跟一个网址 如电影文件等资源
AVURLAsset *ansset = [[AVURLSset alloc] initwithURL:url options:nil];
2.获得一个视频的图像
使用AVASsetImageGenerator类来实现
用来生成图像序列
3.Playback
我们在播放视频时可以使用AVPlayer和AVQueuePlayer播放AVPlayer是AVQueuePlayer的父类
a先创建一个路径
b可以使用AVPlayerItem加载路径
c使用AVPlayer播放文件
当然我们还可以控制它的播放速度
使用rate属性它是一个介于0.0--1.0之间的数
我们也可以播放多个项目
NSArray *items = // 设置一个播放的组合
AVQueuePlayer *queueplayer = [[AVQueuePlayer alloc]initwithItems:items];
然后使用AVPlayerItem
AVPlayerItem *anItem = // get a player item
使用canInsertItem:afterItem 测试
4.Media capture
我们可以配置预设图片的质量和分辨率
Symbol Resolution Comments
AVCaptureSessionPresetHigh High Highest recording quality. This varies per device.
AVCaptureSessionPresetMedium Medium Suitable for WiFi sharing. The actual values may change.
AVCaptureSessionPresetLow Low Suitable for 3g sharing. The actual values may change.
AVCaptureSessionPreset640x480 640x480 VGA
AVCaptureSessionPreset1280x720 1280x720 720p HD
AVCaptureSessionPresetPhoto Photo Full photo resolution. This is not supported for video output
判断一个设备是否适用
AVCaptreSessuion *session = [[AVCaptureSession alloc]init];
if([session canSetSessionPreset:AVCaptureSessionPrese 1280x720]){
session.sessionPreset = AVCaptureSessionPreset 1280x720;
}else{
// Handle the failure.
}
当然在
[session beginConfigration],
[session commitconfiguration]之间配置重新添加你想要适用的设备以及删除以前的设备等操作
5.当我们不知道设备的一些特性时我们可以使用以下代码查找相应的设备
NSArray *devices = [AVCaptureDevice devices];
fo(AVCaptureDevice *device in device){
NSLogO("Device name %@",[devic localizedName]);
当然还可以判断设备的位置
if([device hasMediaType:AVMediaTypeVideo]){
if([device postion] == AVCaptureDevicePostionBack){
nslog(@"Device postion :back");
}else{
NSLog(@"Device postion :front");
}
}
}
下面的demo说明如何找到视频输入设备
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
NsMutableArray *torchDevices = [[NSMutableArray alloc]init];
for(AVCaptureDevice *device in devices){
if([device hasTorch]&&[device supportsAVCaptureSessionPreset:AVCaptureSessionPreset 640x480]){
[torchDevices addObject:device];
}
}
6设备间切换
AVCaptureSession *session = //一个设备session
[session beginConfiguration];
[session removeInput:frontFacingCameraDeviceInput];
[session AddInput:backFacikngCameraDeviceInput];
[session commitConfiguration];
7 配置AVCaptureDeviceInput
AVCaptureSession *captureSession = <#Get a capture session#>;
AVCaptureDeviceInput *captureDeviceInput = <#Get a capture device input#>;
// 检查是否适用
if ([captureSession canAddInput:captureDeviceInput]) {
// 适用则添加
[captureSession addInput:captureDeviceInput];
} else {
// Handle the failure.
}
8 配置AVCaptureOutput
输出的类型:
a.AVCaptureMovieFileOutput 输出一个电影文件
b.AVCaptureVideoDataOutput 输出处理视频帧被捕获
c.AVCaptureAudioDataOutput 输出音频数据被捕获
d.AVCaptureStillImageOutput 捕获元数据
AVCaptureSession *captureSession = <#Get a capture session#>;
AVCaptureMovieFileOutput *movieInput = <#Create and configure a movie output#>;
if ([captureSession canAddOutput:movieInput]) {
[captureSession addOutput:movieInput];
} else {
// Handle the failure.
}
9 保存到一个电影文件
AVCaptureMovieFileOutput *aMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
CMTime maxDuration = <#Create a CMTime to represent the maximum duration#>; aMovieFileOutput.maxRecordedDuration = maxDuration;
aMovieFileOutput.minFreeDiskSpaceLimit = <#An appropriate minimum given the quality of the movie format and the duration#>;
10 录音设备
The delegate must conform to the
AVCaptureFileOutputRecordingDelegate
protocol,
and must implement the
captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error: method.
11 像素和编码格式
iphone 3G iphone 3GS iphone 4
yuvs,2vuy,BGRA,jpeg 420f,420v,BGRA,jpeg 420f, 420v, BGRA, jpeg
12 静态图像捕捉
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey, nil]; [stillImageOutput setOutputSettings:outputSettings];
13 重力模式
The preview layer supports three gravity modes that you set using videoGravity:
● AVLayerVideoGravityResizeAspect: This preserves the aspect ratio, leaving black bars where the
video does not fill the available screen area.
● AVLayerVideoGravityResizeAspectFill: This preserves the aspect ratio, but fills the available screen area, cropping the video when necessary.
● AVLayerVideoGravityResize: This simply stretches the video to fill the available screen area, even if doing so distorts the image.
设备之间切换
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for ( AVCaptureDevice *device in devices )
if ( device.position == position )
return device;
return nil;
}
- (void)swapFrontAndBackCameras {
// Assume the session is already running
NSArray *inputs = self.session.inputs;
for ( AVCaptureDeviceInput *input in inputs ) {
AVCaptureDevice *device = input.device;
if ( [device hasMediaType:AVMediaTypeVideo] ) {
AVCaptureDevicePosition position = device.position;
AVCaptureDevice *newCamera = nil;
AVCaptureDeviceInput *newInput = nil;
if (position == AVCaptureDevicePositionFront)
newCamera = [self cameraWithPosition:AVCaptureDevicePositionBack];
else
newCamera = [self cameraWithPosition:AVCaptureDevicePositionFront];
newInput = [AVCaptureDeviceInput deviceInputWithDevice:newCamera error:nil];
// beginConfiguration ensures that pending changes are not applied immediately
[self.session beginConfiguration];
[self.session removeInput:input];
[self.session addInput:newInput];
// Changes take effect once the outermost commitConfiguration is invoked.
[self.session commitConfiguration];
break;
}
}
}
这篇关于ios开发之AVFoundation总结的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!