iOS AVFoundation/AVCaptureSession实现自定义相机界面录像(三)

本文主要是介绍iOS AVFoundation/AVCaptureSession实现自定义相机界面录像(三),希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

要用到的类:

  • AVCaptureSession
  • AVCaptureVideoPreviewLayer
  • AVCaptureDeviceInput
  • AVCaptureConnection
  • AVCaptureVideoDataOutput
  • AVCaptureAudioDataOutput
  • AVAssetWriter
  • AVAssetWriterInput

    // AVCaptureMovieFileOutputAVCaptureFileOutput的子类,AVCaptureOutput的子类有:AVCaptureFileOutputAVCaptureAudioDataOutputAVCaptureVideoDataOutputAVCaptureStillImageOutputios10.0之后用AVCapturePhotoOutput代替)----这三种颜色的子类处理数据的方法有区别;

    

    //*设备输出数据管理对象,管理输出数据,通常使用它的子类:AVCaptureAudioDataOutput//输出音频管理对象,输出数据为NSData

    //     AVCaptureStillImageDataOutput//输出图片管理对象,输出数据为NSData

    //AVCaptureVideoDataOutput//输出视频管理对象,输出数据为NSData

    

    /* 输出文件管理对象,输出数据以文件形式输出 */

    //AVCaptureFileOutput

    // {//子类

    //        AVCaptureAudioFileOutput   //输出是音频文件

    //AVCaptureMovieFileOutput   //输出是视频文件

    //}

        //添加防抖动功能

@property(nonatomic,strong)AVCaptureConnection *videoConnection;

    self.videoConnection = [self.videoOutputconnectionWithMediaType:AVMediaTypeVideo];

        if ([self.videoConnectionisVideoStabilizationSupported]) {

            self.videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;//防抖模式

        }




=========================




1.AVCaptureSessionAVFoundation捕捉类,在视频捕获时,客户端可以实例化AVCaptureSession并添加适当的AVCaptureInputsAVCaptureDeviceInput和输出,比如AVCaptureMovieFileOutput。通过[AVCaptureSession startRunning]开始数据流从输入到输出,和[AVCaptureSession stopRunning]停止输出输入的流动。客户端可以通过设置sessionPreset属性定制录制质量水平或输出的比特率。


2.AVCaptureDevice的每个实例对应一个设备,如摄像头或麦克风。AVCaptureDevice的实例不能直接创建。所有现有设备可以使用类方法devicesWithMediaType:defaultDeviceWithMediaType:获取,设备可以提供一个或多个给定流媒体类型。AVCaptureDevice实例可用于提供给AVCaptureSession创建一个为AVCaptureDeviceInput类型的输入源。


3.AVCaptureDeviceInput 是AVCaptureSession输入源,提供媒体数据从设备连接到系统,通过AVCaptureDevice的实例化得到,就是我们将要用到的设备输出源设备,也就是前后摄像头,通过[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]方法获得。


4.AVCaptureVideoPreviewLayerCoreAnimation里面layer的一个子类,用来做为AVCaptureSession预览视频输出,简单来说就是来做为拍摄的视频呈现的一个layer。


5.AVCaptureMovieFileOutputAVCaptureFileOutput的子类,用来写入QuickTime视频类型的媒体文件。因为这个类在iphone上并不能实现暂停录制,和不能定义视频文件的类型,所以在这里并不使用,而是用灵活性更强的AVCaptureVideoDataOutputAVCaptureAudioDataOutput来实现视频的录制。


6.AVCaptureVideoDataOutputAVCaptureOutput一个子类,可以用于用来输出未压缩或压缩的视频捕获的帧,AVCaptureVideoDataOutput产生的实例可以使用其他媒体视频帧适合的api处理,应用程序可以用captureOutput:didOutputSampleBuffer:fromConnection:代理方法来获取帧数据。


7.AVCaptureAudioDataOutputAVCaptureOutput的子类,可用于用来输出捕获来的非压缩或压缩的音频样本,AVCaptureAudioDataOutput产生的实例可以使用其他媒体视频帧适合的api处理,应用程序可以用captureOutput:didOutputSampleBuffer:fromConnection:代理方法来获取音频数据。


8.AVCaptureConnection代表AVCaptureInputPort或端口之间的连接,和一个AVCaptureOutputAVCaptureVideoPreviewLayerAVCaptureSession中的呈现;


9.

AVAssetWriter为写入媒体数据到一个新的文件提供服务,AVAssetWriter的实例可以规定写入媒体文件的格式,如QuickTime电影文件格式或MPEG-4文件格式等等。AVAssetWriter有多个并行的轨道媒体数据,基本的有视频轨道和音频轨道,将会在下面介绍。AVAssetWriter的单个实例可用于一次写入一个单一的文件。那些希望写入多次文件的客户端必须每一次用一个新的AVAssetWriter实例。

10.AVAssetWriterInput去拼接一个多媒体样本类型为CMSampleBuffer的实例到AVAssetWriter对象的输出文件的一个轨道;当有多个输入时, AVAssetWriter试图在用于存储和播放效率的理想模式写媒体数据。它的每一个输入信号,是否能接受媒体的数据根据通过readyForMoreMediaData的值来判断。如果readyForMoreMediaDataYES ,说明输入可以接受媒体数据。并且你只能媒体数据追加到输入端。



#import "WCLRecordVideoVC.h"

#import "WCLRecordEngine.h"

#import "WCLRecordProgressView.h"

#import <MobileCoreServices/MobileCoreServices.h>

#import <MediaPlayer/MediaPlayer.h>


typedef NS_ENUM(NSUInteger, UploadVieoStyle) {

    VideoRecord = 0,

    VideoLocation,

};


@interface WCLRecordVideoVC ()<WCLRecordEngineDelegate,UIImagePickerControllerDelegate,UINavigationControllerDelegate>


@property (weak,nonatomic)IBOutletUIButton *flashLightBT;

@property (weak,nonatomic)IBOutletUIButton *changeCameraBT;

@property (weak,nonatomic)IBOutletUIButton *recordNextBT;

@property (weak,nonatomic)IBOutletUIButton *recordBt;

@property (weak,nonatomic)IBOutletUIButton *locationVideoBT;

@property (weak,nonatomic)IBOutletNSLayoutConstraint *topViewTop;

@property (weak,nonatomic)IBOutletWCLRecordProgressView *progressView;

@property (strong,nonatomic)WCLRecordEngine         *recordEngine;

@property (assign,nonatomic)BOOL                    allowRecord;//允许录制

@property (assign,nonatomic)UploadVieoStyle         videoStyle;//视频的类型

@property (strong,nonatomic)UIImagePickerController *moviePicker;//视频选择器

@property (strong,nonatomic)MPMoviePlayerViewController *playerVC;


@end


@implementation WCLRecordVideoVC


- (void)dealloc {

    _recordEngine =nil;

    [[NSNotificationCenterdefaultCenter]removeObserver:selfname:MPMoviePlayerPlaybackDidFinishNotificationobject:[_playerVCmoviePlayer]];

}


- (void)viewWillAppear:(BOOL)animated {

    [superviewWillAppear:animated];

    [self.navigationControllersetNavigationBarHidden:YESanimated:YES];//隐藏导航栏

}


- (void)viewDidDisappear:(BOOL)animated {

    [superviewDidDisappear:animated];

    [self.recordEngineshutdown];//视图将要消失的时候,关闭录制功能

}


//在这个方法中初始化了_recordEnginepreviewLayersession

- (void)viewDidAppear:(BOOL)animated {

    [superviewDidAppear:animated];

    if (_recordEngine ==nil) {

        //  [self.recordEngine 初始化了WCLRecordEngine

        [self.recordEnginepreviewLayer].frame =self.view.bounds;

        [self.view.layerinsertSublayer:[self.recordEnginepreviewLayer]atIndex:0];//初始化视频预览涂层,并且放在第一层

    }

    [self.recordEnginestartUp];//开始录制,用session开启或关闭来控制录制

}


- (void)viewDidLoad {

    [superviewDidLoad];


    self.allowRecord =YES;

}


//根据状态调整view的展示情况

- (void)adjustViewFrame {

    [self.viewlayoutIfNeeded];

    [UIViewanimateWithDuration:0.4delay:0.0options:UIViewAnimationOptionCurveEaseInOutanimations:^{

        if (self.recordBt.selected) {

            self.topViewTop.constant = -64;

            [[UIApplicationsharedApplication]setStatusBarHidden:YESwithAnimation:UIStatusBarAnimationSlide];

        }else {

            [[UIApplicationsharedApplication]setStatusBarHidden:NOwithAnimation:UIStatusBarAnimationSlide];

            self.topViewTop.constant =0;

        }

        if (self.videoStyle ==VideoRecord) {

            self.locationVideoBT.alpha =0;

        }

        [self.viewlayoutIfNeeded];//刷新布局

    } completion:nil];

}


#pragma mark - setget方法

//创建WCLRecordEngine

- (WCLRecordEngine *)recordEngine {

    if (_recordEngine ==nil) {

        _recordEngine = [[WCLRecordEnginealloc]init];

        _recordEngine.delegate =self;

    }

    return_recordEngine;

}


- (UIImagePickerController *)moviePicker {

    if (_moviePicker ==nil) {

        _moviePicker = [[UIImagePickerControlleralloc]init];

        _moviePicker.delegate =self;

        _moviePicker.sourceType =UIImagePickerControllerSourceTypePhotoLibrary;

        _moviePicker.mediaTypes =@[(NSString *)kUTTypeMovie];

    }

    return_moviePicker;

}


#pragma mark - Apple相册选择代理

//选择了某个照片的回调函数/代理回调

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {

    if ([[infoobjectForKey:UIImagePickerControllerMediaType]isEqualToString:(NSString*)kUTTypeMovie]) {

        //获取视频的名称

        NSString * videoPath=[NSStringstringWithFormat:@"%@",[infoobjectForKey:UIImagePickerControllerMediaURL]];

        NSRange range =[videoPathrangeOfString:@"trim."];//匹配得到的下标

        NSString *content=[videoPathsubstringFromIndex:range.location+5];

        //视频的后缀

        NSRange rangeSuffix=[contentrangeOfString:@"."];

        NSString * suffixName=[contentsubstringFromIndex:rangeSuffix.location+1];

        //如果视频是mov格式的则转为MP4

        if ([suffixNameisEqualToString:@"MOV"]) {

            NSURL *videoUrl = [infoobjectForKey:UIImagePickerControllerMediaURL];

            __weaktypeof(self) weakSelf =self;

            [self.recordEnginechangeMovToMp4:videoUrldataBlock:^(UIImage *movieImage) {

                

                [weakSelf.moviePickerdismissViewControllerAnimated:YEScompletion:^{

                    weakSelf.playerVC = [[MPMoviePlayerViewControlleralloc]initWithContentURL:[NSURLfileURLWithPath:weakSelf.recordEngine.videoPath]];

                    [[NSNotificationCenterdefaultCenter]addObserver:selfselector:@selector(playVideoFinished:)name:MPMoviePlayerPlaybackDidFinishNotificationobject:[weakSelf.playerVCmoviePlayer]];

                    [[weakSelf.playerVCmoviePlayer]prepareToPlay];

                    

                    [weakSelf presentMoviePlayerViewControllerAnimated:weakSelf.playerVC];

                    [[weakSelf.playerVCmoviePlayer]play];

                }];

            }];

        }

    }

}


#pragma mark - WCLRecordEngineDelegate

- (void)recordProgress:(CGFloat)progress {

    if (progress >=1) {

        [selfrecordAction:self.recordBt];

        self.allowRecord =NO;

    }

    self.progressView.progress = progress;

}


#pragma mark - 各种点击事件

//返回点击事件

- (IBAction)dismissAction:(id)sender {

    [self.navigationControllerpopViewControllerAnimated:YES];

}


//开关闪光灯

 - (IBAction)flashLightAction:(id)sender {

    if (self.changeCameraBT.selected == NO) {

        self.flashLightBT.selected = !self.flashLightBT.selected;

        if (self.flashLightBT.selected == YES) {

            [self.recordEngineopenFlashLight];

        }else {

            [self.recordEnginecloseFlashLight];

        }

    }

}


//切换前后摄像头

- (IBAction)changeCameraAction:(id)sender {

    self.changeCameraBT.selected = !self.changeCameraBT.selected;

    if (self.changeCameraBT.selected == YES) {

        //前置摄像头

        [self.recordEnginecloseFlashLight];

        self.flashLightBT.selected =NO;

        [self.recordEnginechangeCameraInputDeviceisFront:YES];

    }else {

        [self.recordEnginechangeCameraInputDeviceisFront:NO];

    }

}


//录制下一步点击事件

- (IBAction)recordNextAction:(id)sender {

    if (_recordEngine.videoPath.length > 0) {

        __weaktypeof(self) weakSelf =self;

        [self.recordEnginestopCaptureHandler:^(UIImage *movieImage) {

            weakSelf.playerVC = [[MPMoviePlayerViewControlleralloc]initWithContentURL:[NSURLfileURLWithPath:weakSelf.recordEngine.videoPath]];

            [[NSNotificationCenterdefaultCenter]addObserver:selfselector:@selector(playVideoFinished:)name:MPMoviePlayerPlaybackDidFinishNotificationobject:[weakSelf.playerVCmoviePlayer]];

            [[weakSelf.playerVCmoviePlayer]prepareToPlay];

            

            [weakSelf presentMoviePlayerViewControllerAnimated:weakSelf.playerVC];

            [[weakSelf.playerVCmoviePlayer]play];

        }];

    }else {

        NSLog(@"请先录制视频~");

    }

}


//当点击Done按键或者播放完毕时调用此函数

- (void) playVideoFinished:(NSNotification *)theNotification {

    MPMoviePlayerController *player = [theNotificationobject];

    [[NSNotificationCenterdefaultCenter]removeObserver:selfname:MPMoviePlayerPlaybackDidFinishNotificationobject:player];

    [player stop];

    [self.playerVCdismissMoviePlayerViewControllerAnimated];

    self.playerVC =nil;

}


//本地视频点击视频

- (IBAction)locationVideoAction:(id)sender {

    self.videoStyle =VideoLocation;

    [self.recordEngineshutdown];

    [selfpresentViewController:self.moviePickeranimated:YEScompletion:nil];

}


//开始和暂停录制事件(红按钮)

- (IBAction)recordAction:(UIButton *)sender {

    if (self.allowRecord) {//允许录像

        self.videoStyle =VideoRecord;

        self.recordBt.selected = !self.recordBt.selected;//按钮选中状态

        if (self.recordBt.selected) {

            if (self.recordEngine.isCapturing) {//正在录像

                [self.recordEngineresumeCapture];

            }else {

                [self.recordEnginestartCapture];

            }

        }else {

            [self.recordEnginepauseCapture];

        }

        [selfadjustViewFrame];

    }

}


- (void)didReceiveMemoryWarning {

    [superdidReceiveMemoryWarning];

    // Dispose of any resources that can be recreated.

}


@end

==========================

#import "WCLRecordEngine.h"

#import "WCLRecordEncoder.h"

#import <AVFoundation/AVFoundation.h>

#import <Photos/Photos.h>


@interface WCLRecordEngine ()<AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate,CAAnimationDelegate>

{

    CMTime _timeOffset;//录制的偏移CMTime

    CMTime _lastVideo;//记录上一次视频数据文件的CMTime

    CMTime _lastAudio;//记录上一次音频数据文件的CMTime

    

    NSInteger _cx;//视频分辨的宽

    NSInteger _cy;//视频分辨的高

    int _channels;//音频通道

    Float64 _samplerate;//音频采样率

}


@property (strong,nonatomic)WCLRecordEncoder           *recordEncoder;//录制编码

@property (strong,nonatomic)AVCaptureSession           *recordSession;//捕获视频的会话

@property (strong,nonatomic)AVCaptureVideoPreviewLayer *previewLayer;//捕获到的视频呈现的layer

@property (strong,nonatomic)AVCaptureDeviceInput       *backCameraInput;//后置摄像头输入

@property (strong,nonatomic)AVCaptureDeviceInput       *frontCameraInput;//前置摄像头输入

@property (strong,nonatomic)AVCaptureDeviceInput       *audioMicInput;//麦克风输入

@property (copy  ,nonatomic)dispatch_queue_t           captureQueue;//录制的队列

@property (strong,nonatomic)AVCaptureConnection        *audioConnection;//音频录制连接

@property (strong,nonatomic)AVCaptureConnection        *videoConnection;//视频录制连接

@property (strong,nonatomic)AVCaptureVideoDataOutput   *videoOutput;//视频输出

@property (strong,nonatomic)AVCaptureAudioDataOutput   *audioOutput;//音频输出

@property (atomic,assign)BOOL isCapturing;//正在录制

@property (atomic,assign)BOOL isPaused;//是否暂停

@property (atomic,assign)BOOL discont;//是否中断

@property (atomic,assign)CMTime startTime;//开始录制的时间

@property (atomic,assign)CGFloat currentRecordTime;//当前录制时间


@end


@implementation WCLRecordEngine


- (void)dealloc {

    [_recordSessionstopRunning];

    _captureQueue     =nil;

    _recordSession    =nil;

    _previewLayer     =nil;

    _backCameraInput  =nil;

    _frontCameraInput =nil;

    _audioOutput      =nil;

    _videoOutput      =nil;

    _audioConnection  =nil;

    _videoConnection  =nil;

    _recordEncoder    =nil;

}


- (instancetype)init

{

    self = [superinit];

    if (self) {

        self.maxRecordTime =60.0f;

    }

    returnself;

}


#pragma mark - 公开的方法

//启动录制功能

- (void)startUp {

    NSLog(@"启动录制功能");

    self.startTime =CMTimeMake(0,0);

    self.isCapturing =NO;

    self.isPaused =NO;

    self.discont =NO;

    [self.recordSessionstartRunning];//开始会话,

}

//关闭录制功能

- (void)shutdown {

    _startTime =CMTimeMake(0,0);

    if (_recordSession) {

        [_recordSessionstopRunning];//结束会话

    }

    [_recordEncoderfinishWithCompletionHandler:^{

//        NSLog(@"录制完成");

    }];

}


//开始录制

- (void) startCapture {

    @synchronized(self) {

        if (!self.isCapturing) {

//            NSLog(@"开始录制");

            self.recordEncoder =nil;

            self.isPaused =NO;

            self.discont =NO;

            _timeOffset =CMTimeMake(0,0);

            self.isCapturing =YES;

        }

    }

}

//暂停录制

- (void) pauseCapture {

    @synchronized(self) {

        if (self.isCapturing) {

//            NSLog(@"暂停录制");

            self.isPaused =YES;

            self.discont =YES;

        }

    }

}

//继续录制

- (void) resumeCapture {

    @synchronized(self) {

        if (self.isPaused) {

//            NSLog(@"继续录制");

            self.isPaused =NO;

        }

    }

}

//停止录制

- (void) stopCaptureHandler:(void (^)(UIImage *movieImage))handler {

    @synchronized(self) {

        if (self.isCapturing) {

            NSString* path =self.recordEncoder.path;

            NSURL* url = [NSURLfileURLWithPath:path];

            self.isCapturing =NO;

            dispatch_async(_captureQueue, ^{

                [self.recordEncoderfinishWithCompletionHandler:^{

                    self.isCapturing =NO;

                    self.recordEncoder =nil;

                    self.startTime =CMTimeMake(0,0);

                    self.currentRecordTime =0;

                    if ([self.delegaterespondsToSelector:@selector(recordProgress:)]) {

                        dispatch_async(dispatch_get_main_queue(), ^{

                            [self.delegaterecordProgress:self.currentRecordTime/self.maxRecordTime];

                        });

                    }

                    [[PHPhotoLibrarysharedPhotoLibrary]performChanges:^{

                        [PHAssetChangeRequestcreationRequestForAssetFromVideoAtFileURL:url];

                    } completionHandler:^(BOOL success,NSError *_Nullable error) {

                        NSLog(@"保存成功");

                    }];

                    [selfmovieToImageHandler:handler];

                }];

            });

        }

    }

}


//获取视频第一帧的图片

- (void)movieToImageHandler:(void (^)(UIImage *movieImage))handler {

    NSURL *url = [NSURLfileURLWithPath:self.videoPath];

    AVURLAsset *asset = [[AVURLAssetalloc]initWithURL:urloptions:nil];

    AVAssetImageGenerator *generator = [[AVAssetImageGeneratoralloc]initWithAsset:asset];

    generator.appliesPreferredTrackTransform =TRUE;

    CMTime thumbTime =CMTimeMakeWithSeconds(0,60);

    generator.apertureMode =AVAssetImageGeneratorApertureModeEncodedPixels;

    AVAssetImageGeneratorCompletionHandler generatorHandler =

    ^(CMTime requestedTime,CGImageRef im,CMTime actualTime,AVAssetImageGeneratorResult result,NSError *error){

        if (result ==AVAssetImageGeneratorSucceeded) {

            UIImage *thumbImg = [UIImageimageWithCGImage:im];

            if (handler) {

                dispatch_async(dispatch_get_main_queue(), ^{

                    handler(thumbImg);

                });

            }

        }

    };

    [generator generateCGImagesAsynchronouslyForTimes:

     [NSArrayarrayWithObject:[NSValuevalueWithCMTime:thumbTime]]completionHandler:generatorHandler];

}


#pragma mark - setget方法

//捕获视频的会话

- (AVCaptureSession *)recordSession {

    if (_recordSession ==nil) {

        _recordSession = [[AVCaptureSessionalloc]init];

        //添加后置摄像头的输出

        if ([_recordSessioncanAddInput:self.backCameraInput]) {

            [_recordSessionaddInput:self.backCameraInput];

        }

        //添加后置麦克风的输出

        if ([_recordSessioncanAddInput:self.audioMicInput]) {

            [_recordSessionaddInput:self.audioMicInput];

        }

        //添加视频输出

        if ([_recordSessioncanAddOutput:self.videoOutput]) {

            [_recordSessionaddOutput:self.videoOutput];

            //设置视频的分辨率

            _cx =720;

            _cy =1280;

        }

        //添加音频输出

        if ([_recordSessioncanAddOutput:self.audioOutput]) {

            [_recordSessionaddOutput:self.audioOutput];

        }

        //设置视频录制的方向

        self.videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;

    }

    return_recordSession;

}


//后置摄像头输入

- (AVCaptureDeviceInput *)backCameraInput {

    if (_backCameraInput ==nil) {

        NSError *error;

        _backCameraInput = [[AVCaptureDeviceInputalloc]initWithDevice:[selfbackCamera]error:&error];

        if (error) {

            NSLog(@"获取后置摄像头失败~");

        }

    }

    return_backCameraInput;

}


//前置摄像头输入

- (AVCaptureDeviceInput *)frontCameraInput {

    if (_frontCameraInput ==nil) {

        NSError *error;

        _frontCameraInput = [[AVCaptureDeviceInputalloc]initWithDevice:[selffrontCamera]error:&error];

        if (error) {

            NSLog(@"获取前置摄像头失败~");

        }

    }

    return_frontCameraInput;

}


//麦克风输入

- (AVCaptureDeviceInput *)audioMicInput {

    if (_audioMicInput ==nil) {

       AVCaptureDevice *mic = [AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeAudio];

        NSError *error;

        _audioMicInput = [AVCaptureDeviceInputdeviceInputWithDevice:micerror:&error];

        if (error) {

            NSLog(@"获取麦克风失败~");

        }

    }

    return_audioMicInput;

}


//视频输出

- (AVCaptureVideoDataOutput *)videoOutput {

    if (_videoOutput ==nil) {

        _videoOutput = [[AVCaptureVideoDataOutputalloc]init];

        [_videoOutputsetSampleBufferDelegate:selfqueue:self.captureQueue];

        NSDictionary* setcapSettings = [NSDictionarydictionaryWithObjectsAndKeys:

                                        [NSNumbernumberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange],kCVPixelBufferPixelFormatTypeKey,

                                        nil];

        _videoOutput.videoSettings = setcapSettings;

    }

    return_videoOutput;

}


//音频输出

- (AVCaptureAudioDataOutput *)audioOutput {

    if (_audioOutput ==nil) {

        _audioOutput = [[AVCaptureAudioDataOutputalloc]init];

        [_audioOutputsetSampleBufferDelegate:selfqueue:self.captureQueue];

    }

    return_audioOutput;

}


//视频连接

- (AVCaptureConnection *)videoConnection {

    _videoConnection = [self.videoOutputconnectionWithMediaType:AVMediaTypeVideo];

    return_videoConnection;

}


//音频连接

- (AVCaptureConnection *)audioConnection {

    if (_audioConnection ==nil) {

        _audioConnection = [self.audioOutputconnectionWithMediaType:AVMediaTypeAudio];

    }

    return_audioConnection;

}


//捕获到的视频呈现的layer

- (AVCaptureVideoPreviewLayer *)previewLayer {

    if (_previewLayer ==nil) {

        //通过AVCaptureSession初始化

        AVCaptureVideoPreviewLayer *preview = [[AVCaptureVideoPreviewLayeralloc]initWithSession:self.recordSession];//self.recordSession初始化了Session

        //设置比例为铺满全屏

        preview.videoGravity =AVLayerVideoGravityResizeAspectFill;

        _previewLayer = preview;

    }

    return_previewLayer;

}


//录制的队列

- (dispatch_queue_t)captureQueue {

    if (_captureQueue ==nil) {

        _captureQueue =dispatch_queue_create("cn.qiuyouqun.im.wclrecordengine.capture",DISPATCH_QUEUE_SERIAL);

    }

    return_captureQueue;

}


#pragma mark - 切换动画(转场动画)

- (void)changeCameraAnimation {

    CATransition *changeAnimation = [CATransitionanimation];

    changeAnimation.delegate =self;

    changeAnimation.duration =0.45;

    changeAnimation.type =@"oglFlip";

    changeAnimation.subtype =kCATransitionFromRight;

    changeAnimation.timingFunction =UIViewAnimationCurveEaseInOut;

    [self.previewLayeraddAnimation:changeAnimationforKey:@"changeAnimation"];

}


- (void)animationDidStart:(CAAnimation *)anim {

    self.videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;

    [self.recordSessionstartRunning];

}


#pragma -mark mov文件转为MP4文件

- (void)changeMovToMp4:(NSURL *)mediaURL dataBlock:(void (^)(UIImage *movieImage))handler {

    AVAsset *video = [AVAssetassetWithURL:mediaURL];

    AVAssetExportSession *exportSession = [AVAssetExportSessionexportSessionWithAsset:videopresetName:AVAssetExportPreset1280x720];

    exportSession.shouldOptimizeForNetworkUse =YES;

    exportSession.outputFileType =AVFileTypeMPEG4;

    NSString * basePath=[selfgetVideoCachePath];


    self.videoPath = [basePathstringByAppendingPathComponent:[selfgetUploadFile_type:@"video"fileType:@"mp4"]];

    exportSession.outputURL = [NSURLfileURLWithPath:self.videoPath];

    [exportSession exportAsynchronouslyWithCompletionHandler:^{

        [selfmovieToImageHandler:handler];

    }];

}


#pragma mark - 视频相关

//返回前置摄像头

- (AVCaptureDevice *)frontCamera {

    return [selfcameraWithPosition:AVCaptureDevicePositionFront];

}


//返回后置摄像头

- (AVCaptureDevice *)backCamera {

    return [selfcameraWithPosition:AVCaptureDevicePositionBack];

}



//切换前后置摄像头

- (void)changeCameraInputDeviceisFront:(BOOL)isFront {

    if (isFront) {

        [self.recordSessionstopRunning];

        [self.recordSessionremoveInput:self.backCameraInput];

        if ([self.recordSessioncanAddInput:self.frontCameraInput]) {

            [selfchangeCameraAnimation];

            [self.recordSessionaddInput:self.frontCameraInput];

        }

    }else {

        [self.recordSessionstopRunning];

        [self.recordSessionremoveInput:self.frontCameraInput];

        if ([self.recordSessioncanAddInput:self.backCameraInput]) {

            [selfchangeCameraAnimation];

            [self.recordSessionaddInput:self.backCameraInput];

        }

    }

}


//用来返回是前置摄像头还是后置摄像头

- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition) position {

    //返回和视频录制相关的所有默认设备

    NSArray *devices = [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];

    //遍历这些设备返回跟position相关的设备

    for (AVCaptureDevice *devicein devices) {

        if ([deviceposition] == position) {

            return device;

        }

    }

    returnnil;

}


//开启闪光灯

- (void)openFlashLight {

    AVCaptureDevice *backCamera = [selfbackCamera];

    if (backCamera.torchMode ==AVCaptureTorchModeOff) {

        [backCamera lockForConfiguration:nil];

        backCamera.torchMode =AVCaptureTorchModeOn;

        backCamera.flashMode =AVCaptureFlashModeOn;

        [backCamera unlockForConfiguration];

    }

}

//关闭闪光灯

- (void)closeFlashLight {

    AVCaptureDevice *backCamera = [selfbackCamera];

    if (backCamera.torchMode ==AVCaptureTorchModeOn) {

        [backCamera lockForConfiguration:nil];

        backCamera.torchMode =AVCaptureTorchModeOff;

        backCamera.flashMode =AVCaptureTorchModeOff;

        [backCamera unlockForConfiguration];

    }

}


//获得视频存放地址

- (NSString *)getVideoCachePath {

    NSString *videoCache = [NSTemporaryDirectory()stringByAppendingPathComponent:@"videos"] ;

    BOOL isDir =NO;

    NSFileManager *fileManager = [NSFileManagerdefaultManager];

    BOOL existed = [fileManagerfileExistsAtPath:videoCacheisDirectory:&isDir];

    if ( !(isDir ==YES && existed ==YES) ) {

        [fileManager createDirectoryAtPath:videoCachewithIntermediateDirectories:YESattributes:nilerror:nil];

    };

    return videoCache;

}


- (NSString *)getUploadFile_type:(NSString *)type fileType:(NSString *)fileType {

    NSTimeInterval now = [[NSDatedate]timeIntervalSince1970];

    NSDateFormatter * formatter = [[NSDateFormatteralloc]init];

    [formatter setDateFormat:@"HHmmss"];

    NSDate * NowDate = [NSDatedateWithTimeIntervalSince1970:now];

    ;

    NSString * timeStr = [formatterstringFromDate:NowDate];

    NSString *fileName = [NSStringstringWithFormat:@"%@_%@.%@",type,timeStr,fileType];

    return fileName;

}


#pragma mark - 写入数据

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    NSLog(@"写入数据");


    BOOL isVideo =YES;

    @synchronized(self) {

        if (!self.isCapturing  ||self.isPaused) {

            return;

        }

        if (captureOutput !=self.videoOutput) {

            isVideo = NO;

        }

        //初始化编码器,当有音频和视频参数时创建编码器

        if ((self.recordEncoder ==nil) && !isVideo) {

            CMFormatDescriptionRef fmt =CMSampleBufferGetFormatDescription(sampleBuffer);

            [selfsetAudioFormat:fmt];

            NSString *videoName = [selfgetUploadFile_type:@"video"fileType:@"mp4"];

            self.videoPath = [[selfgetVideoCachePath]stringByAppendingPathComponent:videoName];

            self.recordEncoder = [WCLRecordEncoderencoderForPath:self.videoPathHeight:_cywidth:_cxchannels:_channelssamples:_samplerate];

        }

        //判断是否中断录制过

        if (self.discont) {

            if (isVideo) {

                return;

            }

            self.discont =NO;

            // 计算暂停的时间

            CMTime pts =CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

            CMTime last = isVideo ?_lastVideo :_lastAudio;

            if (last.flags &kCMTimeFlags_Valid) {

                if (_timeOffset.flags &kCMTimeFlags_Valid) {

                    pts = CMTimeSubtract(pts,_timeOffset);

                }

                CMTime offset =CMTimeSubtract(pts, last);

                if (_timeOffset.value ==0) {

                    _timeOffset = offset;

                }else {

                    _timeOffset =CMTimeAdd(_timeOffset, offset);

                }

            }

            _lastVideo.flags =0;

            _lastAudio.flags =0;

        }

        // 增加sampleBuffer的引用计时,这样我们可以释放这个或修改这个数据,防止在修改时被释放

        CFRetain(sampleBuffer);

        if (_timeOffset.value >0) {

            CFRelease(sampleBuffer);

            //根据得到的timeOffset调整

            sampleBuffer = [selfadjustTime:sampleBufferby:_timeOffset];

        }

        // 记录暂停上一次录制的时间

        CMTime pts =CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

        CMTime dur =CMSampleBufferGetDuration(sampleBuffer);

        if (dur.value >0) {

            pts = CMTimeAdd(pts, dur);

        }

        if (isVideo) {

            _lastVideo = pts;

        }else {

            _lastAudio = pts;

        }

    }

    CMTime dur =CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

    if (self.startTime.value == 0) {

        self.startTime = dur;

    }

    CMTime sub =CMTimeSubtract(dur,self.startTime);

    self.currentRecordTime =CMTimeGetSeconds(sub);

    if (self.currentRecordTime >self.maxRecordTime) {

        if (self.currentRecordTime - self.maxRecordTime <0.1) {

            if ([self.delegaterespondsToSelector:@selector(recordProgress:)]) {

                dispatch_async(dispatch_get_main_queue(), ^{

                    [self.delegaterecordProgress:self.currentRecordTime/self.maxRecordTime];

                });

            }

        }

        return;

    }

    if ([self.delegaterespondsToSelector:@selector(recordProgress:)]) {

        dispatch_async(dispatch_get_main_queue(), ^{

            [self.delegaterecordProgress:self.currentRecordTime/self.maxRecordTime];

        });

    }

    // 进行数据编码

    [self.recordEncoderencodeFrame:sampleBufferisVideo:isVideo];

    CFRelease(sampleBuffer);

    NSLog(@"写入数据02");

}


//设置音频格式

- (void)setAudioFormat:(CMFormatDescriptionRef)fmt {

    constAudioStreamBasicDescription *asbd =CMAudioFormatDescriptionGetStreamBasicDescription(fmt);

    _samplerate = asbd->mSampleRate;

    _channels = asbd->mChannelsPerFrame;

    

}


//调整媒体数据的时间

- (CMSampleBufferRef)adjustTime:(CMSampleBufferRef)sample by:(CMTime)offset {

    CMItemCount count;

    CMSampleBufferGetSampleTimingInfoArray(sample,0,nil, &count);

    CMSampleTimingInfo* pInfo =malloc(sizeof(CMSampleTimingInfo) * count);

    CMSampleBufferGetSampleTimingInfoArray(sample, count, pInfo, &count);

    for (CMItemCount i =0; i < count; i++) {

        pInfo[i].decodeTimeStamp =CMTimeSubtract(pInfo[i].decodeTimeStamp, offset);

        pInfo[i].presentationTimeStamp =CMTimeSubtract(pInfo[i].presentationTimeStamp, offset);

    }

    CMSampleBufferRef sout;

    CMSampleBufferCreateCopyWithNewTiming(nil, sample, count, pInfo, &sout);

    free(pInfo);

    return sout;

}


@end


=================================

#import <Foundation/Foundation.h>

#import <UIKit/UIKit.h>

#import <AVFoundation/AVCaptureVideoPreviewLayer.h>


@protocol WCLRecordEngineDelegate <NSObject>


- (void)recordProgress:(CGFloat)progress;


@end


@interface WCLRecordEngine : NSObject


@property (atomic,assign,readonly)BOOL isCapturing;//正在录制

@property (atomic,assign,readonly)BOOL isPaused;//是否暂停

@property (atomic,assign,readonly)CGFloat currentRecordTime;//当前录制时间

@property (atomic,assign)CGFloat maxRecordTime;//录制最长时间

@property (weak,nonatomic)id<WCLRecordEngineDelegate>delegate;

@property (atomic,strong)NSString *videoPath;//视频路径


//捕获到的视频呈现的layer

- (AVCaptureVideoPreviewLayer *)previewLayer;

//启动录制功能

- (void)startUp;

//关闭录制功能

- (void)shutdown;

//开始录制

- (void) startCapture;

//暂停录制

- (void) pauseCapture;

//停止录制

- (void) stopCaptureHandler:(void (^)(UIImage *movieImage))handler;

//继续录制

- (void) resumeCapture;

//开启闪光灯

- (void)openFlashLight;

//关闭闪光灯

- (void)closeFlashLight;

//切换前后置摄像头

- (void)changeCameraInputDeviceisFront:(BOOL)isFront;

//mov的视频转成mp4

- (void)changeMovToMp4:(NSURL *)mediaURL dataBlock:(void (^)(UIImage *movieImage))handler;


@end


=======================


视频写入

#import "WCLRecordEncoder.h"


@interface WCLRecordEncoder ()


@property (nonatomic,strong)AVAssetWriter *writer;//媒体写入对象

@property (nonatomic,strong)AVAssetWriterInput *videoInput;//视频写入

@property (nonatomic,strong)AVAssetWriterInput *audioInput;//音频写入

@property (nonatomic,strong)NSString *path;//写入路径


@end


@implementation WCLRecordEncoder


- (void)dealloc {

    _writer =nil;

    _videoInput =nil;

    _audioInput =nil;

    _path =nil;

}


//WCLRecordEncoder遍历构造器的

+ (WCLRecordEncoder*)encoderForPath:(NSString*) path Height:(NSInteger) cy width:(NSInteger) cx channels: (int) ch samples:(Float64) rate {

    WCLRecordEncoder* enc = [WCLRecordEncoderalloc];

    return [encinitPath:pathHeight:cywidth:cxchannels:chsamples:rate];

}


//初始化方法

- (instancetype)initPath:(NSString*)path Height:(NSInteger)cy width:(NSInteger)cx channels:(int)ch samples:(Float64) rate {

    self = [superinit];

    if (self) {

        self.path = path;

        //先把路径下的文件给删除掉,保证录制的文件是最新的

        [[NSFileManagerdefaultManager]removeItemAtPath:self.patherror:nil];

        NSURL* url = [NSURLfileURLWithPath:self.path];

        //初始化写入媒体类型为MP4类型

        _writer = [AVAssetWriterassetWriterWithURL:urlfileType:AVFileTypeMPEG4error:nil];

        //使其更适合在网络上播放

        _writer.shouldOptimizeForNetworkUse =YES;

        //初始化视频输出

        [selfinitVideoInputHeight:cywidth:cx];

        //确保采集到ratech

        if (rate !=0 && ch !=0) {

            //初始化音频输出

            [selfinitAudioInputChannels:chsamples:rate];

        }

    }

    returnself;

}


//初始化视频写入

- (void)initVideoInputHeight:(NSInteger)cy width:(NSInteger)cx {

    //录制视频的一些配置,分辨率,编码方式等等

    NSDictionary* settings = [NSDictionarydictionaryWithObjectsAndKeys:

                              AVVideoCodecH264,AVVideoCodecKey,

                              [NSNumbernumberWithInteger: cx],AVVideoWidthKey,

                              [NSNumbernumberWithInteger: cy],AVVideoHeightKey,

                              nil];

    //初始化视频写入

    _videoInput = [AVAssetWriterInputassetWriterInputWithMediaType:AVMediaTypeVideooutputSettings:settings];

    //表明输入是否应该调整其处理为实时数据源的数据

    _videoInput.expectsMediaDataInRealTime =YES;

    //将视频写入源加入写入对象

    [_writeraddInput:_videoInput];

}


//初始化音频写入

- (void)initAudioInputChannels:(int)ch samples:(Float64)rate {

    //音频的一些配置包括音频各种这里为AAC,音频通道、采样率和音频的比特率

    NSDictionary *settings = [NSDictionarydictionaryWithObjectsAndKeys:

                              [ NSNumbernumberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,

                              [ NSNumbernumberWithInt: ch],AVNumberOfChannelsKey,

                              [ NSNumbernumberWithFloat: rate],AVSampleRateKey,

                              [ NSNumbernumberWithInt:128000],AVEncoderBitRateKey,

                              nil];

    //初始化音频写入类

    _audioInput = [AVAssetWriterInputassetWriterInputWithMediaType:AVMediaTypeAudiooutputSettings:settings];

    //表明输入是否应该调整其处理为实时数据源的数据

    _audioInput.expectsMediaDataInRealTime =YES;

    //将音频输入源加入

    [_writeraddInput:_audioInput];

    

}


//完成视频录制时调用

- (void)finishWithCompletionHandler:(void (^)(void))handler {

    [_writerfinishWritingWithCompletionHandler: handler];

}


//通过这个方法写入数据

- (BOOL)encodeFrame:(CMSampleBufferRef) sampleBuffer isVideo:(BOOL)isVideo {

    //数据是否准备写入

    if (CMSampleBufferDataIsReady(sampleBuffer)) {

        //写入状态为未知,保证视频先写入

        if (_writer.status ==AVAssetWriterStatusUnknown && isVideo) {

            //获取开始写入的CMTime

            CMTime startTime =CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

            //开始写入

            [_writerstartWriting];//准备写入

            [_writerstartSessionAtSourceTime:startTime];//从当前时间开始写入

        }

        //写入失败

        if (_writer.status == AVAssetWriterStatusFailed) {

            NSLog(@"writer error %@",_writer.error.localizedDescription);

            returnNO;

        }

        //判断是否是视频

        if (isVideo) {

           //视频输入是否准备接受更多的媒体数据

            if (_videoInput.readyForMoreMediaData ==YES) {

                //拼接视频数据

                [_videoInputappendSampleBuffer:sampleBuffer];

                returnYES;

            }

        }else {

           //音频输入是否准备接受更多的媒体数据

            if (_audioInput.readyForMoreMediaData) {

                //拼接音频数据

                [_audioInputappendSampleBuffer:sampleBuffer];

                returnYES;

            }

        }

    }

    returnNO;

}


@end

=========================

#import <Foundation/Foundation.h>

#import <AVFoundation/AVFoundation.h>


/**

 *  写入并编码视频的的类

 */

@interface WCLRecordEncoder : NSObject


@property (nonatomic,readonly)NSString *path;


/**

 *  WCLRecordEncoder遍历构造器的

 *

 *  @param path 媒体存发路径

 *  @param cy   视频分辨率的高

 *  @param cx   视频分辨率的宽

 *  @param ch   音频通道

 *  @param rate 音频的采样比率

 *

 *  @return WCLRecordEncoder的实体

 */

+ (WCLRecordEncoder*)encoderForPath:(NSString*)path Height:(NSInteger)cy width:(NSInteger)cx channels: (int)ch samples:(Float64)rate;


/**

 *  初始化方法

 *

 *  @param path 媒体存发路径

 *  @param cy   视频分辨率的高

 *  @param cx   视频分辨率的宽

 *  @param ch   音频通道

 *  @param rate 音频的采样率

 *

 *  @return WCLRecordEncoder的实体

 */

- (instancetype)initPath:(NSString*)path Height:(NSInteger)cy width:(NSInteger)cx channels: (int)ch samples:(Float64)rate;


/**

 *  完成视频录制时调用

 *

 *  @param handler 完成的回掉block

 */

- (void)finishWithCompletionHandler:(void (^)(void))handler;


/**

 *  通过这个方法写入数据

 *

 *  @param sampleBuffer 写入的数据

 *  @param isVideo      是否写入的是视频

 *

 *  @return 写入是否成功

 */

- (BOOL)encodeFrame:(CMSampleBufferRef)sampleBuffer isVideo:(BOOL)isVideo;


@end


================================

#import "WCLRecordProgressView.h"


@implementation WCLRecordProgressView


- (void)setProgress:(CGFloat)progress {

    _progress = progress;

    [selfsetNeedsDisplay];

}


- (void)setProgressBgColor:(UIColor *)progressBgColor {

    _progressBgColor = progressBgColor;

    [selfsetNeedsDisplay];

}


- (void)setloadProgressColor:(UIColor *)loadProgressColor {

    _loadProgressColor = loadProgressColor;

    [selfsetNeedsDisplay];

}


- (void)setLoadProgress:(CGFloat)loadProgress {

    _loadProgress = loadProgress;

    [selfsetNeedsDisplay];

}


- (void)setProgressColor:(UIColor *)progressColor {

    _progressColor = progressColor;

    [selfsetNeedsDisplay];

}


- (void)drawRect:(CGRect)rect {

    CGContextRef context =UIGraphicsGetCurrentContext();

    CGContextAddRect(context,CGRectMake(0,0, rect.size.width, rect.size.height));

    [self.progressBgColorset];

    CGContextSetAlpha(context,0.5);

    CGContextDrawPath(context,kCGPathFill);

    CGContextAddRect(context,CGRectMake(0,0, rect.size.width*self.loadProgress, rect.size.height));

    [self.progressBgColorset];

    CGContextSetAlpha(context,1);

    CGContextDrawPath(context,kCGPathFill);

    CGContextAddRect(context,CGRectMake(0,0, rect.size.width*self.progress, rect.size.height));

    [self.progressColorset];

    CGContextSetAlpha(context,1);

    CGContextDrawPath(context,kCGPathFill);

}


@end


=====================

#import <UIKit/UIKit.h>


IB_DESIGNABLE


@interface WCLRecordProgressView :UIView


@property (assign,nonatomic)IBInspectableCGFloat progress;//当前进度

@property (strong,nonatomic)IBInspectableUIColor *progressBgColor;//进度条背景颜色

@property (strong,nonatomic)IBInspectableUIColor *progressColor;//进度条颜色

@property (assign,nonatomic)CGFloat loadProgress;//加载好的进度

@property (strong,nonatomic)UIColor *loadProgressColor;//已经加载好的进度颜色


@end



转载自:https://github.com/631106979/WCLRecordVideo.git


这篇关于iOS AVFoundation/AVCaptureSession实现自定义相机界面录像(三)的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/1017771

相关文章

【前端学习】AntV G6-08 深入图形与图形分组、自定义节点、节点动画(下)

【课程链接】 AntV G6:深入图形与图形分组、自定义节点、节点动画(下)_哔哩哔哩_bilibili 本章十吾老师讲解了一个复杂的自定义节点中,应该怎样去计算和绘制图形,如何给一个图形制作不间断的动画,以及在鼠标事件之后产生动画。(有点难,需要好好理解) <!DOCTYPE html><html><head><meta charset="UTF-8"><title>06

hdu1043(八数码问题,广搜 + hash(实现状态压缩) )

利用康拓展开将一个排列映射成一个自然数,然后就变成了普通的广搜题。 #include<iostream>#include<algorithm>#include<string>#include<stack>#include<queue>#include<map>#include<stdio.h>#include<stdlib.h>#include<ctype.h>#inclu

【C++】_list常用方法解析及模拟实现

相信自己的力量,只要对自己始终保持信心,尽自己最大努力去完成任何事,就算事情最终结果是失败了,努力了也不留遗憾。💓💓💓 目录   ✨说在前面 🍋知识点一:什么是list? •🌰1.list的定义 •🌰2.list的基本特性 •🌰3.常用接口介绍 🍋知识点二:list常用接口 •🌰1.默认成员函数 🔥构造函数(⭐) 🔥析构函数 •🌰2.list对象

【Prometheus】PromQL向量匹配实现不同标签的向量数据进行运算

✨✨ 欢迎大家来到景天科技苑✨✨ 🎈🎈 养成好习惯,先赞后看哦~🎈🎈 🏆 作者简介:景天科技苑 🏆《头衔》:大厂架构师,华为云开发者社区专家博主,阿里云开发者社区专家博主,CSDN全栈领域优质创作者,掘金优秀博主,51CTO博客专家等。 🏆《博客》:Python全栈,前后端开发,小程序开发,人工智能,js逆向,App逆向,网络系统安全,数据分析,Django,fastapi

让树莓派智能语音助手实现定时提醒功能

最初的时候是想直接在rasa 的chatbot上实现,因为rasa本身是带有remindschedule模块的。不过经过一番折腾后,忽然发现,chatbot上实现的定时,语音助手不一定会有响应。因为,我目前语音助手的代码设置了长时间无应答会结束对话,这样一来,chatbot定时提醒的触发就不会被语音助手获悉。那怎么让语音助手也具有定时提醒功能呢? 我最后选择的方法是用threading.Time

Android实现任意版本设置默认的锁屏壁纸和桌面壁纸(两张壁纸可不一致)

客户有些需求需要设置默认壁纸和锁屏壁纸  在默认情况下 这两个壁纸是相同的  如果需要默认的锁屏壁纸和桌面壁纸不一样 需要额外修改 Android13实现 替换默认桌面壁纸: 将图片文件替换frameworks/base/core/res/res/drawable-nodpi/default_wallpaper.*  (注意不能是bmp格式) 替换默认锁屏壁纸: 将图片资源放入vendo

安卓链接正常显示,ios#符被转义%23导致链接访问404

原因分析: url中含有特殊字符 中文未编码 都有可能导致URL转换失败,所以需要对url编码处理  如下: guard let allowUrl = webUrl.addingPercentEncoding(withAllowedCharacters: .urlQueryAllowed) else {return} 后面发现当url中有#号时,会被误伤转义为%23,导致链接无法访问

C#实战|大乐透选号器[6]:实现实时显示已选择的红蓝球数量

哈喽,你好啊,我是雷工。 关于大乐透选号器在前面已经记录了5篇笔记,这是第6篇; 接下来实现实时显示当前选中红球数量,蓝球数量; 以下为练习笔记。 01 效果演示 当选择和取消选择红球或蓝球时,在对应的位置显示实时已选择的红球、蓝球的数量; 02 标签名称 分别设置Label标签名称为:lblRedCount、lblBlueCount

Kubernetes PodSecurityPolicy:PSP能实现的5种主要安全策略

Kubernetes PodSecurityPolicy:PSP能实现的5种主要安全策略 1. 特权模式限制2. 宿主机资源隔离3. 用户和组管理4. 权限提升控制5. SELinux配置 💖The Begin💖点点关注,收藏不迷路💖 Kubernetes的PodSecurityPolicy(PSP)是一个关键的安全特性,它在Pod创建之前实施安全策略,确保P

工厂ERP管理系统实现源码(JAVA)

工厂进销存管理系统是一个集采购管理、仓库管理、生产管理和销售管理于一体的综合解决方案。该系统旨在帮助企业优化流程、提高效率、降低成本,并实时掌握各环节的运营状况。 在采购管理方面,系统能够处理采购订单、供应商管理和采购入库等流程,确保采购过程的透明和高效。仓库管理方面,实现库存的精准管理,包括入库、出库、盘点等操作,确保库存数据的准确性和实时性。 生产管理模块则涵盖了生产计划制定、物料需求计划、