Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频写入H264 SEI数据

本文主要是介绍Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频写入H264 SEI数据,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

记录一下学习过程,得到一个需求是基于Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频。

需求:

  1. 在每一帧视频数据中,写入SEI额外数据,方便后期解码时获得每一帧中的自定义数据。
  2. 点击录制功能后,录制的是前N秒至后N秒这段时间的音视频,保存的文件都按照60s进行保存。

写在前面,整个学习过程涉及到以下内容,可以快速检索是否有想要的内容

  • MediaCodec的使用,采用的是createInputSurface()创建一个surface,通过EGL接受camera2传过来的画面。
  • AudioRecord的使用
  • Camera2的使用
  • OpenGL的简单使用
  • H264 SEI的写入简单例子

整体思路设计比较简单,打开相机,创建OpenGL相关环境,然后创建video线程录制video相关数据,创建audio线程录制audio相关数据,video和audio数据都存在自定义的List中作为缓存,最后使用一个编码线程,将video List和audio List中的数据编码到MP4中即可。用的安卓sdk 28,因为29以上保存比较麻烦。整个工程暂时没上传,有需要私。
将以上功能都模块化,分别写到不同的类中。先介绍一些独立的模块。

UI布局

ui很简单,一个GLSurfaceView,两个button控件。

在这里插入图片描述

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"xmlns:app="http://schemas.android.com/apk/res-auto"xmlns:tools="http://schemas.android.com/tools"android:layout_width="match_parent"android:layout_height="match_parent"tools:context=".MainActivity"><android.opengl.GLSurfaceViewandroid:id="@+id/glView"android:layout_width="match_parent"android:layout_height="match_parent"app:layout_constraintBottom_toBottomOf="parent"app:layout_constraintEnd_toEndOf="parent"app:layout_constraintStart_toStartOf="parent"app:layout_constraintTop_toTopOf="parent" /><Buttonandroid:id="@+id/recordBtn"android:layout_width="wrap_content"android:layout_height="wrap_content"android:layout_marginBottom="80dp"android:text="Record"app:layout_constraintBottom_toBottomOf="parent"app:layout_constraintLeft_toLeftOf="parent"app:layout_constraintRight_toRightOf="parent" /><Buttonandroid:id="@+id/exit"android:layout_width="wrap_content"android:layout_height="wrap_content"android:layout_marginTop="20dp"android:layout_marginRight="20dp"android:text="Eixt"app:layout_constraintTop_toTopOf="parent"app:layout_constraintRight_toRightOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>

Camera2

camera2框架的使用,比较简单,需要注意的一点是, startPreview函数中传入的surface用于后续mCaptureRequestBuilder.addTarget(surface)的参数传入。surface的产生由以下基本几步完成。现在简单提一下,下面会贴代码。
1.这个surface 就是通过openGL 生成的纹理, GLES30.glGenTextures(1, mTexture, 0);
2.纹理生成SurfaceTexture, mSurfaceTexture = new SurfaceTexture(mTexture[0]);
3.mSurfaceTexture生成一个surface, mSurface = new Surface(mSurfaceTexture);
4.mCamera.startPreview(mSurface);

public class Camera2 {private final String TAG = "Abbott Camera2";private Context mContext;private CameraManager mCameraManager;private CameraDevice mCameraDevice;private String[] mCamList;private String mCameraId;private Size mPreviewSize;private HandlerThread mBackgroundThread;private Handler mBackgroundHandler;private CaptureRequest.Builder mCaptureRequestBuilder;private CaptureRequest mCaptureRequest;private CameraCaptureSession mCameraCaptureSession;public Camera2(Context Context) {mContext = Context;mCameraManager = (CameraManager) mContext.getSystemService(android.content.Context.CAMERA_SERVICE);try {mCamList = mCameraManager.getCameraIdList();} catch (CameraAccessException e) {e.printStackTrace();}mBackgroundThread = new HandlerThread("CameraThread");mBackgroundThread.start();mBackgroundHandler = new Handler(mBackgroundThread.getLooper());}public void openCamera(int width, int height, String id) {try {Log.d(TAG, "openCamera: id:" + id);CameraCharacteristics characteristics = mCameraManager.getCameraCharacteristics(id);if (characteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT) {}StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);mPreviewSize = getOptimalSize(map.getOutputSizes(SurfaceTexture.class), width, height);mCameraId = id;} catch (CameraAccessException e) {e.printStackTrace();}try {if (ActivityCompat.checkSelfPermission(mContext, android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {return;}Log.d(TAG, "mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);: " + mCameraId);mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}private Size getOptimalSize(Size[] sizeMap, int width, int height) {List<Size> sizeList = new ArrayList<>();for (Size option : sizeMap) {if (width > height) {if (option.getWidth() > width && option.getHeight() > height) {sizeList.add(option);}} else {if (option.getWidth() > height && option.getHeight() > width) {sizeList.add(option);}}}if (sizeList.size() > 0) {return Collections.min(sizeList, new Comparator<Size>() {@Overridepublic int compare(Size lhs, Size rhs) {return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());}});}return sizeMap[0];}private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {@Overridepublic void onOpened(@NonNull CameraDevice camera) {mCameraDevice = camera;}@Overridepublic void onDisconnected(@NonNull CameraDevice camera) {camera.close();mCameraDevice = null;}@Overridepublic void onError(@NonNull CameraDevice camera, int error) {camera.close();mCameraDevice = null;}};public void startPreview(Surface surface) {try {mCaptureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);mCaptureRequestBuilder.addTarget(surface);mCameraDevice.createCaptureSession(Collections.singletonList(surface), new CameraCaptureSession.StateCallback() {@Overridepublic void onConfigured(@NonNull CameraCaptureSession session) {try {mCaptureRequest = mCaptureRequestBuilder.build();mCameraCaptureSession = session;mCameraCaptureSession.setRepeatingRequest(mCaptureRequest, null, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}@Overridepublic void onConfigureFailed(@NonNull CameraCaptureSession session) {}}, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}
}

ImageList

这个类就是用于video 和audio缓存类,没有什么可以介绍的,直接用就好了。

public class ImageList {private static final String TAG = "Abbott ImageList";private Object mImageListLock = new Object();int kCapacity;private List<ImageItem> mImageList = new CopyOnWriteArrayList<>();public ImageList(int capacity) {kCapacity = capacity;}public synchronized void addItem(long Timestamp, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {synchronized (mImageListLock) {ImageItem item = new ImageItem(Timestamp, byteBuffer, bufferInfo);mImageList.add(item);if (mImageList.size() > kCapacity) {int excessItems = mImageList.size() - kCapacity;mImageList.subList(0, excessItems).clear();}}}public synchronized List<ImageItem> getItemsInTimeRange(long startTimestamp, long endTimestamp) {List<ImageItem> itemsInTimeRange = new ArrayList<>();synchronized (mImageListLock) {for (ImageItem item : mImageList) {long itemTimestamp = item.getTimestamp();// 判断时间戳是否在指定范围内if (itemTimestamp >= startTimestamp && itemTimestamp <= endTimestamp) {itemsInTimeRange.add(item);}}}return itemsInTimeRange;}public synchronized ImageItem getItem() {return mImageList.get(0);}public synchronized void removeItem() {mImageList.remove(0);}public synchronized int getSize() {return mImageList.size();}public static class ImageItem {private long mTimestamp;private ByteBuffer mVideoBuffer;private MediaCodec.BufferInfo mVideoBufferInfo;public ImageItem(long first, ByteBuffer second, MediaCodec.BufferInfo bufferInfo) {this.mTimestamp = first;this.mVideoBuffer = second;this.mVideoBufferInfo = bufferInfo;}public synchronized long getTimestamp() {return mTimestamp;}public synchronized ByteBuffer getVideoByteBuffer() {return mVideoBuffer;}public synchronized MediaCodec.BufferInfo getVideoBufferInfo() {return mVideoBufferInfo;}}
}

GlProgram

用于创建OpenGL的程序的类。目前使用的是OpenGL3.0 版本

public class GlProgram {public static final String mVertexShader ="#version 300 es \n" +"in vec4 vPosition;" +"in vec2 vCoordinate;" +"out vec2 vTextureCoordinate;" +"void main() {" +"   gl_Position = vPosition;" +"   vTextureCoordinate = vCoordinate;" +"}";public static final String mFragmentShader ="#version 300 es \n" +"#extension GL_OES_EGL_image_external : require \n" +"#extension GL_OES_EGL_image_external_essl3 : require \n" +"precision mediump float;" +"in vec2 vTextureCoordinate;" +"uniform samplerExternalOES oesTextureSampler;" +"out vec4 gl_FragColor;" +"void main() {" +"    gl_FragColor = texture(oesTextureSampler, vTextureCoordinate);" +"}";public static int createProgram(String vertexShaderSource, String fragShaderSource) {int program = GLES30.glCreateProgram();if (0 == program) {Log.e("Arc_ShaderManager", "create program error ,error=" + GLES30.glGetError());return 0;}int vertexShader = loadShader(GLES30.GL_VERTEX_SHADER, vertexShaderSource);if (0 == vertexShader) {return 0;}int fragShader = loadShader(GLES30.GL_FRAGMENT_SHADER, fragShaderSource);if (0 == fragShader) {return 0;}GLES30.glAttachShader(program, vertexShader);GLES30.glAttachShader(program, fragShader);GLES30.glLinkProgram(program);int[] status = new int[1];GLES30.glGetProgramiv(program, GLES30.GL_LINK_STATUS, status, 0);if (GLES30.GL_FALSE == status[0]) {String errorMsg = GLES30.glGetProgramInfoLog(program);Log.e("Arc_ShaderManager", "createProgram error : " + errorMsg);GLES30.glDeleteShader(vertexShader);GLES30.glDeleteShader(fragShader);GLES30.glDeleteProgram(program);return 0;}GLES30.glDetachShader(program, vertexShader);GLES30.glDetachShader(program, fragShader);GLES30.glDeleteShader(vertexShader);GLES30.glDeleteShader(fragShader);return program;}private static int loadShader(int type, String shaderSource) {int shader = GLES30.glCreateShader(type);if (0 == shader) {Log.e("Arc_ShaderManager", "create shader error, shader type=" + type + " , error=" + GLES30.glGetError());return 0;}GLES30.glShaderSource(shader, shaderSource);GLES30.glCompileShader(shader);int[] status = new int[1];GLES30.glGetShaderiv(shader, GLES30.GL_COMPILE_STATUS, status, 0);if (0 == status[0]) {String errorMsg = GLES30.glGetShaderInfoLog(shader);Log.e("Arc_ShaderManager", "createShader shader = " + type + "  error: " + errorMsg);GLES30.glDeleteShader(shader);return 0;}return shader;}
}

OesTexture

连接上面介绍的OpenGL程序,通过顶点着色器和片元着色器的坐标生成纹理

public class OesTexture {private static final String TAG = "Abbott OesTexture";private int mProgram;private final FloatBuffer mCordsBuffer;private final FloatBuffer mPositionBuffer;private int mPositionHandle;private int mCordsHandle;private int mOESTextureHandle;public OesTexture() {float[] positions = {-1.0f, 1.0f,-1.0f, -1.0f,1.0f, 1.0f,1.0f, -1.0f};float[] texCords = {0.0f, 0.0f,0.0f, 1.0f,1.0f, 0.0f,1.0f, 1.0f,};mPositionBuffer = ByteBuffer.allocateDirect(positions.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();mPositionBuffer.put(positions).position(0);mCordsBuffer = ByteBuffer.allocateDirect(texCords.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();mCordsBuffer.put(texCords).position(0);}public void init() {this.mProgram = GlProgram.createProgram(GlProgram.mVertexShader, GlProgram.mFragmentShader);if (0 == this.mProgram) {Log.e(TAG, "createProgram failed");}mPositionHandle = GLES30.glGetAttribLocation(mProgram, "vPosition");mCordsHandle = GLES30.glGetAttribLocation(mProgram, "vCoordinate");mOESTextureHandle = GLES30.glGetUniformLocation(mProgram, "oesTextureSampler");GLES30.glDisable(GLES30.GL_DEPTH_TEST);}public void PrepareTexture(int OESTextureId) {GLES30.glUseProgram(this.mProgram);GLES30.glEnableVertexAttribArray(mPositionHandle);GLES30.glVertexAttribPointer(mPositionHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mPositionBuffer);GLES30.glEnableVertexAttribArray(mCordsHandle);GLES30.glVertexAttribPointer(mCordsHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mCordsBuffer);GLES30.glActiveTexture(GLES30.GL_TEXTURE0);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, OESTextureId);GLES30.glUniform1i(mOESTextureHandle, 0);GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);GLES30.glDisableVertexAttribArray(mPositionHandle);GLES30.glDisableVertexAttribArray(mCordsHandle);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);}
}

接下来介绍的VideoRecorder,AudioEncoder,EncodingRunnable三个类需要互相搭配使用

public class AudioEncoder extends Thread {private static final String TAG = "Abbott AudioEncoder";private static final int SAVEMP4_INTERNAL = Param.recordInternal * 1000 * 1000;private static final int SAMPLE_RATE = 44100;private static final int CHANNEL_COUNT = 1;private static final int BIT_RATE = 96000;private EncodingRunnable mEncodingRunnable;private MediaCodec mMediaCodec;private AudioRecord mAudioRecord;private MediaFormat mFormat;private MediaFormat mOutputFormat;private long nanoTime;int mBufferSizeInBytes = 0;boolean mExitThread = true;private ImageList mAudioList;private MediaCodec.BufferInfo mAudioBufferInfo;private boolean mAlarm = false;private long mAlarmTime;private long mAlarmStartTime;private long mAlarmEndTime;private List<ImageList.ImageItem> mMuxerImageItem;private Object mLock = new Object();private MediaCodec.BufferInfo mAlarmBufferInfo;public AudioEncoder( EncodingRunnable encodingRunnable) throws IOException {mEncodingRunnable = encodingRunnable;nanoTime = System.nanoTime();createAudio();createMediaCodec();int kCapacity = 1000 / 20 * Param.recordInternal;mAudioList = new ImageList(kCapacity);}public void createAudio() {mBufferSizeInBytes = AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, mBufferSizeInBytes);}public void createMediaCodec() throws IOException {mFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, SAMPLE_RATE, CHANNEL_COUNT);mFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);mFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);mFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 8192);mMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);mMediaCodec.configure(mFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);}public synchronized void setAlarm() {synchronized (mLock) {Log.d(TAG, "setAudio Alarm enter");mEncodingRunnable.setAudioFormat(mOutputFormat);mEncodingRunnable.setAudioAlarmTrue();mAlarmTime = mAlarmBufferInfo.presentationTimeUs;mAlarmEndTime = mAlarmTime + SAVEMP4_INTERNAL;if (!mAlarm) {mAlarmStartTime = mAlarmTime - SAVEMP4_INTERNAL;}mAlarm = true;Log.d(TAG, "setAudio Alarm exit");}}@Overridepublic void run() {super.run();mMediaCodec.start();mAudioRecord.startRecording();while (mExitThread) {synchronized (mLock) {byte[] inputAudioData = new byte[mBufferSizeInBytes];int res = mAudioRecord.read(inputAudioData, 0, inputAudioData.length);if (res > 0) {if (mAudioRecord != null) {enCodeAudio(inputAudioData);}}}}Log.d(TAG, "AudioRecord run: exit");}private void enCodeAudio(byte[] inputAudioData) {mAudioBufferInfo = new MediaCodec.BufferInfo();int index = mMediaCodec.dequeueInputBuffer(-1);if (index < 0) {return;}ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();ByteBuffer audioInputBuffer = inputBuffers[index];audioInputBuffer.clear();audioInputBuffer.put(inputAudioData);audioInputBuffer.limit(inputAudioData.length);mMediaCodec.queueInputBuffer(index, 0, inputAudioData.length, (System.nanoTime() - nanoTime) / 1000, 0);int status = mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);ByteBuffer outputBuffer;if (status == MediaCodec.INFO_TRY_AGAIN_LATER) {} else if (status == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mOutputFormat = mMediaCodec.getOutputFormat();} else {while (status >= 0) {MediaCodec.BufferInfo tmpaudioBufferInfo = new MediaCodec.BufferInfo();tmpaudioBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);mAlarmBufferInfo = new MediaCodec.BufferInfo();mAlarmBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);outputBuffer = mMediaCodec.getOutputBuffer(status);ByteBuffer buffer = ByteBuffer.allocate(tmpaudioBufferInfo.size);buffer.limit(tmpaudioBufferInfo.size);buffer.put(outputBuffer);buffer.flip();if (tmpaudioBufferInfo.size > 0) {if (mAlarm) {mMuxerImageItem = mAudioList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);for (ImageList.ImageItem item : mMuxerImageItem) {mEncodingRunnable.pushAudio(item);}mAlarmStartTime = tmpaudioBufferInfo.presentationTimeUs;mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);if (tmpaudioBufferInfo.presentationTimeUs - mAlarmTime > SAVEMP4_INTERNAL) {mAlarm = false;mEncodingRunnable.setAudioAlarmFalse();Log.d(TAG, "mEncodingRunnable.setAudio itemAlarmFalse();");}} else {mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);}}mMediaCodec.releaseOutputBuffer(status, false);status = mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);}}}public synchronized void stopAudioRecord() throws IllegalStateException {synchronized (mLock) {mExitThread = false;}try {join();} catch (InterruptedException e) {e.printStackTrace();}mMediaCodec.stop();mMediaCodec.release();mMediaCodec = null;}
}
public class VideoRecorder extends Thread {private static final String TAG = "Abbott VideoRecorder";private static final int SAVE_MP4_Internal = 1000 * 1000 * Param.recordInternal;// EGLprivate static final int EGL_RECORDABLE_ANDROID = 0x3142;private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;private EGLContext mSharedContext = EGL14.EGL_NO_CONTEXT;private Surface mSurface;private int mOESTextureId;private OesTexture mOesTexture;private ImageList mImageList;private List<ImageList.ImageItem> muxerImageItem;// Threadprivate boolean mExitThread;private Object mLock = new Object();private Object object = new Object();private MediaCodec mMediaCodec;private MediaFormat mOutputFormat;private boolean mAlarm = false;private long mAlarmTime;private long mAlarmStartTime;private long mAlarmEndTime;private MediaCodec.BufferInfo mBufferInfo;private EncodingRunnable mEncodingRunnable;private String mSeiMessage;public VideoRecorder(EGLContext eglContext, EncodingRunnable encodingRunnable) {mSharedContext = eglContext;mEncodingRunnable = encodingRunnable;int kCapacity = 1000 / 40 * Param.recordInternal;mImageList = new ImageList(kCapacity);try {MediaFormat mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1920, 1080);mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1920 * 1080 * 25 / 5);mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);mMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);mSurface = mMediaCodec.createInputSurface();} catch (IOException e) {e.printStackTrace();}}@Overridepublic void run() {super.run();try {initEgl();mOesTexture = new OesTexture();mOesTexture.init();synchronized (mLock) {mLock.wait(33);}guardedRun();} catch (Exception e) {e.printStackTrace();}}private void guardedRun() throws InterruptedException, RuntimeException {mExitThread = false;while (true) {synchronized (mLock) {if (mExitThread) {break;}mLock.wait(33);}mOesTexture.PrepareTexture(mOESTextureId);swapBuffers();enCodeVideo();}Log.d(TAG, "guardedRun: exit");unInitEgl();}private void enCodeVideo() {mBufferInfo = new MediaCodec.BufferInfo();int status = mMediaCodec.dequeueOutputBuffer(mBufferInfo, 0);ByteBuffer outputBuffer = null;if (status == MediaCodec.INFO_TRY_AGAIN_LATER) {} else if (status == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mOutputFormat = mMediaCodec.getOutputFormat();} else if (status == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {} else {outputBuffer = mMediaCodec.getOutputBuffer(status);if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {mBufferInfo.size = 0;}if (mBufferInfo.size > 0) {outputBuffer.position(mBufferInfo.offset);outputBuffer.limit(mBufferInfo.size - mBufferInfo.offset);mSeiMessage = "avcIndex" + String.format("%05d", 0);}mMediaCodec.releaseOutputBuffer(status, false);}if (mBufferInfo.size > 0) {mEncodingRunnable.setTimeUs(mBufferInfo.presentationTimeUs);ByteBuffer seiData = buildSEIData(mSeiMessage);ByteBuffer frameWithSEI = ByteBuffer.allocate(outputBuffer.remaining() + seiData.remaining());frameWithSEI.put(seiData);frameWithSEI.put(outputBuffer);frameWithSEI.flip();mBufferInfo.size = frameWithSEI.remaining();MediaCodec.BufferInfo tmpAudioBufferInfo = new MediaCodec.BufferInfo();tmpAudioBufferInfo.set(mBufferInfo.offset, mBufferInfo.size, mBufferInfo.presentationTimeUs, mBufferInfo.flags);if (mAlarm) {muxerImageItem = mImageList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);mAlarmStartTime = tmpAudioBufferInfo.presentationTimeUs;for (ImageList.ImageItem item : muxerImageItem) {mEncodingRunnable.push(item);}mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);if (mBufferInfo.presentationTimeUs - mAlarmTime > SAVE_MP4_Internal) {Log.d(TAG, "mEncodingRunnable.set itemAlarmFalse()");Log.d(TAG, tmpAudioBufferInfo.presentationTimeUs + " " + mAlarmTime);mAlarm = false;mEncodingRunnable.setVideoAlarmFalse();}} else {mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);}}}public synchronized void setAlarm() {synchronized (mLock) {Log.d(TAG, "setAlarm enter");mEncodingRunnable.setMediaFormat(mOutputFormat);mEncodingRunnable.setVideoAlarmTrue();if (mBufferInfo.presentationTimeUs != 0) {mAlarmTime = mBufferInfo.presentationTimeUs;}mAlarmEndTime = mAlarmTime + SAVE_MP4_Internal;if (!mAlarm) {mAlarmStartTime = mAlarmTime - SAVE_MP4_Internal;}mAlarm = true;Log.d(TAG, "setAlarm exit");}}public synchronized void startRecord() throws IllegalStateException {super.start();mMediaCodec.start();}public synchronized void stopVideoRecord() throws IllegalStateException {synchronized (mLock) {mExitThread = true;mLock.notify();}try {join();} catch (InterruptedException e) {e.printStackTrace();}mMediaCodec.signalEndOfInputStream();mMediaCodec.stop();mMediaCodec.release();mMediaCodec = null;}public void requestRender(int i) {synchronized (object) {mOESTextureId = i;}}private void initEgl() {this.mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);if (this.mEGLDisplay == EGL14.EGL_NO_DISPLAY) {throw new RuntimeException("EGL14.eglGetDisplay fail...");}int[] major_version = new int[2];boolean eglInited = EGL14.eglInitialize(this.mEGLDisplay, major_version, 0, major_version, 1);if (!eglInited) {this.mEGLDisplay = null;throw new RuntimeException("EGL14.eglInitialize fail...");}//4. 设置显示设备的属性int[] attrib_list = new int[]{EGL14.EGL_SURFACE_TYPE, EGL14.EGL_WINDOW_BIT,EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,EGL14.EGL_RED_SIZE, 8,EGL14.EGL_GREEN_SIZE, 8,EGL14.EGL_BLUE_SIZE, 8,EGL14.EGL_ALPHA_SIZE, 8,EGL14.EGL_DEPTH_SIZE, 16,EGL_RECORDABLE_ANDROID, 1,EGL14.EGL_NONE};EGLConfig[] configs = new EGLConfig[1];int[] numConfigs = new int[1];boolean eglChose = EGL14.eglChooseConfig(this.mEGLDisplay, attrib_list, 0, configs, 0, configs.length, numConfigs, 0);if (!eglChose) {throw new RuntimeException("eglChooseConfig [RGBA888 + recordable] ES2 EGL_config_fail...");}int[] attr_list = {EGL14.EGL_CONTEXT_CLIENT_VERSION, 2, EGL14.EGL_NONE};this.mEGLContext = EGL14.eglCreateContext(this.mEGLDisplay, configs[0], this.mSharedContext, attr_list, 0);checkEglError("eglCreateContext");if (this.mEGLContext == EGL14.EGL_NO_CONTEXT) {throw new RuntimeException("eglCreateContext == EGL_NO_CONTEXT");}int[] surface_attr = {EGL14.EGL_NONE};this.mEGLSurface = EGL14.eglCreateWindowSurface(this.mEGLDisplay, configs[0], this.mSurface, surface_attr, 0);if (this.mEGLSurface == EGL14.EGL_NO_SURFACE) {throw new RuntimeException("eglCreateWindowSurface == EGL_NO_SURFACE");}Log.d(TAG, "initEgl , display=" + this.mEGLDisplay + " ,context=" + this.mEGLContext + " ,sharedContext= " +this.mSharedContext + ", surface=" + this.mEGLSurface);boolean success = EGL14.eglMakeCurrent(this.mEGLDisplay, this.mEGLSurface, this.mEGLSurface, this.mEGLContext);if (!success) {checkEglError("makeCurrent");throw new RuntimeException("eglMakeCurrent failed");}}private void unInitEgl() {boolean success = EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_CONTEXT);if (!success) {checkEglError("makeCurrent");throw new RuntimeException("eglMakeCurrent failed");}if (this.mEGLDisplay != EGL14.EGL_NO_DISPLAY) {EGL14.eglDestroySurface(this.mEGLDisplay, this.mEGLSurface);EGL14.eglDestroyContext(this.mEGLDisplay, this.mEGLContext);EGL14.eglTerminate(this.mEGLDisplay);}this.mEGLDisplay = EGL14.EGL_NO_DISPLAY;this.mEGLContext = EGL14.EGL_NO_CONTEXT;this.mEGLSurface = EGL14.EGL_NO_SURFACE;this.mSharedContext = EGL14.EGL_NO_CONTEXT;this.mSurface = null;}private boolean swapBuffers() {if ((null == this.mEGLDisplay) || (null == this.mEGLSurface)) {return false;}boolean success = EGL14.eglSwapBuffers(this.mEGLDisplay, this.mEGLSurface);if (!success) {checkEglError("eglSwapBuffers");}return success;}private void checkEglError(String msg) {int error = EGL14.eglGetError();if (error != EGL14.EGL_SUCCESS) {throw new RuntimeException(msg + ": EGL_ERROR_CODE: 0x" + Integer.toHexString(error));}}private ByteBuffer buildSEIData(String message) {// 构建 SEI 数据int seiSize = 128;ByteBuffer seiBuffer = ByteBuffer.allocate(seiSize);seiBuffer.put(new byte[]{0, 0, 0, 1, 6, 5});// 设置 SEI messageString seiMessage = "h264testdata" + message;seiBuffer.put((byte) seiMessage.length());// 设置 SEI user dataseiBuffer.put(seiMessage.getBytes());seiBuffer.flip();return seiBuffer;}}
public class EncodingRunnable extends Thread {private static final String TAG = "Abbott EncodingRunnable";private Object mRecordLock = new Object();private boolean mExitThread = false;private MediaMuxer mMediaMuxer;private int avcIndex;private int mAudioIndex;private MediaFormat mOutputFormat;private MediaFormat mAudioOutputFormat;private ImageList mImageList;private ImageList mAudioImageList;private boolean itemAlarm;private long mAudioImageListTimeUs = -1;private boolean mAudioAlarm;private int mVideoCapcity = 1000 / 40 * Param.recordInternal;private int mAudioCapcity = 1000 / 20 * Param.recordInternal;private int recordSecond = 1000 * 1000 * 60;long Video60sStart = -1;public EncodingRunnable() {mImageList = new ImageList(mVideoCapcity);mAudioImageList = new ImageList(mAudioCapcity);}private boolean mIsRecoding = false;public void setMediaFormat(MediaFormat OutputFormat) {if (mOutputFormat == null) {mOutputFormat = OutputFormat;}}public void setAudioFormat(MediaFormat OutputFormat) {if (mAudioOutputFormat == null) {mAudioOutputFormat = OutputFormat;}}public void setMediaMuxerConfig() {long currentTimeMillis = System.currentTimeMillis();Date currentDate = new Date(currentTimeMillis);SimpleDateFormat dateFormat = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault());String fileName = dateFormat.format(currentDate);File mFile = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),fileName + ".MP4");Log.d(TAG, "setMediaMuxerSavaPath: new MediaMuxer  " + mFile.getPath());try {mMediaMuxer = new MediaMuxer(mFile.getPath(), MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);} catch (IOException e) {e.printStackTrace();}avcIndex = mMediaMuxer.addTrack(mOutputFormat);mAudioIndex = mMediaMuxer.addTrack(mAudioOutputFormat);mMediaMuxer.start();}public void setMediaMuxerSavaPath() {if (!mIsRecoding) {mExitThread = false;setMediaMuxerConfig();setRecording();notifyStartRecord();}}@Overridepublic void run() {super.run();while (true) {synchronized (mRecordLock) {try {mRecordLock.wait();} catch (InterruptedException e) {e.printStackTrace();}}MediaCodec.BufferInfo tmpAudioBufferInfo = new MediaCodec.BufferInfo();while (mIsRecoding) {if (mAudioImageList.getSize() > 0) {ImageList.ImageItem audioItem = mAudioImageList.getItem();tmpAudioBufferInfo.set(audioItem.getVideoBufferInfo().offset,audioItem.getVideoBufferInfo().size,audioItem.getVideoBufferInfo().presentationTimeUs + mAudioImageListTimeUs,audioItem.getVideoBufferInfo().flags);mMediaMuxer.writeSampleData(mAudioIndex, audioItem.getVideoByteBuffer(), tmpAudioBufferInfo);mAudioImageList.removeItem();}if (mImageList.getSize() > 0) {ImageList.ImageItem item = mImageList.getItem();if (Video60sStart < 0) {Video60sStart = item.getVideoBufferInfo().presentationTimeUs;}mMediaMuxer.writeSampleData(avcIndex, item.getVideoByteBuffer(), item.getVideoBufferInfo());if (item.getVideoBufferInfo().presentationTimeUs - Video60sStart > recordSecond) {Log.d(TAG, "System.currentTimeMillis() - Video60sStart :" + (item.getVideoBufferInfo().presentationTimeUs - Video60sStart));mMediaMuxer.stop();mMediaMuxer.release();mMediaMuxer = null;setMediaMuxerConfig();Video60sStart = -1;}mImageList.removeItem();}if (itemAlarm == false && mAudioAlarm == false) {mIsRecoding = false;Log.d(TAG, "mediaMuxer.stop()");mMediaMuxer.stop();mMediaMuxer.release();mMediaMuxer = null;break;}}if (mExitThread) {break;}}}public synchronized void setRecording() throws IllegalStateException {synchronized (mRecordLock) {mIsRecoding = true;}}public synchronized void setAudioAlarmTrue() throws IllegalStateException {synchronized (mRecordLock) {mAudioAlarm = true;}}public synchronized void setVideoAlarmTrue() throws IllegalStateException {synchronized (mRecordLock) {itemAlarm = true;}}public synchronized void setAudioAlarmFalse() throws IllegalStateException {synchronized (mRecordLock) {mAudioAlarm = false;}}public synchronized void setVideoAlarmFalse() throws IllegalStateException {synchronized (mRecordLock) {itemAlarm = false;}}public synchronized void notifyStartRecord() throws IllegalStateException {synchronized (mRecordLock) {mRecordLock.notify();}}public synchronized void push(ImageList.ImageItem item) {mImageList.addItem(item.getTimestamp(),item.getVideoByteBuffer(),item.getVideoBufferInfo());}public synchronized void pushAudio(ImageList.ImageItem item) {synchronized (mRecordLock) {mAudioImageList.addItem(item.getTimestamp(),item.getVideoByteBuffer(),item.getVideoBufferInfo());}}public synchronized void setTimeUs(long l) {if (mAudioImageListTimeUs != -1) {return;}mAudioImageListTimeUs = l;Log.d(TAG, "setTimeUs: " + l);}public synchronized void setExitThread() {mExitThread = true;mIsRecoding = false;notifyStartRecord();try {join();} catch (InterruptedException e) {e.printStackTrace();}}}

最后介绍一下Camera2Renderer和MainActivity

Camera2Renderer

Camera2Renderer继承GLSurfaceView.Renderer,通过这个类来调动所有的代码。

public class Camera2Renderer implements GLSurfaceView.Renderer {private static final String TAG = "Abbott Camera2Renderer";final private Context mContext;final private GLSurfaceView mGlSurfaceView;private Camera2 mCamera;private int[] mTexture = new int[1];private SurfaceTexture mSurfaceTexture;private Surface mSurface;private OesTexture mOesTexture;private EGLContext mEglContext = null;private VideoRecorder mVideoRecorder;private EncodingRunnable mEncodingRunnable;private AudioEncoder mAudioEncoder;public Camera2Renderer(Context context, GLSurfaceView glSurfaceView, EncodingRunnable encodingRunnable) {mContext = context;mGlSurfaceView = glSurfaceView;mEncodingRunnable = encodingRunnable;}@Overridepublic void onSurfaceCreated(GL10 gl, EGLConfig config) {mCamera = new Camera2(mContext);mCamera.openCamera(1920, 1080, "0");mOesTexture = new OesTexture();mOesTexture.init();mEglContext = EGL14.eglGetCurrentContext();mVideoRecorder = new VideoRecorder(mEglContext, mEncodingRunnable);mVideoRecorder.startRecord();try {mAudioEncoder = new AudioEncoder(mEncodingRunnable);mAudioEncoder.start();} catch (IOException e) {e.printStackTrace();}}@Overridepublic void onSurfaceChanged(GL10 gl, int width, int height) {GLES30.glGenTextures(1, mTexture, 0);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTexture[0]);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);mSurfaceTexture = new SurfaceTexture(mTexture[0]);mSurfaceTexture.setDefaultBufferSize(1920, 1080);mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {@Overridepublic void onFrameAvailable(SurfaceTexture surfaceTexture) {mGlSurfaceView.requestRender();}});mSurface = new Surface(mSurfaceTexture);mCamera.startPreview(mSurface);}@Overridepublic void onDrawFrame(GL10 gl) {mSurfaceTexture.updateTexImage();mOesTexture.PrepareTexture(mTexture[0]);mVideoRecorder.requestRender(mTexture[0]);}public VideoRecorder getVideoRecorder() {return mVideoRecorder;}public AudioEncoder getAudioEncoder() {return mAudioEncoder;}
}

主函数比较简单,就是申请权限而已。

public class MainActivity extends AppCompatActivity {private static final String TAG = "Abbott MainActivity";private static final String FRAGMENT_DIALOG = "dialog";private final Object mLock = new Object();private GLSurfaceView mGlSurfaceView;private Button mRecordButton;private Button mExitButton;private Camera2Renderer mCamera2Renderer;private VideoRecorder mVideoRecorder;private EncodingRunnable mEncodingRunnable;private AudioEncoder mAudioEncoder;private static final int REQUEST_CAMERA_PERMISSION = 1;@Overrideprotected void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {requestCameraPermission();return;}setContentView(R.layout.activity_main);mGlSurfaceView = findViewById(R.id.glView);mRecordButton = findViewById(R.id.recordBtn);mExitButton = findViewById(R.id.exit);mGlSurfaceView.setEGLContextClientVersion(3);mEncodingRunnable = new EncodingRunnable();mEncodingRunnable.start();mCamera2Renderer = new Camera2Renderer(this, mGlSurfaceView, mEncodingRunnable);mGlSurfaceView.setRenderer(mCamera2Renderer);mGlSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);}@Overrideprotected void onResume() {super.onResume();mRecordButton.setOnClickListener(new View.OnClickListener() {@Overridepublic void onClick(View view) {synchronized (MainActivity.this) {startRecord();}}});mExitButton.setOnClickListener(new View.OnClickListener() {@Overridepublic void onClick(View view) {stopRecord();Log.d(TAG, "onClick: exit program");finish();}});}private void requestCameraPermission() {if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) ||shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||shouldShowRequestPermissionRationale(Manifest.permission.RECORD_AUDIO)) {new ConfirmationDialog().show(getSupportFragmentManager(), FRAGMENT_DIALOG);} else {requestPermissions(new String[]{Manifest.permission.CAMERA,Manifest.permission.WRITE_EXTERNAL_STORAGE,Manifest.permission.RECORD_AUDIO}, REQUEST_CAMERA_PERMISSION);}}public static class ConfirmationDialog extends DialogFragment {@NonNull@Overridepublic Dialog onCreateDialog(Bundle savedInstanceState) {final Fragment parent = getParentFragment();return new AlertDialog.Builder(getActivity()).setMessage(R.string.request_permission).setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {@Overridepublic void onClick(DialogInterface dialog, int which) {}}).setNegativeButton(android.R.string.cancel,new DialogInterface.OnClickListener() {@Overridepublic void onClick(DialogInterface dialog, int which) {Activity activity = parent.getActivity();if (activity != null) {activity.finish();}}}).create();}}private void startRecord() {synchronized (mLock) {try {if (mVideoRecorder == null) {mVideoRecorder = mCamera2Renderer.getVideoRecorder();}if (mAudioEncoder == null) {mAudioEncoder = mCamera2Renderer.getAudioEncoder();}mVideoRecorder.setAlarm();mAudioEncoder.setAlarm();mEncodingRunnable.setMediaMuxerSavaPath();Log.d(TAG, "Start Record ");} catch (Exception e) {e.printStackTrace();}}}private void stopRecord() {if (mVideoRecorder == null) {mVideoRecorder = mCamera2Renderer.getVideoRecorder();}if (mAudioEncoder == null) {mAudioEncoder = mCamera2Renderer.getAudioEncoder();}mEncodingRunnable.setExitThread();mVideoRecorder.stopVideoRecord();mAudioEncoder.stopAudioRecord();}}

这篇关于Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频写入H264 SEI数据的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/669781

相关文章

大模型研发全揭秘:客服工单数据标注的完整攻略

在人工智能(AI)领域,数据标注是模型训练过程中至关重要的一步。无论你是新手还是有经验的从业者,掌握数据标注的技术细节和常见问题的解决方案都能为你的AI项目增添不少价值。在电信运营商的客服系统中,工单数据是客户问题和解决方案的重要记录。通过对这些工单数据进行有效标注,不仅能够帮助提升客服自动化系统的智能化水平,还能优化客户服务流程,提高客户满意度。本文将详细介绍如何在电信运营商客服工单的背景下进行

基于MySQL Binlog的Elasticsearch数据同步实践

一、为什么要做 随着马蜂窝的逐渐发展,我们的业务数据越来越多,单纯使用 MySQL 已经不能满足我们的数据查询需求,例如对于商品、订单等数据的多维度检索。 使用 Elasticsearch 存储业务数据可以很好的解决我们业务中的搜索需求。而数据进行异构存储后,随之而来的就是数据同步的问题。 二、现有方法及问题 对于数据同步,我们目前的解决方案是建立数据中间表。把需要检索的业务数据,统一放到一张M

关于数据埋点,你需要了解这些基本知识

产品汪每天都在和数据打交道,你知道数据来自哪里吗? 移动app端内的用户行为数据大多来自埋点,了解一些埋点知识,能和数据分析师、技术侃大山,参与到前期的数据采集,更重要是让最终的埋点数据能为我所用,否则可怜巴巴等上几个月是常有的事。   埋点类型 根据埋点方式,可以区分为: 手动埋点半自动埋点全自动埋点 秉承“任何事物都有两面性”的道理:自动程度高的,能解决通用统计,便于统一化管理,但个性化定

使用SecondaryNameNode恢复NameNode的数据

1)需求: NameNode进程挂了并且存储的数据也丢失了,如何恢复NameNode 此种方式恢复的数据可能存在小部分数据的丢失。 2)故障模拟 (1)kill -9 NameNode进程 [lytfly@hadoop102 current]$ kill -9 19886 (2)删除NameNode存储的数据(/opt/module/hadoop-3.1.4/data/tmp/dfs/na

异构存储(冷热数据分离)

异构存储主要解决不同的数据,存储在不同类型的硬盘中,达到最佳性能的问题。 异构存储Shell操作 (1)查看当前有哪些存储策略可以用 [lytfly@hadoop102 hadoop-3.1.4]$ hdfs storagepolicies -listPolicies (2)为指定路径(数据存储目录)设置指定的存储策略 hdfs storagepolicies -setStoragePo

Hadoop集群数据均衡之磁盘间数据均衡

生产环境,由于硬盘空间不足,往往需要增加一块硬盘。刚加载的硬盘没有数据时,可以执行磁盘数据均衡命令。(Hadoop3.x新特性) plan后面带的节点的名字必须是已经存在的,并且是需要均衡的节点。 如果节点不存在,会报如下错误: 如果节点只有一个硬盘的话,不会创建均衡计划: (1)生成均衡计划 hdfs diskbalancer -plan hadoop102 (2)执行均衡计划 hd

hdu1043(八数码问题,广搜 + hash(实现状态压缩) )

利用康拓展开将一个排列映射成一个自然数,然后就变成了普通的广搜题。 #include<iostream>#include<algorithm>#include<string>#include<stack>#include<queue>#include<map>#include<stdio.h>#include<stdlib.h>#include<ctype.h>#inclu

【C++】_list常用方法解析及模拟实现

相信自己的力量,只要对自己始终保持信心,尽自己最大努力去完成任何事,就算事情最终结果是失败了,努力了也不留遗憾。💓💓💓 目录   ✨说在前面 🍋知识点一:什么是list? •🌰1.list的定义 •🌰2.list的基本特性 •🌰3.常用接口介绍 🍋知识点二:list常用接口 •🌰1.默认成员函数 🔥构造函数(⭐) 🔥析构函数 •🌰2.list对象

【Prometheus】PromQL向量匹配实现不同标签的向量数据进行运算

✨✨ 欢迎大家来到景天科技苑✨✨ 🎈🎈 养成好习惯,先赞后看哦~🎈🎈 🏆 作者简介:景天科技苑 🏆《头衔》:大厂架构师,华为云开发者社区专家博主,阿里云开发者社区专家博主,CSDN全栈领域优质创作者,掘金优秀博主,51CTO博客专家等。 🏆《博客》:Python全栈,前后端开发,小程序开发,人工智能,js逆向,App逆向,网络系统安全,数据分析,Django,fastapi

让树莓派智能语音助手实现定时提醒功能

最初的时候是想直接在rasa 的chatbot上实现,因为rasa本身是带有remindschedule模块的。不过经过一番折腾后,忽然发现,chatbot上实现的定时,语音助手不一定会有响应。因为,我目前语音助手的代码设置了长时间无应答会结束对话,这样一来,chatbot定时提醒的触发就不会被语音助手获悉。那怎么让语音助手也具有定时提醒功能呢? 我最后选择的方法是用threading.Time