Android拍摄视频流的格式转换(YUV --- RGB) 怎么从网络获取流转换成视频

本文主要是介绍Android拍摄视频流的格式转换(YUV --- RGB) 怎么从网络获取流转换成视频,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

Android允许用户实时捕获摄像头的视频流,这在利用摄像头的AR应用中非常有用。可以利用摄像流实时做画面图像的分析,并做出许多有用的应用。比如人脸识别,条码识别,特定图像替换等等,不过大多数图像处理软件在处理时是需要RGB格式的图像,而默认的视频流是压缩的YUV格式,Android下是YUV420SP,这个格式,虽然可以在程序中修改,但是修改后好像不起作用,也就是说只能得到编码为YUV420SP的视频流,这就需要把YUV420SP的视频流转换成RGB格式的图像,用于图像识别。特贴一个格式转换函数,方便大家使用。

java 代码:

    static public void decodeYUV420SP(byte[] rgbBuf, byte[] yuv420sp, int width, int height) {
final int frameSize = width * height;
if (rgbBuf == null)
throw new NullPointerException("buffer 'rgbBuf' is null");
if (rgbBuf.length < frameSize * 3)
throw new IllegalArgumentException("buffer 'rgbBuf' size "
+ rgbBuf.length + " < minimum " + frameSize * 3);
if (yuv420sp == null)
throw new NullPointerException("buffer 'yuv420sp' is null");
if (yuv420sp.length < frameSize * 3 / 2)
throw new IllegalArgumentException("buffer 'yuv420sp' size " + yuv420sp.length
+ " < minimum " + frameSize * 3 / 2);
int i = 0, y = 0;
int uvp = 0, u = 0, v = 0;
int y1192 = 0, r = 0, g = 0, b = 0;
for (int j = 0, yp = 0; j < height; j++) {
uvp = frameSize + (j >> 1) * width;
u = 0;
v = 0;
for (i = 0; i < width; i++, yp++) {
y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0) y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
y1192 = 1192 * y;
r = (y1192 + 1634 * v);
g = (y1192 - 833 * v - 400 * u);
b = (y1192 + 2066 * u);
if (r < 0) r = 0; else if (r > 262143) r = 262143;
if (g < 0) g = 0; else if (g > 262143) g = 262143;
if (b < 0) b = 0; else if (b > 262143) b = 262143;
rgbBuf[yp * 3] = (byte)(r >> 10);
rgbBuf[yp * 3 + 1] = (byte)(g >> 10);
rgbBuf[yp * 3 + 2] = (byte)(b >> 10);
}
}
}


在Android的拍照视频预览时就可以截取视频数据。每获得一帧就调用一下接口函数。
我的开发平台是Android 1.5,这个程序实现视频流的获取,程序简单地在第20帧到来的时候,写入到文件中。这样就可以拿到电脑上进行分析。

参考代码:

package com.sunshine;
import java.io.File;
import java.io.RandomAccessFile;
import android.app.Activity;
import android.content.res.Configuration;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.Window;
import android.view.WindowManager;
import android.view.SurfaceHolder.Callback;
public class AndroidVideo extends Activity implements Callback,
Camera.PictureCallback {
private SurfaceView mSurfaceView = null;
private SurfaceHolder mSurfaceHolder = null;
private Camera mCamera = null;
private boolean mPreviewRunning = false;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
mSurfaceView = (SurfaceView) this.findViewById(R.id.surface_camera);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
@Override
public void onPictureTaken(byte[] data, Camera camera) {
try {
Log.v("System.out", "get it!");
File file = new File("/sdcard/camera.jpg");
RandomAccessFile raf = new RandomAccessFile(file, "rw");
raf.write(data);
raf.close();
} catch (Exception ex) {
Log.v("System.out", ex.toString());
}
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
if (mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(width, height);
mCamera.setPreviewCallback(new StreamIt());
mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch (Exception ex) {
}
mCamera.startPreview();
mPreviewRunning = true;
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCamera.stopPreview();
mPreviewRunning = false;
mCamera.release();
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
try {
super.onConfigurationChanged(newConfig);
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
} else if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
}
} catch (Exception ex) {
}
}
}
class StreamIt implements Camera.PreviewCallback {
private int tick = 1;
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
// TODO Auto-generated method stub
if (tick == 20) {
System.out.println("data len: " + data.length);
try {
File file = new File("/sdcard/pal.pal");
if (!file.exists())
file.createNewFile();
RandomAccessFile raf = new RandomAccessFile(file, "rw");
raf.write(data);
raf.close();
tick++;
} catch (Exception ex) {
Log.v("System.out", ex.toString());
}
}
tick++;
}
}


xml 布局文件

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent" android:layout_height="fill_parent"
android:orientation="vertical">
<SurfaceView android:id="@+id/surface_camera"
android:layout_width="fill_parent" android:layout_height="fill_parent">
</SurfaceView>
</LinearLayout>


注意在项目配置文件中还要加上访问权限
<uses-permission android:name="android.permission.CAMERA" />
通过查资料发现,Android每帧的数据流的格式是YUV420

YUV420转成RGB的函数,见第一部分

 

Android 端

package com.sunshine;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.net.Socket;
import android.app.Activity;
import android.content.res.Configuration;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.Window;
import android.view.WindowManager;
import android.view.SurfaceHolder.Callback;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.EditText;
public class AndroidVideo extends Activity implements Callback,OnClickListener{
private SurfaceView mSurfaceView = null;
private SurfaceHolder mSurfaceHolder = null;
private Camera mCamera = null;
private boolean mPreviewRunning = false;
//连接相关
private EditText remoteIP=null;
private Button connect=null;
private String remoteIPStr=null;
//视频数据
private StreamIt streamIt=null;
public static  Kit kit=null;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
mSurfaceView = (SurfaceView) this.findViewById(R.id.surface_camera);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
remoteIP=(EditText)this.findViewById(R.id.remoteIP);
connect=(Button)this.findViewById(R.id.connect);
connect.setOnClickListener(this);
}
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
if (mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(width, height);
streamIt=new StreamIt();
kit=new Kit();
mCamera.setPreviewCallback(streamIt);
mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch (Exception ex) {
}
mCamera.startPreview();
mPreviewRunning = true;
}
public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
}
public void surfaceDestroyed(SurfaceHolder holder) {
mCamera.stopPreview();
mPreviewRunning = false;
mCamera.release();
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
try {
super.onConfigurationChanged(newConfig);
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
} else if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
}
} catch (Exception ex) {
}
}
class Kit implements Runnable {
private boolean run=true;
//        private final int dataLen=57600; //307200 OR 230400 76800 OR 57600
private final int tt=28800;
public void run() {
// TODO Auto-generated method stub
try {
Socket socket = new Socket(remoteIPStr, 8899);
DataOutputStream dos = new DataOutputStream(socket
.getOutputStream());
DataInputStream dis = new DataInputStream(socket
.getInputStream());
while (run) {
dos.write(streamIt.yuv420sp, 0, 28800);
dos.write(streamIt.yuv420sp, 28800, 28800);
dis.readBoolean();
Thread.sleep(155);
}
} catch (Exception ex) {
run=false;
ex.printStackTrace();
}
}
}
@Override
public void onClick(View view) {
// TODO Auto-generated method stub
if(view==connect){//连接函数
remoteIPStr=remoteIP.getText().toString();
new Thread(AndroidVideo.kit).start();
}
}
}
class StreamIt implements Camera.PreviewCallback {
public byte[] yuv420sp =null;
private boolean t=true;
public void onPreviewFrame(byte[] data, Camera camera) {
// TODO Auto-generated method stub
//        if(t){
//            t=false;
//            new Thread(AndroidVideo.kit).start();
//        }
yuv420sp=data;
}
}


pc端:

 

import java.awt.Frame;
import java.awt.Graphics;
import java.awt.Point;
import java.awt.Transparency;
import java.awt.color.ColorSpace;
import java.awt.image.BufferedImage;
import java.awt.image.ComponentColorModel;
import java.awt.image.DataBuffer;
import java.awt.image.DataBufferByte;
import java.awt.image.PixelInterleavedSampleModel;
import java.awt.image.Raster;
import java.awt.image.SampleModel;
import java.awt.image.WritableRaster;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.net.ServerSocket;
import java.net.Socket;
public class FlushMe extends Frame {
private static final long serialVersionUID = 1L;
private BufferedImage im;
// 图像信息
//    private final int width = 480;
//    private final int height = 320;
private static final int width = 240;
private static final int height = 160;
private static final int numBands = 3;
private static final int dataLen = 57600;//307200 OR 230400//57600 76800
private static final int tt = 28800;//14400;//28800;
// 图像数组
private byte[] byteArray = new byte[width * height * numBands];// 图像RGB数组
private byte[] yuv420sp = new byte[dataLen];// 图像YUV数组
private static final int[] bandOffsets = new int[] { 0, 1, 2 };
private static final SampleModel sampleModel = new PixelInterleavedSampleModel(
DataBuffer.TYPE_BYTE, width, height, 3, width * 3,
bandOffsets);
// ColorModel
private static final ColorSpace cs=ColorSpace.getInstance(ColorSpace.CS_sRGB);
private static final ComponentColorModel cm=new ComponentColorModel(cs, false, false,
Transparency.OPAQUE, DataBuffer.TYPE_BYTE);
public FlushMe() {
super("Flushing");
updateIM();
setSize(480, 320);
// 窗口关闭方法
this.addWindowListener(new java.awt.event.WindowAdapter() {
public void windowClosing(java.awt.event.WindowEvent e) {
System.exit(0);
}
});
// 窗口居中
this.setLocationRelativeTo(null);
this.setResizable(false);
this.setVisible(true);
this.getData();
}
public void update(Graphics g){
paint(g);
}
public void paint(Graphics g) {
g.drawImage(im, 0, 0, 480, 320, this);
}
public void getData() {
try {
ServerSocket server = new ServerSocket(8899);
Socket socket = server.accept();
DataInputStream dis = new DataInputStream(socket.getInputStream());
DataOutputStream dos = new DataOutputStream(socket.getOutputStream());
while (true) {
for (int i = 0; i < dataLen / tt; i++) {
dis.read(yuv420sp, i * tt, tt);
}
// 得到数据之后立即更新显示
updateIM();
im.flush();
repaint();
dos.writeBoolean(true);
}
} catch (Exception ex) {
ex.printStackTrace();
}
}
private void updateIM() {
try {
// 解析YUV成RGB格式
decodeYUV420SP(byteArray, yuv420sp, width, height);
DataBuffer dataBuffer = new DataBufferByte(byteArray, numBands);
WritableRaster wr = Raster.createWritableRaster(sampleModel,
dataBuffer, new Point(0, 0));
im = new BufferedImage(cm, wr, false, null);
} catch (Exception ex) {
ex.printStackTrace();
}
}
private static void decodeYUV420SP(byte[] rgbBuf, byte[] yuv420sp,
int width, int height) {
final int frameSize = width * height;
if (rgbBuf == null)
throw new NullPointerException("buffer 'rgbBuf' is null");
if (rgbBuf.length < frameSize * 3)
throw new IllegalArgumentException("buffer 'rgbBuf' size "
+ rgbBuf.length + " < minimum " + frameSize * 3);
if (yuv420sp == null)
throw new NullPointerException("buffer 'yuv420sp' is null");
if (yuv420sp.length < frameSize * 3 / 2)
throw new IllegalArgumentException("buffer 'yuv420sp' size "
+ yuv420sp.length + " < minimum " + frameSize * 3 / 2);
int i = 0, y = 0;
int uvp = 0, u = 0, v = 0;
int y1192 = 0, r = 0, g = 0, b = 0;
for (int j = 0, yp = 0; j < height; j++) {
uvp = frameSize + (j >> 1) * width;
u = 0;
v = 0;
for (i = 0; i < width; i++, yp++) {
y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0)
y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
y1192 = 1192 * y;
r = (y1192 + 1634 * v);
g = (y1192 - 833 * v - 400 * u);
b = (y1192 + 2066 * u);
if (r < 0)
r = 0;
else if (r > 262143)
r = 262143;
if (g < 0)
g = 0;
else if (g > 262143)
g = 262143;
if (b < 0)
b = 0;
else if (b > 262143)
b = 262143;
rgbBuf[yp * 3] = (byte) (r >> 10);
rgbBuf[yp * 3 + 1] = (byte) (g >> 10);
rgbBuf[yp * 3 + 2] = (byte) (b >> 10);
}
}
}
public static void main(String[] args) {
Frame f = new FlushMe();
}
}


网络上获取了一段RGB格式的byte数组,但是,在android上显示出来,并且不保存在手机上,只是显示一下

 

 

是RBG格式的String吗?原格式?
要是这样的话可以这样
String s1=new String(b);//b就是byte[]
不过一般网上回来的是inputstream,要是inputstream的话,可以用下面的代码
/**
* InputStream转为String
* 
* @param is
* @return
*/
public static String InputStreamToString(InputStream is) {
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
StringBuilder sb = new StringBuilder();
String line = null;
try {
while ((line = reader.readLine()) != null) {
sb.append(line);
}
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
is.close();
} catch (IOException e) {
e.printStackTrace();
}
}
return sb.toString();
} 


 


这篇关于Android拍摄视频流的格式转换(YUV --- RGB) 怎么从网络获取流转换成视频的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/870482

相关文章

Kotlin Map映射转换问题小结

《KotlinMap映射转换问题小结》文章介绍了Kotlin集合转换的多种方法,包括map(一对一转换)、mapIndexed(带索引)、mapNotNull(过滤null)、mapKeys/map... 目录Kotlin 集合转换:map、mapIndexed、mapNotNull、mapKeys、map

一文详解如何使用Java获取PDF页面信息

《一文详解如何使用Java获取PDF页面信息》了解PDF页面属性是我们在处理文档、内容提取、打印设置或页面重组等任务时不可或缺的一环,下面我们就来看看如何使用Java语言获取这些信息吧... 目录引言一、安装和引入PDF处理库引入依赖二、获取 PDF 页数三、获取页面尺寸(宽高)四、获取页面旋转角度五、判断

Android kotlin中 Channel 和 Flow 的区别和选择使用场景分析

《Androidkotlin中Channel和Flow的区别和选择使用场景分析》Kotlin协程中,Flow是冷数据流,按需触发,适合响应式数据处理;Channel是热数据流,持续发送,支持... 目录一、基本概念界定FlowChannel二、核心特性对比数据生产触发条件生产与消费的关系背压处理机制生命周期

Android ClassLoader加载机制详解

《AndroidClassLoader加载机制详解》Android的ClassLoader负责加载.dex文件,基于双亲委派模型,支持热修复和插件化,需注意类冲突、内存泄漏和兼容性问题,本文给大家介... 目录一、ClassLoader概述1.1 类加载的基本概念1.2 android与Java Class

Python使用OpenCV实现获取视频时长的小工具

《Python使用OpenCV实现获取视频时长的小工具》在处理视频数据时,获取视频的时长是一项常见且基础的需求,本文将详细介绍如何使用Python和OpenCV获取视频时长,并对每一行代码进行深入解析... 目录一、代码实现二、代码解析1. 导入 OpenCV 库2. 定义获取视频时长的函数3. 打开视频文

Linux中压缩、网络传输与系统监控工具的使用完整指南

《Linux中压缩、网络传输与系统监控工具的使用完整指南》在Linux系统管理中,压缩与传输工具是数据备份和远程协作的桥梁,而系统监控工具则是保障服务器稳定运行的眼睛,下面小编就来和大家详细介绍一下它... 目录引言一、压缩与解压:数据存储与传输的优化核心1. zip/unzip:通用压缩格式的便捷操作2.

关于集合与数组转换实现方法

《关于集合与数组转换实现方法》:本文主要介绍关于集合与数组转换实现方法,具有很好的参考价值,希望对大家有所帮助,如有错误或未考虑完全的地方,望不吝赐教... 目录1、Arrays.asList()1.1、方法作用1.2、内部实现1.3、修改元素的影响1.4、注意事项2、list.toArray()2.1、方

MySQL 获取字符串长度及注意事项

《MySQL获取字符串长度及注意事项》本文通过实例代码给大家介绍MySQL获取字符串长度及注意事项,本文通过实例代码给大家介绍的非常详细,对大家的学习或工作具有一定的参考借鉴价值,需要的朋友参考下吧... 目录mysql 获取字符串长度详解 核心长度函数对比⚠️ 六大关键注意事项1. 字符编码决定字节长度2

Mysql常见的SQL语句格式及实用技巧

《Mysql常见的SQL语句格式及实用技巧》本文系统梳理MySQL常见SQL语句格式,涵盖数据库与表的创建、删除、修改、查询操作,以及记录增删改查和多表关联等高级查询,同时提供索引优化、事务处理、临时... 目录一、常用语法汇总二、示例1.数据库操作2.表操作3.记录操作 4.高级查询三、实用技巧一、常用语

利用Python脚本实现批量将图片转换为WebP格式

《利用Python脚本实现批量将图片转换为WebP格式》Python语言的简洁语法和库支持使其成为图像处理的理想选择,本文将介绍如何利用Python实现批量将图片转换为WebP格式的脚本,WebP作为... 目录简介1. python在图像处理中的应用2. WebP格式的原理和优势2.1 WebP格式与传统