本文主要是介绍V4L2+FFMPEG+live555实现流媒体服务端,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
源码下载地址:源码下载
<span style="font-size: 14px; font-family: Arial, Helvetica, sans-serif; background-color: rgb(255, 255, 255);">代码主要参考了如下一篇博客,当然还有其它很多代码,就不一一列举了。</span>
博客链接:http://blog.csdn.net/nieyongs/article/details/17919325
分析LIVE555库,设计了两个类:
1.V4L2FramedSource类,是FramedSource类的直接子类,主要提供如下功能:
使用V4L2提供的MMAP内存映射,建立了V4L2FramedSource::fNumCameraBuffers个缓冲帧
基于FFMPEG的libswscale模块实现缩放与转换图像格式和大小
2.StreamEncoder类,是FramedFilter类的直接子类,不过也算FramedSource类的孙子类吧!基于FFMPEG的libavcodec模块实现H264编码。
这是Live555库中非常常见的两个类,FrameFilter类是SOURCE到SINK途径中可以无限制插入的类,它能够非常出色的将流水作业模块化!
程序中图像数据传递流程为:
V4L2FramedSource →StreamEncoder→H264VideoStreamFramer
→H264VideoRTPSink
程序调用流程是刚好相反的方向,当RTPSink类将图像数据打包完毕之后,它就会向H264VideoStreamFramer要数据,这样。。。。
H264VideoRTPSink是按照[RFC 3984]协议对NALU进行分片,不过分片功能则是交给H264or5Fragmenter类实现
//liveMedia/H264VideoRTPSink.cpp文件
// "liveMedia"
// Copyright (c) 1996-2014 Live Networks, Inc. All rights reserved.
// RTP sink for H.264 video (RFC 3984)
// Implementation
具体分片方法可查看live/liveMedia/H264or5VideoRTPSink.cpp文件的H264or5Fragmenter::doGetNextFrame()函数。
H264VideoStreamFramer类实现了解析各个图像帧,并且一次只向H264or5Fragmenter对象传递一个NAL单元。
Tips:因此,在确保一次传递一个NALU的情况下,完全可以直接砍掉 解析NALU单元的模块,这个最大的鸡肋。不过
FFMPEG库的avcodec_encode_video2()返回的AVPacket确实是一帧已编码的H.264图像,还不能够这样霸气!
StreamEncoder和H264VideoStreamFramer是FramedFilter的子类
V4L2FramedSource是FramedSource的子类
H264VideoRTPSink是RTPSink的子类
/**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See <http://www.gnu.org/copyleft/lesser.html>.)This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// Copyright (c) 1996-2014, Live Networks, Inc. All rights reserved
// A test program that demonstrates how to stream - via unicast RTP
// - various kinds of file on demand, using a built-in RTSP server.
// main program#include <liveMedia.hh>
#include <BasicUsageEnvironment.hh>
#include "DD_H264VideoFileServerMediaSubsession.hh"UsageEnvironment* env;// To make the second and subsequent client for each stream reuse the same
// input stream as the first client (rather than playing the file from the
// start for each client), change the following "False" to "True":
Boolean reuseFirstSource = True;static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,char const* streamName, char const* inputFileName); // fwdint main(int argc, char** argv) {// Begin by setting up our usage environment:TaskScheduler* scheduler = BasicTaskScheduler::createNew();env = BasicUsageEnvironment::createNew(*scheduler);UserAuthenticationDatabase* authDB = NULL;
#ifdef ACCESS_CONTROL// To implement client access control to the RTSP server, do the following:authDB = new UserAuthenticationDatabase;authDB->addUserRecord("username1", "password1"); // replace these with real strings// Repeat the above with each <username>, <password> that you wish to allow// access to the server.
#endif// Create the RTSP server:RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554, authDB);if (rtspServer == NULL) {*env << "Failed to create RTSP server: " << env->getResultMsg() << "\n";exit(1);}char const* descriptionString= "Session streamed by \"testOnDemandRTSPServer\"";// Set up each of the possible streams that can be served by the// RTSP server. Each such stream is implemented using a// "ServerMediaSession" object, plus one or more// "ServerMediaSubsession" objects for each audio/video substream.// A H.264 video elementary stream:{char const* streamName = "h264ESVideoTest";char const* inputFileName = "test.264";ServerMediaSession* sms= ServerMediaSession::createNew(*env, streamName, streamName,descriptionString);sms->addSubsession(DD_H264VideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource));rtspServer->addServerMediaSession(sms);announceStream(rtspServer, sms, streamName, inputFileName);}// Also, attempt to create a HTTP server for RTSP-over-HTTP tunneling.// Try first with the default HTTP port (80), and then with the alternative HTTP// port numbers (8000 and 8080).if (rtspServer->setUpTunnelingOverHTTP(80) || rtspServer->setUpTunnelingOverHTTP(8000) || rtspServer->setUpTunnelingOverHTTP(8080)) {*env << "\n(We use port " << rtspServer->httpServerPortNum() << " for optional RTSP-over-HTTP tunneling.)\n";} else {*env << "\n(RTSP-over-HTTP tunneling is not available.)\n";}env->taskScheduler().doEventLoop(); // does not returnreturn 0; // only to prevent compiler warning
}static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,char const* streamName, char const* inputFileName) {char* url = rtspServer->rtspURL(sms);UsageEnvironment& env = rtspServer->envir();env << "\n\"" << streamName << "\" stream, from the file \""<< inputFileName << "\"\n";env << "Play this stream using the URL \"" << url << "\"\n";delete[] url;
}
/** V4L2.h** Created on: 2013年12月17日* Author: ny*/#ifndef V4L2_H_
#define V4L2_H_#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <assert.h>#include <getopt.h>#include <fcntl.h>
#include <unistd.h>
#include <errno.h>
#include <malloc.h>
#include <sys/stat.h>
#include <sys/types.h>
#include <sys/time.h>
#include <sys/mman.h>
#include <sys/ioctl.h>#include <asm/types.h>#include <linux/videodev2.h>#ifndef _FRAMED_SOURCE_HH
#include "FramedSource.hh"
#endifextern "C"
{#include <libavcodec/avcodec.h>
// #include <libavformat/avformat.h>
// #include <libavfilter/avfilter.h>#include <libswscale/swscale.h>#include "libavutil/opt.h"#include <libavutil/imgutils.h>
}#define CLEAR(x) memset(&(x),0,sizeof(x))struct buffer
{void * start;unsigned int length;
};class V4L2FramedSource: public FramedSource
{
public:V4L2FramedSource(UsageEnvironment& env);virtual ~V4L2FramedSource();
// bool initDev(const char * devName, int width, int height);//摄像头初始化bool startStream();//开始启动摄像头int getWidth();int getHeight();void camera_get_format(void);void camera_set_format(int width, int height);void output_get_format(void);void output_set_format(int width, int height);
// bool setSize(int width, int height);b
这篇关于V4L2+FFMPEG+live555实现流媒体服务端的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!