使用live555类库实现的网络直播系统——直播H264文件
2017-02-23 15:18
429 查看
下载live555最新代码,编译生成live555 的库文件:libBasicUsageEnvironment.a libgroupsock.a libliveMedia.a libUsageEnvironment.a ,使用这4个库再加上live555 自带的测试程序,可以很简单的实现live555 的直播。live555提供的直播程序是只能直播之前已经录制好的视频(与点播不同)。代码如下:
Makefile 文件如下:
INCLUDES = -I./include/usageEnvironment/ -I./include/groupsock/ -I.include/liveMedia/ -I.include/BasicUsageEnvironment
COMPILE_OPTS = $(INCLUDES) -I. -O2 -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS) $(CPPFLAGS) $(CFLAGS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1 $(CPPFLAGS) $(CXXFLAGS)
OBJ = o
LINK = c++ -o
LINK_OPTS = -L. $(LDFLAGS)
CONSOLE_LINK_OPTS = $(LINK_OPTS)
USAGE_ENVIRONMENT_LIB = ./lib/libUsageEnvironment.a
BASIC_USAGE_ENVIRONMENT_LIB = ./lib/libBasicUsageEnvironment.a
LIVEMEDIA_LIB = ./lib/libliveMedia.a
GROUPSOCK_LIB = ./lib/libgroupsock.a
LOCAL_LIBS = $(LIVEMEDIA_LIB) $(GROUPSOCK_LIB) $(BASIC_USAGE_ENVIRONMENT_LIB) $(USAGE_ENVIRONMENT_LIB)
LIBS = $(LOCAL_LIBS)
MEDIA_SERVER_OBJS = h264live.$(OBJ)
APP = h264live
.$(C).$(OBJ):
$(C_COMPILER) -c $(C_FLAGS) $<
.$(CPP).$(OBJ):
$(CPLUSPLUS_COMPILER) -c $(CPLUSPLUS_FLAGS) $<
h264live: $(MEDIA_SERVER_OBJS) $(LOCAL_LIBS)
$(LINK)$@ $(CONSOLE_LINK_OPTS) $(MEDIA_SERVER_OBJS) $(LIBS)
clean:
-rm -rf *.$(OBJ) $(APP) core *.core *~ include/*~
工程目录如下:
直接运行h264live文件,然后在客户端打开URL: "rtsp://192.168.0.127:8554/testStream" 就可以收看直播视频。
完整的工程代码:live555实现h264文件直播工程
/*============================================================================= FileName: h264live.c Desc: use the lib of live555 to do live Author: licaibiao Version: LastChange: 2017-02-23 *=============================================================================*/ #include "h264live.hh" UsageEnvironment* env; char const* inputFileName = "test.264"; H264VideoStreamFramer* videoSource; RTPSink* videoSink; void play(); // forward int main(int argc, char** argv) { // Begin by setting up our usage environment: TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // Create 'groupsocks' for RTP and RTCP: struct in_addr destinationAddress; destinationAddress.s_addr = chooseRandomIPv4SSMAddress(*env); // Note: This is a multicast address. If you wish instead to stream // using unicast, then you should use the "testOnDemandRTSPServer" // test program - not this test program - as a model. const unsigned short rtpPortNum = 18888; const unsigned short rtcpPortNum = rtpPortNum+1; const unsigned char ttl = 255; const Port rtpPort(rtpPortNum); const Port rtcpPort(rtcpPortNum); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); rtpGroupsock.multicastSendOnly(); // we're a SSM source Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl); rtcpGroupsock.multicastSendOnly(); // we're a SSM source // Create a 'H264 Video RTP' sink from the RTP 'groupsock': OutPacketBuffer::maxSize = 100000; videoSink = H264VideoRTPSink::createNew(*env, &rtpGroupsock, 96); // Create (and start) a 'RTCP instance' for this RTP sink: const unsigned estimatedSessionBandwidth = 500; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case RTCPInstance* rtcp = RTCPInstance::createNew(*env, &rtcpGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL /* we're a server */, True /* we're a SSM source */); // Note: This starts RTCP running automatically RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554); if (rtspServer == NULL) { *env << "Failed to create RTSP server: " << env->getResultMsg() << "\n"; exit(1); } ServerMediaSession* sms = ServerMediaSession::createNew(*env, "testStream", inputFileName, "Session streamed by \"testH264VideoStreamer\"", True /*SSM*/); sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp)); rtspServer->addServerMediaSession(sms); char* url = rtspServer->rtspURL(sms); *env << "Play this stream using the URL \"" << url << "\"\n"; delete[] url; // Start the streaming: *env << "Beginning streaming...\n"; play(); env->taskScheduler().doEventLoop(); // does not return return 0; // only to prevent compiler warning } void afterPlaying(void* /*clientData*/) { *env << "...done reading from file\n"; videoSink->stopPlaying(); Medium::close(videoSource); // Note that this also closes the input file that this source read from. // Start playing once again: play(); } void play() { // Open the input file as a 'byte-stream file source': ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(*env, inputFileName); if (fileSource == NULL) { *env << "Unable to open file \"" << inputFileName << "\" as a byte-stream file source\n"; exit(1); } FramedSource* videoES = fileSource; // Create a framer for the Video Elementary Stream: videoSource = H264VideoStreamFramer::createNew(*env, videoES); // Finally, start playing: *env << "Beginning to read from file...\n"; videoSink->startPlaying(*videoSource, afterPlaying, videoSink); }
Makefile 文件如下:
INCLUDES = -I./include/usageEnvironment/ -I./include/groupsock/ -I.include/liveMedia/ -I.include/BasicUsageEnvironment
COMPILE_OPTS = $(INCLUDES) -I. -O2 -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS) $(CPPFLAGS) $(CFLAGS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1 $(CPPFLAGS) $(CXXFLAGS)
OBJ = o
LINK = c++ -o
LINK_OPTS = -L. $(LDFLAGS)
CONSOLE_LINK_OPTS = $(LINK_OPTS)
USAGE_ENVIRONMENT_LIB = ./lib/libUsageEnvironment.a
BASIC_USAGE_ENVIRONMENT_LIB = ./lib/libBasicUsageEnvironment.a
LIVEMEDIA_LIB = ./lib/libliveMedia.a
GROUPSOCK_LIB = ./lib/libgroupsock.a
LOCAL_LIBS = $(LIVEMEDIA_LIB) $(GROUPSOCK_LIB) $(BASIC_USAGE_ENVIRONMENT_LIB) $(USAGE_ENVIRONMENT_LIB)
LIBS = $(LOCAL_LIBS)
MEDIA_SERVER_OBJS = h264live.$(OBJ)
APP = h264live
.$(C).$(OBJ):
$(C_COMPILER) -c $(C_FLAGS) $<
.$(CPP).$(OBJ):
$(CPLUSPLUS_COMPILER) -c $(CPLUSPLUS_FLAGS) $<
h264live: $(MEDIA_SERVER_OBJS) $(LOCAL_LIBS)
$(LINK)$@ $(CONSOLE_LINK_OPTS) $(MEDIA_SERVER_OBJS) $(LIBS)
clean:
-rm -rf *.$(OBJ) $(APP) core *.core *~ include/*~
工程目录如下:
[root@redhat h264live]# tree -L 2 . ├── h264live ├── h264live.cpp ├── h264live.hh ├── h264live.o ├── include │ ├── basicUsageEnvironment │ ├── groupsock │ ├── liveMedia │ └── usageEnvironment ├── lib │ ├── libBasicUsageEnvironment.a │ ├── libgroupsock.a │ ├── libliveMedia.a │ └── libUsageEnvironment.a ├── Makefile └── test.264
直接运行h264live文件,然后在客户端打开URL: "rtsp://192.168.0.127:8554/testStream" 就可以收看直播视频。
完整的工程代码:live555实现h264文件直播工程
相关文章推荐
- 使用Live555类库实现的网络直播系统
- 使用Live555类库实现的网络直播系统
- 使用Live555类库实现的网络直播系统
- 使用Live555类库实现的网络直播系统
- 使用Live555类库实现的网络直播系统
- 使用Live555类库实现的网络直播系统
- Windows平台下使用Dokan实现文件系统的开发
- 2010-07-21 使用系统调用实现文件复制
- 网络文件系统nfs文件系统使用(很全面)
- 使用NFS实现AIX之间文件网络共享[转载]
- Android应用中使用及实现系统“分享”功能(多个文件,多种媒体格式触发)
- Qt系统对话框中文化及应用程序实现重启及使用QSS样式表文件及使用程序启动界面
- 在C#中使用异步Socket编程实现TCP网络服务的C/S的通讯构架(一)----基础类库部分
- 使用Servlet实现的包括文件上传的用户管理系统
- zedboard--网络文件系统NFS的使用(二十五)
- 网络文件系统nfs文件系统使用(比较全面)
- 使用网络文件系统(NFS)
- 网络文件系统nfs文件系统使用
- 通过live555实现H264 RTSP直播
- JTree实现文件树——使用系统图标