mjpg-stream移植到FL2440平台
2016-05-26 16:38
148 查看
前言:
首先说明mjpg-stream是基于uvc和libjpeg的,所以你的内核必须有UVC驱动,而库里必须要有libjpeg,这是前提。所以假设你UVC弄好了,我们从移植libjpeg库开始。其依据是:
在mjpg-streamer的README中有这样的话:
In case of error:
39 * the input plugin “input_uvc.so” depends on libjpeg, make sure it is installed.
40
41 Dependencies for the input plugin “input_uvc.so”:
42 * libjpeg
43 * recent Linux-UVC driver (newer then revision #170)
44
45 Dependencies for the output plugin “output_autofocus.so”:
46 * libmath
libjpeg库安装:
1. 下载libjpeg源码包
2. 解压配置编译及安装:
anzyelay@ubuntu:arm$tar xvf jpegsrc.v8b.tar.gz anzyelay@ubuntu:arm$cd jpeg-8b anzyelay@ubuntu:jpeg-8b$./configure --prefix=(安装目录) --host=arm-none-linux-gnueabi anzyelay@ubuntu:jpeg-8b$make anzyelay@ubuntu:jpeg-8b$make install
安装完后在安装目录下有4个目录 ./lib ./include ./bin ./share
3.拷贝动态库至开发板根系统目录lib/下
anzyelay@ubuntu:libjpeg_install$ cp lib/*.so* ../myrootfs/lib/ -av `lib/libjpeg.so' -> `../myrootfs/lib/libjpeg.so' `lib/libjpeg.so.8' -> `../myrootfs/lib/libjpeg.so.8' `lib/libjpeg.so.8.0.2' -> `../myrootfs/lib/libjpeg.so.8.0.2'
mjpg-stream的移植:
1.下载mjpg-stream源码
2.解压配置编译及安装
解压anzyelay@ubuntu:arm$ unzip mjpg-streamer-code-182.zip anzyelay@ubuntu:arm$ cd mjpg-streamer-code-182/ anzyelay@ubuntu:mjpg-streamer-code-182$ ls doc mjpg-streamer udp_client mjpeg-client mjpg-streamer-experimental uvc-streamer
我们用的是mjpg-streamer,所以只编译这个就好,进入目录
anzyelay@ubuntu:mjpg-streamer-code-182$ cd mjpg-streamer anzyelay@ubuntu:mjpg-streamer$ ls CHANGELOG Makefile mjpg_streamer.h README start.sh utils.c www LICENSE mjpg_streamer.c plugins scripts TODO utils.h
这里以经有Makefile文件了,所以只需要我们更改编译工具 CC就OK,将所有Makefile中的CC替换成arm-linux-gcc
anzyelay@ubuntu:mjpg-streamer$ grep 'CC = gcc' -rl . | xargs sed -i "s/CC = gcc/CC=arm-linux-gcc/g"
由于“input_uvc.so” depends on libjpeg,所以需要更改./plugin/input_uvc/Makefile文件中的库链接和头文件路径,增加两点:
1.CFLAGS += -I /Libjpeg安装目录/include
2.LFLAGS += -L /(Libjpeg安装目录)/lib
anzyelay@ubuntu:mjpg-streamer$ cd plugins/input_uvc/ anzyelay@ubuntu:input_uvc$ vi Makefile
cat Makefile增加的语句如下:
CFLAGS += -I /home/anzyelay/Desktop/arm/libjpeg_install/include#anzyelay add
LFLAGS += -L /home/anzyelay/Desktop/arm/libjpeg_install/lib#anzyelay add
如果要用到make install命令.则需要指明安装路径,修改顶层的Makefile中的DESTDIR变量即可,否则默认安装在/usr/local下。如下:
# specifies where to install the binaries after compilation
# to use another directory you can specify it with:
# $ sudo make DESTDIR=/some/path install
DESTDIR =/home/anzyelay/Desktop/arm/myrootfs/usr/local
回到mjpg-streamer执行编译命令
anzyelay@ubuntu:mjpg-streamer$ make clean all
当我用make install时会出现
is not a directory: No such file or directory的错误,查看了下安装目录发现连./bin都是可执行文件而不是目录,自己手动mkdir bin lib后再安装就OK了。
anzyelay@ubuntu:local$ mkdir bin lib anzyelay@ubuntu:mjpg-streamer$ make install install --mode=755 mjpg_streamer /home/anzyelay/Desktop/arm/myrootfs/usr/local/bin install --mode=644 input_uvc.so output_file.so output_udp.so output_http.so input_testpicture.so input_file.so /home/anzyelay/Desktop/arm/myrootfs/usr/local/lib/ install --mode=755 -d /home/anzyelay/Desktop/arm/myrootfs/usr/local/www install --mode=644 -D www/* /home/anzyelay/Desktop/arm/myrootfs/usr/local/www anzyelay@ubuntu:local$ ls bin lib www
3.拷贝库和可执行文件至开发板中
前期实验推荐拷贝到/usr/local/下,并修改环境配置,在/etc/profile加入下面两句:export PATH=/usr/local/bin:$PATH
#下面一句是指定动态库的搜索目录
export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH
将今后要移植的动态库都放到/usr/local/lib下,测试OK稳定了就可以放到/lib下了,这样如果有问题也不会覆盖了根目录下的库。
将所有动态库文件*.so传到开发板的库目录下,将执行文件mjpg_streamer也传到开发板上。如果是make install了则直接将 bin/,lib/和www/拷到开发板对应目录就行,要移动的文件如下:
[root@bst:/]# ls usr/local/ -R usr/local/: bin lib www usr/local/bin: mjpg_streamer usr/local/lib: input_file.so input_uvc.so output_http.so input_testpicture.so output_file.so output_udp.so
www目录也可不需要了。
4. 测试:
插入摄像头检查:[root@bst:/]# ls dev/video* dev/video0
执行mjpg_streamer 命令:
[root@bst:/]# mjpg_streamer -i "input_uvc.so" -o "output_http.so"
成功后如下:
[root@bst:/]# mjpg_streamer -i "input_uvc.so" -o "output_http.so"MJPG-streamer [1125]: starting application
MJPG Streamer Version: svn rev: exported
MJPG-streamer [1125]: MJPG Streamer Version: svn rev: exported
i: Using V4L2 device.: /dev/video0
MJPG-streamer [1125]: Using V4L2 device.: /dev/video0
i: Desired Resolution: 640 x 480
MJPG-streamer [1125]: Desired Resolution: 640 x 480
i: Frames Per Second.: 5
MJPG-streamer [1125]: Frames Per Second.: 5
i: Format............: MJPEG
MJPG-streamer [1125]: Format............: MJPEG
Adding control for Pan (relative)
Control exists: File exists
Adding control for Tilt (relative)
Control exists: File exists
Adding control for Pan Reset
Control exists: File exists
Adding control for Tilt Reset
Control exists: File exists
Adding control for Pan/tilt Reset
Control exists: File exists
Adding control for Focus (absolute)
Control exists: File exists
mapping control for Pan (relative)
Mapping exists: File exists
mapping control for Tilt (relative)
Mapping exists: File exists
mapping control for Pan Reset
Mapping exists: File exists
mapping control for Tilt Reset
Mapping exists: File exists
mapping control for Pan/tilt Reset
Mapping exists: File exists
mapping control for Focus (absolute)
Mapping exists: File exists
mapping control for LED1 Mode
Mapping exists: File exists
mapping control for LED1 Frequency
Mapping exists: File exists
mapping control for Disable video processing
Mapping exists: File exists
mapping control for Raw bits per pixel
Mapping exists: File exists
o: www-folder-path...: disabled
MJPG-streamer [1125]: www-folder-path...: disabled
o: HTTP TCP port.....: 8080
MJPG-streamer [1125]: HTTP TCP port.....: 8080
o: username:password.: disabled
MJPG-streamer [1125]: username:password.: disabled
o: commands..........: enabled
MJPG-streamer [1125]: commands..........: enabled
MJPG-streamer [1125]: starting input plugin input_uvc.so
MJPG-streamer [1125]: starting output plugin: output_http.so (ID: 00)
在IE地址栏输入如下命令:
192.168.10.110:8080/?action=stream
其中ip地址是你开发板的地址,如果要静止的图
192.168.10.110:8080/?action=snapshot
说明:
我的uvc摄像头输出格式是MJPEG的,所以用上面的命令可以出图,但如果输出格式是YUV不支持MJPEG则要在input_uvc.so插件里加入-y的选项,否则出错如下:
i: Format............: MJPEG MJPG-streamer [1013]: Format............: MJPEG Unable to set format: 1196444237 res: 176x144 Init v4L2 failed !! exit fatal i: init_VideoIn failed MJPG-streamer [1013]: init_VideoIn failed
没有插入UVC摄像头时出错如下:
ERROR opening V4L interface: No such file or directory Init v4L2 failed !! exit fatal i: init_VideoIn failed MJPG-streamer [1192]: init_VideoIn failed
但我找了个只支持yuv输出的摄像头时,使用
[root@bst:/]# mjpg_streamer -i "input_uvc.so -y" -o "output_http.so"
终端显示是正常的在starting中:
MJPG-streamer [1014]: starting input plugin input_uvc.so MJPG-streamer [1014]: starting output plugin: output_http.so (ID: 00)
但以IE中查看时一直是显示transferring data,不能显示图片,goolge了下找到这篇文章:点我查看
分辩率太高,传输不过来,调低分辨率就行.看来应该是传输限制
[root@bst:/]# mjpg_streamer -i "input_uvc.so -r 176x144 --fps 10 -q 80" -o "output_http.so -w /usr/local/www"
mjpg_streamer命令说明:
使用命令查看帮助信息:mjpg_streamer -hmjpg_streamer -i “< input-plugin.so > [-opt] ” -o “< output-plugin.so > [-opt]” [ -h/v/b ]
-i的opt参数说明:设置输入捕获的参数
-d:(device)uvc设备节点
-r :(resolutions)视频捕获的分辨率,可以是:QSIF QCIF CGA QVGA CIF VGA SVGA XGA SXGA
或者直接写如: 640x480
-f :frames per second
-y :enable YUYV format and disable MJPEG mode
-q :图像压缩质量百分比,选中了此,则默认开启-y。
-m:drop frames smaller then this limit, useful
if the webcam produces small-sized garbage frames
may happen under low light conditions
-n :do not initalize dynctrls of Linux-UVC driver
-l :switch the LED “on”, “off”, let it “blink” or leave
it up to the driver using the value “auto”
-o的opt参数说明:设置HTTP输出参数
可选opt有:
-w : 指定www目录
-p:http服务的tcp port
-c:ie访问时需要用户名和密码 “-c username:password”
-n:disable execution of commands
input-plugin.so,output-plugin.so插件如果找不到,则可通过 export LD_LIBRARY_PATH=plugins路径设定摸索目录,也可以使用完整路径如下
mjpg_streamer -i "/path/to/modules/input_uvc.so"
相关文章推荐
- python(十一)异常
- 文章标题
- caffe中LRN的实现
- 枚举
- 第 8 章 界定问题
- 基于ZooKeeper的分布式Session实现
- php函数使用小技巧
- 在发起网络请求时可能会需要对URLString进行编码
- HDU 2860 Regroup(并查集)
- [翻译]Writing Custom Wizards 编写自定义的向导
- sql server日期时间函数
- javaWeb 文件上传功能
- SLAM:(编译ORB)fatal error LNK1181: 无法打开输入文件“libboost_mpi-vc110-mt-1_57.lib”
- Java中@Override的作用
- Centos7下Redis3.2的安装配置与JReid测试
- 有时候稳定也不见得就是什么好事
- 纹理贴图
- javascript 判断是否是数组
- python(十)面向对象编程
- fragment的基本使用模型