V4L2获取usb视频流测试代码
Video4Linux2(Video for Linux Two, 簡(jiǎn)稱V4L2)是Linux中關(guān)于視頻設(shè)備的驅(qū)動(dòng)框架,為上層訪問底層的視頻設(shè)備提供統(tǒng)一接口。V4L2主要支持三類設(shè)備:視頻輸入輸出設(shè)備、VBI設(shè)備和Radio設(shè)備,分別會(huì)在/dev目錄下產(chǎn)生videoX、vbiX和radioX設(shè)備節(jié)點(diǎn),其中X是0,1,2等的數(shù)字。如USB攝像頭是我們常見的視頻輸入設(shè)備。FFmpeg和OpenCV對(duì)V4L2均支持。
在使用USB攝像頭獲取視頻時(shí),外部調(diào)用V4L2 API的包含的主要頭文件為videodev2.h,在/usr/include/linux目錄下。
V4L2獲取USB視頻流執(zhí)行流程如下:代碼參考:https://linuxtv.org/downloads/v4l-dvb-apis-new/uapi/v4l/capture.c.html
1. 打開設(shè)備調(diào)用open_device函數(shù):
(1). 調(diào)用stat函數(shù),通過設(shè)備名字獲取設(shè)備信息,并將結(jié)果保存在結(jié)構(gòu)體stat中,結(jié)構(gòu)體stat內(nèi)容如下:
struct stat {dev_t     st_dev;     /* ID of device containing file */ino_t     st_ino;     /* inode number */mode_t    st_mode;    /* protection */nlink_t   st_nlink;   /* number of hard links */uid_t     st_uid;     /* user ID of owner */gid_t     st_gid;     /* group ID of owner */dev_t     st_rdev;    /* device ID (if special file) */off_t     st_size;    /* total size, in bytes */blksize_t st_blksize; /* blocksize for file system I/O */blkcnt_t  st_blocks;  /* number of 512B blocks allocated */time_t    st_atime;   /* time of last access */time_t    st_mtime;   /* time of last modification */time_t    st_ctime;   /* time of last status change */
};(2). 調(diào)用S_ISCHR函數(shù)判斷是否是字符設(shè)備(character device);
(3). 調(diào)用open函數(shù)打開設(shè)備,以可讀可寫方式(O_RDWR)和無阻塞方式(O_NONBLOCK)打開,open函數(shù)返回一個(gè)文件(或設(shè)備)描述符。
?2. 初始化設(shè)備調(diào)用init_device函數(shù):
(1). 調(diào)用ioctl函數(shù),對(duì)設(shè)備的I/O通道進(jìn)行管理,查詢?cè)O(shè)備能力,并將結(jié)果保存在結(jié)構(gòu)體v4l2_capability中,結(jié)構(gòu)體v4l2_capability內(nèi)容如下:
struct v4l2_capability {__u8	driver[16]; // name of the driver module (e.g. "bttv")__u8	card[32]; // name of the card (e.g. "Hauppauge WinTV")__u8	bus_info[32]; // name of the bus (e.g. "PCI:" + pci_name(pci_dev) )__u32   version; // KERNEL_VERSION__u32	capabilities; // capabilities of the physical device as a whole__u32	device_caps; // capabilities accessed via this particular device (node)__u32	reserved[3]; // reserved fields for future extensions
};(2). 判斷是否是視頻捕獲設(shè)備:V4L2_CAP_VIDEO_CAPTURE;
(3). 判斷是否支持流I/O(streaming I/O):V4L2_CAP_STREAMING;
(4). 調(diào)用ioctl函數(shù),查詢?cè)O(shè)備視頻裁剪和縮放功能信息,并將結(jié)果保存在結(jié)構(gòu)體v4l2_cropcap中,結(jié)構(gòu)體v4l2_cropcap內(nèi)容如下:
enum v4l2_buf_type {V4L2_BUF_TYPE_VIDEO_CAPTURE        = 1,V4L2_BUF_TYPE_VIDEO_OUTPUT         = 2,V4L2_BUF_TYPE_VIDEO_OVERLAY        = 3,V4L2_BUF_TYPE_VBI_CAPTURE          = 4,V4L2_BUF_TYPE_VBI_OUTPUT           = 5,V4L2_BUF_TYPE_SLICED_VBI_CAPTURE   = 6,V4L2_BUF_TYPE_SLICED_VBI_OUTPUT    = 7,
#if 1/* Experimental */V4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY = 8,
#endifV4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE = 9,V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE  = 10,/* Deprecated, do not use */V4L2_BUF_TYPE_PRIVATE              = 0x80,
};struct v4l2_cropcap {__u32 type;	/* enum v4l2_buf_type */struct v4l2_rect        bounds;struct v4l2_rect        defrect;struct v4l2_fract       pixelaspect;
};(5). 調(diào)用ioctl函數(shù),設(shè)置當(dāng)前裁剪矩形,并將結(jié)果保存在結(jié)構(gòu)體v4l2_crop中,結(jié)構(gòu)體v4l2_crop內(nèi)容如下:
struct v4l2_crop {__u32 type;	/* enum v4l2_buf_type */struct v4l2_rect        c;
};(6). 調(diào)用ioctl函數(shù),設(shè)置流數(shù)據(jù)格式,包括寬、高、像素格式,并將結(jié)果保存在結(jié)構(gòu)體v4l2_format中,結(jié)構(gòu)體v4l2_format內(nèi)容如下:
struct v4l2_pix_format {__u32         	width;__u32			height;__u32			pixelformat;__u32			field;		/* enum v4l2_field */__u32           bytesperline;	/* for padding, zero if unused */__u32          	sizeimage;__u32			colorspace;	/* enum v4l2_colorspace */__u32			priv;		/* private data, depends on pixelformat */
};struct v4l2_format { // stream data format__u32	 type; // enum v4l2_buf_type; type of the data streamunion {struct v4l2_pix_format		pix;     // V4L2_BUF_TYPE_VIDEO_CAPTURE, definition of an image formatstruct v4l2_pix_format_mplane	pix_mp;  // V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE, definition of a multiplanar image formatstruct v4l2_window		win;     // V4L2_BUF_TYPE_VIDEO_OVERLAY, definition of an overlaid imagestruct v4l2_vbi_format		vbi;     // V4L2_BUF_TYPE_VBI_CAPTURE, raw VBI capture or output parametersstruct v4l2_sliced_vbi_format	sliced;  // V4L2_BUF_TYPE_SLICED_VBI_CAPTURE, sliced VBI capture or output parameters__u8	raw_data[200];                   // user-defined, placeholder for future extensions and custom formats} fmt;
};3. 因?yàn)閕o采用的是MMAP即內(nèi)存映射方式,因此調(diào)用init_mmap函數(shù):
(1). 調(diào)用ioctl函數(shù),設(shè)置內(nèi)存映射I/O,并將結(jié)果保存在結(jié)構(gòu)體v4l2_requestbuffers中,結(jié)構(gòu)體v4l2_requestbuffers內(nèi)容如下:
enum v4l2_memory {V4L2_MEMORY_MMAP             = 1,V4L2_MEMORY_USERPTR          = 2,V4L2_MEMORY_OVERLAY          = 3,V4L2_MEMORY_DMABUF           = 4,
};struct v4l2_requestbuffers {__u32			count;__u32			type;		/* enum v4l2_buf_type */__u32			memory;		/* enum v4l2_memory */__u32			reserved[2];
};(2). 調(diào)用ioctl函數(shù),查詢緩沖區(qū)狀態(tài),并將結(jié)果保存在結(jié)構(gòu)體v4l2_buffer中,結(jié)構(gòu)體v4l2_buffer內(nèi)容如下:
/*** struct v4l2_buffer - video buffer info* @index:	id number of the buffer* @type:	enum v4l2_buf_type; buffer type (type == *_MPLANE for*		multiplanar buffers);* @bytesused:	number of bytes occupied by data in the buffer (payload);*		unused (set to 0) for multiplanar buffers* @flags:	buffer informational flags* @field:	enum v4l2_field; field order of the image in the buffer* @timestamp:	frame timestamp* @timecode:	frame timecode* @sequence:	sequence count of this frame* @memory:	enum v4l2_memory; the method, in which the actual video data is*		passed* @offset:	for non-multiplanar buffers with memory == V4L2_MEMORY_MMAP;*		offset from the start of the device memory for this plane,*		(or a "cookie" that should be passed to mmap() as offset)* @userptr:	for non-multiplanar buffers with memory == V4L2_MEMORY_USERPTR;*		a userspace pointer pointing to this buffer* @fd:		for non-multiplanar buffers with memory == V4L2_MEMORY_DMABUF;*		a userspace file descriptor associated with this buffer* @planes:	for multiplanar buffers; userspace pointer to the array of plane*		info structs for this buffer* @length:	size in bytes of the buffer (NOT its payload) for single-plane*		buffers (when type != *_MPLANE); number of elements in the*		planes array for multi-plane buffers* @input:	input number from which the video data has has been captured** Contains data exchanged by application and driver using one of the Streaming* I/O methods.*/
struct v4l2_buffer {__u32			index;__u32			type;__u32			bytesused;__u32			flags;__u32			field;struct timeval		timestamp;struct v4l2_timecode	timecode;__u32			sequence;/* memory location */__u32			memory;union {__u32           offset;unsigned long   userptr;struct v4l2_plane *planes;__s32		fd;} m;__u32			length;__u32			reserved2;__u32			reserved;
};(3). 調(diào)用mmap函數(shù),應(yīng)用程序通過內(nèi)存映射將幀緩沖區(qū)地址映射到用戶空間;通常在需要對(duì)文件進(jìn)行頻繁讀寫時(shí)使用,這樣用內(nèi)存讀寫取代I/O讀寫,以獲得較高的性能。
4. 開始捕捉,調(diào)用start_capturing函數(shù):
(1). 調(diào)用ioctl函數(shù),VIDIOC_QBUF,并將結(jié)果保存在結(jié)構(gòu)體v4l2_buffer中;
(2). 調(diào)用ioctl函數(shù),啟動(dòng)流I/O,并將結(jié)果保存在結(jié)構(gòu)體v4l2_buf_type中;
5. 主循環(huán),調(diào)用mainloop函數(shù):
(1). 清空集合調(diào)用FD_ZERO函數(shù),將一個(gè)給定的文件描述符加入到集合中調(diào)用FD_SET函數(shù);
(2). 調(diào)用select函數(shù);
(3). 調(diào)用ioctl函數(shù),VIDIOC_DQBUF,并將結(jié)果保存在結(jié)構(gòu)體v4l2_buffer中;
(4). 將幀數(shù)據(jù)寫入指定的文件;
(5). 調(diào)用ioctl函數(shù),VIDIOC_QBUF,并將結(jié)果保存在結(jié)構(gòu)體v4l2_buffer中。
6. 停止捕捉,調(diào)用stop_capturing函數(shù):
?(1). 調(diào)用ioctl函數(shù),停止流I/O,并將結(jié)果保存在結(jié)構(gòu)體v4l2_buf_type中。
7. 調(diào)用uninit_device函數(shù):
(1). 調(diào)用munmap函數(shù),取消映射設(shè)備內(nèi)存;
(2). 調(diào)用free函數(shù),刪除calloc申請(qǐng)的內(nèi)存。
8. 關(guān)閉設(shè)備,調(diào)用close_device函數(shù):
(1). 調(diào)用close函數(shù)關(guān)閉設(shè)備。
這里通過ffmpeg查看usb攝像頭支持的編碼類型,執(zhí)行結(jié)果如下圖所示,編碼類型支持Raw和Mjpeg兩種,對(duì)應(yīng)的像素格式為V4L2_PIX_FMT_YUYV和V4L2_PIX_FMT_MJPEG:
測(cè)試代碼如下:
#include "funset.hpp"
#include <string.h>
#include <assert.h>
#include <iostream>#ifndef _MSC_VER#include <fcntl.h>
#include <unistd.h>
#include <errno.h>
#include <sys/stat.h>
#include <sys/types.h>
#include <sys/time.h>
#include <sys/mman.h>
#include <sys/ioctl.h>#include <linux/videodev2.h>namespace {#define CLEAR(x) memset(&(x), 0, sizeof(x))enum io_method {IO_METHOD_READ,IO_METHOD_MMAP,IO_METHOD_USERPTR,
};struct buffer {void *start;size_t length;
};char* dev_name;
enum io_method io = IO_METHOD_MMAP;
int fd = -1;
struct buffer* buffers;
unsigned int n_buffers;
int out_buf;
int force_format = 1;
int frame_count = 10;
int width = 640;
int height = 480;
FILE* f;void errno_exit(const char *s)
{fprintf(stderr, "%s error %d, %s\n", s, errno, strerror(errno));exit(EXIT_FAILURE);
}int xioctl(int fh, int request, void *arg)
{int r;do {r = ioctl(fh, request, arg);} while (-1 == r && EINTR == errno);return r;
}void process_image(const void *p, int size)
{/*if (out_buf)fwrite(p, size, 1, stdout);fflush(stderr);fprintf(stderr, ".");fflush(stdout);*/fwrite(p, size, 1, f);
}int read_frame(void)
{struct v4l2_buffer buf;unsigned int i;switch (io) {case IO_METHOD_READ:if (-1 == read(fd, buffers[0].start, buffers[0].length)) {switch (errno) {case EAGAIN:return 0;case EIO:// Could ignore EIO, see spec. fall throughdefault:errno_exit("read");}}process_image(buffers[0].start, buffers[0].length);break;case IO_METHOD_MMAP:CLEAR(buf);buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;buf.memory = V4L2_MEMORY_MMAP;if (-1 == xioctl(fd, VIDIOC_DQBUF, &buf)) {switch (errno) {case EAGAIN:return 0;case EIO:// Could ignore EIO, see spec. fall throughdefault:errno_exit("VIDIOC_DQBUF");}}assert(buf.index < n_buffers);process_image(buffers[buf.index].start, buf.bytesused);if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))errno_exit("VIDIOC_QBUF");break;case IO_METHOD_USERPTR:CLEAR(buf);buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;buf.memory = V4L2_MEMORY_USERPTR;if (-1 == xioctl(fd, VIDIOC_DQBUF, &buf)) {switch (errno) {case EAGAIN:return 0;case EIO:// Could ignore EIO, see spec. fall throughdefault:errno_exit("VIDIOC_DQBUF");}}for (i = 0; i < n_buffers; ++i)if (buf.m.userptr == (unsigned long)buffers[i].start && buf.length == buffers[i].length)break;assert(i < n_buffers);process_image((void *)buf.m.userptr, buf.bytesused);if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))errno_exit("VIDIOC_QBUF");break;}return 1;
}void mainloop(void)
{unsigned int count = frame_count;while (count-- > 0) {for (;;) {fd_set fds;struct timeval tv;int r;FD_ZERO(&fds);FD_SET(fd, &fds);// Timeouttv.tv_sec = 2;tv.tv_usec = 0;r = select(fd + 1, &fds, NULL, NULL, &tv);if (-1 == r) {if (EINTR == errno)continue;errno_exit("select");}if (0 == r) {fprintf(stderr, "select timeout\n");exit(EXIT_FAILURE);}if (read_frame())break;// EAGAIN - continue select loop}}
}void stop_capturing(void)
{enum v4l2_buf_type type;switch (io) {case IO_METHOD_READ:// Nothing to dobreak;case IO_METHOD_MMAP:case IO_METHOD_USERPTR:type = V4L2_BUF_TYPE_VIDEO_CAPTURE;if (-1 == xioctl(fd, VIDIOC_STREAMOFF, &type))errno_exit("VIDIOC_STREAMOFF");break;}
}void start_capturing(void)
{unsigned int i;enum v4l2_buf_type type;switch (io) {case IO_METHOD_READ:// Nothing to dobreak;case IO_METHOD_MMAP:for (i = 0; i < n_buffers; ++i) {struct v4l2_buffer buf;CLEAR(buf);buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;buf.memory = V4L2_MEMORY_MMAP;buf.index = i;if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))errno_exit("VIDIOC_QBUF");}type = V4L2_BUF_TYPE_VIDEO_CAPTURE;if (-1 == xioctl(fd, VIDIOC_STREAMON, &type))errno_exit("VIDIOC_STREAMON");break;case IO_METHOD_USERPTR:for (i = 0; i < n_buffers; ++i) {struct v4l2_buffer buf;CLEAR(buf);buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;buf.memory = V4L2_MEMORY_USERPTR;buf.index = i;buf.m.userptr = (unsigned long)buffers[i].start;buf.length = buffers[i].length;if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))errno_exit("VIDIOC_QBUF");}type = V4L2_BUF_TYPE_VIDEO_CAPTURE;if (-1 == xioctl(fd, VIDIOC_STREAMON, &type))errno_exit("VIDIOC_STREAMON");break;}f = fopen("usb.yuv", "w");if (!f) {errno_exit("fail to open file");}
}void uninit_device(void)
{unsigned int i;switch (io) {case IO_METHOD_READ:free(buffers[0].start);break;case IO_METHOD_MMAP:for (i = 0; i < n_buffers; ++i)if (-1 == munmap(buffers[i].start, buffers[i].length))errno_exit("munmap");break;case IO_METHOD_USERPTR:for (i = 0; i < n_buffers; ++i)free(buffers[i].start);break;}free(buffers);fclose(f);
}void init_read(unsigned int buffer_size)
{buffers = (struct buffer*)calloc(1, sizeof(*buffers));if (!buffers) {fprintf(stderr, "Out of memory\n");exit(EXIT_FAILURE);}buffers[0].length = buffer_size;buffers[0].start = malloc(buffer_size);if (!buffers[0].start) {fprintf(stderr, "Out of memory\n");exit(EXIT_FAILURE);}
}void init_mmap(void)
{struct v4l2_requestbuffers req;CLEAR(req);req.count = 4;req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;req.memory = V4L2_MEMORY_MMAP;if (-1 == xioctl(fd, VIDIOC_REQBUFS, &req)) {if (EINVAL == errno) {fprintf(stderr, "%s does not support memory mappingn\n", dev_name);exit(EXIT_FAILURE);} else {errno_exit("VIDIOC_REQBUFS");}}if (req.count < 2) {fprintf(stderr, "Insufficient buffer memory on %s\n", dev_name);exit(EXIT_FAILURE);}buffers = (struct buffer*)calloc(req.count, sizeof(*buffers));if (!buffers) {fprintf(stderr, "Out of memory\n");exit(EXIT_FAILURE);}for (n_buffers = 0; n_buffers < req.count; ++n_buffers) {struct v4l2_buffer buf;CLEAR(buf);buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;buf.memory = V4L2_MEMORY_MMAP;buf.index = n_buffers;if (-1 == xioctl(fd, VIDIOC_QUERYBUF, &buf))errno_exit("VIDIOC_QUERYBUF");buffers[n_buffers].length = buf.length;buffers[n_buffers].start =mmap(NULL, // start anywherebuf.length,PROT_READ | PROT_WRITE, // requiredMAP_SHARED, // recommendedfd, buf.m.offset);if (MAP_FAILED == buffers[n_buffers].start)errno_exit("mmap");}
}void init_userp(unsigned int buffer_size)
{struct v4l2_requestbuffers req;CLEAR(req);req.count = 4;req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;req.memory = V4L2_MEMORY_USERPTR;if (-1 == xioctl(fd, VIDIOC_REQBUFS, &req)) {if (EINVAL == errno) {fprintf(stderr, "%s does not support user pointer i/on", dev_name);exit(EXIT_FAILURE);} else {errno_exit("VIDIOC_REQBUFS");}}buffers = (struct buffer*)calloc(4, sizeof(*buffers));if (!buffers) {fprintf(stderr, "Out of memory\n");exit(EXIT_FAILURE);}for (n_buffers = 0; n_buffers < 4; ++n_buffers) {buffers[n_buffers].length = buffer_size;buffers[n_buffers].start = malloc(buffer_size);if (!buffers[n_buffers].start) {fprintf(stderr, "Out of memory\n");exit(EXIT_FAILURE);}}
}void init_device(void)
{struct v4l2_capability cap;struct v4l2_cropcap cropcap;struct v4l2_crop crop;struct v4l2_format fmt;unsigned int min;if (-1 == xioctl(fd, VIDIOC_QUERYCAP, &cap)) {if (EINVAL == errno) {fprintf(stderr, "%s is no V4L2 device\n", dev_name);exit(EXIT_FAILURE);} else {errno_exit("VIDIOC_QUERYCAP");}}if (!(cap.capabilities & V4L2_CAP_VIDEO_CAPTURE)) {fprintf(stderr, "%s is no video capture device\n", dev_name);exit(EXIT_FAILURE);}switch (io) {case IO_METHOD_READ:if (!(cap.capabilities & V4L2_CAP_READWRITE)) {fprintf(stderr, "%s does not support read i/o\n", dev_name);exit(EXIT_FAILURE);}break;case IO_METHOD_MMAP:case IO_METHOD_USERPTR:if (!(cap.capabilities & V4L2_CAP_STREAMING)) {fprintf(stderr, "%s does not support streaming i/o\n", dev_name);exit(EXIT_FAILURE);}break;}// Select video input, video standard and tune hereCLEAR(cropcap);cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;if (0 == xioctl(fd, VIDIOC_CROPCAP, &cropcap)) {crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;crop.c = cropcap.defrect; // reset to defaultif (-1 == xioctl(fd, VIDIOC_S_CROP, &crop)) {switch (errno) {case EINVAL:// Cropping not supportedbreak;default:// Errors ignoredbreak;}}} else {// Errors ignored}CLEAR(fmt);fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;if (force_format) {fmt.fmt.pix.width = width;fmt.fmt.pix.height = height;fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV;fmt.fmt.pix.field = V4L2_FIELD_INTERLACED;// Note VIDIOC_S_FMT may change width and heightif (-1 == xioctl(fd, VIDIOC_S_FMT, &fmt))errno_exit("VIDIOC_S_FMT");} else {// Preserve original settings as set by v4l2-ctl for exampleif (-1 == xioctl(fd, VIDIOC_G_FMT, &fmt))errno_exit("VIDIOC_G_FMT");}// Buggy driver paranoiamin = fmt.fmt.pix.width * 2;if (fmt.fmt.pix.bytesperline < min)fmt.fmt.pix.bytesperline = min;min = fmt.fmt.pix.bytesperline * fmt.fmt.pix.height;if (fmt.fmt.pix.sizeimage < min)fmt.fmt.pix.sizeimage = min;switch (io) {case IO_METHOD_READ:init_read(fmt.fmt.pix.sizeimage);break;case IO_METHOD_MMAP:init_mmap();break;case IO_METHOD_USERPTR:init_userp(fmt.fmt.pix.sizeimage);break;}
}void close_device(void)
{if (-1 == close(fd))errno_exit("close");fd = -1;
}void open_device(void)
{struct stat st;if (-1 == stat(dev_name, &st)) {fprintf(stderr, "Cannot identify '%s': %d, %s\n", dev_name, errno, strerror(errno));exit(EXIT_FAILURE);}if (!S_ISCHR(st.st_mode)) {fprintf(stderr, "%s is no devicen\n", dev_name);exit(EXIT_FAILURE);}fd = open(dev_name, O_RDWR | O_NONBLOCK, 0); // O_RDWR: requiredif (-1 == fd) {fprintf(stderr, "Cannot open '%s': %d, %s\n", dev_name, errno, strerror(errno));exit(EXIT_FAILURE);}
}} // namespaceint test_v4l2_usb_stream()
{// reference: https://linuxtv.org/downloads/v4l-dvb-apis-new/uapi/v4l/capture.c.htmldev_name = "/dev/video0";open_device();init_device();start_capturing();mainloop();stop_capturing();uninit_device();close_device();fprintf(stdout, "test finish\n");return 0;
}#elseint test_v4l2_usb_stream()
{fprintf(stderr, "Error: this test code only support linux platform\n");return -1;
}#endif通過ffplay播放生成的usb.yuv文件,執(zhí)行結(jié)果如下:
GitHub:https://github.com//fengbingchun/OpenCV_Test
總結(jié)
以上是生活随笔為你收集整理的V4L2获取usb视频流测试代码的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問題。
 
                            
                        - 上一篇: FFmpeg中拉取rtsp视频流并缩放显
- 下一篇: 通过配置NFS使Ubuntu和海思355
