Stephen Crampton wrote:
I've implemented a simple capture program for my Webcam, however, my frame
rate is very slow (around 4 fps). Under Windows XP, my system achieves a
frame rate of 30 fps (I have a Dell Inspiron 8200 with Pentium 4 at 1.4
GHz and a Creative WebCam III).
The Windows driver supports compression, which allows for a much higher
frame rate. The Linux driver does too, but you'll need to get the latest
stable version (1.62) from http://alpha.dyndns.org/ov511/ . After
compiling the driver, you need to:
insmod ov511.o compress=1
insmod ov511_decomp.o
Then check dmesg. If the driver reports that your camera has an OV7620
sensor, you should be able to get 30 FPS at up to 352x288, and 10-15 FPS
at 640x480. If it says you have something else, the frame rate may be lower.
...
info.format = VIDEO_PALETTE_RGB24;
The newer drivers (2.01+) and the driver in the 2.5 kernel don't support
RGB/BGR any more. You might want to capture VIDEO_PALETTE_YUV420P images
instead, and use the conversion code that's in the 1.62 driver
(yuv420p_to_rgb()) to do the conversion in user-space. That will work
with the 1.xx drivers too.
...
I notice from the API that you can have the camera dump frames directly to
the video card's frame buffer.
The driver doesn't support that for a number of reasons, not the least
of which is that most video cards don't support a YUV420P frame buffer.
However, I want to be able to manipulate
the images in real time in my code, so I need it in main memory.
Any suggestions?
Just use mmap() like you're doing now, but use multiple buffers (the
driver supports up to two by default). Here's a pseudo-code example from
the V4L API docs:
/* setup everything */
VIDIOCMCAPTURE(0)
while (whatever) {
VIDIOCMCAPTURE(1)
VIDIOCSYNC(0)
/* process frame 0 while the hardware captures frame 1 */
VIDIOCMCAPTURE(0)
VIDIOCSYNC(1)
/* process frame 1 while the hardware captures frame 0 */
}
--
Mark McClelland
mark@xxxxxxxxxxxxxxxx