Mark, A few additional questions: > The Windows driver supports compression, which allows for a much higher > frame rate. The Linux driver does too, but you'll need to get the latest > stable version (1.62) from http://alpha.dyndns.org/ov511/ . After > compiling the driver, you need to: > > insmod ov511.o compress=1 > insmod ov511_decomp.o > > Then check dmesg. If the driver reports that your camera has an OV7620 > sensor, you should be able to get 30 FPS at up to 352x288, and 10-15 FPS > at 640x480. If it says you have something else, the frame rate may be lower. FYI, dmesg reports a USB OV511+ video device found, "Sensor is an OV7620." When I use the compression as you suggest, there are faint, but noticeable squares throughout the image. They look like artifacts of the compression scheme. Are there options that affect how the compression is done? I would be willing to trade frame rate for image quality. If I could achieve a consistent 15 fps with decent image quality, I would be satisfied. Currently, I'm getting a highly variable frame rate that jumps between about 20 to 37 fps (the rate is measured each frame so it reflects the time between the current and previous frames). > The newer drivers (2.01+) and the driver in the 2.5 kernel don't support > RGB/BGR any more. You might want to capture VIDEO_PALETTE_YUV420P images > instead, and use the conversion code that's in the 1.62 driver > (yuv420p_to_rgb()) to do the conversion in user-space. That will work > with the 1.xx drivers too. Do you mean to say that the RGB format is supplied by the driver? I thought that was the native mode of the hardware. Also, I thought I heard somewhere that there were video cameras that perform compression on the CCD chip itself, in parallel, much more efficiently than in software. I'm doing research in computer vision, and I would prefer that technical issues, such as acquisition of images, take as few resources as possible. What I'm getting at is, What's the most efficient way of doing things that would be forward compatible? Is YUV420P the native mode of the chip? Given my requirements, would that be your recommendation? Also, the image freezes briefly every so often. Do you have recommendations for how to keep the frame rate more or less constant? I have implemented the double-buffering you suggested. I tried implementing a frame queue, but I think it adds a lot of overhead as well as a time lag. Since I'm interesting in using video input to control the computer, I would prefer to minimize the time lag. One other point regarding compression and image quality. From my standpoint, the important issue is consistency from frame to frame, so I can, e.g., track objects. I use brightness (e.g., grayscale images) for most of my routines, but sometimes I am interested in color (e.g., skin color). If I could get much better performance, I might opt for grayscale images from the camera. (This would be pointless, however, if the camera always sends color and the driver merely converts the images to grayscale.) Like I said previously, I don't need 30fps, but 4 fps is a bit low. Somewhere in the middle would be ideal.