Re: Re: v4l2 api

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Michael H. Schimek wrote:

Gerd Knorr wrote:
That reminds me, how should (digital) USB webcams implement
VIDIOC_ENUMSTD?  Should they set std.id = V4L2_STD_UNKNOWN? Should they
just return -EINVAL? Or should we have have a V4L2_STD_DIGITAL_CAM for
devices where modulation standards and fixed frame rates don't apply?
Just returning -EINVAL looks fine to me.

What can we really expect from usb cams? Will they appear like tv
cards, just dropping 4/5 frames or sth like that? Even when lots of
frames are missing they still arrive at some regular interval, a
multiple of 33 or 40 ms I suppose.
It's not even safe to assume that, particularly if the frames are compressed. Even without compression, there are a dozen things that can cause the interval to change unpredictably. At least with the cameras I work with though (OV511/OV518), it is possible to get a fairly predictable FPS, maybe within +/- 5% accuracy averaged over a few seconds' time. The cameras have internal feedback loops that vary compression/exposure/etc... to try to get as close to the desired frame rate as possible, but it can take a few seconds to stabilize.

Assuming the wrong frame period, timestamps will suggest a non-integer
number of frames missing between any two received frames. A decent app
with good a/v sync will handle that and accumulate the error fractions.
But when such a stream is recorded all the timestamps may be lost,
leaving only the alleged frame rate (probably 30 Hz, being an upper
limit) and no way to determine how far apart any two frames or fields
really are.
It isn't necessarily a matter of missing frames. The frame interval of the image sensor itself may not be constant or even a multiple of any reasonable value, due to variables such as exposure time. As long as the camera uses real-time (isochronous) transmission though, the timestamps should be farily accurate in the relative sense. Not in the absolute sense though, (eg. for syncing audio to video) The camera's internal pipeline can be quite long, so there could be a significant delay between when the image is sensed and when it is timestamped by the driver.

Obviously, none of that is good enough for recording to eg. a VCD-compliant MPEG, but there are file formats out there that can deal with variable frame intervals.

What about the v4l2_buffer.sequence field? I always assumed this is the
number of transmitted, not read frames. The app can count read frames
itself, so the latter would be useless. When a usb cam driver behaves
accordingly the sequence number won't match the assumed frame period,
greatly confusing apps. Or will v4l2_std_id = V4L2_STD_UNKNOWN change
the semantics, considering "transmitted" now those sent over usb,
instead of counting the number of frames really taken by the camera?

Well, there's still the possibility that the driver might need to drop frames if the app can't keep up for whatever reason. That's probably worth accounting for, since the 100 ms discrepancy due to a lost frame at 10 FPS is much larger than anything the camera itself might cause.

IMHO the frame period shouldn't be assumed based on v4l2_std_id. There are all sorts of problems with doing so:

- USB TV capture devices, for the reasons mentioned above.
- ATSC, where there are 6 possible frame rates (23.976, 24, 29.97, 30, 59.94, 60), yet only two modulation standards. - NTSC, where the field rate could conceivably be 60 Hz (and not 59.94 Hz). Not quite sure about that. - Security camera hacks, where multiple sources are multiplexed into a single CVBS input and demuxed into multiple V4L devices by the driver.

Not to mention that it adds unnecessary complexity to apps, which really shouldn't care about the modulation standard, aside from letting the user pick one.

And to answer your question, with USB webcams you cannot infer the
timeperframe from any other fields in the V4L(2) API.
Ok, then lets keep it.

You do know v4l2_captureparm.timeperframe was intended to request a
different frame period (frame skipping on the driver side, reducing the
bus load) I suppose. Of course the app can request 33 ms and see what
the driver is able to sustain, but that's not the primary purpose. The
entire struct and ioctl are concerned with optimizing the capture
process, which is optional as I understand.

I have no problem with it being optional. In a lot of cases (viewing live video, webcams) the exact frame interval doesn't matter. Apps that care can set the video standard, and then do VIDIOC_G_PARM to see what the driver thinks the timeperframe is for that standard.

--
Mark McClelland
mark@xxxxxxxxxxxxxxxx






[Index of Archives]     [Linux DVB]     [Video Disk Recorder]     [Asterisk]     [Photo]     [DCCP]     [Netdev]     [Xorg]     [Util Linux NG]     [Xfree86]     [Free Photo Albums]     [Fedora Users]     [Fedora Women]     [ALSA Users]     [ALSA Devel]     [Linux USB]

Powered by Linux