Hello all, with the 2.5.x version of Video4Linux2, some problems concerning the behaviour of streaming capture, read() capture and overlay video arise. I've discussed this in private with Gerd Knorr, but we haven't come to a definite solution. Because all opens on the device have equal rights, there is a need to specify some sort of importance or "order" of the different things. First of all, we have to define what the different things are: - overlay video: clear. - streaming capture: full fps capture via REQBUFS, STREAMON, QBUF, DQBUF, STREAMOFF - read() capture: capture via read() system call. First question: should capture via read() be able to deliver all frames (ie 25fps for PAL) by definition? I think we all agree on the fact, that one application should not be able to "steal" capture frames from another application that is already capturing. But what about overlay <=> capture? Ok, the first attempt for a definition may be: "All opens have equal rights and all uses have equal rights" This would mean overlay and capture cannot be interrupted by each other. If one application does overlay video (ie. xawtv) another application (ie. motion detection) cannot do any capture of any sort. Both applications need to sync each other, so that the first one can do a "overlay off", before the other one can grab frames. For driver authors, this is very easy to implement. Second attempt: "Capture is always more important than overlay (Regardless of streaming capture or capture via read())". This would mean that one application can "steal" the "focus" of an application that is currently doing overlay video completely. Is this desirable? Third attempt: "Streaming capture can only be activated if no other capture/overlay is running. But you can capture single frames via read() if overlay is running." This is tricky for driver authors. Most of the time, read() and streaming capture are very similar, so it's possible to steal all frames from the overlay application via read() by simply calling it endlessly. To avoid this, the driver has to do some sort of fair queueing. (Not very elegant and very driver dependend -- not the way we want it, I assume) Another problem for driver authors: if there is an overlay from application A and application B wants to capture a frame via read(), then the driver needs to reprogram the capture engine, then capture the frame. After that, it needs to reprogram the engine to do overlay video again, but, most likely, the needed informations are saved in a per-open structure for open A, which is not accessible in the read() call by application B... One solution would be to save the state of the capture engine, do the capture, then reprogram it with the old state. But in this case, it would be almost impossible to get all fps for read() capture because of the reprogramming overhead. ------------ bttv/saa7134 currently do solution 3) implicitly, with the problem that other applications can completely steal all frames from the overlay application by using read(). bttv uses a hack to save the overlay status before activating the capture, so it can be reprogrammed afterwards. saa7134 has two independent video tasks, so it's no problem there. But for other, mostly inferiour hardware this is perhaps a real problem. So how do we solve these problems? Any ideas are really appreciated. CU Michael.