Re: How to detect loss of COMPOSITE signal?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Michel Bardiaux (mbardiaux@xxxxxxxxxxx):

> But this led me to a related question: when there is no composite or
> no RF, I have seen that a sequence VIDIOCMCAPTURE+VIDIOCSYNC does not
> block, an one gets either noise, or a frozen image, or a patchwork of
> both. But what is the *timing*? Will VIDIOCSYNC return at once? None
> of the V4L documents I have found seems to say anything about that.

  If there is no signal, the bttv driver here seems to sometimes block
for up to 100ms or so when calling VIDIOCSYNC.  In my app (tvtime), this
can significantly reduce interactivity as you try to navigate around
channels with no signal.

  To get around this, tvtime relies on VIDIOCGTUNER to give us the
signal bit, and we only capture new frames if the card reports a signal.
If the ioctl fails, or the input has no tuner, we have no choice but to
always assume that there is a signal.  If no signal is present, tvtime
shows a blue frame and says "No signal".

  There are two problems with this:

  1. Some users have over-the-air channels with bad reception, fooling
     the signal detection on the card, and these become unwatchable.
  2. I have fears for drivers that listen to GTUNER but don't set the
     bit correctly, although I have not seen any yet.

  To get around this, we currently have a config file option to shut off
our signal detection, but of course that sucks (users need to discover
that it exists, and know it would help).

  Hope that helps,
  -Billy





[Index of Archives]     [Linux DVB]     [Video Disk Recorder]     [Asterisk]     [Photo]     [DCCP]     [Netdev]     [Xorg]     [Util Linux NG]     [Xfree86]     [Free Photo Albums]     [Fedora Users]     [Fedora Women]     [ALSA Users]     [ALSA Devel]     [Linux USB]

Powered by Linux