Re: How to detect loss of COMPOSITE signal?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Billy Biggs wrote:
Michel Bardiaux (mbardiaux@xxxxxxxxxxx):


But this led me to a related question: when there is no composite or
no RF, I have seen that a sequence VIDIOCMCAPTURE+VIDIOCSYNC does not
block, an one gets either noise, or a frozen image, or a patchwork of
both. But what is the *timing*? Will VIDIOCSYNC return at once? None
of the V4L documents I have found seems to say anything about that.


  If there is no signal, the bttv driver here seems to sometimes block
for up to 100ms or so when calling VIDIOCSYNC.  In my app (tvtime), this
can significantly reduce interactivity as you try to navigate around
channels with no signal.

I have some more data now.

At this stage, my capture application grans only 2 frames per second, using simple VIDIOCMCAPTURE+VIDIOCSYNC pairs; I observe delays between 80 and 120ms, whether there is a signal or not! The delay changes from session to session, but seems stable from second to second in one session; EXCEPT when the signal disappears or returns: then there is a sudden change in delay. Looks like my board at least (Hercules SmartTV) uses ticks from somewhere (internal clock? PCI bus?) as locum for the VSYNC.

A delay up to 80msec would be easily explained if CMCAPTURE has to wait for a VSYNC on an even-numbered frame before actually initiating the grab (up to 40msec), then it takes exactly 40msec for the frame to complete. The rest of the delay might be due to the fact I always use the 1st buffer page here; if the driver or the device uses alternate pages internally, then there can be an extra wait of 1 frame for an even-numbered *frame*, not field.

Does all this sound reasonable?

When I use the 'butterfly' arrangement of VIDIOCMCAPTURE and VIDIOCSYNC recommended by all HOWTOs, the delay between the end of 2 successive CSYNC is still 40msec, very stable, signal or no signal. When the signal disappears, the delay is unchanged. When the signal comes back, for a few frames some of the delays can be up to 80msec. Looks like the TV board has an internal PLL; that would explain why no bump at signal loss, but bumps when it comes back.

This is good news for me, I had been afraid I would have to forcibly terminate any initiated grab in case of signal loss, re-initiate the butterfly loop later, supply my own timings,... Seems I just have to skip the transfer, but the grab logic is unaffected.


  To get around this, tvtime relies on VIDIOCGTUNER to give us the
signal bit, and we only capture new frames if the card reports a signal.
If the ioctl fails, or the input has no tuner, we have no choice but to
always assume that there is a signal.  If no signal is present, tvtime
shows a blue frame and says "No signal".

Exactly what I am doing.


  There are two problems with this:

  1. Some users have over-the-air channels with bad reception, fooling
     the signal detection on the card, and these become unwatchable.
  2. I have fears for drivers that listen to GTUNER but don't set the
     bit correctly, although I have not seen any yet.

  To get around this, we currently have a config file option to shut off
our signal detection, but of course that sucks (users need to discover
that it exists, and know it would help).

  Hope that helps,
  -Billy



--
video4linux-list mailing list
Unsubscribe mailto:video4linux-list-request@xxxxxxxxxx?subject=unsubscribe
https://listman.redhat.com/mailman/listinfo/video4linux-list


--
Michel Bardiaux
Peaktime Belgium S.A.  Bd. du Souverain, 191  B-1160 Bruxelles
Tel : +32 2 790.29.41





[Index of Archives]     [Linux DVB]     [Video Disk Recorder]     [Asterisk]     [Photo]     [DCCP]     [Netdev]     [Xorg]     [Util Linux NG]     [Xfree86]     [Free Photo Albums]     [Fedora Users]     [Fedora Women]     [ALSA Users]     [ALSA Devel]     [Linux USB]

Powered by Linux