Re: xawtv and XVideo vs interlaced output

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



  After playing around, I now agree with you that if you scale both
fields to the same location on the screen (pretend the input is a
720x240 stream, for example), then the results are unacceptable.

> > Sure, but if you're scaling from a stream of 720x240 fields, you'd
> > need to have every second field start a half-scanline below the
> > first one, which you can't do with memory location tricks.  Instead,
> > you must talk to the video card each time to tell it to change the
> > scale destination for each field.
> 
> Doing this is close to impossible.  The X-Server runs in userland and
> thus I can't easily make the irq handler handle this.  Also the
> X-Server internal interfaces don't allow to setup overlays with
> sub-pixel resolution (which you would need to get the half-scanline
> issue right).

  The problem with sub-pixel resolution has been brought up on Xpert and
will be fixed.  The other bug is, in my opinion, a serious design flaw.
With v4l2 we have (afaict) a nice clean way to export an interrupt per
field (since you can read from the device as a stream of fields), so if
the X server was running SCHED_FIFO it would be able to update the
overlay position on each field triggered by that interrupt.  So this is
one horrible but potential way to solve the problem.

  Another idea is that maybe in the DRI stuff we could allow control
over overlay surfaces, so that way the bttv driver could talk through a
kernel dri api.

  Really though, using hardware scaling of the fields this is a
technically solvable problem.  We could have a system which uses almost
no CPU and still looks close to TV quality.

> > > Please don't confuse *frame* rate and *field* rate.  The frame
> > > rate is 30 fps only.
> > 
> > But, you do understand that when watching TV, you don't see both
> > fields at the same time.  [ ... ]
> 
> Hmm, well, yes and no.  The TV people play tricks there, using the
> fact that the human eyes are too slow to actually see what really
> happens ...

  Regardless, the human eye can definitely tell the difference between
refreshing at 30fps and 60fps, and 30fps just looks chunky compared to
the smoothness of video.

> With plenty of CPU time at the problem you can fix it of course :-)

  My app currently uses just over 50% of my P3-733 to deinterlace and
display at 59.94fps, giving lots of room to improve the algorithm.
Right now the quality is comparable to my TV, with slightly more
flicker.

-- 
Billy Biggs
vektor@xxxxxxxxxxxx





[Index of Archives]     [Linux DVB]     [Video Disk Recorder]     [Asterisk]     [Photo]     [DCCP]     [Netdev]     [Xorg]     [Util Linux NG]     [Xfree86]     [Free Photo Albums]     [Fedora Users]     [Fedora Women]     [ALSA Users]     [ALSA Devel]     [Linux USB]

Powered by Linux