Hi, I hope that this doesn't go too far off topic for the list. I have a Linux machine that I use for video functions at home. This machine outputs video to PAL TV using a Voodoo3 card (with bt869 TV encoder) and XFree. I mostly play video files using mplayer's XV driver. So - my output display is interlaced. A niggly problem for me has been that mplayer's playback does not lock frame playback to the video card refresh - so playback suffers from a "tear" when mplayer changes frames whilst the TV is scanning a visible part of the screen. Also, playback is juddery - especially noticable on pans and scrolls - caused by inconsistent display time for each video frame. These effects are irritating but not a showstopper for me. I have always used mpeg1 capture at 352x288 resolution - because this only captures one of the two fields of each frame, and uses 25Hz frame rate, the effect is reduced - at least displayed output always goes forward in time. But I've now added a DVB-T card to my setup. This captures over-the-air transmitted MPEG2 streams. This video is often true interlaced video. Playback of these streams on my system give rather horrible results. On top of the general problems above, oftentimes the lack of sync between the TV and mplayer's displayed frames results in the video fields appearing out of order - top field of frame 2 being displayed followed by bottom field of frame 1. This shows the two fields out of time order. Result - horrible flicker in moving areas. To fix this properly I need to get mplayer's frame playback locked to the frame VSYNC of the video card. (Or, perhaps, lock the video card to mplayer's frame playback?). I believe that with this done, a Linux box could give video playback indistinguishable from broadcast. A deinterlace filter can somewhat patch over this problem, but isn't the right solution as it throws away both spatial and temporal resolution. Seeing we now have servicable video capture and playback on Linux, now seems the time to make it really good quality! What are my options? Are there any facilities in Xv/X to sense display card vsync? Have I missed the obvious fix for this? Thanks, Steve Davies ---->> Some background on the problem follows ----------->> Let me try to diagram what happens: What we want to happen: 0ms Mplayer writes frame 1 to XV Video card/bt869 outputs frame 1, top field 20ms Video card/bt869 outputs frame 1, bottom field 40ms Mplayer writes frame 2 to XV Video card/bt869 outputs frame 2, top field 60ms Video card/bt869 outputs frame 2, bottom field etc etc. For non interlaced source files, each frame will be output completely over two fields - the 25Hz frame time will be accurately reproduced. If the original file has frames made up of two fields, then the two original fields will actually be correctly displayed one after the other, the 25Hz frame rate input file appears correctly at a 50Hz field rate. (The usual combing effect that interlaced video has on a non-interlaced display doesn't occur, deinterlacing filters are not required and are in fact undersirable as they throw away spatial and temporal resolution). What currently can happen: 0ms Mplayer writes frame 1 to XV 20ms Video card outputs frame 1, top field 40ms Mplayer writes frame 2 to XV Video card outputs frame 2, bottom field 60ms Video card outputs frame 2, top field 80ms Mplayer writes frame 3 to XV Video card outputs frame 3, bottom field As you can see, the original fields are displayed in reverse order, with resulting horrible judder.