Brian J. Murrell (v4l@xxxxxxxxxxxxxxx): > > Consider MPEG2. You could do 352x480 frames, interlaced, put in the > > dominance, and you get exactly what you describe later > > So MPEG2 decoder/playback is capable of displaying interlaced fields > @59.94? That would have to be dependent on the video driver in the > case of a TV out (i.e. G400) situation still though, right? Well sure but my dxr2 hardware decoder can play to a TV at 59.94 from a DVD and it works well. In movietime I intend to get software DVD playback to 59.94 working also. If you go pick up a dxr3 it can probably do it too. > > So that means that instead of seeing the raw data you get a blur > > between both fields, [...] > > Right. I think between the "combing" effect of not deinterlacing and > bluring of interpolation, I would choose the latter. I just find the > combing effect too anoying. Well of course. There are lots of deinterlacing filters you can apply to make the output suck less. I kinda liked the ffmpeg filter, vertical with kernel [-1 4 2 4 -1], although I know there are probably better ones that are just as cheap. Still, I think it would be just as expensive to not do any deinterlacing and record interlaced frames to MPEG2. It would be neat to turn mp1e into an MPEG2 recorder, I think. Then you could play back to a dxr3 card and maybe get smooth 59.94 playback for free. Blah, or we can just get that hauppauge pvr thing working and it will probably do the right thing too. > > Well you can send it frames that are interlaced, > > Just to make sure we are using the same nomenclature, when you say > "frames that are interlaced," do you mean sending a frame that is a > composition of the two fields or do you mean sending a frame that is > only one field, interlaced into every other line? I mean a frame that contains two fields. > > but you have no control over the dominance (which field is shown > > first), > > Which seems to imply sending the two fields in one frame because if > you were sending the two fields separtely, the timing of them would > indicate dominance, no? Uh, well you'd still have to say 'here is a top field', and 'here is a bottom field'. Regardless you can't use the API we have right now, which is 'draw RGB to an fbcon device and it will appear on your TV'. > > nor do you get an interrupt to tell you when to blit the next frame. > > The G400 supports "stacking" (or buffering) of frames into it's memory > and it will display them at the framerate of the output device. i.e. > asychronous operation instead of sychronous. Does that not make the > non-interrupt a non-issue? Well still if you're sending buffers too fast you'll need to be blocked and if you're sending them too slow you'll get underruns. So we need some sort of API that exports that. > > Because I'm a deinterlacer. Each field I have to show as a frame. > > I want to watch TV at full framerate. So I interpolate each field > > to frame height and send it. Is there something that isn't clear? > > Ahhh. I guess what was not clear is that I assumed you were using the > hardware scaler on the video card to stretch the 240 lines into 480. > If you could, you cut your bandwidth requirements in half. Yeah but then I'd need to tell it to subpixelly move the destination accordingly. I had some discussions with Matt Sottek and Mark Vojkovich on the Xpert list about this and we sort of came up with an API to do this with XVideo, but neither Matt nor I have had time to implement it on the X side, and nobody else seems interested in doing it. I mean sure, there are some people who want to get this done, just not enough with free time to get it done too quickly. :) > Well I did a 720x480 capture and played back with MPlayer's -fs option > to my G400/TV-Out. I got, predictably, black bands at the top and > bottom of the picture, because my framebuffer mode is set at 800x600. > I guess the goal would be to have a framebuffer ratio of 3:2 rather > than 4:3. I think the real problem is that the API MPlayer is using to talk to the G400 TV out is just broken. Is it still using an fbcon context? You should be able to use the G400 TV out as if it were an overlay device, give it a 720x480 Y'CbCr frame and have it output that to the TV encoder directly. We need to get rid of these hacks and do it properly. > > But again, I want to write a high quality VCR app before I write my > > PVR, so this is how I approach this stuff. :) > > Where are you drawing your distinction between PVR and VCR? I think like this: VCR: Record to like 40GB/hour, take 3 days postprocessing, get out a high quality DVD. PVR: Record directly to storage format, let me pause during programs and stuff, time-shifting, etc. No postprocessing of video (already gone to a lossy format). > > You should go to OLS. [...] > Oh, I am going to be there! Awesome. :) > > You need to do image heuristics to detect the sequence (except on > > well-mastered NTSC DVDs, which store the progressive 24fps stream > > and have flags to tell the player to perform the process). I have > > code to do this and so do others, but since it's heuristics based, > > off-line versions can do a better job. > > Of course. I guess you would capture the NTSC and then telecine it > offline. But if I were just going to watch the movie and then delete > it, does the difference in viewing pleasure really warrant the > process? Automated, I suppose it would be alright. I currently do 3:2 pulldown inversion in realtime on DVD content, so that works well. You can do a better job offline (see the phase detection code I have in reetpvr), but it's easy enough to do in realtime that it's worth it. dscaler does it. > > There are telecine detectors in higher end TVs. For a neat (GPL'ed) > > Windows deinterlacer that does 59.94fps output and can detect > > pulldown, see http://deinterlace.sf.net/ > > Wow. It would be nice if they abstracted enough to separate the > display stuff from the rest so that the results could be used on other > platforms. Maybe one day. Maybe MPlayer can make use of the win32 > binaries. The code isn't so bad, but I don't like some of their algorithms. :) There have been some ports though, some of their filters were modified and put in xine for example. > > Um, well any video card with TV output that has some drivers > > started, I guess. :) [...] > > A card that you will have enough access to the hardware that you can > output field-at-a-time, interlaced, with proper dominance? Yeah well they pretty much all can, it's just a matter of putting them in the right mode and knowing what they expect. I'm pretty much convinced that with sufficient hacking you could do a V4L2 TV-Output API for all the consumer cards with TV output that would be good enough for what we've been discussing. -- Billy Biggs vektor@xxxxxxxxxxxx