Hi, I hope I have provided the necessary information for this information. I have the following hardware: Nividia TNT2 Ultra 32MB (Viper V770) Bt-878A (PixelView PlayTV Pro PV-BT878+ w/FM) Im currently running: RH 6.1 (all changed around) Kenel 2.4.0 Nvidia 0.9-5 drivers xawtv 3.30 bttv 0.7.50 The problem is that when I resize the xawtv window it works fine until I make it wider than 512 pixels. When the height it seems reasonably tollerant and will eventually place a black border on the top and bottom. I haven't been able to figure this one out. I've tried running xawtv with the "-novm" option and this does have any effect I can see. I've also tried playing around with the dga options under xwindows. I've tried to omit dga option from in my XF86Config and this doesn't appear to help either. If i've missed somthing I dont know what it is. I dont appear to get any errors of any sort. Any help would be much appreciated, even if it is a point to where I should continue my investigation. Here is the relevant sections from dmesg: i2c-core.o: i2c core module i2c-algo-bit.o: i2c bit algorithm module bttv: driver version 0.7.50 loaded bttv: using 2 buffers with 2080k (4160k total) for capture bttv: Bt8xx card found (0). PCI: Found IRQ 10 for device 00:0a.0 PCI: The same IRQ used for device 00:0a.1 bttv0: Bt878 (rev 17) at 00:0a.0, irq: 10, latency: 64, memory: 0xe4000000 bttv0: model: BT878(PixelView PlayTV pro) [insmod option] i2c-algo-bit.o: Adapter: bt848 #0 scl: 1 sda: 1 -- testing... i2c-algo-bit.o:1 scl: 1 sda: 0 i2c-algo-bit.o:2 scl: 1 sda: 1 i2c-algo-bit.o:3 scl: 0 sda: 1 i2c-algo-bit.o:4 scl: 1 sda: 1 i2c-algo-bit.o: bt848 #0 passed test. i2c-core.o: adapter bt848 #0 registered as adapter 0. bttv0: i2c: checking for TDA9875 @ 0xb0... not found bttv0: i2c: checking for TDA7432 @ 0x8a... not found i2c-core.o: driver i2c TV tuner driver registered. tuner: chip found @ 0x61 bttv0: i2c attach [Philips PAL] i2c-core.o: client [Philips PAL] registered to adapter [bt848 #0](pos. 0). bttv0: PLL: 28636363 => 35468950 ... ok xawtv gives me the following with -debug 1: This is xawtv-3.30, running on Linux/i686 (2.4.0) visual: id=0x21 class=4 (TrueColor), depth=24 visual: id=0x22 class=5 (DirectColor), depth=24 visual: id=0x23 class=4 (TrueColor), depth=24 visual: id=0x24 class=4 (TrueColor), depth=24 visual: id=0x25 class=4 (TrueColor), depth=24 visual: id=0x26 class=4 (TrueColor), depth=24 visual: id=0x27 class=4 (TrueColor), depth=24 visual: id=0x28 class=4 (TrueColor), depth=24 visual: id=0x29 class=4 (TrueColor), depth=24 check if the X-Server is local ... **** ok x11 socket: me=localhost, server=localhost DGA version 2.0 Xv: 2 adaptors available. Xv: video4linux: input video, ports 42-42 Xv: NV04 Video Overlay: input image, ports 43-43 Xv: using port 42 for video XV_ENCODING get set, -1000 -> 1000 XV_BRIGHTNESS get set, -1000 -> 1000 XV_CONTRAST get set, -1000 -> 1000 XV_SATURATION get set, -1000 -> 1000 XV_HUE get set, -1000 -> 1000 XV_MUTE get set, 0 -> 1 XV_FREQ get set, 0 -> 16000 image format list for port 43 0x32595559 (YUY2) packed 0x32315659 (YV12) planar 0x59565955 (UYVY) packed 0x30323449 (I420) planar Xv: using port 43 for hw scaling x11: color depth: 24 bits, 3 bytes - pixmap: 4 bytes x11: color masks: red=0x00ff0000 green=0x0000ff00 blue=0x000000ff x11: server byte order: little endian x11: client byte order: little endian main thread [pid=853] main: creating windows ... main: read config file ... main: checking for vidmode extention ... main: checking for lirc ... main: mapping main window ... xt: pointer show main: initialize hardware ... Xv: getattr 3 Xv: getattr 4 Xv: getattr 5 Xv: getattr 1 cmd: "setfreqtab" "australia" cmd: "capture" "overlay" Xv: tune getfreq cmd: "setchannel" "10" Xv: video: win=0x1400052, size=0x0, off Xv: video: win=0x1400052, size=384x288, on main: known station tuned, not changing Xv: video: win=0x1400052, size=384x288, on Xv: video: win=0x1400052, size=384x288, on Xv: video: win=0x1400052, size=384x288, on Xv: video: win=0x1400052, size=384x288, on