Gerd, I finally figured out why xawtv wasn't work with OpenGL here. It is most likely to be NVidia's issue. Xawtv is trying to find the extension (with glGetString(GL_EXTENSIONS)) to be used with the current video format before create a context. It's just ok with some libGLs, like Matrox one, but always returns NULL with NVidia one. You have to call glXCreateContext+glXMakeCurrent before use glGetString. I tried fixing it and sending you a patch, but doing such a thing changes the order xawtv initializes its stuff and I'm not sure where to begin. Do you think I could help anyway? are you going to fix it? Thanks. Fabio > First, thanks Gerd. > > In x11/blit.c, line 556, wouldn't it be nice a: > ext = glGetString(GL_EXTENSIONS); > + if (NULL == ext) { > + fprintf (strerr, "blit: gl: no extensions available\n"); > + return 0; > + } > if (NULL == (pos = strstr(ext,find))) > > I took HAVE_LIBXV off from config.h to make opengl work, just "-noxv" > arg in command line wasn't enought. > > Sorry but I'm new in OpenGL, could you help me answering why does > GL_EXTENSIONS return NULL in my system? would it be a lack of my > GeForceMX400? I don't think so, I rather think I made a mistake > somewhere. > > Thanks a lot, > > Fabio > > > Gerd Knorr wrote: > > Fabio Roger wrote: > > > Hi all, > > > > > > I am looking for some information about how to use OpenGL to render a > > > scene with frames grabbed using v4l as texture with a acceptable capture > > > rate. > > > > > > Could someone point me to somewhere (url, mailing-list,...)? .. or a > > > hint if it is possible. > > > > Some stuff is in current xawtv releases (x11/blit.c). Uses OpenGL > > textures do do hardware-accelerated scaling of RGB data. Could > > certainly use some improvements, using gl extentions for bgr(a) byte > > ordering would be useful for example ... > > > > Gerd > > > > -- > > You can't please everybody. And usually if you _try_ to please > > everybody, the end result is one big mess. > > -- Linus Torvalds, 2002-04-20 > > > > >