First, thanks Gerd. In x11/blit.c, line 556, wouldn't it be nice a: ext = glGetString(GL_EXTENSIONS); + if (NULL == ext) { + fprintf (strerr, "blit: gl: no extensions available\n"); + return 0; + } if (NULL == (pos = strstr(ext,find))) I took HAVE_LIBXV off from config.h to make opengl work, just "-noxv" arg in command line wasn't enought. Sorry but I'm new in OpenGL, could you help me answering why does GL_EXTENSIONS return NULL in my system? would it be a lack of my GeForceMX400? I don't think so, I rather think I made a mistake somewhere. Thanks a lot, Fabio Gerd Knorr wrote: > Fabio Roger wrote: > > Hi all, > > > > I am looking for some information about how to use OpenGL to render a > > scene with frames grabbed using v4l as texture with a acceptable capture > > rate. > > > > Could someone point me to somewhere (url, mailing-list,...)? .. or a > > hint if it is possible. > > Some stuff is in current xawtv releases (x11/blit.c). Uses OpenGL > textures do do hardware-accelerated scaling of RGB data. Could > certainly use some improvements, using gl extentions for bgr(a) byte > ordering would be useful for example ... > > Gerd > > -- > You can't please everybody. And usually if you _try_ to please > everybody, the end result is one big mess. > -- Linus Torvalds, 2002-04-20 > >