Michael. I've started a document on Y'CbCr encodings used in video applications that will hopefully clarify all of these. It's unfortunately unfinished, but you can view it at: http://vektor.theorem.ca/graphics/ycbcr/ To my dismay, you revealed some glaring errors in my post. I didn't even have the intelligence to check my own webpage before posting random comments. I stick by the #defines, but not the comments. See below: Michael H. Schimek (mschimek@xxxxxx): > > 2. Doing a general colourspace definition is dangerous, IMHO, so I > > like the idea of having a bunch of #defines, like with the video > > standard. It will make things easier. Having a way to detect the > > broken bt878 extents would kick! Comments on this as a start? > > > > /* 601 transfer with oldschool chromaticities. */ > > #define V4L2_COLORSPACE_SMPTE170M 0x000001 > > So this is NTSC sampled according to 601. > What about PAL/SECAM chromaticities? (Actually the video standard > already implies all this information.) The video standard does not itself specify the chromaticities used. See how the chromaticies have changed over time, and changed for HDTV. S170M is my default reference for broadcast NTSC/PAL chromaticities. > > /* I know this is used in some PAL areas I think. */ > > #define V4L2_COLORSPACE_SMPTE240M 0x000002 > > 1125-Line (US) HDTV? Yes, S240M is 1124-line HDTV. My point for including it is that the gamma function defined in S240M is different from others, so it needs to be distinguished separately. > > /* HD and modern captures. */ > > #define V4L2_COLORSPACE_REC709 0x000004 > > Yup, another HDTV related standard. I don't have 709 here but I can > get it if necessary, for a check. The RGB->YUV coefficients are > everywhere on the net, all of the above. For example the colorspace > faq, MPEG-2 docs. I have a copy here. It's an HDTV standard, and also common since the chromaticities match the sRGB standard. This is the default colourspace for MPEG-2 for example. Are you disputing including these in V4L? > > /* Be able to detect the broken BT878 extents. */ > > #define V4L2_COLORSPACE_BT878 0x000008 > > Excuse a stupid question, what are extents and how are those of bt878 > broken? The bt878 spec says that luma goes from 16-253 instead of the normal 16-235. At first I thought it was an error in the datasheet, but I confirmed (and so did some others on the dScaler lists) that the hardware does give 16-253 for the nominal range. The chroma also seems to always give 250 quantization units. So, in my deinterlacer I specifically map back to the correct excursions for my hardware overlay, which does the 601 transfer function. > > /* These should be useful. Assume 601 extents. */ > > #define V4L2_COLORSPACE_470_SYSTEM_M 0x000010 > > #define V4L2_COLORSPACE_470_SYSTEM_BG 0x000020 > > I though SMPTE 170M defines the system M as in BT.470? If not, what is > the difference? Suppose 470_SYSTEM_BG means PAL (former BT.624) > chromaticities and ITU-R Rec 601 sampling. Why only BG? According to > 470 all PAL and SECAM standards (except M/PAL) have the same > chromaticities. The RGB->YUV conversion is also the same for all > standards A-Z, and the same as defined in 601. Please explain. BT-470 specifies chromaticities distinct from those in S170M. I separate out BG because this is what MPEG-2 defines also. Also, M uses a different white point than all the other standards in 470. See my page (I'll try and finally finish it this weekend). > > /* I know there will be cameras that send this. So, this is > > * unspecified chromaticities and full 0-255 on each of the Y'CbCr > > * components > > */ > > #define V4L2_COLORSPACE_JPEG 0x000040 > > Can't say anything here. T.81 speaks only of components and locates > colorspace information at some higher layer. The libjpeg refers to 601 > for YCbCr data, that of course assumes 220/225 quantization levels, > not 256. But one must always expect "excursions" as 601 says. The quantization levels is the problem. I'd like consistent colour where possible. JPEG uses 256 levels, and they use the conversion from E'_Y et al and specifiy 601 as their reference. Here is the quote from the JFIF standard: Standard color space The color space to be used is YCbCr as defined by CCIR 601 (256 levels). The RGB components calculated by linear conversion from YCbCr shall not be gamma corrected (gamma = 1.0). If only one component is used, that component shall be Y. Of course, that text goes against pretty much everything. The JPEG problem is the main reason why so many open source video applications mistakenly use 0-255 "yuv to rgb" conversion functions rather than the 601 or 709 conversions, resulting in poor colour. > How would this be used anyway? Some hardware encodes or decodes JPEG > images. Where do they come from, a camera, user? And the driver does > what, sends/receives raw images, JPEG data? "Cameras" sounds like the > driver "captures" JPEG data. The colorspace information should be > already in there. What's the catch? This would be used to say that the image is Y'CbCr using the JPEG 0-255 style Y'CbCr, instead of the 601 quantization levels. I figure there are webcams that return images like this. What I was really getting at would be something more like the bt878 'full luma' option, where we spec it as S170M but with 0-255 quantization on each component. > > /* For RGB colourspaces, this is probably a good start. */ > > #define V4L2_COLORSPACE_SRGB 0x000080 > > What does this one tell us? Well, for RGB outputs, you have to specifiy what gamma and chromaticities just like everything else. sRGB is the leading standard for RGB by pretty much everyone, so it seems like a good start. -- Billy Biggs vektor@xxxxxxxxxxxx