Re: Re: v4l2 api

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Billy Biggs wrote:
> 
>   2. Doing a general colourspace definition is dangerous, IMHO, so I
> like the idea of having a bunch of #defines, like with the video
> standard.  It will make things easier.  Having a way to detect the
> broken bt878 extents would kick!  Comments on this as a start?
> 
> /* 601 transfer with oldschool chromaticities. */
> #define V4L2_COLORSPACE_SMPTE170M     0x000001

So this is NTSC sampled according to 601.
What about PAL/SECAM chromaticities? (Actually the video standard
already implies all this information.)

> /* I know this is used in some PAL areas I think. */
> #define V4L2_COLORSPACE_SMPTE240M     0x000002

1125-Line (US) HDTV?
 
> /* HD and modern captures. */
> #define V4L2_COLORSPACE_REC709        0x000004

Yup, another HDTV related standard. I don't have 709 here but I can get
it if necessary, for a check. The RGB->YUV coefficients are everywhere
on the net, all of the above. For example the colorspace faq, MPEG-2
docs.

> /* Be able to detect the broken BT878 extents. */
> #define V4L2_COLORSPACE_BT878         0x000008

Excuse a stupid question, what are extents and how are those of bt878
broken?

> /* These should be useful.  Assume 601 extents. */
> #define V4L2_COLORSPACE_470_SYSTEM_M  0x000010
> #define V4L2_COLORSPACE_470_SYSTEM_BG 0x000020

I though SMPTE 170M defines the system M as in BT.470? If not, what is
the difference? Suppose 470_SYSTEM_BG means PAL (former BT.624)
chromaticities and ITU-R Rec 601 sampling. Why only BG? According to 470
all PAL and SECAM standards (except M/PAL) have the same chromaticities.
The RGB->YUV conversion is also the same for all standards A-Z, and the
same as defined in 601. Please explain.
 
> /* I know there will be cameras that send this.  So, this is
>  * unspecified chromaticities and full 0-255 on each of the Y'CbCr
>  * components
>  */
> #define V4L2_COLORSPACE_JPEG          0x000040

Can't say anything here. T.81 speaks only of components and locates
colorspace information at some higher layer. The libjpeg refers to 601
for YCbCr data, that of course assumes 220/225 quantization levels, not
256. But one must always expect "excursions" as 601 says.

How would this be used anyway? Some hardware encodes or decodes JPEG
images. Where do they come from, a camera, user? And the driver does
what, sends/receives raw images, JPEG data? "Cameras" sounds like the
driver "captures" JPEG data. The colorspace information should be
already in there. What's the catch?

> /* For RGB colourspaces, this is probably a good start. */
> #define V4L2_COLORSPACE_SRGB          0x000080

What does this one tell us?

Michael






[Index of Archives]     [Linux DVB]     [Video Disk Recorder]     [Asterisk]     [Photo]     [DCCP]     [Netdev]     [Xorg]     [Util Linux NG]     [Xfree86]     [Free Photo Albums]     [Fedora Users]     [Fedora Women]     [ALSA Users]     [ALSA Devel]     [Linux USB]

Powered by Linux