Friday, 8 June 2018

How is color represented in RAW formats?


I was browsing the forum and found this thread:



Does the selection of sRGB or Adobe RGB in camera when shooting RAW ever matter?


It makes sense that the selection of either sRGB or Adobe RGB as the colorspace is only relevant when converting from RAW to JPEG, but that does mean RAW file formats use some other convention to represent color information, maybe even a proprietary one.


Does anyone know which convention would that be? And how to map it to sRGB, Adobe RGB and/or any other colorspace?



Answer



It's just the values from the sensor, which is a (mostly) linear counter. The different photosites on a Bayer sensor have different colored filters, and the value for each site represents the light which gets through that filter. The name "RAW" is meant to convey precisely that the values are simply that "uncooked" reading.


In a sense, then, the RAW file is in the camera's native colorspace, where the primaries match whatever happens to be the wavelengths of the filter. Converting to a reference color space like sRGB or Adobe RGB is a matter of transforming from the native space to that, and to do it correctly you need to know the particular properties of the filters in a given camera's sensor.


Raw converters use a color matrix (specific for each camera model) to get their results. You can see this in the dcraw source code — it's my understanding that author Dave Coffin gets most of this information from Adobe, who in turn has relationships directly with the camera manufacturers.


No comments:

Post a Comment

Why is the front element of a telephoto lens larger than a wide angle lens?

A wide angle lens has a wide angle of view, therefore it would make sense that the front of the lens would also be wide. A telephoto lens ha...