banner



Does Color Space Matter When Shooting Raw

Sony A7s Mirror less Camera bodyDigital cameras oftentimes accept a setting that allows you to select either sRGB or Adobe RGB. What, exactly does this exercise? The respond is that, similar the white residual and several other camera settings, it affects only the JPEG files that the camera produces. If yous shoot RAW (and if you don't, y'all really need to beginning) the sRGB or Adobe RGB setting has no impact whatever on the raw epitome data stored in the RAW file.

Digital Camera Electronics
Figure 1:  Digital Camera Electronics

Figure 1 shows how image data is captured in a digital camera.  The sensor is an array of photo-sites.  A photo-site consist of a photo-detector, which is essentially a photon counter, under a "red," "light-green" or "blue" filter.  Typically, each pixel has one "ruby", one "bluish" and two "light-green" photo-sites per pixel, so in that location are alternating rows of blue/dark-green and greenish/red photo-sites.  This is called the Bayer pattern, as shown in Figure i.  The Bayer pattern is very common, but there are variations.

I use "red," "greenish" and "blue" in quotes here because I want to emphasize that these are photo-site filter responses every bit determined by the photograph-chemical backdrop of the dyes used in the sensor'south color filters.  These colour responses determine the sensor's native color space .  They are non like the carmine, green and blueish primaries of the standard RGB spaces , such every bit sRGB, Adobe RGB (time to comeaRGB), or ProPhoto RGB.

After capturing an exposure each photo-site photon count is realized every bit an analog voltage.  This voltage is read off of the sensor array row-by-row in a raster sequence.  The point is sampled for each photo-site response and converted into digital words by an Analog-to-Digital Converter ( ADC ).  Information technology is these digitized samples at the ADC are the sensor's raw image data .

Figure one also indicates that the camera'south ISO setting is an analog amplifier prior to the ADC.  Increasing the amplification, for higher ISO, amplifies low level signals but at the cost of also amplifying sensor dissonance.  That story, the story of dynamic range, we'll concur for another day.

After the ADC, a processor core called a Digital Signal Processor ( DSP ) (as well called an Image Processor ) applies in-camera digital processing to the raw image data.  Both in-camera and post processing of raw image data include the following.

  • Demosaicing:   The rearrangement and interpolation of the raw image data (row-by-row sampled photograph-site information) into pixel information channels (blue-green-green-cherry for the Beyer design), still in the sensor's native color space.
  • Color Space Transformation :  The transformation from raw image data to a standard RGB color space. This pace does apply the photographic camera'south white residue setting.  In camera, the destination infinite is adamant by the s/aRGB camera setting.
  • Other Adjustments :  Dissimilarity, saturation, sharpening, etc.
  • JPEG encoding
  • Lossless information compression (typically ZLW coding)

Sometimes, the combination of demosaicing and color space transformation is chosen rendering .

Your camera always creates a JPEG even if you shoot in RAW just mode.  If for nothing else, the LCD preview display is a JPEG.  JPEG production can only exist practical later demosaicing and color space transformation, and other adjustments may be prior to JPEG product.  Nonetheless,

RAW files store raw image data every bit produced at the ADC in sensor's native color infinite – hence, the name RAW.  The simply in-camera digital processing applied to the raw image data is lossless data compression, and the only camera setting that affects the raw image data is the ISO setting*.

*At that place are actually some very minor exceptions to the 'ISO but rule,'  only they are not generally relevant to modernistic digital cameras.  I'll get back to those exceptions below.

The color space transformation uses the four (Beyer pattern) samples per pixel to calculate the red, green and blueish channels of a standard RGB space.  That's actually a rather messy betoken processing operation as information technology must compensate for sensor nonlinearities before the linear transformation to a standard RGB space.  If you lot are a digital wonk, like me, you lot tin can download the open standard Adobe DNG Specification (which is the chief reference for this web log) and read about the gory details like I did.

White remainder ( WB ) is a special adjustment setting.  Information technology does not bear on the raw image data.  Still, color infinite transformation, both in-camera and in mail, does apply the camera's WB setting.  Thus, the commonly cited statement that photographic camera WB does not touch postal service-processing of RAW files is not quite right.  The camera's WB setting is stored in the RAW file and used for color infinite transformation.  That determines the initial preview that y'all start with in post processing.

If y'all are a serious mail service-processing photographer, be conscientious with machine-WB.  The reason is that auto will make WB for each paradigm.  In postal service processing, for a group of pictures shot in the same setting, you'll take the time to get your color balance right once and then utilise that adjustment across the entire grouping.  But if the group was shot with auto-WB, then strictly speaking you have to colour rest each paradigm separately.  It doesn't really thing what the WB setting is, it only matters that they are all the same across the group.  I have recently adopted the practise of using the neutral "daylight" WB setting for all shooting.  This ensures a uniform WB setting across all images, so I don't have to bank check before applying a batch colour balance adjustment.  The only drawback is that the in-camera previews take the wrong WB rendering, so the color remainder is off in the photographic camera previews.  I really don't care.  I don't employ the in-photographic camera LCD preview for adjustments.  However, I do care about the histogram and blow-out indicators provided on the camera's LCD to assist me set exposures.  Thus, I want the camera's histogram for to accurately reflect the raw epitome information, not a white-counterbalanced preview.  That's why I use a neutral in-camera rendering (daylight WB) to calculate histograms.

My Nikon D700 offers many more than adjustments under the heading "moving picture controls."  There are pre-sets: standard, neutral, vivid and monochrome, and you tin add custom moving-picture show controls allowing you lot specify things like contrast, brightness, saturation, sharpening, etc.  We might call this "pre-processing" because all the adjustments are set prior to capturing the prototype.

Now, if you're like me, then y'all concord that the photographic camera's LCD is not the correct place to make adjustments.  I practice that on my 27" iMac, thank you very much.  Nonetheless, with the exception of WB, these in-camera adjustments, along with the camera'due south a/sRGB setting, simply apply to in-camera JPEG production.  I tested this on my D700.  I shot my SpyderChecker bill of fare with each of the "pic control" presets (neutral, brilliant, …), and and so opened the RAW files in Lightroom.  The big hint that was that the monochrome image, which was in B&W on the camera'south LCD preview, showed upwards in glorious color when I opened the RAW file in Lightroom.  Indeed, all four images were identical.

Now, I empathise why manufacturers offer all of these in-camera adjustments.  First, of course, they want pictures captured with their cameras to await great with no post processing!  But there is more to it than that.  Some photographers need to shoot a lot of photos and process them quickly.  Think wedding photography.  They can have a set the white balance, sharpening, that soft hymeneals style contrast, etc., for each shooting environment, and then but shoot and deliver the in-camera JPEGs.  There'southward nothing wrong with that.  It is just an efficient way to deliver a quality product to clients apace.

Nikon also offer something called Active-D lighting, which supposedly increases dynamic range.  Other manufacturers may offering like gimmicks.  I did a piddling internet searching, and plant that (apparently) Active-D is a combination of exposure modification and a digital tone mapping algorithm.  This exposure modification is one of those "exceptions" to the 'ISO just rule,' because information technology does touch the raw prototype data.  This sounds similar to the recommended practice of eastxposure to the correct ( ETTR ).  I'll discuss that in a blog on dynamic range.  I only plow that stuff off!  If you are a serious post-processing photographer, yous will make your own tone-mapping adjustments (using things like slope filters and masked layers).  I don't want to rely on a "canned algorithm."  Fortunately, the tone mapping of Active-D is a digital processing that only applied to in-photographic camera JPEGs.  Still, it might monkey with your exposure.  Then but don't do it.  Yous should larn the ETTR technique and manage it yourself.

Post processing software uses a working RGB space .  You should configure your post processor for the largest possible working space, which typically is ProPhoto RGB .  ProPhoto is the default setting for Lightroom.  In Photoshop, you set the working color space in the Color Settings dialog box that gives you an incredible array of options.  You can even specify your own custom RGB infinite, such as a monitor's native RGB infinite, just ProPhoto is more than sufficient for the vast majority of photographers.  Using a wide gamut working space like ProPhoto (which is much wider than can be produced in impress or on a monitor) allows processing to create new colors via diverse adjustments without having to perform the destructive (and nonlinear) out-of-gamut rendering operation after each adjustment.  You should utilize the out-of-gamut rendering simply as the very last stride to produce your final production (a JPEG or print) under your control using tools like soft proofing.

Still, many photographers seem to think that RAW files store paradigm data in sRGB or aRGB, undoubtedly because it is a setting on the camera.  If that were true, there would certainly be an massive outcry of photography blogs.  It just wouldn't make sense to compress raw image data to s/aRGB in-camera, and and so transform to a larger working space like ProPhoto for post processing.  There should be only i transformation from sensor native to a wide gamut working space, and and then, after processing, one transformation from the working space to the production JPEG or print  gamut where you control out-of-gamut rendering.

I recommended above to always use a neutral WB setting in-photographic camera, but I do presume you will mail service-process in a wide working infinite.  Transforming to sRGB every bit the working space with the wrong WB setting can produce undesirable results that mail adjustments cannot correct because color data was lost.

Unfortunately, manufacturers tend to obfuscate these facts.  My Nikon D700 user'south manual, for example, makes no mention of the fact that the s/aRGB setting, the 'picture control' adjustments and Active-D tone-mapping, is only for in-camera JPEG product.  It merely doesn't say anything near what output format this setting affects – as if JPEG is the simply assumed default image capture format.

A few years back, Nikon introduced some lossy compression in their RAW file format NEF.  You can at present select either 12 or 14 bit NEF files.  12 chip data is slightly compressed to produce a smaller RAW file.  While the 12 scrap pinch distortions may be imperceptible, the public relations blowback was not.  That said, I have my D700 set to 14 bit information to become a true RAW file, every bit whatsoever absolute data integrity purist would do.

More Details Anyone?

The residual of this blog is going to exist a deeper swoop into some details, and so experience gratuitous to stop reading hither.

What exactly is the sensor'southward native color infinite?  Equally stated above, it is the response curves of the "ruddy," "green" and "bluish" color filter on top of the sensor photograph-sites.  I don't have colour response curves for commercial sensors (which are probable proprietary anyway).  Withal, these color filters play exactly the same office as the opsin proteins in the cone cells of our eyes.  The cone cell response curves, and the color space they determine, are called LMS for Long, Medium and Curt wave, which just roughly correspond to red, greenish and blue.  The figure below shows the LMS response curves vs. the wavelength of monochromatic light.  (Monochromatic means pure calorie-free of a single wavelength.)  Note that the LMS responses are very broad and have substantial overlap.

Figure 2: Homo cone prison cell LMS response curves.  This figure iis due to BenRG (own work) [Public domain], which was copied from https://commons.wikimedia.org/wiki/File:Cone-fundamentals-with-srgb-spectrum.svg.  Information technology is reproduced and attributed here in accordance the Wikimedia Commons copyright understanding.

We perceive colour every bit combinations of LMS responses – not directly from wavelength.  As a issue, dissimilar combinations of monochromatic light can produce the same LMS responses and hence are perceived as the same color.

In dissimilarity, the ruby-red, green and blue colors of standard RGB spaces are most monochromatic colors.   How do I know this?  Because they are main colors, which take a very special property.  While other colors tin (by and large) be represented as combinations of the ruby, green and blue primaries, the opposite is non truthful.  Consider yellow, which is a secondary color.  Our eyes and brains perceive yellowish as a combination of the 50 and M responses.  Thus, nosotros can perceive the aforementioned yellowish as a mixture of red and green monochromatic light, or as a single monochromatic yellowish light.  So, why are the standard RGB primaries nearly "nearly monochromatic?"  Because you want the widest color gamut that can be produced by but iii primaries, and that requires highly saturated (which substantially means "about monochromatic") primary colors.

1 should not confuse the sensor photograph-site color filter response curves with the primaries of standard RGB spaces.  It's just apples and oranges.  There are infinitely many combinations of monochromatic light that produce the same perception of a xanthous color, and a color sensing organisation must properly recognize all of them as yellow.  A reproduction system on the other paw, (like a monitor, a mail service processor working space, or a print) just needs to produce 1 of the combinations that result in the same colour perception.

Even so not convinced?  Well, suppose we build a sensor with response curves beingness the red, greenish and blue spectra of, say, sRGB primaries, as illustrated in Figure three.  Case (a) shows the case that we excite the sensor with equal levels of red and green light.  The sensor work fine.  The sensed levels for both cherry and dark-green are equal, and our eyes and brains perceive that as yellow.  Case (b) shows the case that this sensor is excited with a monochrome yellow light that nosotros perceive as exactly the same xanthous as the red-light-green combination of (a).  However, in this instance, there is no sensor response.  The narrow sRGB main spectra, used here as sensor response curves, don't overlap, and don't cover the unabridged range of visible wavelengths.  In case (b), the sensor reports that the color is blackness – and that'due south simply wrong!

Figure 3: Hypothetical
Figure 3: Hypothetical "sRGB sensor" excited with (a) a green-ruby combination, or
 (b) monochromatic yellow. Both are perceived as the same yellow by the human eye.
 Black dot represents sensed value.

It is, therefore, a source of the confusion that camera manufacturers describe their color photograph-sites as "ruby-red," "greenish" and "blue."  It would exist ameliorate (in my opinion) to call these "sensor LMS" responses.

E'er find that nobody talks about camera color gamut?  You don't see this discussed in reviews.  The reason is that the sensor'southward colour gamut is likely not a limiting factor.  In fact, I can provide a mathematical proof to testify that a photographic camera sensor tin can accurately sense 100% of the human perceivable colors if and simply if its response curves are a linear combination of the LMS responses.  (Ship request to [electronic mail protected])  The sensor issue is not gamut (the range of colors that can be sensed), but rather colour accurateness, that is, the degree to which all of the combinations of monochromatic calorie-free that are humanly perceived as the same colour are sensed correctly as the same color.

Alas, much of the literature (and many "expert" blogs on the internet) does seem confuse the distinction between a sensor'southward native color space and processing/reproduction spaces like the standard RGBs.  Of form, different manufacturers will apply different variations of the 'colour filter' engineering science, which leads to different native color spaces.  Perhaps this is why the industry has resisted adopting a uniform standard for RAW files.  DNG provides all the necessary support for native sensor color space transformation.  It continues to be an enigma that many camera manufacturers have resisted DNG.

DNG includes all the raw prototype data exactly as it exists in the manufacturer's RAW file.  Like all RAW files, boosted data DNG includes metadata and photographic camera settings.  This boosted data is stored in data fields called tags .  DNG is actually an extension of the TIFF (Tagged Image File Format) standard that includes additional DNG specific tags.  Those DNG tags are used for color space transformation in Adobe Photographic camera Raw.  I purchased an app called ExifExtreme from the Apple App Store that reads all the tags in DNG and TIFF files.  (There is besides UNIX command line tool chosen tiffutil that will read TIFF tags, but it doesn't read the DNG specific tags, or the manufacturer'south specific information.)  The DNG tags are listed in the DNG spec.  The DNG also includes all of the manufacturer specific tags (in my case, from the Nikon NEF file).  You get all those in-photographic camera adjustment settings (standard, vivid, Active-D, etc.), fifty-fifty though they are non incorporated in the raw image data.   If you are a digital wonk, ExifExtreme is a good $five investment.

Lastly, I need to mention that other exception to my 'ISO only rule.'  According to the Adobe DNG Specification, some cameras can apply an "analog WB." This is implemented as unlike analog gains on different colour channels prior to the ADC.  Hence, this WB setting does affect the raw epitome data.  Adobe says it tin can increase dynamic range.  My internet searches turned upward merely scant discussion on this.  Analog WB was apparently implemented on the Nikon D1, and perhaps in some Pentax cameras.  My guess is that this is a difficult to implement technology, that there are better ways to increment dynamic range via sensor design, and hence, this has non been a widely adopted engineering.  If you know more most cameras with analog white residual, please share.  The presence of analog white balance is indicated by the DNG tag AnalogBalance.  If your DNG does not include the AnalogBalance tag, or has a value 1 ane 1, and so you don't have analog white balance (equally is the case for my D700 files).

John Sadowsky

Source: https://www.captureschool.com/articles/raw-files-sensor-native-color-space-camera-processing/

Posted by: kellermantinandeved.blogspot.com

0 Response to "Does Color Space Matter When Shooting Raw"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel