# Accurate Color Capture



## eninja (Mar 14, 2014)

im an engineer working under R&D. im not a rocket scientist. im studying fluorescence of material. Fluorescence is, in my term, when you illuminate a given sample single wavelength of light, the sample absorbs this light energy and in return it glows or gives of light with color. 

Now I am using a DSLR camera to capture the fluorescence of a given material, to record it. I don't know how to construct my questions. Many factors: Different Camera, different settings, will produce different color output as seen by naked eye, also depends on the screen, screen settings. white balance?? and etc. 

With all of this factors, how can I narrow down, or make these variables constant as much as possible such that i can compare color output, any given time.

What are the factors that affects the color being output by the camera. Also does HSB, (Hue, Saturation, Brigthness) plays important role in this topic?

Thanks.


----------



## neuroanatomist (Mar 14, 2014)

The best way to ensure accurate color representation is to use something like a ColorChecker Passport to generate a camera profile, shoot RAW, and apply that profile. 

Having said that, you may get imperfect results with fluorescence, because of the way a restricted set of wavelengths interacts with the color filter array (CFA, aka Bayer mask) on the sensor. That would be most problematic if the fluorescence emission spectrum of your material is narrow. 

If you're trying to accurately capture the color of the fluorescence emission, are you filtering out the excitation (e.g. with a dichroic)? If the material is at all reflective (or transmissive, if thin and you're illuminating the other side), the excitation wavelength will affect the recorded color (a lot, since it will be brighter than the emission, unless you're using other 'tricks' like collimated illumination at an acute angle with a non-scattering surface, so if you image from near the illumination angle, you record only the omnidirectional fluorescent emission).

Backing up a step, what's your goal here?


----------



## eninja (Mar 14, 2014)

My goal is, to gather few of our samples and record its fluorescence and able to compare them. 
I'm still digesting what you've said. 

Thanks.


----------



## neuroanatomist (Mar 14, 2014)

Occupational hazard  fluorescence microscopy is part of my day job, usually biological samples (transmission fluorescence) but some solids imaged via reflection. Mostly I work with known fluorophores, so 'filter sets' are available that combine a bandpass emission filter, a dichroic mirror, and a bandpass (or longpass) emission filter so you see only the specific emission wavelengths. 

With unknowns, I use a light source with 10 different color LEDs ranging from long UV to red to try several excitation wavelengths, and detect the emission with a multispectral camera that images the visible and short IR spectrum in 20nm increments.


----------



## eninja (Mar 14, 2014)

Thanks again for sharing. I believe for my case, samples are inorganic, in contrast to your side, organic. I hope its valid to say this. 

Quick question.
How about exposure settings. We can narrow down to shutter speed settings. If you increase exposure, color as output by the camera will be different from with short exposure. e.g. at shorter exposure time fluorescence come out as red, at longer exposure, resulting image is orange.

What is the science/reason behind this?

Thank you.


----------



## neuroanatomist (Mar 14, 2014)

Shutter speed should not be affecting color, assuming you're exposing properly. If you have a red light and you overexpose, the red channel will saturate and you will get increasing contributions from the green channel, giving you a progressively yellower image with longer exposures. At high ISO, you do see some desaturation in the red channel. When setting exposure, keep in mind that the camera's meter does not deal with monochromatic light very well.

I should mention one caveat about shutter speed not affecting color. With illumination from fluorescent lights (tubes used in light fixtures, not talking about fluorescence emission from a material), shutter speed can affect color. Fluorescent tubes emit light of different colors depending on where they are in the power cycle, which is based on the frequency of the electric current powers them. Because of that, shutter speeds shorter than the cycle duration will capture different portions of the cycle, resulting in shifting color casts under fluorescent light.


----------



## eninja (Mar 17, 2014)

I read about color filter array.

Sorry for getting into basic and fundamentals.

Is it, One Pixel can be Red, Green or Blue or Combination of it right??

Does that mean in a bayer mask, 1 set of RGBG is consider One Pixel?

Sorry, I got confuse.


----------



## neuroanatomist (Mar 17, 2014)

Each pixel has an R, G or B color filter over it (and there are twice as many G). The spatial resolution is based on the actual pixels, but the color value ultimately assigned to each pixel (done when the RAW capture is converted, either in-camera or on the computer) is interpolated based on neighboring pixels. The demosaicing algorithm basically guesses the actual color of the light hitting a pixel based on the pixels in the local area. 

Your eyes are analogous, with red, green and blue cones, and your brain interpolates the actual color.


----------

