Is possible to make this with the webcam?, any suggestion is well accepted. Thanks!
MATLAB: Brightness measurement with the webcam
brightness
Related Solutions
Welcome to the very confusing field of optical units. An image does not have units of intensity. See the table of the bottom of this page: https://en.wikipedia.org/wiki/Candela I'm surprised you couldn't find anything because there is tons of information out there, unfortunately it will make your head spin unless you have a Ph.D. in optics (sometimes even if you do, speaking from personal experience). The units of an image are like joules, or in photometric units lux*m^2*seconds, which is lumens*seconds, which is candela*steradian*second (to get it into all base SI units). Anyway, think of it as a measure of energy (or luminous energy). Let's use regular radiometric units instead of photometric (luminous) units (which are restricted to the human visual range and a lot more complicated). So you have optical power hitting your sensor. Like 10 watts over an area of 1 cm by 1 cm. Now the CCD well integrates those photons. Each pixel might be 5 microns by 5 microns. So now we have watts per area multiplied by the area. That's how many watts are integrated by that pixel. But the pixel only integrates for a certain number of seconds, and watts is joules per second, so you have watts*seconds = (joules/second)*second = joules. That's why I say it's like joules or energy.
OK, that's more than you wanted to know, so I won't even bother to go into the "intensity" of a light source which is even more complicated. Even the "experts" don't agree. For example the American Institute of Physics says that the "intensity" of a light source is W/steradian, yet intensity is an SI Base quantity (like meter, kg, second, ampere, kelvin, and mole) and it's units are candela. My late optics professor, Jim Palmer of the College of Optics at the University of Arizona, got so worked up on the sloppy usage that he wrote a paper on it: "Getting intense about intensity", Metrologia, 1993, vol 30, pp. 371-372. I attach a partial screenshot of his paper, for educational purposes.
I do calibrated color imaging all the time - it's my "day job". You're doing it backwards, and incorrectly. You white balance FIRST and THEN determine skin color, not the other way around. And you can't do white balancing with just gray cards. You need something like an X-rite Color Checker Chart. Otherwise how can you tell if there is some color shift in your illumination? You might have yellowish or bluish white and want to make sure that you have the true white you're looking for, for example D65.
And I wouldn't go into sRGB space. How can you do that? How can you get standard RGB values when you can get almost any RGB out of the camera that you want just by changing the exposure and other things? I would just go directly from your actual RGB to XYZ and then to LAB. I have a primitive gamut visualization routine here (attached) but the best one is this one
Actually first you do background correction to compensate for lens shading. Then you snap images of a series of gray cards taking up the whole image to determine the opto-electronic conversion function ("gamma"). Then you image the Color Checker chart and background correct and gamma correct it. Then you determine the RGB to XYZ transform. Then you can use the "book formulas" to go from XYZ to LAB, which is the "true" colors.
Best Answer