Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
luminance

Subject: luminance

From: Jessica

Date: 13 Oct, 2008 14:26:04

Message: 1 of 5

Hi,

Does anyone know whether it is possible to measure the overall luminance of an image using Matlab?

Thanks!

Subject: luminance

From: Walter Roberson

Date: 13 Oct, 2008 20:02:21

Message: 2 of 5

Jessica wrote:
 
> Does anyone know whether it is possible to measure the overall luminance
> of an image using Matlab?

Yes, I know that it is *not* possible, not without additional information.

Take a single picture of a star, Use any magnification and exposure you want.
Now, using just that one picture, tell us what the absolute luminance of the
star is.

If you could do that, you could revolutionize astronomy.


When you take a picture, you get out a group of sensor measurements.
However, just knowing the measurement readings doesn't tell you
what was being measured. Are the sensors linear or non-linear?
Are they full-spectrum or narrow-frequency spectrum or broad spectrum?

For example, a single pixel lit at intensity 42 (maximum 255) in
an image from a gamma-ray sensor photographing an extra-galactic object
represents *much* *much* more luminance than an entire image lit
at intensity 254 in an far-infrared sensor on long exposure of
a terrestrial object shot close-up.

In order to measure luminance, you have to have calibrated
the sensors, and you have to know the distance to the object
photographed, and you have to know about impediments (e.g.,
galactic dust clouds or terrestrial water clouds) that
are attenuating the reception; and you have to have a good
(and well-founded) model of the relationship between the
inherent luminance of the object and the luminance as measured
on the frequency bands your sensors are sensitive to. Is the
object emitting classical Boltzmann black-body radiation?
(Amswer: probably not really: it probably has emission spectra
and absorption spectra...)

Subject: luminance

From: Jessica

Date: 13 Oct, 2008 21:13:02

Message: 3 of 5

Thanks for the response. If I am only looking to get a relative measure of luminance, would you have any suggestions? I'm trying to determine whether particular pictures are darker/lighter than other pictures (they are just pictures of people and shapes) when they appear on a computer screen.

Subject: luminance

From: ImageAnalyst

Date: 13 Oct, 2008 23:04:37

Message: 4 of 5

On Oct 13, 5:13=A0pm, "Jessica " <jyorzin...@ucdavis.edu> wrote:
> Thanks for the response. If I am only looking to get a relative measure o=
f luminance, would you have any suggestions? I'm trying to determine whethe=
r particular pictures are darker/lighter than other pictures (they are just=
 pictures of people and shapes) when they appear on a computer screen.

------------------------------------------
Jessica:
Walter is right again (as usual) if you want true, absolute luminance
in units of candela per square meter. If you just want the simple
"book formula" for comparing relative luminances, you might try the
formulas here:
http://www.easyrgb.com/index.php?X=3DMATH.

Are you sure you really know the difference between all the different
optical terms, like what is the difference between luminance,
illuminance, luminous intensity, radiant intensity, etc.? You might
check out http://en.wikipedia.org/wiki/Luminance to brush up. For
example, if you have two objects in the scene that are the same
brightness, they have the same illuminance on the sensor, but they
could have different luminances if one is farther away than the
other. Even "intensity" has about 7 different meanings as it's used
in articles and popular usage even though it does have a precise SI
definition. It can be complicated and confusing and I'm not going to
get into it here. In general if you see things start with "i" it
means received light and if you don't see the "i" it means emitted
light, for example luminance and illuminance, radiance and irradiance,
etc. Things that sound like "lum*" refer to energy weighted by the
human visual response and things that don't (like "rad*") refer to the
entire spectrum, even outside the visible, and don't weight the values
by the human eye's response. It can get pretty tricky, but if you
just want to measure relative illuminance on the sensor and assume the
objects subtend the same solid angle, then you can compare them fairly
well just using book formulas on easyrgb.com. You can use a variety
of "intensity" terms (I'm using the vague layman's definition here,
not the SI definition) such as I, L, or V. These are generally
weighted sums of the red, green, and blue values. It'll probably be
good enough for your situation if it's something like a class
assignment.
Regards,
ImageAnalyst

Subject: luminance

From: Walter Roberson

Date: 14 Oct, 2008 05:09:02

Message: 5 of 5

"Jessica " <jyorzinski@ucdavis.edu> wrote in message <gd0dku$4ik$1@fred.mathworks.com>...
> If I am only looking to get a relative measure of
> luminance, would you have any suggestions? I'm trying
> to determine whether particular pictures are darker/lighter
> than other pictures (they are just pictures of people and
> shapes) when they appear on a computer screen.

No, you can't calculate that either, not without additional
information.

Suppose for simplicity of argument that we are working with
grayscale images, integers in the range 0 through 255. And
suppose we have Image A, which is all 0 on the left side,
and all 128 on the right side. And suppose we have also
Image B, which is all 64 over the entire image. Now, which
image is "darker" or "lighter"?

If we say that Image A is lighter because its maximum
intensity reading of 128 is larger than the maximum
intensity reading of 64 for Image B, then let us introduce
Image C, which is all 0 except for a single pixel of
intensity 129. By the tentative definition of this
paragraph, we would have to conclude that Image C was
lighter on the grounds that it is now the one that has
the spot with the highest intensity reading -- and yet
our common sense experience tells us something that has
a single half-lit pixel is going to be "darker" than an
image in which half the pixels are each lit very nearly
as much as that single pixel. So by common experience,
we must conclude that we should not look -just- at maximum
pixel intensity.

If we then modify our hypothesis to describe lightness
in terms of -average- intensity readings (for images
of the same size) or in terms of -total- intensity
readings summed over all pixels (for images of different
sizes), then we would conclude that mathematically
Image A and Image B are equally light. But is that true?
Well, put them up on the screen and look at them: what do
you -observe- ? It gets a bit difficult to say, doesn't it,
what with that large and very obvious bright area in
Image B compared to the larger more washed-out area of
Image A ?

So let me make this easier on you: to help
you decide, reach over to the controls and your monitor,
and turn the monitor contrast up to the maximum the
monitor can handle. Now which of the two images is
lighter? Image B, right? Okay, now turn your monitor
contrast down a long long way. Which image is lighter
now? If your monitor controls can handle a full range of
contrast adjustments, then the right-half of image B
will have its pixels only as bright as the individual
pixels of image A, but since image A has twice as many
pixels lit up to that brightness, our experience tells
us that image A will now be the "lighter" one.

So, by adjusting the monitor contrast, we can make either
Image A -or- Image B the lighter of the two images. Now,
when we adjusted the monitor contrast, did the intensity
readings stored in the matrix representing A or B change?
No, of course not. And what that tells us is that without
further information, there is no way to -calculate-
whether Image A or Image B is brighter, just given
their intensity matrices.

What is the inherent problem here? This: that when you
have a particular intensity reading in a matrix, you
do not know how bright the -monitor- is going to render
that intensity reading. And because of that, you have
no way of weighting the relative intensity readings of the
pixels in order to calculate which of the images will
*look* lighter or darker on the screen.

Typically, when you adjust the brightness and contrast
controls on a monitor, what you are adjusting is
something referred to as the "gamma curve". Conventional
cathode ray tube technology is such that you need
a minimum energy to make a point fluoresce at all, and
that as you increase the voltage linearly from that
point, the photon emission rises exponentially. So if
you want a linear photo emission out in response to
a linear intensity reading increase, then the subsystem
that converts intensity reading into voltage has to
effectively take a log of the intensities. But often
people don't really want linear increases, so it is
typically controlled by a look-up table which is
usually calculated along a curve that is variable
from linear (exponential brightness response) to log
(linear brightness response)... with saturation
being fairly common as people seldom have their
controls adjusted properly. And a dirty manufacturing
secret is that considering that the beam on a CRT is
not easily switched off completely, just moved around,
it can be really hard for a manufacturer to get a
true black colour at a pixel. The monitor I'm using
right now was pretty high quality for its day, but
even when it switches into power-saving mode, the entire
screen still has a pale glow.

What do you need in order to know how the brightness
on your monitor changes with pixel intensity reading?
Well, effectively you have to calibrate the equipment
using a luminosity measurement device (and then lock the controls). And that's partly because your graphics card
has its own gamma tables that adjust the signals being
sent to the monitor, and then the monitor has -its-
gamma tables that control how the monitor will react
to those signals -- but you need to know the response
"end-to-end", not the response at any one point. -Some-
consumer-level monitors have a calibration device supplied;
a commercial calibration tool can be a bit pricey
(a few thousand dollars.) It's one of those things where
"If you need it, you NEED it" -- like if you are producing
online proof copy for publishing purposes, then you need
to *know* that the colour you see on the screen will
match the colours that are actually printed.


Now I'm going to confuse the issue even more by switching
attention to colour images. Let us take Image E, which
has all of its pixels set to blue 255 and red and green 0,
and let us take Image F, which has all of its pixels set
to red 255 and green and blue 0. Which image is "lighter",
blue image E, or red Image F ? Same pixel intensities
for both, so the gamma response doesn't affect the
question, right? Oops, that's wrong -- colour CRTs have
separate gamma curves for Red, Green, and Blue! So,
let's move into a more comfortable question by assuming
that the Red, Green, and Blue response curves are
configured identically to each other... that should
fix -that- problem, right? Ulp! Not quite -- Red,
Green, and Blue phosphors on CRTs have different
minimum fluorescence energies and different brightness
exponents. Okay, so maybe buying that $2000 calibration
tool wasn't a waste of money after-all: assume that
we've got everything all calibrated so that at maximum
intensity values, the different colour signals all
have the same photon emission rate -- and since the
response curves will be different for each of the
colours, assume that we have a complete table of
relative photon emission levels for each possible
intensity reading from 0 to 255, so that we could
take a colour intensity reading matrix and calculate
the total photon emission level. *Now* which is lighter,
the #$#@ blue Image E, or the #$@# red Image F?

Answer: under those conditions, of equal photon emission
rates for the blue and red images, humans will -perceive-
the red image to be brighter! Humans have three sets of
colour cones, with peak sensitivity in different colours,
and the brightness response for red is higher than the
brightness response for blue (but the shade discrimination is better for blue -- we can pick out differences in
blues more readily than we can pick out differences in
reds). Which feeds back into another reason why
monitors are often not set up for linear photon
output emission rates: so as to balance the -perceived-
brightnesses amongst the different colours.

Since you were asking about "luminance" I'll leave
you with a final bit of confusion: although blues
are perceived by humans to generally be darker colours,
blue photons are higher frequency (more energetic) than
red photons. If you fire up a gas burner, "red hot" is
-cooler- than "blue-hot". So if you start wanting to
know about the energy involved in an image, you
have to take into account that for humans, higher
energy can be -less- bright.

Now, what was the question again? :-)

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us