Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
How to calculate RELATIVE ENTROPY between two images?

Subject: How to calculate RELATIVE ENTROPY between two images?

From: Learner

Date: 5 Jan, 2012 10:37:08

Message: 1 of 17

Let suppose i have two images.
after histeq, i want to know the relative entropy between original & HEed image..

How to do that??

Subject: How to calculate RELATIVE ENTROPY between two images?

From: ImageAnalyst

Date: 5 Jan, 2012 13:04:56

Message: 2 of 17

On Jan 5, 5:37 am, "Learner " <farhan7...@gmail.com> wrote:
> Let suppose i have two images.
> after histeq, i want to know the relative entropy between original & HEed image..
>
> How to do that??

-----------------------------------------------------------------------
Do you have a formula for it?
Sorry but the algorithm for "relative entropy" isn't committed to my
memory.

Subject: How to calculate RELATIVE ENTROPY between two images?

From: Learner

Date: 5 Jan, 2012 13:56:08

Message: 3 of 17

ImageAnalyst <imageanalyst@mailinator.com> wrote in message <02ba33ff-bbf9-4aed-838b-0f50616fb9bc@k28g2000yqn.googlegroups.com>...
> On Jan 5, 5:37 am, "Learner " <farhan7...@gmail.com> wrote:
> > Let suppose i have two images.
> > after histeq, i want to know the relative entropy between original & HEed image..
> >
> > How to do that??
>
> -----------------------------------------------------------------------
> Do you have a formula for it?
> Sorry but the algorithm for "relative entropy" isn't committed to my
> memory.

relative entropy between two pdf's p(x) & q(x):
summation[p(x)*log{p(x)/q(x)}]

Subject: How to calculate RELATIVE ENTROPY between two images?

From: dpb

Date: 5 Jan, 2012 14:47:13

Message: 4 of 17

On 1/5/2012 7:56 AM, Learner wrote:
...

> relative entropy between two pdf's p(x) & q(x):
> summation[p(x)*log{p(x)/q(x)}]


Pretty much as written...

e=sum(p.*log(p./q))

Note the "dot" operators...look 'em up in online

doc punct

--

Subject: How to calculate RELATIVE ENTROPY between two images?

From: Learner

Date: 6 Jan, 2012 19:15:09

Message: 5 of 17

dpb <none@non.net> wrote in message <je4d5i$s6v$1@speranza.aioe.org>...
> On 1/5/2012 7:56 AM, Learner wrote:
> ...
>
> > relative entropy between two pdf's p(x) & q(x):
> > summation[p(x)*log{p(x)/q(x)}]
>
>
> Pretty much as written...
>
> e=sum(p.*log(p./q))
>
> Note the "dot" operators...look 'em up in online
>
> doc punct
>
> --
it is not working!!.. can u please show with an example?

Subject: How to calculate RELATIVE ENTROPY between two images?

From: ImageAnalyst

Date: 6 Jan, 2012 19:24:34

Message: 6 of 17

He gave the example. Now it's your turn to show why it's not
working. Why do you say it didn't work? Surely you must have tried
it with some p and q, didn't you?

Subject: How to calculate RELATIVE ENTROPY between two images?

From: dpb

Date: 6 Jan, 2012 20:00:18

Message: 7 of 17

On 1/6/2012 1:15 PM, Learner wrote:
> dpb <none@non.net> wrote in message <je4d5i$s6v$1@speranza.aioe.org>...
....

>> e=sum(p.*log(p./q))
>>
>> Note the "dot" operators...look 'em up in online
...

> it is not working!!.. can u please show with an example?

For what definition of "not working", precisely?

 >> p=rand(3,1);q=rand(size(p));
 >> e=sum(p.*log(p./q))
e =
     2.3223
 >>

For a 2D image you'll need to force them into a vector or sum twice.

doc colon

for the former; see why the latter in

doc sum

to see Matlab's default behavior on arrays in sum() and most similar
functions.

--

Subject: How to calculate RELATIVE ENTROPY between two images?

From: Learner

Date: 7 Jan, 2012 07:51:08

Message: 8 of 17

ImageAnalyst <imageanalyst@mailinator.com> wrote in message <642713a8-1e68-4452-8877-28ea64177359@u20g2000yqb.googlegroups.com>...
> He gave the example. Now it's your turn to show why it's not
> working. Why do you say it didn't work? Surely you must have tried
> it with some p and q, didn't you?

sorry..
i mean i applied & answer was :
entropy=NaN.

i donno why!!??

Subject: How to calculate RELATIVE ENTROPY between two images?

From: Learner

Date: 7 Jan, 2012 07:52:08

Message: 9 of 17

dpb <none@non.net> wrote in message <je7jsh$pko$1@speranza.aioe.org>...
> On 1/6/2012 1:15 PM, Learner wrote:
> > dpb <none@non.net> wrote in message <je4d5i$s6v$1@speranza.aioe.org>...
> ....
>
> >> e=sum(p.*log(p./q))
> >>
> >> Note the "dot" operators...look 'em up in online
> ...
>
> > it is not working!!.. can u please show with an example?
>
> For what definition of "not working", precisely?
>
> >> p=rand(3,1);q=rand(size(p));
> >> e=sum(p.*log(p./q))
> e =
> 2.3223
> >>
>
> For a 2D image you'll need to force them into a vector or sum twice.
>
> doc colon
>
> for the former; see why the latter in
>
> doc sum
>
> to see Matlab's default behavior on arrays in sum() and most similar
> functions.
>
> --

Ok i will try..

Subject: How to calculate RELATIVE ENTROPY between two images?

From: dpb

Date: 7 Jan, 2012 14:36:19

Message: 10 of 17

On 1/7/2012 1:51 AM, Learner wrote:
> ImageAnalyst <imageanalyst@mailinator.com> wrote in message
> <642713a8-1e68-4452-8877-28ea64177359@u20g2000yqb.googlegroups.com>...
>> He gave the example. Now it's your turn to show why it's not
>> working. Why do you say it didn't work? Surely you must have tried
>> it with some p and q, didn't you?
>
> sorry..
> i mean i applied & answer was :
> entropy=NaN.
> i donno why!!??

What's log(0) would be a starting guess, maybe???

--

Subject: How to calculate RELATIVE ENTROPY between two images?

From: ImageAnalyst

Date: 7 Jan, 2012 14:22:42

Message: 11 of 17

On Jan 7, 2:51 am, "Learner " <farhan7...@gmail.com> wrote:
> sorry..
> i mean i applied & answer was :
> entropy=NaN.
>
> i donno why!!??

------------------------------------------------------
Well, did your p or q have any zeros in it? I bet that's why.

Subject: How to calculate RELATIVE ENTROPY between two images?

From: ImageAnalyst

Date: 7 Jan, 2012 14:52:54

Message: 12 of 17

On Jan 7, 9:36 am, dpb <n...@non.net> wrote:
> On 1/7/2012 1:51 AM, Learner wrote:
>
> > ImageAnalyst <imageanal...@mailinator.com> wrote in message
> > <642713a8-1e68-4452-8877-28ea64177...@u20g2000yqb.googlegroups.com>...
> What's log(0) would be a starting guess, maybe???
------------------------------------------------------------------------
Funny - that's what I said a half hour ago but it's not showing up in
Google or the Mathworks yet. Weird - sometimes Google hangs onto
posts for hours and other times it posts them immediately.

He can get the non-zero parts of the array by doing something like
(untested)
p_nonZeroLocations = p > 0;
q_nonZeroLocations = q > 0;
both_nonZero = p_nonZeroLocations & q_nonZeroLocations;
Then:
e=sum(p(both_nonZero) .* log(p(both_nonZero) ./ q(both_nonZero)))

Let's see if this posts.

Subject: How to calculate RELATIVE ENTROPY between two images?

From: dpb

Date: 7 Jan, 2012 15:31:38

Message: 13 of 17

On 1/7/2012 8:52 AM, ImageAnalyst wrote:
> On Jan 7, 9:36 am, dpb<n...@non.net> wrote:
>> On 1/7/2012 1:51 AM, Learner wrote:
>>
>>> ImageAnalyst<imageanal...@mailinator.com> wrote in message
>>> <642713a8-1e68-4452-8877-28ea64177...@u20g2000yqb.googlegroups.com>...
>> What's log(0) would be a starting guess, maybe???
> ------------------------------------------------------------------------
> Funny - that's what I said a half hour ago but it's not showing up in
> Google or the Mathworks yet. Weird - sometimes Google hangs onto
> posts for hours and other times it posts them immediately.

usenet is a wondrous anomaly... :) Your other post hadn't shown up here
at the time I posted, either.

> He can get the non-zero parts of the array by doing something like
> (untested)
> p_nonZeroLocations = p> 0;
> q_nonZeroLocations = q> 0;
> both_nonZero = p_nonZeroLocations& q_nonZeroLocations;
> Then:
> e=sum(p(both_nonZero) .* log(p(both_nonZero) ./ q(both_nonZero)))
>
> Let's see if this posts.

Indeed...

Interesting, though. I did a check of log(0)/log(0) here to confirm and
get a warning --

 >> log(0)/log(0)
Warning: Log of zero.
Warning: Log of zero.
ans =
    NaN
 >>

Doesn't that happen w/ newer versions as it would have seem to have been
apparent to the OP?

--

Subject: How to calculate RELATIVE ENTROPY between two images?

From: ImageAnalyst

Date: 7 Jan, 2012 16:07:26

Message: 14 of 17

On Jan 7, 10:31 am, dpb <n...@non.net> wrote:
> Indeed...
>
> Interesting, though.  I did a check of log(0)/log(0) here to confirm and
> get a warning --
>
>  >> log(0)/log(0)
> Warning: Log of zero.
> Warning: Log of zero.
> ans =
>     NaN
>  >>
>
> Doesn't that happen w/ newer versions as it would have seem to have been
> apparent to the OP?
--------------------------------------------------------------------
Here's an interesting experiment:
>> log(0)
ans =
  -Inf
>> log(99) / log(0)
ans =
     0
>> log(0) / log(0)
ans =
   NaN
So it appears that to get NaN, p and q must have both had 0 in the
same location. So the code I posted before should work:
p_nonZeroLocations = p > 0;
q_nonZeroLocations = q > 0;
both_nonZero = p_nonZeroLocations & q_nonZeroLocations;
Then:
e=sum(p(both_nonZero) .* log(p(both_nonZero) ./ q(both_nonZero)))

We've both posted code. I wonder why Learner won't post his code so
that we can help him. It's like pulling teeth to get him to let us
help him.

Subject: How to calculate RELATIVE ENTROPY between two images?

From: dpb

Date: 7 Jan, 2012 17:50:44

Message: 15 of 17

On 1/7/2012 10:07 AM, ImageAnalyst wrote:
> On Jan 7, 10:31 am, dpb<n...@non.net> wrote:
>> Indeed...
>>
>> Interesting, though. I did a check of log(0)/log(0) here to confirm and
>> get a warning --
...
>> Doesn't that happen w/ newer versions as it would have seem to have been
>> apparent to the OP?
> --------------------------------------------------------------------
> Here's an interesting experiment:
>>> log(0)
> ans =
> -Inf
>>> log(99) / log(0)
> ans =
> 0
>>> log(0) / log(0)
> ans =
> NaN
> So it appears that to get NaN, p and q must have both had 0 in the
> same location.

Yes. Interesting that you don't get any warnings for log(0), though. I
forget w/o looking up how the default warning level is set but appears
as though it's less strict in later releases than I have. Not sure
that's _a_good_thing_ (tm)...wonder if S Lord will stumble on this
thread and chime in on that point.

So the code I posted before should work:
> p_nonZeroLocations = p> 0;
> q_nonZeroLocations = q> 0;
> both_nonZero = p_nonZeroLocations& q_nonZeroLocations;
> Then:
> e=sum(p(both_nonZero) .* log(p(both_nonZero) ./ q(both_nonZero)))
>
> We've both posted code. I wonder why Learner won't post his code so
> that we can help him. It's like pulling teeth to get him to let us
> help him.

Yes, you have the best solution to get an answer. Whether that's the
right answer I don't know; guess it depends on the purpose OP has in
computing the value in the first place.

It's often the case that folks somehow think clairvoyance is a trait of
respondents to their queries in cs-sm. I often point out that my
crystal ball is in the shop or murky even when available as I'm sure you
recall... :)

--

Subject: How to calculate RELATIVE ENTROPY between two images?

From: Steven_Lord

Date: 9 Jan, 2012 03:28:12

Message: 16 of 17



"dpb" <none@non.net> wrote in message news:jea0lh$nj3$1@speranza.aioe.org...
> On 1/7/2012 10:07 AM, ImageAnalyst wrote:
>> On Jan 7, 10:31 am, dpb<n...@non.net> wrote:

*snip*

>> So it appears that to get NaN, p and q must have both had 0 in the
>> same location.

Not necessarily:

log(NaN)/log(5)
log(0)/log(Inf)
log(Inf)/log(Inf)

all return NaN. log(0)/log(0) returns NaN because log(0) is -Inf and
dividing Inf by Inf (regardless of the signs) returns NaN as per section 7.2
of IEEE 754-2008. [Well, technically section 7.2 only talks about Inf/Inf,
but I believe it reasonable to generalize to combinations of +Inf and -Inf.]

> Yes. Interesting that you don't get any warnings for log(0), though. I
> forget w/o looking up how the default warning level is set but appears as
> though it's less strict in later releases than I have. Not sure that's
> _a_good_thing_ (tm)...wonder if S Lord will stumble on this thread and
> chime in on that point.

That warning was removed a few releases ago -- in release R2010a, I think.

*snip*

> It's often the case that folks somehow think clairvoyance is a trait of
> respondents to their queries in cs-sm. I often point out that my crystal
> ball is in the shop or murky even when available as I'm sure you recall...
> :)

I swear the Mind Reading Toolbox is in the works -- after the last round of
tests turned the lab rats hyper-intelligent, they've been helping out with
the development ;)

--
Steve Lord
slord@mathworks.com
To contact Technical Support use the Contact Us link on
http://www.mathworks.com

Subject: How to calculate RELATIVE ENTROPY between two images?

From: Learner

Date: 10 Jan, 2012 09:31:08

Message: 17 of 17

ImageAnalyst <imageanalyst@mailinator.com> wrote in message <ae075d4f-dfae-48f6-b828-2d539fe4eefe@f33g2000yqh.googlegroups.com>...
> On Jan 7, 10:31 am, dpb <n...@non.net> wrote:

> We've both posted code. I wonder why Learner won't post his code so
> that we can help him. It's like pulling teeth to get him to let us
> help him.

sorry for delay.. actually i don't have frequent internet access these days bcoz of vacations..
 & highly thanx.. ur formula works like a charm..

thanx again.. & also many thanks to "dpb" & "Steve Lord"..

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us