Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
question about the expected value of this distribution

Subject: question about the expected value of this distribution

From: anja.ende@googlemail.com

Date: 8 Mar, 2012 12:26:11

Message: 1 of 6

Hello all,

This is not a matlab question per se, but I was hoping someone here
might have a good idea.

I have a joint entropy like expression as follows:

F(B|A) = integral[-inf +inf] P(B>b|A) log P(B>b|A)

To implement this, I have a 2D array where the elements represent this
joint PDF (I compute the conditional survival function and its log
basically).

Now, what would be the expected value of this expression i.e. E(F(B|
A)). Would it be simply scaling over the sum of each of the rows and
columns? I am somehow getting very confused with this.

Thanks for any help you can give me.

Anja

Subject: question about the expected value of this distribution

From: Matt J

Date: 8 Mar, 2012 14:33:18

Message: 2 of 6

"anja.ende@googlemail.com" <anja.ende@googlemail.com> wrote in message <63344cb9-ead9-4235-bb13-de80516e0c66@q11g2000vbu.googlegroups.com>...
>
> I have a joint entropy like expression as follows:
>
> F(B|A) = integral[-inf +inf] P(B>b|A) log P(B>b|A)
================

Is the integration w.r.t. b? If not then what? And if so, shouldn't F simply be a function of the single random variable A?

F(A) = integral[-inf +inf] P(B>b|A) log P(B>b|A)

I'll assume so below.


> Now, what would be the expected value of this expression i.e. E(F(B|
> A)). Would it be simply scaling over the sum of each of the rows and
> columns? I am somehow getting very confused with this.
>
================

E(F(A))= sum F(A) P(A)

The probabilities P(A) can just be obtained as the marginal of your
joint distribution P(A,B).

Subject: question about the expected value of this distribution

From: Steven_Lord

Date: 8 Mar, 2012 14:40:02

Message: 3 of 6



<anja.ende@googlemail.com> wrote in message
news:63344cb9-ead9-4235-bb13-de80516e0c66@q11g2000vbu.googlegroups.com...
> Hello all,
>
> This is not a matlab question per se, but I was hoping someone here
> might have a good idea.
>
> I have a joint entropy like expression as follows:
>
> F(B|A) = integral[-inf +inf] P(B>b|A) log P(B>b|A)

You should probably ask this question on a statistics newsgroup, like the
sci.stat.math newsgroup available via Google Groups (if your news server
doesn't carry it.)

--
Steve Lord
slord@mathworks.com
To contact Technical Support use the Contact Us link on
http://www.mathworks.com

Subject: question about the expected value of this distribution

From: Roger Stafford

Date: 8 Mar, 2012 15:34:17

Message: 4 of 6

"anja.ende@googlemail.com" <anja.ende@googlemail.com> wrote in message <63344cb9-ead9-4235-bb13-de80516e0c66@q11g2000vbu.googlegroups.com>...
> Hello all,
>
> This is not a matlab question per se, but I was hoping someone here
> might have a good idea.
>
> I have a joint entropy like expression as follows:
>
> F(B|A) = integral[-inf +inf] P(B>b|A) log P(B>b|A)
>
> To implement this, I have a 2D array where the elements represent this
> joint PDF (I compute the conditional survival function and its log
> basically).
>
> Now, what would be the expected value of this expression i.e. E(F(B|
> A)). Would it be simply scaling over the sum of each of the rows and
> columns? I am somehow getting very confused with this.
>
> Thanks for any help you can give me.
>
> Anja
- - - - - - - - - -
  I don't think the notion of expected value applies to your F(B|A) quantity. B and A are stochastic but probabilities involving them are not. They are definite numerical quantities. If I flip a penny it is legitimate to ask for the expected number of heads, namely 1/2, but it is not legitimate to ask for the expected probability of throwing a head. That is not a quantity subject to statistical variation.

Roger Stafford

Subject: question about the expected value of this distribution

From: Roger Stafford

Date: 8 Mar, 2012 19:59:39

Message: 5 of 6

"Roger Stafford" wrote in message <jjajhp$1n9$1@newscl01ah.mathworks.com>...
> I don't think the notion of expected value applies to your F(B|A) quantity. ....
- - - - - - - - -
  I withdraw my comment about your expected value. I shouldn't try to answer posts when I haven't fully awakened.

Roger Stafford

Subject: question about the expected value of this distribution

From: anja.ende@googlemail.com

Date: 11 Mar, 2012 21:10:34

Message: 6 of 6

On Mar 8, 2:33 pm, "Matt J " <mattjacREM...@THISieee.spam> wrote:
> "anja.e...@googlemail.com" <anja.e...@googlemail.com> wrote in message <63344cb9-ead9-4235-bb13-de80516e0...@q11g2000vbu.googlegroups.com>...
>
> > I have a joint entropy like expression as follows:
>
> > F(B|A) = integral[-inf +inf] P(B>b|A) log P(B>b|A)
>
> ================
>
> Is the integration w.r.t. b? If not then what? And if so, shouldn't F simply be a function of the single random variable A?
>
> F(A) = integral[-inf +inf] P(B>b|A) log P(B>b|A)
>
> I'll assume so below.
>
> > Now, what would be the expected value of this expression i.e. E(F(B|
> > A)). Would it be simply scaling over the sum of each of the rows and
> > columns? I am somehow getting very confused with this.
>
> ================
>
> E(F(A))= sum F(A) P(A)
>
> The probabilities P(A) can just be obtained as the marginal of your
> joint distribution P(A,B).

Thank you very much for the reply Matt!

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us