Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Approximate entropy

Subject: Approximate entropy

From: Aino

Date: 7 Jun, 2010 20:03:06

Message: 1 of 2

Hi all!

I am trying to make a code for calculating approximate entropy (Pincus 1996). Since the measure is somewhat random (personal opinion..) I compared my code with another one I found in this forum (http://www.mathworks.com/matlabcentral/newsreader/view_thread/259062#712823). I didn't get exactly the same results.. At least the logarithm is taken in a bit later point in the code than in the other code. Here's my code (cop is the signal):

function AppEnt=ApproxEntropy(cop,M,r)

N=length(cop);
count1=0;
count2=0;
CIM=[];

for m=[M,M+1]
    Cim=NaN(1,N-m+1);
    
    for i=1:N-m+1
        seg_i=cop(i:i+m-1);
        
        for j=1:N-m+1
            seg_j=cop(j:j+m-1);
            
            for k=1:m
                if abs(seg_i(k)-seg_j(k))>r
                    count1=count1+1;
                end
            end
            
            if count1==0
                count2=count2+1;
            end
            count1=0;
        end
        
        Cim(i)=count2/(N-m+1);
        count2=0;
    end
    CIM=cat(2,CIM,sum(log(Cim))/(N-m+1));
end

AppEnt=CIM(1)-CIM(2);

Could you help me see if there is something wrong with it? Also, if I put random noise in to the code, it doesn't give out "a big number" (what ever that means).

Also, can I use this as a cross approximate entropy if I change the seg_j to be from another signal?

Thank you so much!

-Aino

Subject: Approximate entropy

From: Aino

Date: 10 Jun, 2010 13:30:23

Message: 2 of 2

Well, I read that Sample Entropy would be a better than Approximate entropy, so I changed to that. Though now I have big problems of speeding it up, I guess I will post about it now..

Thanks!

-Aino


"Aino" <aino.tietavainen@removeThis.helsinki.fi> wrote in message <hujj9q$496$1@fred.mathworks.com>...
> Hi all!
>
> I am trying to make a code for calculating approximate entropy (Pincus 1996). Since the measure is somewhat random (personal opinion..) I compared my code with another one I found in this forum (http://www.mathworks.com/matlabcentral/newsreader/view_thread/259062#712823). I didn't get exactly the same results.. At least the logarithm is taken in a bit later point in the code than in the other code. Here's my code (cop is the signal):
>
> function AppEnt=ApproxEntropy(cop,M,r)
>
> N=length(cop);
> count1=0;
> count2=0;
> CIM=[];
>
> for m=[M,M+1]
> Cim=NaN(1,N-m+1);
>
> for i=1:N-m+1
> seg_i=cop(i:i+m-1);
>
> for j=1:N-m+1
> seg_j=cop(j:j+m-1);
>
> for k=1:m
> if abs(seg_i(k)-seg_j(k))>r
> count1=count1+1;
> end
> end
>
> if count1==0
> count2=count2+1;
> end
> count1=0;
> end
>
> Cim(i)=count2/(N-m+1);
> count2=0;
> end
> CIM=cat(2,CIM,sum(log(Cim))/(N-m+1));
> end
>
> AppEnt=CIM(1)-CIM(2);
>
> Could you help me see if there is something wrong with it? Also, if I put random noise in to the code, it doesn't give out "a big number" (what ever that means).
>
> Also, can I use this as a cross approximate entropy if I change the seg_j to be from another signal?
>
> Thank you so much!
>
> -Aino

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us