Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
question on faster run-time with Parallel Computing toolbox

Subject: question on faster run-time with Parallel Computing toolbox

From: Aidy

Date: 17 Oct, 2011 22:01:30

Message: 1 of 7

Good day all,

I currently have an optimization algorithm implemented in Matlab. The time to arrive at a solution for this algorithm varies.

There is a speed variation since success of ending the optimization relies on randomly selecting a set of data which is hopefully reliable enough for a solution. So for example, my algorithm can end in little as 1 iterations or a maximum of 200 iterations ( i.e. ,a value which I set).

I have seen Matlab's Parallel computing toolbox but not sure if would apply to my situation.

I would like to know if it is possible to somehow run my algorithm in "parallel" jobs, such that, I can run the script multiple times and end the overall algorithm with the job that terminates first ?

Is anything like this or similarly possible with the parallel toolbox?

thanks any suggestions,
Aiden

Subject: question on faster run-time with Parallel Computing toolbox

From: Aidy

Date: 18 Oct, 2011 20:04:11

Message: 2 of 7

"Aidy" wrote in message <j7i8jp$1ai$1@newscl01ah.mathworks.com>...
> Good day all,
>
> I currently have an optimization algorithm implemented in Matlab. The time to arrive at a solution for this algorithm varies.
>
> There is a speed variation since success of ending the optimization relies on randomly selecting a set of data which is hopefully reliable enough for a solution. So for example, my algorithm can end in little as 1 iterations or a maximum of 200 iterations ( i.e. ,a value which I set).
>
> I have seen Matlab's Parallel computing toolbox but not sure if would apply to my situation.
>
> I would like to know if it is possible to somehow run my algorithm in "parallel" jobs, such that, I can run the script multiple times and end the overall algorithm with the job that terminates first ?
>
> Is anything like this or similarly possible with the parallel toolbox?
>
> thanks any suggestions,
> Aiden

Subject: question on faster run-time with Parallel Computing toolbox

From: Edric M Ellis

Date: 19 Oct, 2011 10:17:56

Message: 3 of 7

"Aidy " <aidenjobe@gmail.com> writes:

> I currently have an optimization algorithm implemented in Matlab. The
> time to arrive at a solution for this algorithm varies. There is a
> speed variation since success of ending the optimization relies on
> randomly selecting a set of data which is hopefully reliable enough
> for a solution. So for example, my algorithm can end in little as 1
> iterations or a maximum of 200 iterations ( i.e. ,a value which I
> set).

> I have seen Matlab's Parallel computing toolbox but not sure if would
> apply to my situation.
>
> I would like to know if it is possible to somehow run my algorithm in
> "parallel" jobs, such that, I can run the script multiple times and
> end the overall algorithm with the job that terminates first ?
>
> Is anything like this or similarly possible with the parallel toolbox?

It is possible, but you'll probably need to use SPMD rather than
PARFOR. SPMD is a bit trickier to use. Here's a simple example - use
SPMD to generate random numbers in parallel, and stop when we find the
first one below some threshold. Assuming you already have MATLABPOOL
open, you could proceed as follows:


thresh = 1e-4;
done = false; % we'll flip this when we're done
itCount = 0; % count number of iterations
spmd
    % SPMD block body executes in parallel on each worker
    while ~done
        myRand = rand(); % generate a single random number
        itCount = itCount + 1;
        myDone = myRand < thresh; % have I hit the end condition?
        done = gop( @or, myDone ); % has anyone hit the end condition?
        % gop( @or, val ) is effectively 'any' across multiple workers
        if done
            % find the overall minimum, send the value to lab 1
            globalMin = gop( @min, myRand, 1 );
        end
    end
end

% Extract the values from the 'Composite'
globalMin = globalMin{1}
itCount = itCount{1}

The GOP function is very useful for this sort of thing - it's a global
reduction operation - in other words, it applies the supplied function
to pairs of values in turn to produce the overall value.

Cheers,

Edric.

Subject: question on faster run-time with Parallel Computing toolbox

From: Aidy

Date: 19 Oct, 2011 21:04:27

Message: 4 of 7

thanks Edric,

I have a question. Can I use SPMD with a typical "for" loop instead of "while"?

cheers,
aiden

Subject: question on faster run-time with Parallel Computing toolbox

From: Aidy

Date: 20 Oct, 2011 00:02:29

Message: 5 of 7

Another question if I may Edric,

if the for loop is usable with SPMD, I am assuming the "break" is also usable ? Am I right?

I ask this since I use it is one of parfor limitations ,i.e. , the break commmand

cheers
aiden

Subject: question on faster run-time with Parallel Computing toolbox

From: Edric M Ellis

Date: 20 Oct, 2011 06:55:07

Message: 6 of 7

"Aidy " <aidenjobe@gmail.com> writes:

> if the for loop is usable with SPMD, I am assuming the "break" is also
> usable ? Am I right?

While you can, you still need to make sure all the workers 'break' at
the same time, i.e.

spmd
  for ii=1:100000
    x = someFcn();
    myDone = somePredicate(x);
    anyDone = gop(@or, myDone); % all workers have the same value
    if anyDone
      % ... so all workers 'break' at the same time.
      break;
    end
  end
end

Cheers,

Edric.

Subject: question on faster run-time with Parallel Computing toolbox

From: Aidy

Date: 20 Oct, 2011 07:40:31

Message: 7 of 7

thanks Edric.

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us