Asked by Leon
on 17 Nov 2012

Pick two points on a surface (x1,y1) and (x2,y2). From selected position first one moves with the speed (vecotr)v=(v1,v2) and the other one with the speed of (vector)u=(u1,u2).

If points are getting close figure out when and where they are in the moment when they are the closest. Also computer must tell, if they are moving away. Plot graph of distnace in time between the 2 vectors.

A starting point of the first one is (0.4,603)m and the starting point of the secon one is (23,-908)m. Speed of first is vector(v)=(-0.1,-0.98)m/s and the second: vector(u)=(-0.2,1.1)m/s.

PLEASE HELP ME WITH THIS. THANK YOU

*No products are associated with this question.*

Answer by Eoin
on 20 Nov 2012

It depends on how fancy you need to be...

you could do something like

dt = 1; R1 = [X1 Y1]; V = [V1 V2]; R2 = [X2 Y2]; U = [U1 U2]; oldD = 999; while(true) R1 = R1+V*dt R2 = R2+U*dt D = sqrt(sum(R1-R2).^2); if (D>oldD) break; % This will make the loop stop when the distance between the points starts to increase end oldD = D; end

More fancy would be to use an ordinary differential equation solver like ode45.

Plotting is left for you ;)

Opportunities for recent engineering grads.

## 2 Comments

## John Petersen (view profile)

Direct link to this comment:http://www.mathworks.com/matlabcentral/answers/53979#comment_112265

Do you need help with the equations or the code?

## Jan Simon (view profile)

Direct link to this comment:http://www.mathworks.com/matlabcentral/answers/53979#comment_112271

Is the surface flat? If so, the problem is to find the minimum distance between to lines.