Given two points **x** and **y** placed at opposite corners of a rectangle, find the minimal euclidean distance between another point **z** and every point within this rectangle.

For example, the two points

x = [-1,-1]; y = [1,1];

define a square centered at the origin. The distance between the point

z = [4,5];

and this square is

d = 5;

(the closest point in the square is at [1,1])

The distance between the point z = [0,2] and this same square is d = 1 (closest point at [0,1])

The distance between the point z = [0,0] and this same square is d = 0 (inside the square)

Notes:

- you can always assume that
**x**<**y**(element-wise) - The function should work for points x,y,z in an arbitrary n-dimensional space (with n>1)

Sam
on 8 Oct 2014

For the n dimensional case it would be better to say that x and y lie on opposite vertices of the n-hypercuboid such that each edge is parallel to a coordinate axis.

1 Comment

Alfonso Nieto-Castanon
on 31 Jan 2012

nice trick :)

**Tags**

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn moreOpportunities for recent engineering grads.

Apply Today
1 player likes this problem

1 player likes this problem