Roel H,
Agreed on both counts. The code has been updated and re-posted. Doing some quick testing, the updates you recommended significantly improve speed for very large matrices, thank you.
Though I have a few remarks. For the largeMat case, it is better to use bsxfun instead of repmat, as it is more efficient(faster) for large matrices which obviously is the case. Also it may be an idea to postpone the "sqrt" call untill a maximum is found. This won't change the outcome, but should require less computations
It was brought to my attention by Roey Baror of Tel-Aviv University that creating/outputting a matrix of distances between all points could quickly tax the system's memory for large matrices, such as high resolution images. The update provides a secondary algorithm to calculate the Hausdorff Distance without storing the large matrix in memory, and detects automatically when this secondary algorithm is necessary.
Edits Added the matrix of distances as an output option. Fixed a bug that would cause an error if one of the sets was a single point. Removed excess calls to "size" and "length". - May 2010
15 Jun 2010
Generalizes the code to allow N-dimensional point sets. This update is inspired by file 27905, which has a good implementation of HD beyond 2-D sets of points.
30 Apr 2012
The code now automatically switches to a secondary algorithm when there is insufficient memory to compute and store a matrix containing distances between all constituent points. It also allows the user to manually choose the desired algorithm.
04 Oct 2012
Based on user comments, the algorithm for large data sets was updated for performance.