Fastest way to find minimum distance of one point to points on a curve

Rather than an arbitrary distance, you could perhaps iterate until "out of range".

In your example, suppose you start with the point on the upper curve at the top-right of your line. Then drop vertically downwards, you get a distance of (by my eye) about 200um.

Now you can move right from here testing points until the horizontal distance is 200um. Beyond that, it's impossible to get a distance less than 200um.

Moving left, the distance goes down until you find the 150um minimum, then starts rising again. Once you're 150um to the left of your upper point, again, it's impossible to beat the minimum you've found.

If you'd gone left first, you wouldn't have had to go so far right, so as an optimization either follow the direction in which the distance falls, or else work out from the middle in both directions at once.

I don't know how many um 50 units is, so this might be slower or faster than what you have. It does avoid the risk of missing a lower value, though.

Since you're doing lots of tests against the same set of points on the lower curve, you can proably improve on this by ignoring the fact that the points form a curve at all. Stick them all in a k-d tree or similar, and search that repeatedly. It's called a Nearest neighbor search.


It may help to identify this problem as a nearest neighbour search problem. That link includes a good discussion about the various algorithms that are used for this. If you are OK with using C++ rather than straight C, ANN looks like a good library for this.

It also looks as though this question has been asked before.


We can label the top curve y=t(x) and the bottom curve y=b(x). Label the closest-function x_b=c(x_t). We know that the closest-function is weakly monotone non-decreasing as two shortest paths never cross each other.

If you know that the distance function d(x_t,x_b) has only one local minimum for every fixed x_t (this happens if the curve is "smooth enough"), then you can save time by "walking" the curve:

- start with x_t=0, x_b=0
- while x_t <= x_max
-- find the closest x_b by local search
     (increment x_b while the distance is decreasing)
-- add {x_t, x_b} to the result set
-- increment x_t

If you expect x_b to be smooth enough, but you cannot assume that and you want an exact result,

Walk the curve in both directions. Where the results agree, they are correct. Where they disagree, run a complete search betwen the two results (the leftmost and the rightmost local maxima). Sample the "ambiguous block" in such an order (binary division) to allow the most pruning due to the monotonicity.

As a middle ground:

Walk the curve in both directions. If the results disagree, choose among the two. If you can guarantee at most two local maxima for each fixed x_t, this produces the optimal solution. There are still some pathological cases where the optimal solution is not found, and contain a local minimum that is flanked by two other local minima that are both worse than this one. I dare say it is uncommon to find a case where the solution is far from optimal (assuming smooth y=b(x)).