Robust DuplicateFreeQ for numerical data
This has 0 problems, but for vectors maybe you could do something like:
dupFreeQ[d_, tol_] := With[{s = Sort @ N @ d},
FreeQ[0] @ Chop[Rest[s]/Most[s] - 1, tol]
]
Example:
data = RandomReal[{1, 1 + 10^-3}, 10^6];
dupFreeQ[data, 10^-14] //AbsoluteTiming
numericDuplicateFreeQ[data] //AbsoluteTiming
{0.246441, False}
{0.525652, False}
Adjusting the tolerance:
dupFreeQ[data, 5 10^-16] //AbsoluteTiming
{0.274373, True}
Another idea is to use Nearest
:
Nearest[N @ data -> "Index", data] //OrderedQ //AbsoluteTiming
{0.278319, True}
vs:
data = RandomReal[{1, 1 + 10^-5}, 10^6];
Nearest[N @ data -> "Index", data] //OrderedQ //AbsoluteTiming
{0.519298, False}
although this will be slow if there are lots of duplicates, and there is no tolerance control. On the other hand, I think this kind of approach will be more robust for higher order arrays.
Is anything wrong with OrderedQ[Sort@N@data, Less]
?
SeedRandom[0]
data = RandomReal[1, 10^6];
Internal`$EqualTolerance = 2;
OrderedQ[Sort@N@data, Less] // RepeatedTiming
Internal`$EqualTolerance = 7;
OrderedQ[Sort@N@data, Less] // RepeatedTiming
{0.3581, True} {0.1785, False}