How does interval comparison work?

As specified in the Python documentation:

Comparisons can be chained arbitrarily, e.g., x < y <= z is equivalent to x < y and y <= z, except that y is evaluated only once (but in both cases z is not evaluated at all when x < y is found to be false).

Formally, if a, b, c, ..., y, z are expressions and op1, op2, ..., opN are comparison operators, then a op1 b op2 c ... y opN z is equivalent to a op1 b and b op2 c and ... y opN z, except that each expression is evaluated at most once.


Unlike most languages, Python supports chained comparison operators and it evaluates them as they would be evaluated in normal mathematics.

This line:

return min <= test <= max

is evaluated by Python like this:

return (min <= test) and (test <= max)

Most other languages however would evaluate it like this:

return (min <= test) <= max