About joint probability divided by the product of the probabilities

Since you say that $X$ and $Y$ are events, let us rename them $A$ and $B$, to avoid a confusion with random variables.

Then, at least in the environmental, medical and life sciences literature, $P(A\cap B)/(P(A)P(B))$ is called the observed to expected ratio (abbreviation o/e). The idea is that the numerator is the actual probability of $A\cap B$ while the denominator is what it would be if $A$ and $B$ were independent.

Obviously the o/e ratio is $1$ if $A$ and $B$ are independent, it is more than $1$ if $A$ is favored by $B$, or, equivalently, if $B$ is favored by $A$, and it is less than $1$ if the opposite holds.

In the statistical analysis of genomic sequences, the CpGo/e ratio is especially important, which represents the frequency of the word CG divided by the product of the frequencies of the letters C (cytosine) and G (guanine), see here for an example. The rough idea is that in non functional portions of the genome, CpGo/e is much less than $1$ due to some well-known biological and chemical processes (a methylation-deamination of the guanine when it is right next to a cytosine, if you want to know). By contrast, in portions of the genome called CpG islands, CpGo/e is only slightly smaller than $1$ or even, greater than $1$, a fact which witnesses a repression of these processes and, as a consequence, may signal some functional regions.


You could notice (assuming you know about conditional probabilites) that $$h(X,Y) = \frac{P(X Y )}{P(X) P(Y)} = \frac{P(X|Y)}{P(X)}= \frac{P(Y | X)}{P(Y)}$$

Hence, for example, $h(X,Y) > 1 \Leftrightarrow P(X |Y) > P(X)$ which, informally, says that the occurrence of event $Y$ increments the probability of event $X$ occurring (and vice versa). And that is all. Remember, though, here $X,Y$ are events, not variables , ie., it does not make sense to say that, e.g, $h(X,Y) > 1$ for some variables $X,Y$ globally, (so that we could say that the variables are "positively dependent" or something like that). For general measures of random variables dependence (or correlation, which is a related though weaker property), see here.

Added: If we regard the events as two joint Bernoulli variables (we identify the event $X$ with the probability that the variable equals 1, $P(X=1)=p_X$ ), we can note that the covariance is given by $Cov_{X Y} = E(X Y) - E(X)E(Y)=p_{XY} - p_X \,p_Y$ and then $$h(X,Y) = \frac{1}{1+Cov_{X Y}/p_{XY}}$$