$f(x)=ax+b$ for some $a,b\in\mathbb{Q}$ if $f(\mathbb{Q})\subset\mathbb{Q}$ and $f(\mathbb{R-Q})\subseteq \mathbb{R}-\mathbb{Q}$
First, note that $f$ has rational coefficients. For instance, you can prove this by Lagrange interpolation: if $\deg f=n$, pick $n+1$ elements of $\mathbb{Q}$, and then $f$ must be the unique polynomial with rational coefficients that takes the right values on those $n+1$ points.
Multiplying $f$ by an integer, we may assume $f$ has integer coefficients. So we just have to show that if $f$ is a polynomial with integer coefficients of degree $>1$, then there is some irrational number that $f$ sends to a rational number. That is, we want to show there exists a rational number $a$ such that $f(x)-a$ has an irrational root.
Adding a constant and possibly multiplying by $-1$, we may assume the leading coefficient of $f$ is positive and the constant term is $0$. We can then guarantee that $f(x)-a$ has an irrational root by taking $a$ to be a sufficiently large prime number. Indeed, if $a$ is sufficiently large, then $f(x)-a$ does have a real root, since the leading coefficient of $f$ is positive. But by the rational root theorem, any rational root has the form $\pm b/c$, where $b$ is a divisor of $a$ (i.e., $b=1$ or $b=a$ since $a$ is prime) and $c$ is a divisor of the leading coefficient of $f$. For any fixed such $c$, when $a$ is very large, $f(\pm a/c)-a$ will be dominated by the leading term and so cannot be $0$ (since $f$ has degree $>1$, so the leading term grows faster than the $-a$). So we can pick $a$ to be sufficiently large such that $f(x)-a$ cannot have any roots of the form $\pm 1/c$ or $\pm a/c$ for any such $c$. For such an $a$, then, $f(x)-a$ must have an irrational root.
As in Eric Wofsey's answer, we prove that $f$ has rational coefficients. Multiplying by a constant, we may then assume that $f$ has integer coefficients. Let its leading term be $ax^p$. Replacing $f(x)$ with $a^{p-1}f(x/a)$, we may assume that $f(x)$ is monic with integer coefficients.
If $f$ has degree $> 1$, then we have $f'(x) > 1$ for large enough $x$, say for $x \geq N$, where $N$ is an integer. Then $f(N)$ and $f(N+1)$ are integers and $f(N+1) - f(N) > 1$ by the mean value theorem. By the intermediate value theorem, we must have $f(x) = f(N) + 1$ for some $x \in (N,N+1)$. By hypothesis, $x$ must be rational, hence an integer, since $f$ is monic. But this contradicts the choice of $x$.