Why can't we define more elementary functions?

Elementary functions are finite sums, differences, products, quotients, compositions, and $n$th roots of constants, polynomials, exponentials, logarithms, trig functions, and all of their inverse functions.

The reason they are defined this way is because someone, somewhere thought they were useful. And other people believed him. Why, for example, don't we redefine the integers to include $1/2$? Is this any different than your question about $\mathrm{lax}$ (or rather $\operatorname{erf}(x)$)?

Convention is just that, and nothing more.


I would approach the question this way. We can think of our "library" of functions being built up recursively: Start with a few basic functions (polynomials, exponentials, logarithms, trig functions, etc.), and start composing, concatenating, integrating, etc. At each stage you have a collection of functions that have been defined "so far". What Liouville's theorem is stating is that:

At any stage, there will be functions whose integrals consist of functions that have not yet been defined.

So for example, you can (if you want) add the function lax $(x)$ to your library and consider it "elementary"... But then as soon as you start to think about the integral of lax $(x)$ you discover that it can't be expressed using elementary functions. So, sure, go ahead and give that function a name and call it elementary; as soon as you do, you realize that its integral can't be expressed using elementary functions. And so on.

Edited to add: A couple of commenters have asked for a reference to the bold-faced paraphrase of Liouville's Theorem above. I should clarify that I don't have a reference for it, and I'm not even sure such a theorem exists (or has been proven). The bold-faced paraphrase is my informal, intuitive interpretation of the meaning of Liouville's Theorem; it seemed to me that was what the OP was looking for. Note that my statement was deliberately vague (it includes "etc." in two different places). I suspect there is probably some more precise refinement of this paraphrase that is true, but I don't know what that refinement would be.


The motivation here is similar to that in elementary Galois theory where you might study whether or not you can write the roots of a polynomial using the standard arithmetic functions along with the $n$th root function. You might wonder why anyone cares about solving polynomial equations using these restricted functions when you could just use any of a number of numerical methods, or define new functions.

But if you do investigate the restricted problem a large amount of interesting mathematics arises. Solutions to polynomial equations using radicals correspond to solvable groups which are of interest in their own right. This gives plenty of insight into solving more general problems (not to mention that historically this problem helped kickstart the whole field of abstract algebra). The fact that this connection exists now justifies the study of the original restricted problem. And it gives insight into solving wider classes of problems.

If you're writing software to symbolically integrate functions, say, then it'd be risky to ignore Liouville's theorem and Risch's algorithm. They'll give you insight even if you're solving a more general problem.