What did Alan Turing mean when he said he didn't fully understand dy/dx?

I'm guessing too, but my guess is that it has something to do with the fact that beyond his initial introduction to calculus Turing (in common with many of us) thinks of a function as the entity of which a derivative is taken, either universally or at a particular point. In Leibnitz's notation, $y$ isn't explicitly a function. It's something that has been previously related to $x$, but it actually is a variable, or an axis of a graph, or the output of the function, not the function or the relation per se.

Defining $y$ as being related to $x$ by $y = x^2 + 3x$, and then writing $\frac{\mathrm{d}y}{\mathrm{d}x}$ to be "the derivative of $y$ with respect to $x$", quite reasonably might seem unintuitive and confusing to Turing once he's habitually thinking about functions as complete entities. That's not to say he can't figure out what the notation refers to, of course he can, but he's remarking that he finds it difficult to properly grasp.

I don't know what notation Turing preferred, but Lagrange's notation was to define a function $f$ by $f(x) = x^2 + 3x$ and then write $f'$ for the derivative of $f$. This then is implicitly with respect to $f$'s single argument. We have no $y$ that we need to understand, nor do we need to deal with any urge to understand what $\mathrm{d}y$ might be in terms of a rigorous theory of infinitesimals. The mystery is gone. But it's hard to deal with the partial derivatives of multivariate functions in that notation, so you pay your money and take your choice.


I'll try my hand and give one possible rationale behind Turing's confusion over the notation $ \tfrac{\mathrm{d}y}{\mathrm{d}x} $. The short answer is that he appears to take issue with the notation on the grounds that differentiation is a mapping between two function spaces but $y$ looks like a variable.

To answer the first part of your question regarding depth and his meaning, I base my answer on the discussion in the linked webpage. Based on the dating of the notes and the discussion in the pictures, I doubt that his contention is regarding actual interpretation in terms of differentials $\mathrm{d}x, \mathrm{d}y$ and is instead more pedantic in nature. Earlier in the discussion, he talks about indeterminates and the difference between an indeterminate and a variable. Later, he writes "What is the way out? The notation $\tfrac{\mathrm{d}}{\mathrm{d}x} f(x, y)_{x=y,y=x}$ hardly seems to help in this difficult case". From this I gander that it's primarily the fact that he doesn't like the use of $y$ as something being differentiated. He states that $y = x^2 + 3x$ as if to say that you could alternatively just rearrange the equation in terms of $x$; however, in taking the derivative, you get another function - that is, you have a function $f(x)$ that becomes $g(x)$ as a result of differentiation $D:f(x) \rightarrow g(x)$. Thus, $y$ can't be a variable but rather a function if we want differentiation to be well-defined. Think of it as something akin to abuse of notation.

Regarding intuition and subtlety, Leibnitz's notation does indeed provide both although in truth $$ \dfrac{\mathrm{d}}{\mathrm{d}x} f(x) = g(x)$$ is clearer than using $y$. From a more intuitive point of view, you can think of the derivative as a variation in $f$ with respect to $x$ and indeed it's this notion that is more important than the one regarding secants and tangents. The subtlety of the notation is readily apparent when people are using the chain rule, even though you're not really cross-multiplying (for instance it doesn't work with second order derivatives). The subtlety becomes more apparent when you abstract even further to vector calculus or even further to differentiable manifolds.


I'm no expert on Alan Turing at all, and the following will not directly answer your question, but it might give some context. Following the link provided, I also found the following page, which might shed some more light on things:

Another page from Alan Turing's notebook

It says the following (my apologies for any errors in transcribing, I'm grateful for any corrections):

A formal expression $$ f(x) = \sum_{i=1}^n \alpha_i x^i $$ involving the `indeterminate' (or variable) $x$, whose coefficients $\alpha_i$ are numbers in a field $K$, is called a ($K$-)polynomial of formal degree $n$.

The idea of an `indeterminate' is distinctly subtle, I would almost say too subtle. It is not (at any rate as van der Waerden [link added by me] sees it) the same as variable. Polynomials in an indeterminate $x$, $f_1(x)$ and $f_2(x)$, would not be considered identical if $f_1(x)=f_2(x)$ [for] all $x$ in $K$, but the coefficients differed. They are in effect the array of coefficients, with rules for multiplication and addition suggested by their form

I am inclined to the view that this is too subtle and makes an inconvenient definition. I prefer the indeterminate $x$ [?] be just the variable.

I think one thing to keep in mind here is that at this time anything relating to `computability' was not as clear as today. After all, Turing (an Church & Co.) were just discovering the essential notions.

In particular questions of intensionality vs. extensionality could have been an issue. It might be that Turing was pondering on the difference between functions (and also operations on functions) from a purely mathematical point of view (i.e., functions as extensional objects) vs. a computational point of view (i.e., functions as some form of formal description of a calculation process, which a priori can not be looked at in an extensional way).

All of this can be still seen in the context of the foundational crisis of mathematics (or at least strong echoes thereof). Related to this, are of course, questions of rigour, formalism, and denotation. This, in turn, is where your quote comes in. As others have outlined, Turing might have asked the question, what $\frac{\mathrm{d}y}{\mathrm{d}x}$ is (from a formal point of view), but also what is denoted by it, and (to his frustration) found that the answer to his question was not as clear as he wanted it to be.