The Halting Problem and Church's Thesis
The invocation of Church's thesis is not a religuous move but rather a warning to the reader that the author is describing informally an effective procedure which could be translated into a construction of a Turing machine (if one enjoyed such a thing). This is completely standard in computability theory. (And other branches of mathematics have a similar level of rigour, as pointed out by Jason Rute in the comments.)
We could ask whether we have to worry about the informal level of proof or Church's thesis itself. The answer is that Church's thesis has been tested billions of times in practice: every time anyone thinks of an algorithm and then actually codes it up, that is a confirmation that they did not violate Chuch's thesis and that their sense of what makes an algorithm did not lead them astray. In any case, for the paranoid there is always the formalization of Halting problem.
Let me point out that there are really a family of Church-Turing theses assertions.
On the one hand, for what is sometimes described as the weak Church-Turing thesis, one imagines an idealized human agent, not constrained by resources of time or memory (or supplies of paper and pencil), but carrying out the kind of idealized computation that Turing had described — using paper and pencil calculations according to certain kinds of formal rules — and the claim is that any such algorithm that could in principle be carried out by such an idealized human agent can in fact by carried out by a suitable Turing machine program. According to this version of the thesis, therefore, the Turing-machine account of computability has captured the correct notion of computability-in-principle for an idealized human agent.
I believe that it is the weak Church-Turing thesis that most mathematicians have in mind when speaking of the Church-Turing thesis. A convincingly large piece of evidence for at least this weak form of the Church-Turing thesis consists of the mathematical fact that all the various notions of formal computability, including Turing machines, modified and expanded Turing machines, such as multi-tape machines and expanded alphabet or instruction set Turing machines, but including also register machines, modified register machines, flowchart machines, machines based on idealized versions of the basic programming language, or C++ or whatever other computer language, and so on. All these various notions of formal computability have been proved equivalent — they can all simulate each other — and this makes us think that we have correctly captured the notion of computability-in-principle with any one of them.
And surely it is only the weak Church-Turing thesis that is used in the arguments that Rogers mentions. When one describes a computational procedure in the way that Rogers does, or anyone does in the entire field of computability theory, one is describing a computational procedure that could in principle be carried out by an idealized human agent using only paper and pencil with plenty of time and plenty of paper. So this is a use of the comparatively uncontested formulation of the Church-Turing thesis. As Andrej and the other commentators testify, there is nearly universal agreement on the accuracy of the weak Church-Turing thesis.
The point I want to emphasize, however, is that there is another stronger version of the thesis, the strong Church-Turing thesis, which asserts that not only are the idealized paper-and-pencil computational procedures all simulable by Turing machines, but also any algorithm procedure that we could in principle carry out in our physical universe, however strange, is simulable by Turing machines. This is a much stronger claim.
Frankly, the evidence for this stronger version of the Church-Turing thesis is considerably weaker, in light of the fact that we already know that the fundamental nature of physical reality, including various bizarre quantum effects as well as relativistic effects, such as time dilation, are quite bizarre. We don't actually have much reason to think that it should not be possible in principle to take advantage of them for computational effect.
The quantum Turing machines may be a familiar example. Although we already know a lot about quantum Turing machines, in fact these will not violate the strong Church-Turing thesis, since they are in principle simulable by ordinary Turing machines (although the simulation takes much longer time, so the new quantum machines may give rise to a new complexity theory, even though they give rise to the same computability theory). So there seems to be no hidden violation of the strong Church-Turing thesis arising there.
But meanwhile, there are other strange computational procedures such as black hole computation and others, which seem possibly to offer a way to violate the Church-Turing thesis, by taking advantage of the relativistic effects such as time dilation that occur in our actual physical world. See Philip Welch's article, The extent of computation in Malament-Hogarth spacetimes, for a great summary and mathematical analysis of this kind of thing.
Jack Copeland has particularly emphasized the distinction between the weak and the strong Church-Turing theses.
I would say that Church's thesis is more of a physical principle that formalizes the notion of "computable in finite time". One cannot rule out new scientific discoveries that would falsify Church's thesis.