How do we know what the integral of $\sin (x)$ is?

This is an interesting question and I understand the broader implications , but I will focus on the statement that computing a (definite) integral using the basic definition as a limit of Riemann sums is intractable for all but the simplest functions.

Granted, computation via the Fundamental Theorem of Calculus, is often the most expedient approach, but there comes a point where finding the anti-derivative in terms of elementary functions also is intractable. Furthermore, the bar for computation via the basic definition is perhaps not as high as you seem to think.

Presumably in your exercise you computed something like

$$\int_0^1 t^2 dt = \lim_{n \to \infty}\frac{1}{n} \sum_{k=1}^n\left( \frac{k}{n}\right)^2 = \frac{1}{3}, $$ or, even more generally, $$\int_0^x t^2 dt = \lim_{n \to \infty}\frac{1}{n} \sum_{k=1}^n\left( \frac{kx}{n}\right) = \frac{x^3}{3}, $$

and this was facilitated by knowing

$$\sum_{k=1}^n k^2 = \frac{n(n+1)(2n+1)}{6}.$$

Now consider your example, $\sin x$. I would assume you are aware of such basic properties as $\cos 2x = 1 - 2 \sin^2 x$ and $\lim_{x \to 0} \sin x / x = 1.$ Possibly less apparent is

$$\tag{1}\sum_{k=1}^n \sin (ky) = \frac{\sin \left(\frac{ny}{2} \right) \sin\left(\frac{(n+1)y}{2} \right)}{\sin\left(\frac{y}{2} \right)}.$$

This identity can be derived a number of ways, one being taking the imaginary part of the geometric sum $\sum_{k=1}^n (e^{iy})^k.$ As in your exercise where you knew the closed form for the sum of the squares, you can use $(1)$ to compute

$$\int_0^x \sin t \, dt = \lim_{n \to \infty}\sum_{k=1}^n\sin \left(\frac{kx}{n} \right)\left(\frac{kx}{n} - \frac{(k-1)x}{n} \right) = \lim_{n \to \infty}\frac{x}{n}\sum_{k=1}^n\sin \left(\frac{kx}{n} \right).$$

Using $(1)$ with $y = x/n$, we have

$$\begin{align}\frac{x}{n}\sum_{k=1}^n\sin \left(\frac{kx}{n} \right) &= \frac{x}{n}\frac{\sin \left(\frac{x}{2} \right) \sin\left(\frac{x}{2} + \frac{x}{2n} \right)}{\sin\left(\frac{x}{2n} \right)} \\ &= \frac{x}{n}\frac{\sin \left(\frac{x}{2} \right) \left[\sin\left(\frac{x}{2}\right)\cos\left(\frac{x}{2n}\right)+ \sin\left(\frac{x}{2n}\right) \cos\left(\frac{x}{2}\right)\right]}{\sin\left(\frac{x}{2n} \right)} \\ &= \frac{x\sin \left(\frac{x}{2} \right) \cos \left(\frac{x}{2} \right) }{n} + \frac{2\sin^2 \left(\frac{x}{2} \right) \cos\left(\frac{x}{2n}\right) }{\sin\left(\frac{x}{2n} \right)/ \frac{x}{2n} }\end{align}.$$

Now if we take the limit as $n \to \infty $ we see $ \frac{x}{2n} \to 0$ and

$$\int_0^x \sin t \, dt = \lim_{n \to \infty}\frac{x}{n}\sum_{k=1}^n\sin \left(\frac{kx}{n} \right) = 2\sin^2 \left(\frac{x}{2}\right) = 1 - \cos x = \cos 0 - \cos x.$$


It may be a bit off topic but here is how you can rigoursouly introduce those trigonometrics functions and their derivatives (so their primitives as well in this case) in a complex way (pun intended).

I am writting this partly by memory from what I read in Rudin, Real and Complex Analysis (I think this is in the prologue of the book).

First, define this function :

$$\exp(z)=1+z+\dfrac {z^2} 2+\cdots+\dfrac {z^n} {n!}+\cdots$$ That function goes from $\mathbb{C}$ to itself. (This sum is normally convergent on any compact set of $\mathbb{C}$.)

By the Cauchy's product of series and the Newton formula, we can deduce the first property of that function, which is $$\exp(a+b)=\exp(a) \exp(b).$$

Then if we define $e=\exp(1)$ then we denote $\exp(z)=e^z$ since $$e^{a+b}=\exp(a+b)=\exp(a) \exp(b)=e^ae^b.$$ (Note that this is a notation, "fortunately" it is consistent with the rules of computations for powers).

Then let $x \in \mathbb{R}$, then $|e^{ix}|^2=e^{ix} \overline{e^{ix}}=e^{ix}e^{-ix}=e^{ix-ix}=e^0=1$.

Then we define $\cos(x)=\Re(e^{ix})$ and $\sin(x)=\Im(e^{ix})$ so : $e^{ix}=\cos(x)+i\sin(x)$.

But $(e^{ix})'=\left(1+ix+\dfrac {(ix)^2} 2+\cdots+\dfrac {(ix)^n} {n!}+\cdots \right)'$.

Since that serie and the serie of the derivatives $(\frac {(ix)^k} {k!})'$ is normally convergent on $\mathbb{R}$, you can deduce :

$$(e^{ix})'=ie^{ix}.$$

But then $(\cos(x))'+i(\sin(x))'=(e^{ix})'=-\sin(x)+i\cos(x)$.

By identifying, we finally find : $(\cos(x))'=-\sin(x)$ and $(\sin(x))'=\cos(x)$.

PS : Of course a LOT more can be said, I went right to what I wanted to explain, I strongly recommend reading the prologue mentionned at the beginning of that answer, this is a very interesting reading.


Expanding the comments of Doug M and benguin: this is a very simplified version of the history.

Gregory/Barrow/Newton proved (more or less) the Fundamental Theorem of Calculus with integral = area under the curve.

About the formulas $\sin' = \cos$, $\cos' = -\sin$, is really difficult to say who proved first. Maybe Roger Cotes? See The calculus of the trigonometric functions for details and also How were derivatives of trigonometric functions first discovered?

Also very interesting: Some History of the Calculus of the Trigonometric Functions includes the proof by Archimedes of our formula $$\int_0^\pi\sin = 2$$ than can be easily generalized (Archimedes dont't do this) to $$\int_0^\alpha\sin x\,dx = 1 - \cos\alpha.$$ The section about Pascal is equally interesting.