Really advanced techniques of integration (definite or indefinite)
Here are a few. The first one is included because it's not very well known and is not general, though the ones that follow are very general and very useful.
- A great but not very well known way to find the primitive of $f^{-1}$ in terms of the primitive of $f$, $F$, is (very easy to prove: just differentiate both sides and use the chain rule): $$ \int f^{-1}(x)\, dx = x \cdot f^{-1}(x)-(F \circ f^{-1})(x)+C. $$
Examples:
$$ \begin{aligned} \displaystyle \int \arcsin(x)\, dx &= x \cdot \arcsin(x)- (-\cos\circ \arcsin)(x)+C \\ &=x \cdot \arcsin(x)+\sqrt{1-x^2}+C. \end{aligned} $$
$$ \begin{aligned} \int \log(x)\, dx &= x \cdot \log(x)-(\exp \circ \log)(x) + C \\ &= x \cdot \left( \log(x)-1 \right) + C. \end{aligned} $$
- This one is more well known, and extremely powerful, it's called differentiating under the integral sign. It requires ingenuity most of the time to know when to apply, and how to apply it, but that only makes it more interesting. The technique uses the simple fact that $$ \frac{\mathrm d}{\mathrm d x} \int_a^b f \left({x, y}\right) \mathrm d y = \int_a^b \frac{\partial f}{\partial x} \left({x, y}\right) \mathrm d y. $$
Example:
We want to calculate the integral $\int_{0}^{\infty} \frac{\sin(x)}{x} dx$. To do that, we unintuitively consider the more complicated integral $\int_{0}^{\infty} e^{-tx} \frac{\sin(x)}{x} dx$ instead.
Let $$ I(t)=\int_{0}^{\infty} e^{-tx} \frac{\sin(x)}{x} dx,$$ then $$ I'(t)=-\int_{0}^{\infty} e^{-tx} \sin(x) dx=\frac{e^{-t x} (t \sin (x)+\cos (x))}{t^2+1}\bigg|_0^{\infty}=\frac{-1}{1+t^2}.$$
Since both $I(t)$ and $-\arctan(t)$ are primitives of $\frac{-1}{1+t^2}$, they must differ only by a constant, so that $I(t)+\arctan(t)=C$. Let $t\to \infty$, then $I(t) \to 0$ and $-\arctan(t) \to -\pi/2$, and hence $C=\pi/2$, and $I(t)=\frac{\pi}{2}-\arctan(t)$.
Finally, $$ \int_{0}^{\infty} \frac{\sin(x)}{x} dx = I(0) = \frac{\pi}{2}-\arctan(0) = \boxed{\frac{\pi}{2}}. $$
- This one is probably the most commonly used "advanced integration technique", and for good reasons. It's referred to as the "residue theorem" and it states that if $\gamma$ is a counterclockwise simple closed curve, then $\displaystyle \int_\gamma f(z) dz = 2\pi i \sum_{k=1}^n \operatorname{Res} ( f, a_k )$ . It will be difficult for you to understand this one without knowledge in complex analysis, but you can get the gist of it with the wiki article.
Example:
We want to compute $\int_{-\infty}^{\infty} \frac{x^2}{1+x^4} dx$. The poles of our function $f(z)=\frac{x^2}{1+x^4}$ in the upper half plane are $a_1=e^{i \frac{\pi}{4}}$ and $a_2=e^{i \frac{3\pi}{4}}$. The residues of our function at those points are $$\operatorname{Res}(f,a_1)=\lim_{z\to a_1} (z-a_1)f(z)=\frac{e^{i \frac{-\pi}{4}}}{4},$$ and $$\operatorname{Res}(f,a_2)=\lim_{z\to a_2} (z-a_2)f(z)=\frac{e^{i \frac{-3\pi}{4}}}{4}.$$ Let $\gamma$ be the closed path around the boundary of the semicircle of radius $R>1$ on the upper half plane, traversed in the counter-clockwise direction. Then the residue theorem gives us ${1 \over 2\pi i} \int_\gamma f(z)\,dz=\operatorname{Res}(f,a_1)+\operatorname{Res}(f,a_2)={1 \over 4}\left({1-i \over \sqrt{2}}+{-1-i \over \sqrt{2}}\right)={-i \over 2 \sqrt{2}}$ and $ \int_\gamma f(z)\,dz= {\pi \over \sqrt{2}}$. Now, by the definition of $\gamma$, we have: $$\int_\gamma f(z)\,dz = \int_{-R}^R \frac{x^2}{1+x^4} dx + \int_0^\pi {i (R e^{it})^3 \over 1+(R e^{it})^4} dz = {\pi \over \sqrt{2}}.$$ For the integral on the semicircle $$ \int_0^\pi {i (R e^{it})^3 \over 1+(R e^{it})^4} dz, $$ we have $$ \begin{aligned} \left| \int_0^\pi {i (R e^{it})^3 \over 1+(R e^{it})^4} dz \right| &\leq \int_0^\pi \left| {i (R e^{it})^3 \over 1+(R e^{it})^4} \right| dz \\ &\leq \int_0^\pi {R^3 \over R^4-1} dz={\pi R^3 \over R^4-1}. \end{aligned} $$ Hence, as $R\to \infty$, we have ${\pi R^3 \over R^4-1} \to 0$, and hence $\int_0^\pi {i (R e^{it})^3 \over 1+(R e^{it})^4} dz \to 0$. Finally, $$ \begin{aligned} \int_{-\infty}^\infty \frac{x^2}{1+x^4} dx &= \lim_{R\to \infty} \int_{-R}^R \frac{x^2}{1+x^4} dx \\ &= \lim_{R\to \infty} {\pi \over \sqrt{2}}-\int_0^\pi {i (R e^{it})^3 \over 1+(R e^{it})^4} dz =\boxed{{\pi \over \sqrt{2}}}. \end{aligned} $$
- My final "technique" is the use of the mean value property for complex analytic functions, or Cauchy's integral formula in other words: $$ \begin{aligned} f(a) &= \frac{1}{2\pi i} \int_\gamma \frac{f(z)}{z-a}\, dz \\ &= \frac{1}{2\pi} \int_{0}^{2\pi} f\left(a+e^{ix}\right) dx. \end{aligned} $$
Example:
We want to compute the very messy looking integral $\int_0^{2\pi} \cos (\cos (x)+1) \cosh (\sin (x)) dx$. We first notice that $$ \begin{aligned} &\hphantom{=} \cos [\cos (x)+1] \cosh [\sin (x)] \\ &=\Re\left\{ \cos [\cos (x)+1] \cosh [\sin (x)] -i\sin [\cos (x)+1] \sinh [\sin (x)] \right\} \\ &= \Re \left[ \cos \left( 1+e^{i x} \right) \right]. \end{aligned} $$ Then, we have $$ \begin{aligned} \int_0^{2\pi} \cos [\cos (x)+1] \cosh [\sin (x)] dx &= \int_0^{2\pi} \Re \left[ \cos \left( 1+e^{i x} \right) \right] dx \\ &= \Re \left[ \int_0^{2\pi} \cos \left( 1+e^{i x} \right) dx \right] \\ &= \Re \left( \cos(1) \cdot 2 \pi \right)= \boxed{2 \pi \cos(1)}. \end{aligned} $$
You can do integration by inverting the matrix representation of the differentiation operator with respect to a clever choice of a basis and then apply the inverse of the operator to function you wish to integrate.
For example, consider the basis $\mathcal{B} = \{e^{ax}\cos bx, e^{ax}\sin bx \}$. Differentiating with respect to $x$ gives \begin{align*} \frac{d}{dx}e^{ax} \cos bx &= ae^{ax} \cos bx - be^{ax} \sin bx\\ \frac{d}{dx} e^{ax} \sin bx &= ae^{ax} \sin bx + be^{ax} \cos bx \end{align*}
and the matrix representation of the linear operator is
$$T = \begin{bmatrix} a & b\\ -b & a \end{bmatrix}$$
To then solve something like $\int e^{ax}\cos bx\operatorname{d}\!x$, this is equivalent to calculating
$$T^{-1}\begin{bmatrix} 1\\ 0 \end{bmatrix}_{\mathcal{B}} = \frac{1}{a^{2} + b^{2}}\begin{bmatrix} a\\ b \end{bmatrix}_{\mathcal{B}}.$$
That is,
$$\int e^{ax}\cos bx\operatorname{d}\!x = \frac{a}{a^{2}+b^{2}}e^{ax}\cos bx + \frac{b}{a^{2} + b^{2}}e^{ax}\sin bx$$
Another option is converting the value under the integral to a summation. For example,
$$ \int{\frac{1}{1 + x^2}dx} = \int\sum_{i = 0}^\infty{(-1)^ix^{2i}}dx = \sum_{i = 0}^\infty(-1)^i\int{x^{2i}}dx = \sum_{i = 0}^\infty \frac{(-1)^ix^{2i+1}}{2i + 1}.$$
You might then make use of the fact that,
$$\sum_{i = 0}^\infty \frac{(-1)^ix^{2i+1}}{2i + 1} = \tan^{-1}{x}.$$
Of course, you need to be familiar with many different series, which comes with practise. In fact, most derivations of $\arctan(x)$ as a series actually use the method I just used. However, it still serves as an example of the technique.
Another example of this comes through the Riemann zeta function:
Let $u=kx$,
$$\begin{align}\int_0^\infty\frac{x^s}{e^x-1}\ dx&=\int_0^\infty x^se^{-x}\left(\frac1{1-e^{-x}}\right)\ dx\\&=\int_0^\infty x^se^{-x}\sum_{k=0}^\infty e^{-kx}\ dx\\&=\sum_{k=1}^\infty\int_0^\infty x^se^{-kx}\ dx\\&=\sum_{k=1}^\infty\frac1{k^{s+1}}\int_0^\infty u^se^{-u}\ du\\&=\zeta(s+1)\Gamma(s+1)\end{align}$$
A beautiful non-trivial example of expansion and solving through series.