Why is this integration method not valid?

You have not paid enough attention to the limits of your integration - the two integrals you are adding are not over the same interval.

The method to use here is quite similar to yours in using the same trick correctly, without disturbing the interval of integration:

$$\frac{2\sin x}{\sin x + \cos x}=\frac {\sin x +\cos x}{\sin x +\cos x}+\frac {\sin x - \cos x}{\sin x + \cos x}$$


You might need a little more explanation of the interval issue.

Integrals are additive in two ways.

First, if the same function is integrated over disjoint intervals (later measurable sets) then we can integrate over the union of the intervals and the integral over the whole is the sum of the integrals over the parts.

Second, if we integrate different functions over the same interval (measurable set), the sum of the integrals is equal to the integral of the sum of the functions.

Apply your method to the integral of $x^2$ and use the substitution $y=-x$. The integral you get is the integral of $-x^2$. Adding the two you get twice the integral is zero, which is nonsense, because the function you are integrating is positive except at $x=0$. What has happened here is that you have swapped the limits of integration, and you need to reverse the sign to straighten them out.

Your method has both reversed the limits and translated them by $\frac {\pi} 2$. You can't add the integrals in this case and expected to get the right answer.


$$\int_{a}^{b}\frac{\sin \ x}{\cos \ x + \sin \ x}dx\neq \int_{a}^{b}\frac{-\cos \ x}{\cos \ x + \sin \ x}dx$$ but $$\int_{a}^{b}\frac{\sin \ x}{\cos \ x + \sin \ x}dx= \int_{\pi/2-a}^{\pi/2-b}\frac{-\cos \ x}{\cos \ x + \sin \ x}dx$$


Let's apply your 'method' to a much simpler example:

Let $I = \int x\ dx$. Let $y = 1-x$. Then $I = \int (1-y)\ dx = \int (y-1)\ dy = \int (x-1)\ dx$. So $0 = \int 1\ dx$. Haha...

So what went wrong? The reason you can't easily find a reference that says that indefinite integrals cannot be added is that there are different ways to interpret and manipulate the indefinite integral notation, some of which do permit such addition (and forbid other manipulations).

In particular, the notation you used reflects the view of an indefinite integral as an anti-derivative. That is, $x$ is a variable and $I$ is another variable such that $\frac{dI}{dx} = x$. Note that this already implies that you have $\int (1-y)\ dx = \int (y-1)\ dy + c$ for some constant $c$ (which you have no control over), but it also implies that $\int (y-1)\ dy$ is not the same as $\int (x-1)\ dx$, because $x,y$ are not dummy variables in this framework. After all, don't forget that you tied them together by the relation $y = 1-x$.

Another view is to treat indefinite integrals as definite integrals from some fixed point, in which case you cannot use your notation in the first place. In this view, given any real interval $D$ and integrable function $f : D \to \mathbb{R}$, we take "$\int f(x)\ dx$" to mean $\int_a^x f(t)\ dt$ where $a$ is some fixed real in $D$, and $x$ here is a free variable and the expression is meaningless except in a context where $x$ is defined. So for our little example, $\int (x-1)\ dx = \int_a^x (t-1)\ dt$ and $\int (y-1)\ dy = \int_a^y (t-1)\ dt$, so clearly we cannot claim they are equal for all $x,y$ satisfying $y = 1-x$.