+ C in integration by parts allows for major differences in answers?

This is a good observation. Yes, the you could write $g(x)+C$ instead of $g(x)$, but the indefinite integral would be the same.

You would sum a $Cf(x)$ in the first term that would be canceled by $-\int Cf'(x) dx$ in the second term.

Edit: Answering your edit, you simply cannot write two different constants. The Integration by Parts theorem does not allow so. You need a single function $g(x)$ whose derivative is $g'(x)$. I advise you to take a look at the proof of the theorem here as it is very easy and it will surely clarify things.


$$\dfrac{d}{dx}\left(f(x)g(x)-\int f'(x)g(x)dx\right) = f'(x)g(x)+f(x)g'(x)-\dfrac{d}{dx}\left(\int f'(x)g(x)dx\right)$$

By the Fundamental Theorem of Calculus, the last derivative is $f'(x)g(x)$, so:

$$\dfrac{d}{dx}\left(f(x)g(x)-\int f'(x)g(x)dx\right) = f(x)g'(x)$$

This shows that $f(x)g(x)-\int f'(x)g(x)dx$ is an antiderivative of $f(x)g'(x)$.

By the Fundamental Theorem of Calculus, we also know that any two antiderivatives differ by at most a constant.

So:

$$\int f(x)g'(x)dx = f(x)g(x) - \int f'(x)g(x)dx$$

is a correct final form, and any other form you come up with will be equal up to a difference of a constant. In other words, the form you came up with may well be valid, but would be different from the standard form by at most a constant.

Here is an example of this:

$$\int \dfrac{1}{x}dx$$

Let $u = \dfrac{1}{x}, dv = dx$

Then $du = -\dfrac{1}{x^2}dx, v = x$

So, we have:

$$\int \dfrac{1}{x}dx = \dfrac{1}{x}\cdot x - \int \left(-\dfrac{1}{x^2}\right)x dx = 1 + \int \dfrac{1}{x}dx$$

It appears that you can then subtract $\int \dfrac{1}{x}dx$ from both sides and wind up with $0=1$, but in fact, this is just an example of two antiderivatives differing by at most a constant.


Short answer:

If you are worried about the constants of integration, then the integration by parts formula is most simply written as $$ \int f'(x) g(x) \,\mathrm{d}x = f(x) g(x) - \int f(x)g'(x) \,\mathrm{d}x + C. $$ This is, however, equivalent to the formula given in the question.

Details:

It may be worthwhile to recall exactly what the integration by parts theorem says, and where it comes from. First, recall that the notation $$ \int f(x)\,\mathrm{d}x $$ represents an antiderivative of $f$. There are a few ways of conceptualizing what this actually means. The usual definition is

Definition: Let $f : \mathbb{R} \to \mathbb{R}$ be a function[1], and suppose that there is a differentiable function $F : \mathbb{R} \to \mathbb{R}$ such that $F'(x) = f(x)$ for all $x\in\mathbb{R}$. Then $F$ is said to be an antiderivative of $f$.

Typically, the first theorem proved after this definition is introduced is that antiderivatives differ by at most a constant. That is,

Theorem: Let $f:\mathbb{R}\to\mathbb{R}$ be a function, and suppose that both $F$ and $G$ are antiderivatives of $f$. Then there exists some constant $C$ such that $$ F(x) = G(x) + C $$ for all $x \in \mathbb{R}$.

An important observation here is that antiderivatives are not unique. Indeed, while this is not stated above, it can be shown without too much difficulty that if $F$ is an antiderivative of $f$ and $C\in\mathbb{R}$ is any constant, then the function $x \mapsto F(x) + C$ is also an antiderivative of $f$.

One other observation:

Lemma: Let $f,g:\mathbb{R}\to \mathbb{R}$ be two functions, and suppose that $F$ and $G$ are antiderivatives of $f$ and $g$, respectively. Then $F+G$ is an antiderivative of $f+g$.

In other words, one antiderivative of a sum is the sum of antiderivatives of the summands. This follows immediately from the definition of an antiderivative and properties of the derivative.

At this point, it is possible to state a version of the integration by parts formula:

Theorem: Let $f,g:\mathbb{R}\to\mathbb{R}$ be two functions, and suppose that $U' = f'g$ and that $V' = fg'$ (that is, suppose that $U$ and $V$ are antiderivatives of $f'g$ and $fg'$, respectively). Then $$ U(x) = f(x)g(x) - V(x) $$ for all $x\in\mathbb{R}$.

Proof: By the product rule, $ (f\cdot g)' = f'g + fg'. $ That is, for any $x\in\mathbb{R}$, $$ (f\cdot g)'(x) = f'(x) g(x) + f(x) g'(x). $$ As $fg$ is an antiderivative of $(fg)'$, it follows from the lemma that $$ fg = U + V \implies U = fg - V, $$ which is the claimed result. $\square$

Notice that, in the above argument, $U$ and $V$ are any two antiderivatives of the indicated functions. As noted above, antiderivatives differ by (at most) a constant, and that if $F$ is an antiderivative of $f$, then so too is $F+C$ for any $C\in\mathbb{R}$. This implies that any antiderivative of $(fg)'$ is of the form $$ fg + C_1 $$ for some constant. Similarly, antiderivatives of $f'g$ and $fg'$ are of the form $$ U + C_2 \qquad\text{and}\qquad V + C_3, $$ respectively, where $C_1$ and $C_2$ are constants. In the integration by parts formula, this becomes $$ U + C_2 = (fg + C_1) - (V+C_3) \implies U = fg - V + (C_1 - C_2 - C_3) = fg - V + C, $$ where $C = C_1 - C_2 - C_3$ is just a constant. In other words, we can include a constant of integration into the integration by parts formula by writing $$ U = fg - V + C. $$ This notation is non-standard, but it emphasizes that we are working with specific antiderivatives of specific functions: $fg$ is a convenient choice of an antiderivative for $(fg)'$, while $U$ and $V$ are completely arbitrary choices of antiderivatives of $f'g$ and $fg'$. This notation also helps us to see where the constants come into play.

The more standard notation is to write the antiderivative of a function using the elongated "s", that is: $$ \int f(x),\mathrm{d}x $$ In this notation, the above result becomes $$ \int f'(x) g(x) \,\mathrm{d}x = f(x) g(x) - \int f(x)g'(x) \,\mathrm{d}x + C. $$

Why does this look different from the formula in the question?

In the question, it is proposed that the formula should be $$ \int f'(x)g(x)\,\mathrm{d}x = (f(x)+C)g(x) - \int (f(x)+C)g'(x)\,\mathrm{d}x. $$ (modulo a change in the roles of $f$ and $g$, which I have made to remain consistent with what I've written above). Expand out the right-hand side to get \begin{align} &(f(x)+C)g(x) - \int (f(x)+C)g'(x)\,\mathrm{d}x \\ &\qquad= f(x)g(x) + Cg(x) - \int f(x)g'(x)\,\mathrm{d}x - \int Cg(x)\,\mathrm{d}x \\ &\qquad= f(x)g(x) + Cg(x) - \int f(x)g'(x)\,\mathrm{d}x - C\int g'(x)\,\mathrm{d}x \\ &\qquad= f(x)g(x) + Cg(x) - \int f(x)g'(x)\,\mathrm{d}x - C(g(x)+D) && (\text{$D$ is a constant}) \\ &\qquad= f(x)g(x) - \int f(x)g'(x)\,\mathrm{d}x - CD. \end{align} But $CD$ is just a constant, so this is the same formula as derived above.


[1] Typically, we need some additional hypotheses on $f$, e.g. we might like $f$ to be continuous, or Riemann integrable, or something similar. It also isn't necessary to insist that $f$ be defined on $\mathbb{R}$—the assumption that the domain is an interval (and therefore connected) is sufficient to avoid most problems, and more complicated domains may be considered with a little bit of work. However, to avoid annoying technicalities, I will assume that $f$ is "sufficiently nice", and that it is defined on a "sufficiently nice" domain. There are important details that I am sweeping under the rug with the phrase "sufficiently nice", but they are not all that important in the current context.

Tags:

Calculus