Mathematical misconceptions and how to combat them

Way too many to count. Of course, one of the most common ones is:

All functions are linear. So, $(a+b)^2 = a^2+b^2$, $\sqrt{a+b} = \sqrt{a}+\sqrt{b}$, $\sin(a+b) = \sin(a)+\sin(b)$, etc.

All things can be sustituted: thus, since $\int\frac{1}{x}\,dx = \ln|x|+C$, then $\int\frac{1}{f(x)} \,dx = \ln|f(x)|+C$

At the bottom, all of these common misconceptions arise from not understanding that the symbols are supposed to have a meaning, and that the manipulations are not simply mindless rules. The first misconception comes from not understanding what the decimal expansion of a number means (it describes the coefficients of a series, and it represents the number that is the limit of the partial sums). The second comes form mindless manipulation (as does the "every function is linear" problem).

Others arise because the students are trying to memorize without understanding, and there's too much to memorize, often very similar to one another e.g., (you'll notice a theme, but that's because right now I'm teaching series, so these are fresh):

  • The Divergence Test tells you that a series converges if the terms go to zero, and diverges if they don't.

  • The Integral Test gives you the value of the series.

  • In the limit comparison test, you have to see if the limit is greater than $1$ or smaller than $1$.

I'm not sure what is "the best way to combat them". After so many years, the Freshman Dream is alive and well, as are misconceptions about the nature of decimal expansions.

(I would say that your second example is not a "common misconception", but rather a fallacy that many people have a hard time figuring out and spotting; it's not like people actually think that $1=2$, whereas they do actually think that $1$ and $0.999999\ldots$ are different, or that you can talk about "an infinite number of $9$s, and then a $0$" in decimal expansions).

See also http://www.math.vanderbilt.edu/~schectex/commerrs/


For an $m \times n$ matrix $$\sum_{i=1}^{m} \sum_{j=1}^{n} a_{ij} = \sum_{j=1}^{n} \sum_{i=1}^{m} a_{ij}$$

Thus $$\sum_{i=1}^{\infty} \sum_{j=1}^{\infty} a_{ij} = \sum_{j=1}^{\infty} \sum_{i=1}^{\infty} a_{ij}$$

Or

$$\int \int f(x,y) \ dx \ dy = \int \int f(x,y) \ dy \ dx$$ always.


$\text{Prob}(A \cap B) = \text{Prob}(A) \text{Prob}(B)$

which only holds when $A$ and $B$ are independent. For example, if you roll a fair die, let $A$ be the event that you roll a number up to $3$, and let $B$ be the event that you roll an even number. $\text{Prob}(A \cap B) = 1/6$, while $\text{Prob}(A) \text{Prob}(B)=1/4$.