Why should I always enable compiler warnings?
C is, famously, a rather low-level language as HLLs go. C++, though it might seem to be a considerably higher-level language than C, still shares a number of its traits. And one of those traits is that the languages were designed by programmers, for programmers — and, specifically, programmers who knew what they were doing.
(For the rest of this answer I'm going to focus on C. Most of what I'll say also applies to C++, though perhaps not as strongly. Although as Bjarne Stroustrup has famously said, "C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off.".)
If you know what you are doing — really know what you are doing — sometimes you may have to "break the rules". But most of the time, most of us will agree that well-intentioned rules keep us all out of trouble, and that wantonly breaking those rules all the time is a bad idea.
But in C and C++, there are surprisingly large numbers of things you can do that are "bad ideas", but which aren't formally "against the rules". Sometimes they're a bad idea some of the time (but might be defensible other times); sometimes they're a bad idea virtually all of the time. But the tradition has always been not to warn about these things — because, again, the assumption is that programmers know what they are doing, they wouldn't be doing these things without a good reason, and they'd be annoyed by a bunch of unnecessary warnings.
But of course not all programmers really know what they're doing. And, in particular, every C programmer (no matter how experienced) goes through a phase of being a beginning C programmer. And even experienced C programmers can get careless and make mistakes.
Finally, experience has shown not only that programmers do make mistakes, but that these mistakes can have real, serious consequences. If you make a mistake, and the compiler doesn't warn you about it, and somehow the program doesn't immediately crash or do something obviously wrong because of it, the mistake can lurk there, hidden, sometimes for years, until it causes a really big problem.
So it turns out that, most of the time, warnings are a good idea, after all. Even the experienced programmers have learned that (actually, it's "especially the experienced programmers have learned that"), on balance, the warnings tend to do more good than harm. For every time you did something wrong deliberately and the warning was a nuisance, there are probably at least ten times you did something wrong by accident and the warning saved you from further trouble. And most warnings can be disabled or worked around for those few times when you really want to do the "wrong" thing.
(A classic example of such a "mistake" is the test if(a = b)
. Most of the time, this is truly a mistake, so most compilers these days warn about it — some even by default. But if you really want to both assign b
to a
and test the result, you can disable the warning by typing if((a = b))
.)
The second question is, why would you want to ask the compiler to treat warnings as errors? I'd say it's because of human nature, specifically, the all-too-easy reaction of saying "Oh, that's just a warning, that's not so important, I'll clean that up later." But if you're a procrastinator (and I don't know about you, but I'm a world-class procrastinator) it's easy to put off the necessary cleanup for basically ever — and if you get into the habit of ignoring warnings, it gets easier and easier to miss an important warning message that's sitting there, unnoticed, in the midst of all the ones you're relentlessly ignoring.
So asking the compiler to treat warnings as errors is a little trick you can play on yourself to get around this human foible, to force yourself to fix the warnings today, because otherwise your program won't compile.
Personally, I'm not as insistent about treating warnings as errors — in fact, if I'm honest, I can say that I don't tend to enable that option in my "personal" programming. But you can be sure I've got that option enabled at work, where our style guide (which I wrote) mandates its use. And I would say — I suspect most professional programmers would say — that any shop that doesn't treat warnings as errors in C is behaving irresponsibly, is not adhering to commonly-accepted industry best practices.
Warnings consist of the best advice some of the most skilled C++ developers could bake into an application. They're worth keeping around.
C++, being a Turing complete language, has plenty of cases where the compiler must simply trust that you knew what you are doing. However, there are many cases where the compiler can realize that you probably did not intend to write what you wrote. A classic example is printf() codes which don't match the arguments, or std::strings passed to printf (not that that ever happens to me!). In these cases, the code you wrote is not an error. It is a valid C++ expression with a valid interpretation for the compiler to act on. But the compiler has a strong hunch that you simply overlooked something which is easy for a modern compiler to detect. These are warnings. They are things that are obvious to a compiler, using all the strict rules of C++ at its disposal, that you might have overlooked.
Turning warnings off, or ignoring them, is like choosing to ignore free advice from those more skilled than you. It’s a lesson in hubris that ends either when you fly too close to the sun and your wings melt, or a memory corruption error occurs. Between the two, I'll take falling from the sky any day!
"Treat warnings as errors" is the extreme version of this philosophy. The idea here is that you resolve every warning the compiler gives you -- you listen to every bit of free advice and act on it. Whether this is a good model for development for you depends on the team and what kind of product you are working on. It's the ascetic approach that a monk might have. For some, it works great. For others, it does not.
On many of my applications we do not treat warnings as errors. We do this because these particular applications need to compile on several platforms with several compilers of varying ages. Sometimes we find it is actually impossible to fix a warning on one side without it turning into a warning on another platform. So we are merely careful. We respect warnings, but we don't bend over backwards for them.
Why should I enable warnings?
C and C++ compilers are notoriously bad at reporting some common programmer mistakes by default, such as:
- forgetting to initialise a variable
- forgetting to
return
a value from a function - arguments in
printf
andscanf
families not matching the format string - a function is used without being declared beforehand (C only)
These can be detected and reported, just usually not by default; this feature must be explicitly requested via compiler options.
How can I enable warnings?
This depends on your compiler.
Microsoft C and C++ compilers understand switches like /W1
, /W2
, /W3
, /W4
and /Wall
. Use at least /W3
. /W4
and /Wall
may emit spurious warnings for system header files, but if your project compiles cleanly with one of these options, go for it. These options are mutually exclusive.
Most other compilers understand options like -Wall
, -Wpedantic
and -Wextra
. -Wall
is essential and all the rest are recommended (note that, despite its name, -Wall
only enables the most important warnings, not all of them). These options can be used separately or all together.
Your IDE may have a way to enable these from the user interface.
Why should I treat warnings as errors? They are just warnings!
A compiler warning signals a potentially serious problem in your code. The problems listed above are almost always fatal; others may or may not be, but you want compilation to fail even if it turns out to be a false alarm. Investigate each warning, find the root cause, and fix it. In the case of a false alarm, work around it — that is, use a different language feature or construct so that the warning is no longer triggered. If this proves to be very hard, disable that particular warning on a case by case basis.
You don't want to just leave warnings as warnings even if all of them are false alarms. It could be OK for very small projects where the total number of warnings emitted is less than 7. Anything more, and it's easy for a new warning to get lost in a flood of old familiar ones. Don't allow that. Just cause all your project to compile cleanly.
Note this applies to program development. If you are releasing your project to the world in the source form, then it might be a good idea not to supply -Werror
or equivalent in your released build script. People might try to build your project with a different version of the compiler, or with a different compiler altogether, which may have a different set of warnings enabled. You may want their build to succeed. It is still a good idea to keep the warnings enabled, so that people who see warning messages could send you bug reports or patches.
How can I treat warnings as errors?
This is again done with compiler switches. /WX
is for Microsoft, most others use -Werror
. In either case, the compilation will fail if there are any warnings produced.
Is this enough?
Probably not! As you crank up your optimisation level, the compiler starts looking at the code more and more closely, and this closer scrutiny may reveal more mistakes. Thus, do not be content with the warning switches by themselves, always use them when compiling with optimisations enabled (-O2
or -O3
, or /O2
if using MSVC).