Which is better option to use for dividing an integer number by 2?
Does the first one look like dividing? No. If you want to divide, use x / 2
. Compiler can optimise it to use bit-shift if possible (it's called strength reduction), which makes it a useless micro-optimisation if you do it on your own.
To pile on: there are so many reasons to favor using x = x / 2;
Here are some:
it expresses your intent more clearly (assuming you're not dealing with bit twiddling register bits or something)
the compiler will reduce this to a shift operation anyway
even if the compiler didn't reduce it and chose a slower operation than the shift, the likelihood that this ends up affecting your program's performance in a measurable way is itself vanishingly small (and if it does affect it measurably, then you have an actual reason to use a shift)
if the division is going to be part of a larger expression, you're more likely to get the precedence right if you use the division operator:
x = x / 2 + 5; x = x >> 1 + 5; // not the same as above
signed arithmetic might complicate things even more than the precedence problem mentioned above
to reiterate - the compiler will already do this for you anyway. In fact, it'll convert division by a constant to a series of shifts, adds, and multiplies for all sorts of numbers, not just powers of two. See this question for links to even more information about this.
In short, you buy nothing by coding a shift when you really mean to multiply or divide, except maybe an increased possibility of introducing a bug. It's been a lifetime since compilers weren't smart enough to optimize this kind of thing to a shift when appropriate.
Use the operation that best describes what you are trying to do.
- If you are treating the number as a sequence of bits, use bitshift.
- If you are treating it as a numerical value, use division.
Note that they are not exactly equivalent. They can give different results for negative integers. For example:
-5 / 2 = -2
-5 >> 1 = -3
(ideone)