Converting bytes to megabytes

Divide by 2 to the power of 20, (1024*1024) bytes = 1 megabyte

1024*1024 = 1,048,576   
2^20 = 1,048,576
1,048,576/1,048,576 = 1  

It is the same thing.


Traditionally by megabyte we mean your second option -- 1 megabyte = 220 bytes. But it is not correct actually because mega means 1 000 000. There is a new standard name for 220 bytes, it is mebibyte (http://en.wikipedia.org/wiki/Mebibyte) and it gathers popularity.


There's an IEC standard that distinguishes the terms, e.g. Mebibyte = 1024^2 bytes but Megabyte = 1000^2 (in order to be compatible to SI units like kilograms where k/M/... means 1000/1000000). Actually most people in the IT area will prefer Megabyte = 1024^2 and hard disk manufacturers will prefer Megabyte = 1000^2 (because hard disk sizes will sound bigger than they are).

As a matter of fact, most people are confused by the IEC standard (multiplier 1000) and the traditional meaning (multiplier 1024). In general you shouldn't make assumptions on what people mean. For example, 128 kBit/s for MP3s usually means 128000 bits because the multiplier 1000 is mostly used with the unit bits. But often people then call 2048 kBit/s equal to 2 MBit/s - confusing eh?

So as a general rule, don't trust bit/byte units at all ;)


BTW: Hard drive manufacturers don't count as authorities on this one!

Oh, yes they do (and the definition they assume from the S.I. is the correct one). On a related issue, see this post on CodingHorror.