Java: Subtract '0' from char to get an int... why does this work?
char
s are converted to int
implicitly:
public static void main(String [] args) throws Exception {
String bar = "abc";
int foo = bar.charAt(1) - '0';
int foob = bar.charAt(1);
System.err.println("foo = " + foo + " foob = " + foob);
}
output: foo = 50 foob = 98
.
Maybe you put two int foo ...
and this is because it didn't work?
This is an old ASCII trick which will work for any encoding that lines the digits '0' through '9' sequentially starting at '0'. In Ascii, '0' is a character with value 0x30 and '9' is 0x39. Basically, if you have a character that is a digit, subtracting '0' "converts" it to it's digit value.
I have to disagree with @Lukas Eder and suggest that it is a terrible trick;
because the intent of this action aproaches 0% obvious
from code.
If you are using Java and have a String
that contains digits and you want to convert said String
to an int
I suggest that you use
Integer.parseInt(yourString);
.
This technique has the benifit of being obvious to the future maintenance programmer.
'0'
is a char too. It turns out, the characters in Java have a unicode (UTF-16) value. When you use the -
operator with characters Java performs the operation with the integer values.
For instance, int x = 'A' - '0';// x = 17
That's a clever trick. char's are actually of the same type / length as shorts. Now when you have a char that represents a ASCII/unicode digit (like '1'), and you subtract the smallest possible ASCII/unicode digit from it (e.g. '0'), then you'll be left with the digit's corresponding value (hence, 1)
Because char is the same as short (although, an unsigned short), you can safely cast it to an int. And the casting is always done automatically if arithmetics are involved