"Double casting"
It is true that double casting would under most circumstances cause a ClassCastException
, there are a few special cases. As mentioned in other answers, it can be useful for dealing with primitives and generics, but it can also be used when the compiled class doesn't necessarily represent the runtime class. This is especially true when ASM bytecode class transformers are involved.
As an example, I will use the Mixin framework. The following code will be injected to the Foo
class at runtime. I won't get into the specifics of how this works too much.
@Mixin(Foo.class)
public class MixinFoo {
public Foo bar() {
return (Foo) (Object) this;
}
}
The compile-time type of this
seems to be MixinFoo
, but this method actually gets dynamically inserted, so during runtime, it would be Foo
. The compiler doesn't know this; we would have to cast it to Foo
, but that would also cause a compiler error.
Casting to Object
first then to Foo
(double casting) would remedy the compiler's issue. Just remember if the method doesn't get applied at runtime, this WILL result in a runtime error.
Of course, this is well beyond the skill level of intro Java.
Whilst "double casting" certainly isn't a common term and you shouldn't seem any sort of casting of references much, you ought to know what happens (a ClassCastException
).
For completeness, there are some cases where it wouldn't CCE:
- If the value is actually
null
. - Stuff involving primitives. (er,
Object
toInteger
to [unboxing]int
, orint
to [lossy]byte
to [positive]char
) - Change generic type arguments.
List<String>
toObject
toList<Integer>
.
Yeah, I'm pretty sure that's not a thing. There's no reason that double casting would ever be necessary - it's possible it might get rid of a compile warning about unsafe casting (in which case you're probably doing it wrong), but otherwise that's just not right.
I mean, there's auto toString
calling e.g. println("" + i)
, but even then you don't need to cast to an object first...
Edit: After reading Tom's answer I'm suddenly unsure about this answer - primitives and (particularly) generics can actually use this. I don't have the ability to test anything right now, but anyone reading this answer should definitely take a look at his (and probably upvote it).
I'm going to stick to the line that there are no (or at least extremely few and far between) good reasons to do this, however, and the provided example certainly has nothing to do with it.