Consider the following C# console application.
using System; class Program { enum A { A42 = 42 } static void Main() { object obj = 42; Console.WriteLine(obj.GetType().Name); /* "Int32" */ Console.WriteLine(obj is int); /* "True" */ Console.WriteLine(obj is A); /* "False" */ A a = (A)obj; Console.WriteLine(a); /* "A42" */ } }
It compiles to the following:
.class private auto ansi beforefieldinit Program extends [mscorlib]System.Object { .class auto ansi sealed nested private A extends [mscorlib]System.Enum { .field public specialname rtspecialname int32 value__ .field public static literal valuetype Program/A A42 = int32(0x0000002A) } // end of class A .method private hidebysig static void Main() cil managed { .entrypoint // Code size 71 (0x47) .maxstack 2 .locals init ([0] object obj, [1] valuetype Program/A a) IL_0000: ldc.i4.s 42 IL_0002: box [mscorlib]System.Int32 IL_0007: stloc.0 IL_0008: ldloc.0 IL_0009: callvirt instance class [mscorlib]System.Type [mscorlib]System.Object::GetType() IL_000e: callvirt instance string [mscorlib]System.Reflection.MemberInfo::get_Name() IL_0013: call void [mscorlib]System.Console::WriteLine(string) IL_0018: ldloc.0 IL_0019: isinst [mscorlib]System.Int32 IL_001e: ldnull IL_001f: cgt.un IL_0021: call void [mscorlib]System.Console::WriteLine(bool) IL_0026: ldloc.0 IL_0027: isinst Program/A IL_002c: ldnull IL_002d: cgt.un IL_002f: call void [mscorlib]System.Console::WriteLine(bool) IL_0034: ldloc.0 IL_0035: unbox.any Program/A IL_003a: stloc.1 IL_003b: ldloc.1 IL_003c: box Program/A IL_0041: call void [mscorlib]System.Console::WriteLine(object) IL_0046: ret } // end of method Program::Main .method public hidebysig specialname rtspecialname instance void .ctor() cil managed { // Code size 7 (0x7) .maxstack 8 IL_0000: ldarg.0 IL_0001: call instance void [mscorlib]System.Object::.ctor() IL_0006: ret } // end of method Program::.ctor } // end of class ProgramAlthough obj refers to a boxed Int32 rather than a boxed A, the unboxing conversion (A)obj succeeds at run time. This behavior is specified in Standard ECMA-335 "Common Language Infrastructure (CLI)", 6th Edition:
- §I.8.7 ("Assignment compatibility"): "The underlying type of a type T is the following: 1. If T is an enumeration type, then its underlying type is the underlying type declared in the enumeration's definition. 2. Otherwise, the underlying type is itself." Thus, the underlying type of both A and int32 is int32.
- §I.8.7: "The reduced type of a type T is the following: 1. If the underlying type of T is: […] c. int32, or unsigned int32, then its reduced type is int32." Thus, the reduced type of both A and int32 is int32.
- §I.8.7: "The verification type (§III.1.8.1.2.1) of a type T is the following: 1. If the reduced type of T is: […] c. int32 then its verification type is int32." Thus, the verification type of both A and int32 is int32.
- §III.1.8.1.2.3 ("Verification type compatibility"): "A type Q isverifier-assignable-to R (sometimes written R := Q) if and only if T is the verification type of Q, and U is the verification type of R, and at least one of the following holds: 1. T is identical to U." Thus, A is verifier-assignable to int32 and vice versa.
- §III.4.33 ("unbox.any – convert boxed value type to value"): "System.InvalidCastException is thrown ifobj is not a boxed value type, typeTok is a Nullable<T> andobj is not a boxed T, or if the type of the value contained in obj is not verifier-assignable-to (§III.1.8.1.2.3) typeTok." Thus, InvalidCastException is not thrown.
However, if I change the initialization to "object obj = 42U;", then Microsoft .NET Framework 4.5.1 actually throws InvalidCastException. Why does that happen? If I understand correctly, the reduced type and verification type are int32 in this
case too, so the unbox.any instruction should not throw InvalidCastException.