What is the difference between int, Int16, Int32 and Int64?

Each type of integer has a different range of storage capacity

   Type      Capacity

   Int16 -- (-32,768 to +32,767)

   Int32 -- (-2,147,483,648 to +2,147,483,647)

   Int64 -- (-9,223,372,036,854,775,808 to +9,223,372,036,854,775,807)

As stated by James Sutherland in his answer:

int and Int32 are indeed synonymous; int will be a little more familiar looking, Int32 makes the 32-bitness more explicit to those reading your code. I would be inclined to use int where I just need ‘an integer’, Int32 where the size is important (cryptographic code, structures) so future maintainers will know it’s safe to enlarge an int if appropriate, but should take care changing Int32 variables in the same way.

The resulting code will be identical: the difference is purely one of readability or code appearance.

Leave a Comment