• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Liutauras Vilda
  • Jeanne Boyarsky
  • Devaka Cooray
  • Paul Clapham
Sheriffs:
  • Tim Cooke
  • Knute Snortum
  • Bear Bibeault
Saloon Keepers:
  • Ron McLeod
  • Tim Moores
  • Stephan van Hulst
  • Piet Souris
  • Ganesh Patekar
Bartenders:
  • Frits Walraven
  • Carey Brown
  • Tim Holloway

casting a char to an int

 
Ranch Hand
Posts: 101
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I've read in a couple of my books that you can cast a char to an int, but I've seen no examples and I'm having a tough time understanding this concept. Would it be like this?
(int)c
And then how is c represented by an integer value? I mean, c is a letter, not a number. I'm confused here Thanks!
 
Sheriff
Posts: 6920
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The integer value will be the unicode value of the character. Look up unicode somewhere to find a definition of the possible values.
 
Greenhorn
Posts: 12
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
hi,
It is not necessary that you have to do an explicit cast in this case. As character is 16 bits and integer is 32 bits, the integer can hold the value of 16 bits. So there is no need of the statement
(int)c;
But if you want to store the value of int to a char then you need to do an explicit cast as
c = (char)I; // if I is an integer and c is a character
Also note that when converting a char to an int the value is always positive as char is unsigned.
 
tyler jones
Ranch Hand
Posts: 101
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks guys!
 
Greenhorn
Posts: 11
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am still confused by this. Example:
String strA = "A";
char charA = strA.charAt(0);
int intA = charA;
When printing the value of intA you get 65 which is the ASCII decimal value of "A". Why isn't the value of intA a Unicode value?
 
Ranch Hand
Posts: 193
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The first 128 unicode characters are the same as the equivalent ASCII characters, so ASCII 65 == Unicode 65 == 'A'
[This message has been edited by Graeme Brown (edited January 09, 2001).]
 
Greenhorn
Posts: 7
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
According to the Code Charts from the Unicode homepage, 0065 is the unicode value for "e." The value for "A" is 0041 and the value for "a" is 0061.
So I'm just as confused...
 
Sheriff
Posts: 3341
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
65 is the decimal value for the ascii and unicode charater A. What you see on that page is the hexadecimal value of 0x0041
4 * 16 = 64 +1 = 65.
Hope this helps
 
It is sorta covered in the JavaRanch Style Guide.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!