Win a copy of Programmer's Guide to Java SE 8 Oracle Certified Associate (OCA) this week in the OCAJP forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

A no-brainer, almost

 
Val Pecaoco
Ranch Hand
Posts: 156
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
In most (if not all) of sample Java code snippets, we encounter the declaration

i.e., . Supposing you know that 's maximum value would be some fairly small number, say, 52, doesn't it make perfect sense to change 's declaration from

to
// perhaps "shortVar" would be more appropriate.
?? Since, as we all know, an takes up more memory than a .
Ex Animo Java!
-- Val
 
Mark Herschberg
Sheriff
Posts: 6037
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
We don't know that an int takes up less memory than a short. All we know is the range of a short is smaller than the range of an int (or more specifically, we only know exactly what those ranges are).
If you want to write a JVM which uses half a meg of memory to store the value of a short, there's nothing in the spec to stop you. Specifically, I suspect most JVMs allocate a fully 32 bit word for everything from a binary to an int.
We discussed this in great detail a year ago in the Performance forum at http://www.javaranch.com/cgi-bin/ubb/ultimatebb.cgi?ubb=get_topic&f=15&t=000098
--Mark
 
Val Pecaoco
Ranch Hand
Posts: 156
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Mark,
Isn't it that a short's "size" is 16 bits and that of an int is 32 bits, while you're saying that "most JVMs allocate a fully 32 bit word for everything from a binary to an int"?
Secondly, doesn't "size" equate to memory? Drawing a simple comparison, a 32-bit processor can address up to 2^32 bits of memory, while a 64-bit can address up to 2^64 bits. Or so what I was taught in my undergraduate classes.
Ex Animo Java!
-- Val
 
Graeme Brown
Ranch Hand
Posts: 193
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Using shorts and bytes may potentially be error prone, because the default type for an integer is int.
Consider the following:

like it will compile, but because there is an intermediate value calculated and stored as an int, an explicit cast is required.

more code == more chance of error
 
Dave Vick
Ranch Hand
Posts: 3244
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Val
It probably wouldn't save you any space anyway becasue in Java all operations that are carried out are done in at least 32 bit precision anyway - meaning that if one of the operators in an expression is a long the entire operation is done in 64 bit precision (all other operators are widened to a long first, then the operation is carried out), otherwise all operators are widened to ints then the operation is carried out.
For a more detailed explaination check out the JLS Section 4.2.2
Also, with the sophistication of todays compilers most efficiency changes you can make are done better by the compiler anyway and making specific changes might even hinder the compiler.
hope that clear it up for you
 
Val Pecaoco
Ranch Hand
Posts: 156
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi guys,
Thanks for all the info. But I think we've got a problem, err...or a revelation, I might say: we're being sucked into thinking that a "short's size is 16 bits, while that of an int is 32 bits", while all along it doesn't turn out that way in reality, to wit: "in Java all operations that are carried out are done in at least 32 bit precision anyway...". In every Java tutorial I've encountered, even Sun's, or even every book I've read for that matter, doesn't quite clear that up (even in sidebars - please!!!). And yes, the JLS - but do you think every Java greenhorn would mind taking a look at it first?!
"Maybe" yes, that clears it up for me, but what about the others - and authors of Java books and tutorials?! What I can say to finally put this issue to rest is: everything must be cleared at the very first opportunity (just as constructors and initializers should be about initialization).
No, I won't shift to C# - not in any way, ever! That's why I say --
Ex Animo Java!
-- Val
 
Roseanne Zhang
Ranch Hand
Posts: 1953
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Actually C# is an excellent progrmming language. It is very similar to Java, but it has overcome some weakness of Java. C# is more flexible, more efficient, and if you really need to, you can write pointers (unsafe code), and make your code work as efficient as C.
CLR is a good concept, It seems more efficient and flexible than JVM too.
Of course, C# has its own weakness too. For example, I don't like that C# does not have Java checked Exception concepts. I hate Micro$oft monopoly. But I don't like $un monopoly either. Variaty and competition made the world better!
Open your mind, you are better off by just opening your mind. I'm a big Java fan, but not blind Java fan!
Visit JavaChina on the web
 
Roseanne Zhang
Ranch Hand
Posts: 1953
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Even when I wear $unglasses
 
Val Pecaoco
Ranch Hand
Posts: 156
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Ms. Zhang,
It's kinda off-beat, but I couldn't resist relating my side of the fence.
I'm not saying that this language or another is superior; that is very subjective and as I deem it, I need to point out that every programming language is "perfect in its own imperfections", or everything for that matter, humans included - but we have free will to compensate - that we can never be perfect, but we have the choice to be near-perfect.
Which brings us to my argument.
Java has been around since 1991, and eventually picking up speed in features and popularity. While Java revolutionized the IT industry, M$ (please excuse my "syntax", I seem impatient typing long words, that's why right now I'll prefer "int" to "short", even the connotation says otherwise) of course tried to rode with it, even licensing Java from Sun to come up with Visual J++. To really, really cut the long story "short", M$ got sued by Sun over it, which maybe hurt its ego (read: "I have had enough of this! I won't support Java in Windows XP!").
Ergo, if C# and CLR are really that a novel idea, why didn't M$ thought of it in the first place? Why did they wait for some ten years after Java's birth to come up with it? Visual C++ was M$'s flagship IDE product then, why didn't they just improve C++ so that C++ programmers would be happy, at least? C# is just a form of M$ sour-graping since they got denied of Java, reminiscent of all the running they did with/against UNIX (Xenix, anyone?), Netscape, Apple, ... the US DOJ (?) ... and now, Sun (who's next?!?!).
I just couldn't picture myself working with a such a language. With Java, I could sleep, eat, drink (Java coffee at night ), and work with my head held up high. Simply without any tinge of guilt (and sunglasses).
I guess I just opened my mind too much
Ex Animo Java!
-- Val
PS: A Sun monopoly? And what can you make of BEA, IBM, Oracle, etc., etc., etc., etc. ... and on UNIX, Linux, MacOS, and (ugh) Windows ... ???
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic