This week's book giveaway is in the Agile forum.
We're giving away four copies of Head First Agile and have Andrew Stellman & Jennifer Greene on-line!
See this thread for details.
Win a copy of Head First Agile this week in the Agile forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic

clarification on int division  RSS feed

 
jim gotti
Ranch Hand
Posts: 36
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Im currently starting from scratch to refresh some java having been away for a while.
i was reading over HWS javanotes and came across this blurb, maybe someing can clear it up for me.

If N is an integer variable, then N/100 is an integer, and 1/N is equal to zero for any N greater than one! This fact is a common source of programming errors. You can force the computer to compute a real number as the answer by making one of the operands real: For example, when the computer evaluates 1.0/N, it first converts N to a real number in order to match the type of 1.0, so you get a real number as the answer.


why is 1/N equal to 0 if N > 1? ;where N is an int?
 
Mike Gershman
Ranch Hand
Posts: 1272
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
An int variable can only hold an integer (-1, 0, 1 ,2, etc.).

1/2 is actually 0.5, but integer division always truncates towards zero. The next lower integer to 0.5 is 0.
 
Oneal Shaha
Ranch Hand
Posts: 98
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The simple Logic behind it is that whenever you perform any operation like X/Y where X and Y both are integers, and if the answer is not a perfect interger then simple the digits after the decimal point . are removed.

so if answer of 5/2 is 2.50 then .50 is removed and 2 is given as a answer in interger format.

In case 1/N if N>1 then the real answer will always be 0.#### In this case all ####s are removed and 0 as a simple interger is returned as answer.
 
fred rosenberger
lowercase baba
Bartender
Posts: 12542
48
Chrome Java Linux
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Remember in gradeschool, when you first learned how to divide (or at least, this is how I learned to do it)?

you would do something like


so 23 divided by 5 is 4, with a remainder of 3.

integer division does exactly this, then just throws the remainder away.

so when you do 1 / 1, you get 1 remainder 0.

when you do 1 / 2, you get 0, remainder 1...remainder thrown away, therefore you get 0.

1 / 3 is 0 remainder 1, throw away, 0... etc.
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!