• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Junilu Lacar
Sheriffs:
  • Rob Spoor
  • Liutauras Vilda
  • Tim Cooke
Saloon Keepers:
  • Tim Moores
  • Piet Souris
  • Tim Holloway
  • Jj Roberts
  • Stephan van Hulst
Bartenders:
  • Himai Minh
  • Carey Brown
  • Frits Walraven

How do octal numbers compare to decimal numbers?

 
Ranch Hand
Posts: 71
3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I understand that octal numbers are not widely used in computing because they are not as efficient as hexadecimal numbers at converting values into bytes (8 bits). How do octal numbers compare to decimal numbers? Are decimal numbers preferred to octal numbers at converting values into bytes? For example, the byte 01100000 can be represented in decimal (base 10) by 2 digits 96 and in octal (base 8) by 3 digits 140. Does this mean that decimal numbers are more efficient than octal numbers at representing bytes?
 
author
Posts: 23901
142
jQuery Eclipse IDE Firefox Browser VI Editor C++ Chrome Java Linux Windows
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Tess Jacobs wrote:I understand that octal numbers are not widely used in computing because they are not as efficient as hexadecimal numbers at converting values into bytes (8 bits). How do octal numbers compare to decimal numbers? Are decimal numbers preferred to octal numbers at converting values into bytes? For example, the byte 01100000 can be represented in decimal (base 10) by 2 digits 96 and in octal (base 8) by 3 digits 140. Does this mean that decimal numbers are more efficient than octal numbers at representing bytes?



Computers work in binary. Everything is bits and bytes. With mathematical operations (and many logical math operations), they use base 2 math.

Octal, Hexadecimal, and decimal representation are for people to be more efficient (to be able to read better). And I guess, for people, efficiency depends on what they are used to, and what they are doing.

Henry



 
Bartender
Posts: 10780
71
Hibernate Eclipse IDE Ubuntu
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Tess Jacobs wrote:Does this mean that decimal numbers are more efficient than octal numbers at representing bytes?


Visually: yes; however, as Henry says, it matters not one whit to your computer.

On another level though, decimal is almost always "most efficient" for us puny humans, because it's what we're used to.

I've been in the biz for more than 35 years, and I still find hex literals (theoretically more "efficient" than decimal) that have more than two digits in them hard to read, unless they're all '0's or 'F's.

Just one reason (maybe) why we still love our IPv4 addresses...

Winston
 
Henry Wong
author
Posts: 23901
142
jQuery Eclipse IDE Firefox Browser VI Editor C++ Chrome Java Linux Windows
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Tess Jacobs wrote:How do octal numbers compare to decimal numbers?



IMO, the advantage of decimal numbers is that they are in a representation that is what we are used to. We are naturally trained to work in decimal since birth. So, when possible, that is probably the best representation to use.

The advantage of octal is that you can envision the bits better. Every digit is exactly three bits. And there is no way for a single bit change to affect two octal digits. This is not true for decimal, where certain bits can affect more than one digit. However, I still don't like octal as much, even with this advantage. If I need to envision the bits better, I prefer hexadecimal, or even binary, instead of octal. Of course, this is personal preference.

Henry
 
Tess Jacobs
Ranch Hand
Posts: 71
3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Henry Wong wrote:The advantage of octal is that you can envision the bits better. Every digit is exactly three bits. And there is no way for a single bit change to affect two octal digits. This is not true for decimal, where certain bits can affect more than one digit.



So octal numbers have the advantage of being able to match bits in a consistent multiple (multiples of 3s i.e. 3 bits per octal digit) whereas decimal numbers cannot match bits in a consistent multiple. Would this imply that octal numbers are more efficient at representing bytes than decimal numbers? The same way hexadecimal numbers are more efficient at representing bytes than octal numbers.
 
Author and all-around good cowpoke
Posts: 13078
6
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Personally, I learned Z80 op codes in octal because they made sense that way and were easier to remember than hex.

 
Winston Gutkowski
Bartender
Posts: 10780
71
Hibernate Eclipse IDE Ubuntu
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Tess Jacobs wrote:Would this imply that octal numbers are more efficient at representing bytes than decimal numbers? The same way hexadecimal numbers are more efficient at representing bytes than octal numbers.


It really depends on what you mean by "efficient" and what you're looking at. If you need to know a bit pattern (which will usually be because you're looking at bytes or ints for doing things like masking), I'd say that hex beats out octal, because it provides you with bits in groups of 4, not 3; and most binary types these days (and certainly in Java) are based on multiples of 4 bits.

However, as I say, even after 35 years I can't "visualise" that a hex 'B' is '1011'.

And for values? DECIMAL - every time.

Winston
 
Tess Jacobs
Ranch Hand
Posts: 71
3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Winston Gutkowski wrote:It really depends on what you mean by "efficient" and what you're looking at.


As humans, we tend to use a number system that we find the most intuitive, but I'm trying to look at it from the computer's perspective. Since all data (hex, octal or decimal) eventually gets converted to binary, I’m trying to find out which conversion process is fastest (most efficient) from the computer's perspective, for example, since a hex digit represents 4 bits and each computer word is a multiple of 4 bits, I’m sure that the computer will find it very easy converting a hex number to binary. However, when dealing with decimal and octal numbers, I’m trying to find out which one the computer will find easiest to convert to binary.
 
Henry Wong
author
Posts: 23901
142
jQuery Eclipse IDE Firefox Browser VI Editor C++ Chrome Java Linux Windows
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Tess Jacobs wrote:
As humans, we tend to use a number system that we find the most intuitive, but I'm trying to look at it from the computer's perspective. Since all data (hex, octal or decimal) eventually gets converted to binary, I’m trying to find out which conversion process is fastest (most efficient) from the computer's perspective, for example, since a hex digit represents 4 bits and each computer word is a multiple of 4 bits, I’m sure that the computer will find it very easy converting a hex number to binary. However, when dealing with decimal and octal numbers, I’m trying to find out which one the computer will find easiest to convert to binary.



What conversion are you referring to? The conversion that is done by the compiler? Or the conversion that is done before printing it out to be read? For the first case, it is very unlikely that it will be even noticeable, as the compiler is doing many more things that is many times more complex than converting a string with a number representation to a binary value. And that doesn't count the I/O that the compiler is doing.

For the second case, the computer has to wait for the person to read it, and hence, will definitely not be noticeable.

Henry
 
Tess Jacobs
Ranch Hand
Posts: 71
3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Many thanks for the clarification.
 
Winston Gutkowski
Bartender
Posts: 10780
71
Hibernate Eclipse IDE Ubuntu
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Tess Jacobs wrote:As humans, we tend to use a number system that we find the most intuitive...


No we don't, we use base 10 because God (if you believe in him/her) gave us that number of fingers. And in computer terms it's PITA, because it does NOT dovetail nicely with bases like 2, 8, and 16.

Which offers two possibilities:
1. God wasn't a programmer.
2. S/he wanted us to think.
Personally, I prefer the latter.

Winston
 
Tess Jacobs
Ranch Hand
Posts: 71
3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Winston Gutkowski wrote:No we don't, we use base 10


I understand that some low-level programmers who work on things like microprocessors prefer hex to decimal, for example, given the number 54312, its not easy to tell what binary number it represents. However, the hexadecimal equivalent, 0xd428 can tell a low-level programmer right away that the bit pattern it represents is 1101 0100 0010 1000.

Tess Jacobs wrote:How do octal numbers compare to decimal numbers?



I’m still trying to wrap my heading around the computer number system, so my original question wasn’t phrased correctly. I’d like to re-phrase it.

I understand that in the 60s, octal numbers were popular because systems such as IBM mainframes employed 12-bit, 24-bit or 36-bit words (multiples of 3). In the 70s when bytes became the common building block for computer systems (with the development of 8-bit microprocessors), the hexadecimal number system took over from octal because it could more perfectly represent a byte i.e. using two hexadecimal digits. Today, octal numbers are not widely used in computers anymore because they are not as efficient as hexadecimal numbers at representing bytes. A byte can represent 256 numbers (0-255). To represent 255 as an octal number, three digits are needed (377). However, this is a waste as three octal digits can represent 512 numbers (0-511); the third digit is not being used to it's full potential (from java.about.com/).

So, what I’m trying to find out is whether an octal number is better at representing bytes than a decimal number. Not from the human perspective but from the computer's perspective.

Henry Wong wrote:The advantage of octal is that you can envision the bits better. Every digit is exactly three bits. And there is no way for a single bit change to affect two octal digits. This is not true for decimal, where certain bits can affect more than one digit.


I suspect that this answers my question.
 
Marshal
Posts: 73306
332
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
No, an octal number does not represent bytes better than hex. Not since people gave up 6‑bit bytes. We now use two hex digits ≡ one byte. Round my way (Teesside) we regard octal as obsolete.

The problem you have is that you think you should be able to convert back and forth between octal hex and decimal. Don't. Use decimal for ordinary arithmetic and hex when you need to do something with the bit pattern, and forget that there is any correspondence. The JVM does not convert hex to binary or anything like that for arithmetic; it does everything in binary. Any such conversions are done by the compiler or methods like Integer#parseInt Scanner#nextInt or System.out#println(int).
 
You showed up just in time for the waffles! And this tiny ad:
Building a Better World in your Backyard by Paul Wheaton and Shawn Klassen-Koop
https://coderanch.com/wiki/718759/books/Building-World-Backyard-Paul-Wheaton
reply
    Bookmark Topic Watch Topic
  • New Topic