posted 1 year ago

- 2

Isaac,

A logarithmic scale is when the cost goes up by 1 every time you double it. For example, suppose, i give you a sorted list of numbers and ask you to find the number 42. You'd start in the middle of the list and keep halving it until you find the number. This is O(log(n)).

Now suppose that sorted list had 100 numbers in it. If I said we were going to repeat this exercise with a 200 number sorted list, would you be complaining that I doubled the work? (If so, it would be O(n)). But no. You only have one extra check - one more halving operation.

A logarithmic scale is when the cost goes up by 1 every time you double it. For example, suppose, i give you a sorted list of numbers and ask you to find the number 42. You'd start in the middle of the list and keep halving it until you find the number. This is O(log(n)).

Now suppose that sorted list had 100 numbers in it. If I said we were going to repeat this exercise with a 200 number sorted list, would you be complaining that I doubled the work? (If so, it would be O(n)). But no. You only have one extra check - one more halving operation.

[OCA 8 book] [OCP 8 book] [Practice tests book] [Blog] [JavaRanch FAQ] [How To Ask Questions] [Book Promos]

Other Certs: SCEA Part 1, Part 2 & 3, Core Spring 3, TOGAF part 1 and part 2

Stephan van Hulst

Saloon Keeper

Posts: 7993

143

posted 1 year ago

Note that the base of the logarithm goes up when you partition the work more often.

Jeanne described a binary partition: On each step, the amount of work is halved. The base of the logarithm is 2. If instead we divided the list in 3 parts, on each step we would divide the work by 3. The base of the logarithm is 3.

In time complexity, we don't really care about the base of the logarithm. That's why we write

Jeanne described a binary partition: On each step, the amount of work is halved. The base of the logarithm is 2. If instead we divided the list in 3 parts, on each step we would divide the work by 3. The base of the logarithm is 3.

In time complexity, we don't really care about the base of the logarithm. That's why we write

`log(n)`instead of`log2(n)`or`log3(n)`. Usually`log2`is implied though, because`O(log(n))`is most often used when we're working with binary trees.*The mind is a strange and wonderful thing. I'm not sure that it will ever be able to figure itself out, everything else, maybe. From the atom to the universe, everything, except itself.*

Campbell Ritchie

Marshal

Posts: 56595

172

posted 1 year ago

I might have got that wrong; please check. Since log

If I remember correctly, from my school maths many many years ago,Stephan van Hulst wrote:. . . In time complexity, we don't really care about the base of the logarithm. . . .

`log`

*ₓy*= log*ₐy*× log*ₓa*I might have got that wrong; please check. Since log

*ₓa*isn't going to change, that means that the ratios of the time complexities are independent of the base of the logarithm. If you work out something with logarithmic complexity, and it takes twice as long when worked out in log₂, it will also take twice as long when worked out in log₁₀. The actual figures will be different, but so are the units. It is rather like working out something is 304.8mm long with one calculation and then finding somebody else calculated its length at one foot.

Consider Paul's rocket mass heater. |