• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Meaning of log

 
Ranch Hand
Posts: 96
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi,

When I read about the complexity of algorithms like Quick Sort or Heap Sort, I usually see something like O(n*log(n)).
I would like to know what is the base of the log? Is this base 10 or base "e"?

Thank you.
 
lowercase baba
Posts: 13091
67
Chrome Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
in this case, it doesn't matter.

Seriously.

The purpose of Big-O notation is to give you an idea of how fast or complex an algorithm is. if you graph logs on base 10, and then logs on base e, and then logs on base 37.2 (or really ANY base), the curves look basically the same.
 
Bartender
Posts: 825
5
Python Ruby Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Generally it's a logarithm of base 2, which usually means that the problem is divided in half in each step of the algorithm (divide and conquer).
 
author
Posts: 23956
142
jQuery Eclipse IDE Firefox Browser VI Editor C++ Chrome Java Linux Windows
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator


Agree with Fred. It doesn't matter what base the logarithm is in. Just remember, that any base can be converted to any other base via a multiply with a specific constant. And with big-O notation, the constant is not a concern.

Henry
 
Swerrgy Smith
Ranch Hand
Posts: 96
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Henry Wong wrote:

Agree with Fred. It doesn't matter what base the logarithm is in. Just remember, that any base can be converted to any other base via a multiply with a specific constant. And with big-O notation, the constant is not a concern.

Henry



What do you mean when you say the constant is not a concern?
For example, I think an algorithm with complexity of log(n) is better than an algorithm with complexity of 2*log(n) (2 is a constant).
 
Marshal
Posts: 28271
95
Eclipse IDE Firefox Browser MySQL Database
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
You can think that if you like, but it isn't right. An algorithm of complexity log(N) means that as N goes to infinity, the time taken to complete the algorithm goes to C*log(N) where C is some constant. And so "complexity 2*log(N)" means exactly the same thing as "complexity log(N)". Which is why you never see algorithm experts talking about "complexity 2*log(N)".
 
Henry Wong
author
Posts: 23956
142
jQuery Eclipse IDE Firefox Browser VI Editor C++ Chrome Java Linux Windows
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Swerrgy Smith wrote:
What do you mean when you say the constant is not a concern?
For example, I think an algorithm with complexity of log(n) is better than an algorithm with complexity of 2*log(n) (2 is a constant).



Paul Clapham wrote:You can think that if you like, but it isn't right. An algorithm of complexity log(N) means that as N goes to infinity, the time taken to complete the algorithm goes to C*log(N) where C is some constant. And so "complexity 2*log(N)" means exactly the same thing as "complexity log(N)". Which is why you never see algorithm experts talking about "complexity 2*log(N)".




To add to that, a big O of "(23 * N^2) + (34 * log (N)) + 96)" is the same as a big O of "N^2". When N approaches infinity, the lower growing terms tends to disappear. This is also why you never see algorithm experts with complex compound expressions for complexity.

Henry
 
fred rosenberger
lowercase baba
Posts: 13091
67
Chrome Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Remember that Big-O is not supposed to tell you EXACTLY how fast an algorithm grows in complexity, but is really just a way to group them. ANY algorithm that grows O(n^2), regardless of any constant, will always grow faster than any algorithm that grows O(log(n)), regardless of the constants.

even .00000001(n^2) grows faster than 1000000000000000000(log(n)), once n gets big enough.

And that's what we care about
 
Rancher
Posts: 1044
6
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Generally it's a logarithm of base 2



Or natural logarithm, that is, e-based for the mathematically inclined. But as rightly pointed out in several posts, it does not matter in this context.

f(x) = e^x (e to x-th) is a nice function: it equals to its own derivative. Mathematicians love that.
reply
    Bookmark Topic Watch Topic
  • New Topic