1. 10n + 3n2 + log n
2. n(n3 + 8)
Give the worst case order of growth: (what does that mean?)
3.
for (int i = 0; i < a.length; i++) {
for (int j = i  1; j > 1; j = j / 2) {
// Do something with a[i], a[j]
}
}
4.
for (int i = 0; i < a.length; i = i + 2) {
for (int j = 0; j < 5; j++) {
for (int k = i; k > i / 2; k) {
// Do something with a[i], a[j], a[k]
}
}
}
The n in the equation depicts the units that determine the complexity.
So an equation containing only n is said to have a complexity of n or denoted as O(n), n square has O(n square), log n has O(log n).
So O(n square ) is more complex than O(n).
You need to study Analysis and design of algorithm to understand this much deeply. However, I assume, my rely can help you for a while.
Thank you.
Originally posted by Christy John:
Big O notation is a theoratical measure of the complexity of an algorithm. It usually depicts the amount of resources needed for it's completion, either in terms of memory or computer time needed.
The n in the equation depicts the units that determine the complexity.
So an equation containing only n is said to have a complexity of n or denoted as O(n), n square has O(n square), log n has O(log n).
So O(n square ) is more complex than O(n).
Christy, I am not sure if I agree with this.
I always thought that the "big O", or the "order of", is the measure of the growth in time to complete the task in relation to the number of items in the task.
For example, if I have to run a task with 1000 items, and it takes 5 minutes to do it, then running the task with 2000 items, should take 10 minutes to do it, if it has an order of N.
You can have a really complex algorithm, with an order of N. And you can have a really simple algorithm, with an order of N squared.
Henry
I don't think you can say that doubling the the number of items doubles the time to complete it. For one thing, the operating environment is not considered in BigO. There are many variables in program performance that are outside the control of the program being tested. Also, I don't think many recursive algorithms double in time as the number doublesrecursive Fibonacci comes to mind.
"Should array indices start at 0 or 1? My compromise of 0.5 was rejected without, I thought, proper consideration." Stan KellyBootle
Originally posted by Henry Wong:
... the "big O", or the "order of", is the measure of the growth in time to complete the task in relation to the number of items in the task.
Most often it's used for time complexity, but as Christy notes, sometimes it's applied to memory complexity.
John  in math, have you studied sequences and series? That is helpful background for this discussion.
Generally, with, bigO problems, the process to solve the problem formally would be
1. get a formula for the amount of time required to complete a given algorithm or code snippet. Generally, this means a formula for the number of times that the innermost loop of the code will execute.
2. simplify that formula by removing all but the most significant term, and remove any constant multiplier as well. So a formula like 10n^2 + 7n + 113 would simplify to 10n^2 and then to n^2 (where n^2 means n squared  not the Java meaning).
Knowing that the final formula will be simplified allows you to take many shortcuts in deriving the formula. E.g. you didn't need to worry about the 10, or the 7n, or the 133. You could skip those details in the first place if you know what you're doing. With practice, bigO problems can usually be solved with some quick mathematical intuition rather than formal proof. But this is difficult to develop except through practice.
I suspect that questions 1 and 2 were supposed to relate to only the second step above  how to simplify the formula. That is, if we assume that
10n + 3n2 + log n and n(n3 + 8) are formulas for the time to execute, rather than code to be executed, then you just need to simplify them. However this assumption was not stated in the problem. Also I'm guessing 3n2 is supposed to be 3n^2, is that right?
[ March 28, 2007: Message edited by: Jim Yingst ]
"I'm not back."  Bill Harding, Twister
Sheriff
Your answers look correct except for the third one. Specifically, look more carefully at the inner loop. How many times will it execute if i = 10? What if i = 100? Or i = 1000? How can you generalize this behavior?
[ March 28, 2007: Message edited by: Jim Yingst ]
"I'm not back."  Bill Harding, Twister
Sheriff
I think there's an implied "approximately" and/or "for large values of N" in most bigO discussions. If algorithmic complexity is not the most significant factor in determining execution time, then it makes little sense to worry about the bigO performance of the algorithm.
[David McCombs]: Also, I don't think many recursive algorithms double in time as the number doublesrecursive Fibonacci comes to mind.
Note that Henry did say "if it has an order of N."
"I'm not back."  Bill Harding, Twister
Sheriff
"I'm not back."  Bill Harding, Twister
Sheriff
All true so far...
[John] (meaning both execute n times).
No. Merely being dependent on n does not automatically mean executing n times. Sure, that's the most common case (for a loop from 0 to n1) but it's not what we have here.
[John]: Now the equation of n for the inner loop might not be n/2. But it should still be an O(n^2) algorithm.
And yet, it's not. Again, I recommend that you write some code to see what happens. It doesn't matter what you think the answer should be  you need to discover what it is.
Note: it will be simpler if you focus on the inner loop only, at first:
How many times would this execute for i = 10? i = 100? Or i = 1000? Don't guess. Find out.
"I'm not back."  Bill Harding, Twister
look! it's a bird! it's a plane! It's .... a teeny tiny ad
The WEB SERVICES and JAXRS Course
https://coderanch.com/t/690789/WEBSERVICESJAXRS
