• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Jeanne Boyarsky
  • Liutauras Vilda
  • Campbell Ritchie
  • Tim Cooke
  • Bear Bibeault
Sheriffs:
  • Paul Clapham
  • Junilu Lacar
  • Knute Snortum
Saloon Keepers:
  • Ron McLeod
  • Ganesh Patekar
  • Tim Moores
  • Pete Letkeman
  • Stephan van Hulst
Bartenders:
  • Carey Brown
  • Tim Holloway
  • Joe Ess

Time complexity of recursion inside of for loop  RSS feed

 
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I'm learning how to calculate time complexity. Can you help me with the time complexity of this bit of code? I've asked other people but didn't understand it. It's supposed to be O(3^vertices) or O(vertices^2) based on the base condition. The difference between the two confuses me. Any help is appreciated, thanks in advance!

 
Saloon Keeper
Posts: 9145
172
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The growth rate of control() depends linearly on vertices. control() is called a constant number of times from a single non-recursive invocation of find(). find() calls itself until n matches vertices, so control() is being called vertices times. We conclude that the growth rate of find() is in O(vertices * vertices), because it the number of calls to control() grows linearly in vertices, and the growth of control() itself is also linear in vertices.

I don't know from where you get the other complexity class. I can't see any exponential growth.
 
Elif Sahin
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Stephan van Hulst wrote:The growth rate of control() depends linearly on vertices. control() is called a constant number of times from a single non-recursive invocation of find(). find() calls itself until n matches vertices, so control() is being called vertices times. We conclude that the growth rate of find() is in O(vertices * vertices), because it the number of calls to control() grows linearly in vertices, and the growth of control() itself is also linear in vertices.

I don't know from where you get the other complexity class. I can't see any exponential growth.



This makes a lot of sense to me. Thank you.

control method has a loop that runs for vertices number of times - So it is O(vertices).

find is a recursive function that stops when n reaches vertices. (Assuming n starts from 1) It calls control for n=1,2,3... vertices.

find is called from a loop for 3 times. Each recursive calls involves this and hence this makes it exponential - O(3^vertices).

So, finally, it is O(3^vertices) * O(vertices) which is O(3^vertices)

EDIT:

Since you have throw new Exception("Solution found");, it will blow up once n reaches vertices. So it is O(vertices^2) in that way.



This is the explanation I got, prior to yours. That was the reason I was confused and that's where the exponential growth came from.
 
Stephan van Hulst
Saloon Keeper
Posts: 9145
172
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Right. The growth would have been exponential in vertices if you didn't check that n is smaller than vertices, but since you do, the growth is polyrhythmic.
 
Elif Sahin
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Stephan van Hulst wrote:Right. The growth would have been exponential in vertices if you didn't check that n is smaller than vertices, but since you do, the growth is polyrhythmic.



Thank you so much.
 
Stephan van Hulst
Saloon Keeper
Posts: 9145
172
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I apologize, I'm wrong.

Growth is indeed exponential. Every call of find() in the recursion tree spawns three new calls to find(), a level lower. vertices determines the height of the recursion tree, so the complexity is O(3^vertices).

My confusion arised because I considered the loop in which find() is called to add a constant factor which could be ignored. However, it determines the base number of the exponent. If the loop ran once instead of three times, the recursion tree would have a degenerate form of a single stem, which would lead to the quadratic complexity we calculated earlier.
 
Stephan van Hulst
Saloon Keeper
Posts: 9145
172
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I'm sorry for the added confusion.
 
Elif Sahin
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Stephan van Hulst wrote:I apologize, I'm wrong.

Growth is indeed exponential. Every call of find() in the recursion tree spawns three new calls to find(), a level lower. vertices determines the height of the recursion tree, so the complexity is O(3^vertices).

My confusion arised because I considered the loop in which find() is called to add a constant factor which could be ignored. However, it determines the base number of the exponent. If the loop ran once instead of three times, the recursion tree would have a degenerate form of a single stem, which would lead to the quadratic complexity we calculated earlier.



Okay, I think I understand it actually. I'm very new to this so thank you for taking your time to explain.
 
Elif Sahin
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Stephan van Hulst wrote:I apologize, I'm wrong.

Growth is indeed exponential. Every call of find() in the recursion tree spawns three new calls to find(), a level lower. vertices determines the height of the recursion tree, so the complexity is O(3^vertices).

My confusion arised because I considered the loop in which find() is called to add a constant factor which could be ignored. However, it determines the base number of the exponent. If the loop ran once instead of three times, the recursion tree would have a degenerate form of a single stem, which would lead to the quadratic complexity we calculated earlier.



Can you also explain why throwing the exception changes the growth to O(vertices^2)? Or does it?
 
Marshal
Posts: 4455
284
Clojure IntelliJ IDE Java
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Can we take into consideration the cognitive complexity of writing code like this? Somebody smart once said...

Martin Fowler in the book Refactoring wrote:Any fool can write code that a computer can understand. Good programmers write code that humans can understand.


I would become very unpopular very quickly with the rest of my team if I presented code like this as part of a solution.
 
Stephan van Hulst
Saloon Keeper
Posts: 9145
172
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Because the exception isn't caught, all future calls to find() will never happen. That means that the initial call to find() will terminate after the first recursive call where n == vertices holds, pruning all other branches of the recursion tree.

In short, when you use a throw statement to terminate your method call, the complexity will be O(vertices^2). If you use a return statement, the complexity will be O(3^vertices).
 
Master Rancher
Posts: 2758
93
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Another way is simply to place a counter in the most inner loop, and run the code for several values of vertices (with n fixed) and for a fixed number of vertices, with varying n, and with both varying. you should then get a reasonable idea about the complexity.
 
Elif Sahin
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Stephan van Hulst wrote:Because the exception isn't caught, all future calls to find() will never happen. That means that the initial call to find() will terminate after the first recursive call where n == vertices holds, pruning all other branches of the recursion tree.

In short, when you use a throw statement to terminate your method call, the complexity will be O(vertices^2). If you use a return statement, the complexity will be O(3^vertices).



What if I were to call find() in a try block and then catched the exception. Would the complexity be O(3^vertices)?
 
Elif Sahin
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Tim Cooke wrote:Can we take into consideration the cognitive complexity of writing code like this? Somebody smart once said...

Martin Fowler in the book Refactoring wrote:Any fool can write code that a computer can understand. Good programmers write code that humans can understand.


I would become very unpopular very quickly with the rest of my team if I presented code like this as part of a solution.



I am very sorry for that. But all my explanations and the rest of the code was in Turkish. I didn't have time to translate them all so I stripped the code. I shouldn't have done that.
 
Tim Cooke
Marshal
Posts: 4455
284
Clojure IntelliJ IDE Java
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
It wasn't that. It was the mixing of iterative and recursive looping structures that would have offended my peers and embarrassed myself.
 
Elif Sahin
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Tim Cooke wrote:It wasn't that. It was the mixing of iterative and recursive looping structures that would have offended my peers and embarrassed myself.



Oh. Well, I'm a newbie so I'll try to watch it. Thanks for the warning.
 
Stephan van Hulst
Saloon Keeper
Posts: 9145
172
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Tim Cooke wrote:It wasn't that. It was the mixing of iterative and recursive looping structures that would have offended my peers and embarrassed myself.


I'm not sure I share that feeling. Many tree traversals algorithms use a combination of these techniques. Here's a common algorithm for pre-order tree traversal:
 
Elif Sahin
Greenhorn
Posts: 19
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Elif Sahin wrote:

Stephan van Hulst wrote:Because the exception isn't caught, all future calls to find() will never happen. That means that the initial call to find() will terminate after the first recursive call where n == vertices holds, pruning all other branches of the recursion tree.

In short, when you use a throw statement to terminate your method call, the complexity will be O(vertices^2). If you use a return statement, the complexity will be O(3^vertices).



What if I were to call find() in a try block and then catched the exception. Would the complexity be O(3^vertices)?



Thank you so much for helping me. I appreciate it.
 
Stephan van Hulst
Saloon Keeper
Posts: 9145
172
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Elif Sahin wrote:What if I were to call find() in a try block and then catched the exception. Would the complexity be O(3^vertices)?


It depends on where you catch the exception. If it's within the loop, so that the loop is allowed to finish, the complexity will be exponential. If it's outside the loop, the recursion tree degenerates and the complexity will be polyrhythmic. Note that the same would happen with return as what happens now with your throw statement, if you perform an early return from the loop, which is a cleaner solution anyway. You must not use exceptions to perform logic.

After analysis of your algorithm, it appears the goal of it is to color every node of a graph one of three colors, such that no node shares a color with one of its neighbors. Here's a version of your algorithm that looks clearer to me:
 
Stephan van Hulst
Saloon Keeper
Posts: 9145
172
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Actually, the algorithm itself isn't so bad, the one I wrote is quite similar. The problem is mostly with your code style, and how you handle edge cases. Here are some pointers:

  • Most importantly, use descriptive names. I think my example is much easier to read because I use fully written phrases for identifiers. The meaning of them is clear without having to analyze the context in which they're used.
  • Use verbs for method names, and nouns for variable names.
  • Use appropriate types. An integer is a poor representation of a color. Use enums for types that have a small finite number of instances.
  • Don't use exceptions for flow control. Exceptions are for when something goes wrong. Finding a solution is a normal outcome of your algorithm, not an exceptional outcome.
  • Prefer collection types over arrays. I think arrays are really great for representing fixed coordinate systems, but otherwise I stick to lists, sets, maps, queues, etc.
  • Consider all edge cases that an algorithm might encounter. For instance, your algorithm doesn't take into account that nodes might already be colored before it starts running.
  • Don't hardcode values. Your algorithm is hardwired so that it uses three colors. Instead, you can pass the data that it needs through method parameters.
  •  
    Elif Sahin
    Greenhorn
    Posts: 19
    • Mark post as helpful
    • send pies
    • Quote
    • Report post to moderator

    Stephan van Hulst wrote:

  • Most importantly, use descriptive names. I think my example is much easier to read because I use fully written phrases for identifiers. The meaning of them is clear without having to analyze the context in which they're used.
  • Use verbs for method names, and nouns for variable names.
  • Use appropriate types. An integer is a poor representation of a color. Use enums for types that have a small finite number of instances.
  • Don't use exceptions for flow control. Exceptions are for when something goes wrong. Finding a solution is a normal outcome of your algorithm, not an exceptional outcome.
  • Prefer collection types over arrays. I think arrays are really great for representing fixed coordinate systems, but otherwise I stick to lists, sets, maps, queues, etc.
  • Consider all edge cases that an algorithm might encounter. For instance, your algorithm doesn't take into account that nodes might already be colored before it starts running.
  • Don't hardcode values. Your algorithm is hardwired so that it uses three colors. Instead, you can pass the data that it needs through method parameters.


  • These are great, I will definitely keep them in mind. I'm grateful for everything, thank you so much.
     
    It is sorta covered in the JavaRanch Style Guide.
    • Post Reply Bookmark Topic Watch Topic
    • New Topic
    Boost this thread!