- 1

`control()`depends linearly on

`vertices`.

`control()`is called a constant number of times from a single non-recursive invocation of

`find()`.

`find()`calls itself until

`n`matches

`vertices`, so

`control()`is being called

`vertices`times. We conclude that the growth rate of

`find()`is in

`O(vertices * vertices)`, because it the number of calls to

`control()`grows linearly in

`vertices`, and the growth of

`control()`itself is also linear in

`vertices`.

I don't know from where you get the other complexity class. I can't see any exponential growth.

*The mind is a strange and wonderful thing. I'm not sure that it will ever be able to figure itself out, everything else, maybe. From the atom to the universe, everything, except itself.*

Stephan van Hulst wrote:The growth rate of

control()depends linearly onvertices.control()is called a constant number of times from a single non-recursive invocation offind().find()calls itself untilnmatchesvertices, socontrol()is being calledverticestimes. We conclude that the growth rate offind()is inO(vertices * vertices), because it the number of calls tocontrol()grows linearly invertices, and the growth ofcontrol()itself is also linear invertices.

I don't know from where you get the other complexity class. I can't see any exponential growth.

This makes a lot of sense to me. Thank you.

control method has a loop that runs for vertices number of times - So it is O(vertices).

find is a recursive function that stops when n reaches vertices. (Assuming n starts from 1) It calls control for n=1,2,3... vertices.

find is called from a loop for 3 times. Each recursive calls involves this and hence this makes it exponential - O(3^vertices).

So, finally, it is O(3^vertices) * O(vertices) which is O(3^vertices)

EDIT:

Since you have throw new Exception("Solution found");, it will blow up once n reaches vertices. So it is O(vertices^2) in that way.

This is the explanation I got, prior to yours. That was the reason I was confused and that's where the exponential growth came from.

- 1

`vertices`if you didn't check that

`n`is smaller than

`vertices`, but since you do, the growth is polyrhythmic.

*The mind is a strange and wonderful thing. I'm not sure that it will ever be able to figure itself out, everything else, maybe. From the atom to the universe, everything, except itself.*

Stephan van Hulst wrote:Right. The growth would have been exponential in

verticesif you didn't check thatnis smaller thanvertices, but since you do, the growth is polyrhythmic.

Thank you so much.

- 1

Growth is indeed exponential. Every call of

`find()`in the recursion tree spawns three new calls to

`find()`, a level lower.

`vertices`determines the height of the recursion tree, so the complexity is

`O(3^vertices)`.

My confusion arised because I considered the loop in which

`find()`is called to add a constant factor which could be ignored. However, it determines the base number of the exponent. If the loop ran once instead of three times, the recursion tree would have a degenerate form of a single stem, which would lead to the quadratic complexity we calculated earlier.

*The mind is a strange and wonderful thing. I'm not sure that it will ever be able to figure itself out, everything else, maybe. From the atom to the universe, everything, except itself.*

- 1

Stephan van Hulst wrote:I apologize, I'm wrong.

Growth is indeed exponential. Every call offind()in the recursion tree spawns three new calls tofind(), a level lower.verticesdetermines the height of the recursion tree, so the complexity isO(3^vertices).

My confusion arised because I considered the loop in whichfind()is called to add a constant factor which could be ignored. However, it determines the base number of the exponent. If the loop ran once instead of three times, the recursion tree would have a degenerate form of a single stem, which would lead to the quadratic complexity we calculated earlier.

Okay, I think I understand it actually. I'm very new to this so thank you for taking your time to explain.

Stephan van Hulst wrote:I apologize, I'm wrong.

Growth is indeed exponential. Every call offind()in the recursion tree spawns three new calls tofind(), a level lower.verticesdetermines the height of the recursion tree, so the complexity isO(3^vertices).

My confusion arised because I considered the loop in whichfind()is called to add a constant factor which could be ignored. However, it determines the base number of the exponent. If the loop ran once instead of three times, the recursion tree would have a degenerate form of a single stem, which would lead to the quadratic complexity we calculated earlier.

Can you also explain why throwing the exception changes the growth to O(vertices^2)? Or does it?

Martin Fowler in the book Refactoring wrote:Any fool can write code that a computer can understand. Good programmers write code that humans can understand.

I would become very unpopular very quickly with the rest of my team if I presented code like this as part of a solution.

Tim Driven Development

`find()`will never happen. That means that the initial call to

`find()`will terminate after the first recursive call where

`n == vertices`holds, pruning all other branches of the recursion tree.

In short, when you use a

`throw`statement to terminate your method call, the complexity will be

`O(vertices^2)`. If you use a

`return`statement, the complexity will be

`O(3^vertices)`.

Stephan van Hulst wrote:Because the exception isn't caught, all future calls to

find()will never happen. That means that the initial call tofind()will terminate after the first recursive call wheren == verticesholds, pruning all other branches of the recursion tree.

In short, when you use athrowstatement to terminate your method call, the complexity will beO(vertices^2). If you use areturnstatement, the complexity will beO(3^vertices).

What if I were to call find() in a try block and then catched the exception. Would the complexity be O(3^vertices)?

Tim Cooke wrote:Can we take into consideration the cognitive complexity of writing code like this? Somebody smart once said...

Martin Fowler in the book Refactoring wrote:Any fool can write code that a computer can understand. Good programmers write code that humans can understand.

I would become very unpopular very quickly with the rest of my team if I presented code like this as part of a solution.

I am very sorry for that. But all my explanations and the rest of the code was in Turkish. I didn't have time to translate them all so I stripped the code. I shouldn't have done that.

Tim Driven Development

Tim Cooke wrote:It wasn't that. It was the mixing of iterative and recursive looping structures that would have offended my peers and embarrassed myself.

Oh. Well, I'm a newbie so I'll try to watch it. Thanks for the warning.

Tim Cooke wrote:It wasn't that. It was the mixing of iterative and recursive looping structures that would have offended my peers and embarrassed myself.

I'm not sure I share that feeling. Many tree traversals algorithms use a combination of these techniques. Here's a common algorithm for pre-order tree traversal:

Elif Sahin wrote:

Stephan van Hulst wrote:Because the exception isn't caught, all future calls to

find()will never happen. That means that the initial call tofind()will terminate after the first recursive call wheren == verticesholds, pruning all other branches of the recursion tree.

In short, when you use athrowstatement to terminate your method call, the complexity will beO(vertices^2). If you use areturnstatement, the complexity will beO(3^vertices).

What if I were to call find() in a try block and then catched the exception. Would the complexity be O(3^vertices)?

Thank you so much for helping me. I appreciate it.

Elif Sahin wrote:What if I were to call

find()in a try block and then catched the exception. Would the complexity be O(3^vertices)?

It depends on where you catch the exception. If it's within the loop, so that the loop is allowed to finish, the complexity will be exponential. If it's outside the loop, the recursion tree degenerates and the complexity will be polyrhythmic. Note that the same would happen with return as what happens now with your throw statement, if you perform an early return from the loop, which is a cleaner solution anyway. You must not use exceptions to perform logic.

After analysis of your algorithm, it appears the goal of it is to color every node of a graph one of three colors, such that no node shares a color with one of its neighbors. Here's a version of your algorithm that looks clearer to me:

Stephan van Hulst wrote:

Most importantly, use descriptive names. I think my example is much easier to read because I use fully written phrases for identifiers. The meaning of them is clear without having to analyze the context in which they're used. Use verbs for method names, and nouns for variable names. Use appropriate types. An integer is a poor representation of a color. Use enums for types that have a small finite number of instances. Don't use exceptions for flow control. Exceptions are for when something goes wrong. Finding a solution is a normal outcome of your algorithm, not an exceptional outcome. Prefer collection types over arrays. I think arrays are really great for representing fixed coordinate systems, but otherwise I stick to lists, sets, maps, queues, etc. Consider all edge cases that an algorithm might encounter. For instance, your algorithm doesn't take into account that nodes might already be colored before it starts running. Don't hardcode values. Your algorithm is hardwired so that it uses three colors. Instead, you can pass the data that it needs through method parameters.

These are great, I will definitely keep them in mind. I'm grateful for everything, thank you so much.

It is sorta covered in the JavaRanch Style Guide. |