Jump to content
Urch Forums

more recursions, by popular (Alba's) demand


nonevent99

Recommended Posts

Here are some more to practice on. Solve for the function as a big-O of n, and also give the run-time as a big-O of n.

 

function a(n) {

if (n

return a(n/3) + a(n/2);

}

 

function b(n) {

if (n

return a(n-1) + 2*a(n-2);

}

 

function c(n) {

if (n

return c(floor(log(n))) + e^n;

}

 

function d(n) {

if (n

return 2*d(n%(n/2));

}

 

function e(n) {

if (n

return e(n/2)* lg(n);

}

 

function f(n) {

if (n

return n*f(n-1);

}

 

function g(n) {

if (n

return g(n-1) + f(n-2) + e(n-3);

}

 

 

function h(n) {

if (n

return 2*h(n/2) + n*lgn;

}

 

Good luck!

 

  • Like 1
Link to comment
Share on other sites

Wood I don't think you got the runtime of b correct

 

b(n) = b(n-1) + b(n-2)

 

if we build and execution tree, it is going to be a binary tree, with height n - 1, the number of nodes in the tree give us the number of calls, which is going to be 2^n. Am I missing an important part here ??? Try the real tree with n= 6, I am getting 15 calls, and n=5 yields 9 calls

b(7)  = b(6) +  b(5) = 24
b(8)  = b(7) +  b(6) = 24 + 15 = 39
b(9)  = b(8) +  b(7) = 39 + 24 = 63
b(10) = b(9) +  b(8) = 63 + 39 = 102

that does not seem like a linear growth to me. what do you think?

Link to comment
Share on other sites

there you go 2^n. I find it really helpfull building a tree, even in the other case when you have to compute the big-oh of the function, you have a tree, not binary anymore but ternary, (n-1) (n-2) (n-2). Don't know if this is going to work for you but it definitely works for me. Screw the master theorem, as long as you know how to do summations correctly you can always solve a recurrence by building a tree, I usually do a tree of heigh 3, because that's when the patterns become visible and let the summation take over from there. Try it, it might work for you too

 

hope it helps

Link to comment
Share on other sites

function c(n) {

if (n

return c(floor(log(n))) + e^n;

}

run time O(log*(n))

function = e^n + e^log(n) + e^log(log(n)) ..... no clue about this series

 

function d(n) {

if (n

return 2*d(n%(n/2));

}

 

run time = O(1)

function = O(1)

 

function e(n) {

if (n

return e(n/2)* lg(n);

}

 

run time = O(lg(n))

function = lg(n)*lg(n/2)*lg(n/4)* ....... =

= lg(n)(lg(n) - 1)(lg(n) -2) ...... *(lg(n) - [lg(n) - 1]), so I guess we could say

function = O[lg(n)^lg*(n)] ???? nonevent can you confirm

Link to comment
Share on other sites

Well, pretty much the same logic as you posted.

 

The run time is O(log(n)) (by log(n) I mean log n base 2), since we keep dividing it by 2.

 

The function is

 

ln(n)*ln(n/2)*ln(n/4)*... for log(n)-1 times.

 

ln(n)*(ln(n)-ln(2))*(ln(n)-ln(4))*...*(ln(n)-(log(n)-1))

[ln(n)^2 - ln(n)*ln(2)]*(ln(n)-ln(4))*...

[ln(n)^3 - ln(n)^2*ln(4) - ln(n)^2*ln(2)]*...

 

So, if you multiply it by log(n)-1 times, the first part of the function will be ln(n)^log(n) and we can bound to it.

 

Does it make sense? Am I completely off?

Link to comment
Share on other sites

It does make sense, but one part that you're loosing me is why do you switch to ln(n).

 

For f(n) I guess you can put O(1), and your runtime is good.

Another case is, if nonevent is looking for is actually calls to f(n) if this was a recurrence problems like this

 

f(n) = n%3 (n =1,2,3)

f(n) = nf(n-1)

 

which is then going to be n!/3!, and then the recurrence start to unroll.

 

The above might be the case if nonevent made a typo, instead of writing

 

function f(n) {

if (n

return n*f(n-1);

}

 

 

Link to comment
Share on other sites

a(n) = a(n/3) + a(n/2);

 

Did you guys try plugging this into a piece of software and actually try calculating it? It's instructive. a(n) definitely doesn't grow as fast as n. So you could say it's O(n), but there is a tighter bound.

 

If you guess that a(n) = O(n^p) for some p that we need to discover, then you can suppose that a(n) = k * n^p for really big n. Substituting, a(n) = k * (n/2)^p + k * (n/3) ^ p. If we set this equal to a(n) = k * n^p, we find that k needs to solve k /2^p + k/3^p = 1. Trying out various values of p, I find that k = 1, p = 0.78 quantitatively matches what I get if I write a piece of software to compute a.

 

So a(n) is approximately Theta(n^0.78).

 

Obviously, on the GRE we won't have recourse to software. But there is a lesson here: If the problem had been a(n) = 2*a(n/2), then obviously the answer would be a(n) = O(n). But here we only have a(n/3) + a(n/2). It turns out that this makes an asymptotic difference in the function's value.

 

I hope you think that's as cool as I do. ;)

 

The run-time should be log (n base 3) because that's what it takes to drive n down to 1.

 

 

 

function b(n) {

if (n

return b(n-1) + 2*b(n-2); // not a

}

 

Definitely O(2^(n-1)), as calculating a few will tell. Run time O(n).

 

Link to comment
Share on other sites

Originally posted by AlbaLed

 

function c(n) {

if (n

return c(floor(log(n))) + e^n;

}

run time O(log*(n))

function = e^n + e^log(n) + e^log(log(n)) ..... no clue about this series

 

That looks right, so it's e^n + n + log(n) + log(log(n)) + ...

 

which I think can be bounded above by e^n + 2*n, yielding O(e^n).

Link to comment
Share on other sites

nice catch on

a(n) = a(n/3) + a(n/2);

 

Do you know of any faster way to work on this problem, because you cannot assume a(n)

 

The run-time should be log (n base 3) because that's what it takes to drive n down to 1.

 

True, for the left most part of recursion, what about the n/2 side??? Isn't big OH supposed to capture the max, wors case time? Eventhough it does not make any difference because log3(n) is only a constant factor away from lg(n), I want to make sure I get you're line of thinking.

 

Link to comment
Share on other sites

I hope you think that's as cool as I do. ;)
Indeed!!

 

 

function b(n) {

if (n

return b(n-1) + 2*b(n-2); // not a

}

 

Definitely O(2^(n-1)), as calculating a few will tell. Run time O(n).

 

Look at AlbaLed's explanation and mine for the runtime. It is definitely not linear.. I ran a program and after 50, my pentium IV at work was crawling! :D Why isn't it O(3^N)?

 

 

Link to comment
Share on other sites

Originally posted by AlbaLed

 

nice catch on

a(n) = a(n/3) + a(n/2);

 

Do you know of any faster way to work on this problem, because you cannot assume a(n)

 

The run-time should be log (n base 3) because that's what it takes to drive n down to 1.

 

True, for the left most part of recursion, what about the n/2 side??? Isn't big OH supposed to capture the max, wors case time? Eventhough it does not make any difference because log3(n) is only a constant factor away from lg(n), I want to make sure I get you're line of thinking.

 

 

You're probably right that there's an asymptotically tighter bound than log3(n). I don't know how to get it off the top of my head, though. The problem is that 3 isn't a power of 2. If somebody had said a(n) = a(n/2) + a(n/4), then I'd say that computing the left hand subtree of a(n) is twice as expensive as computing the right hand subtree. So R(n) = 2*R(n/4) + R(n/4) (where R = runtime) = 3*R(n/4), which has a nice clean solution using the master theorem.

 

Hm. Is it true that computing the left hand subtree is twice as expensive as the right hand? I'm not so good with trees.

 

 

Link to comment
Share on other sites

Originally posted by wood

 

I hope you think that's as cool as I do. ;)
Indeed!!

 

 

function b(n) {

if (n

return b(n-1) + 2*b(n-2); // not a

}

 

Definitely O(2^(n-1)), as calculating a few will tell. Run time O(n).

 

Look at AlbaLed's explanation and mine for the runtime. It is definitely not linear.. I ran a program and after 50, my pentium IV at work was crawling! :D Why isn't it O(3^N)?

 

 

 

It's those kind of cavalier statements that'll get me a 630 on this exam if I'm not careful! [V] The run time is essentially the same as if we had to calculate the run time of b(n) = b(n-1) + b(n-2), ie computing the Fibonnaci. We hashed through that in another thread.

 

http://www.TestMagic.com/forum/topic.asp?TOPIC_ID=8613

 

 

Link to comment
Share on other sites

Originally posted by AlbaLed

 

It does make sense, but one part that you're loosing me is why do you switch to ln(n).

 

For f(n) I guess you can put O(1), and your runtime is good.

Another case is, if nonevent is looking for is actually calls to f(n) if this was a recurrence problems like this

 

f(n) = n%3 (n =1,2,3)

f(n) = nf(n-1)

 

which is then going to be n!/3!, and then the recurrence start to unroll.

 

The above might be the case if nonevent made a typo, instead of writing

 

function f(n) {

if (n

return n*f(n-1);

}

 

 

 

Yup, typo. Definite typo. :drunk:

 

So you're both right!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...