nonevent99 Posted November 1, 2003 Share Posted November 1, 2003 Here are some more to practice on. Solve for the function as a big-O of n, and also give the run-time as a big-O of n. function a(n) { if (n return a(n/3) + a(n/2); } function b(n) { if (n return a(n-1) + 2*a(n-2); } function c(n) { if (n return c(floor(log(n))) + e^n; } function d(n) { if (n return 2*d(n%(n/2)); } function e(n) { if (n return e(n/2)* lg(n); } function f(n) { if (n return n*f(n-1); } function g(n) { if (n return g(n-1) + f(n-2) + e(n-3); } function h(n) { if (n return 2*h(n/2) + n*lgn; } Good luck! 1 Quote Link to comment Share on other sites More sharing options...
AlbaLed Posted November 1, 2003 Share Posted November 1, 2003 function b(n) { if (n return a(n-1) + 2*a(n-2); } Is there a typo, a(n-1) should it be b(n-1) or not? Quote Link to comment Share on other sites More sharing options...
wood Posted November 1, 2003 Share Posted November 1, 2003 I believe so... function a(n) { if (n return a(n/3) + a(n/2); } a(n) = O(n) run time = O(log n) function b(n) { if (n return a(n-1) + 2*a(n-2); } b(n) = O(3^n) run time = O(n) Quote Link to comment Share on other sites More sharing options...
AlbaLed Posted November 1, 2003 Share Posted November 1, 2003 Wood I don't think you got the runtime of b correct b(n) = b(n-1) + b(n-2) if we build and execution tree, it is going to be a binary tree, with height n - 1, the number of nodes in the tree give us the number of calls, which is going to be 2^n. Am I missing an important part here ??? Try the real tree with n= 6, I am getting 15 calls, and n=5 yields 9 calls b(7) = b(6) + b(5) = 24 b(8) = b(7) + b(6) = 24 + 15 = 39 b(9) = b(8) + b(7) = 39 + 24 = 63 b(10) = b(9) + b(8) = 63 + 39 = 102 that does not seem like a linear growth to me. what do you think? Quote Link to comment Share on other sites More sharing options...
wood Posted November 1, 2003 Share Posted November 1, 2003 You're right AlbaLed. Somehow, I have trouble calculating the running time. So, in this case it is 2^(n-1) + 2^(n-2), right? We bound it to what, 2^N? Quote Link to comment Share on other sites More sharing options...
AlbaLed Posted November 1, 2003 Share Posted November 1, 2003 there you go 2^n. I find it really helpfull building a tree, even in the other case when you have to compute the big-oh of the function, you have a tree, not binary anymore but ternary, (n-1) (n-2) (n-2). Don't know if this is going to work for you but it definitely works for me. Screw the master theorem, as long as you know how to do summations correctly you can always solve a recurrence by building a tree, I usually do a tree of heigh 3, because that's when the patterns become visible and let the summation take over from there. Try it, it might work for you too hope it helps Quote Link to comment Share on other sites More sharing options...
wood Posted November 1, 2003 Share Posted November 1, 2003 AlbaLed... I usually try the iteration method or a tree. I usually swing between both and because of that I lose time sometimes. Let me try the other ones... Quote Link to comment Share on other sites More sharing options...
wood Posted November 1, 2003 Share Posted November 1, 2003 In C, it is supposed to be ln instead of log right? function c(n) { if (n return c(floor(ln(n))) + e^n; } ? function d(n) { if (n return 2*d(n%(n/2)); } d(n) = 2 run time = 2 Quote Link to comment Share on other sites More sharing options...
AlbaLed Posted November 1, 2003 Share Posted November 1, 2003 function c(n) { if (n return c(floor(log(n))) + e^n; } run time O(log*(n)) function = e^n + e^log(n) + e^log(log(n)) ..... no clue about this series function d(n) { if (n return 2*d(n%(n/2)); } run time = O(1) function = O(1) function e(n) { if (n return e(n/2)* lg(n); } run time = O(lg(n)) function = lg(n)*lg(n/2)*lg(n/4)* ....... = = lg(n)(lg(n) - 1)(lg(n) -2) ...... *(lg(n) - [lg(n) - 1]), so I guess we could say function = O[lg(n)^lg*(n)] ???? nonevent can you confirm Quote Link to comment Share on other sites More sharing options...
wood Posted November 1, 2003 Share Posted November 1, 2003 Right AlbaLed... I got the same thing and was about to post. O(ln(n)^log(n)) Wood Quote Link to comment Share on other sites More sharing options...
AlbaLed Posted November 1, 2003 Share Posted November 1, 2003 how did you get that, enlighten us all plz Quote Link to comment Share on other sites More sharing options...
wood Posted November 1, 2003 Share Posted November 1, 2003 Well, pretty much the same logic as you posted. The run time is O(log(n)) (by log(n) I mean log n base 2), since we keep dividing it by 2. The function is ln(n)*ln(n/2)*ln(n/4)*... for log(n)-1 times. ln(n)*(ln(n)-ln(2))*(ln(n)-ln(4))*...*(ln(n)-(log(n)-1)) [ln(n)^2 - ln(n)*ln(2)]*(ln(n)-ln(4))*... [ln(n)^3 - ln(n)^2*ln(4) - ln(n)^2*ln(2)]*... So, if you multiply it by log(n)-1 times, the first part of the function will be ln(n)^log(n) and we can bound to it. Does it make sense? Am I completely off? Quote Link to comment Share on other sites More sharing options...
wood Posted November 1, 2003 Share Posted November 1, 2003 function f(n) { if (n return n*f(n-1); } f(n) = for n>=3, I'd say f(n) = 0 ? run time = O(n) Quote Link to comment Share on other sites More sharing options...
AlbaLed Posted November 1, 2003 Share Posted November 1, 2003 It does make sense, but one part that you're loosing me is why do you switch to ln(n). For f(n) I guess you can put O(1), and your runtime is good. Another case is, if nonevent is looking for is actually calls to f(n) if this was a recurrence problems like this f(n) = n%3 (n =1,2,3) f(n) = nf(n-1) which is then going to be n!/3!, and then the recurrence start to unroll. The above might be the case if nonevent made a typo, instead of writing function f(n) { if (n return n*f(n-1); } Quote Link to comment Share on other sites More sharing options...
wood Posted November 2, 2003 Share Posted November 2, 2003 Ops... my mistake. Somehow, I thought that it was ln() from the definition of e(). It is therefore O(log(n)^log(n)). Quote Link to comment Share on other sites More sharing options...
nonevent99 Posted November 2, 2003 Author Share Posted November 2, 2003 a(n) = a(n/3) + a(n/2); Did you guys try plugging this into a piece of software and actually try calculating it? It's instructive. a(n) definitely doesn't grow as fast as n. So you could say it's O(n), but there is a tighter bound. If you guess that a(n) = O(n^p) for some p that we need to discover, then you can suppose that a(n) = k * n^p for really big n. Substituting, a(n) = k * (n/2)^p + k * (n/3) ^ p. If we set this equal to a(n) = k * n^p, we find that k needs to solve k /2^p + k/3^p = 1. Trying out various values of p, I find that k = 1, p = 0.78 quantitatively matches what I get if I write a piece of software to compute a. So a(n) is approximately Theta(n^0.78). Obviously, on the GRE we won't have recourse to software. But there is a lesson here: If the problem had been a(n) = 2*a(n/2), then obviously the answer would be a(n) = O(n). But here we only have a(n/3) + a(n/2). It turns out that this makes an asymptotic difference in the function's value. I hope you think that's as cool as I do. ;) The run-time should be log (n base 3) because that's what it takes to drive n down to 1. function b(n) { if (n return b(n-1) + 2*b(n-2); // not a } Definitely O(2^(n-1)), as calculating a few will tell. Run time O(n). Quote Link to comment Share on other sites More sharing options...
nonevent99 Posted November 2, 2003 Author Share Posted November 2, 2003 Originally posted by AlbaLed function c(n) { if (n return c(floor(log(n))) + e^n; } run time O(log*(n)) function = e^n + e^log(n) + e^log(log(n)) ..... no clue about this series That looks right, so it's e^n + n + log(n) + log(log(n)) + ... which I think can be bounded above by e^n + 2*n, yielding O(e^n). Quote Link to comment Share on other sites More sharing options...
nonevent99 Posted November 2, 2003 Author Share Posted November 2, 2003 function d(n) { if (n return 2*d(n%(n/2)); } You guys got this one. Pretty much always equals 2 (exception is n Quote Link to comment Share on other sites More sharing options...
AlbaLed Posted November 2, 2003 Share Posted November 2, 2003 nice catch on a(n) = a(n/3) + a(n/2); Do you know of any faster way to work on this problem, because you cannot assume a(n) The run-time should be log (n base 3) because that's what it takes to drive n down to 1. True, for the left most part of recursion, what about the n/2 side??? Isn't big OH supposed to capture the max, wors case time? Eventhough it does not make any difference because log3(n) is only a constant factor away from lg(n), I want to make sure I get you're line of thinking. Quote Link to comment Share on other sites More sharing options...
nonevent99 Posted November 2, 2003 Author Share Posted November 2, 2003 This is probably the coolest of all the functions I posted. n e(n) lg n 1 1 0 2 1 1 4 2 2 8 6 3 16 24 4 32 120 5 64 720 6 128 5040 7 256 40320 8 512 362880 9 Recognize the sequence? Quote Link to comment Share on other sites More sharing options...
wood Posted November 2, 2003 Share Posted November 2, 2003 I hope you think that's as cool as I do. ;)Indeed!! function b(n) { if (n return b(n-1) + 2*b(n-2); // not a } Definitely O(2^(n-1)), as calculating a few will tell. Run time O(n). Look at AlbaLed's explanation and mine for the runtime. It is definitely not linear.. I ran a program and after 50, my pentium IV at work was crawling! :D Why isn't it O(3^N)? Quote Link to comment Share on other sites More sharing options...
AlbaLed Posted November 2, 2003 Share Posted November 2, 2003 n permute n-1 hahahaha :p Quote Link to comment Share on other sites More sharing options...
nonevent99 Posted November 2, 2003 Author Share Posted November 2, 2003 Originally posted by AlbaLed nice catch on a(n) = a(n/3) + a(n/2); Do you know of any faster way to work on this problem, because you cannot assume a(n) The run-time should be log (n base 3) because that's what it takes to drive n down to 1. True, for the left most part of recursion, what about the n/2 side??? Isn't big OH supposed to capture the max, wors case time? Eventhough it does not make any difference because log3(n) is only a constant factor away from lg(n), I want to make sure I get you're line of thinking. You're probably right that there's an asymptotically tighter bound than log3(n). I don't know how to get it off the top of my head, though. The problem is that 3 isn't a power of 2. If somebody had said a(n) = a(n/2) + a(n/4), then I'd say that computing the left hand subtree of a(n) is twice as expensive as computing the right hand subtree. So R(n) = 2*R(n/4) + R(n/4) (where R = runtime) = 3*R(n/4), which has a nice clean solution using the master theorem. Hm. Is it true that computing the left hand subtree is twice as expensive as the right hand? I'm not so good with trees. Quote Link to comment Share on other sites More sharing options...
nonevent99 Posted November 2, 2003 Author Share Posted November 2, 2003 Originally posted by wood I hope you think that's as cool as I do. ;)Indeed!! function b(n) { if (n return b(n-1) + 2*b(n-2); // not a } Definitely O(2^(n-1)), as calculating a few will tell. Run time O(n). Look at AlbaLed's explanation and mine for the runtime. It is definitely not linear.. I ran a program and after 50, my pentium IV at work was crawling! :D Why isn't it O(3^N)? It's those kind of cavalier statements that'll get me a 630 on this exam if I'm not careful! [V] The run time is essentially the same as if we had to calculate the run time of b(n) = b(n-1) + b(n-2), ie computing the Fibonnaci. We hashed through that in another thread. http://www.TestMagic.com/forum/topic.asp?TOPIC_ID=8613 Quote Link to comment Share on other sites More sharing options...
nonevent99 Posted November 2, 2003 Author Share Posted November 2, 2003 Originally posted by AlbaLed It does make sense, but one part that you're loosing me is why do you switch to ln(n). For f(n) I guess you can put O(1), and your runtime is good. Another case is, if nonevent is looking for is actually calls to f(n) if this was a recurrence problems like this f(n) = n%3 (n =1,2,3) f(n) = nf(n-1) which is then going to be n!/3!, and then the recurrence start to unroll. The above might be the case if nonevent made a typo, instead of writing function f(n) { if (n return n*f(n-1); } Yup, typo. Definite typo. :drunk: So you're both right! Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.