Jump to content

a[n] = Θ(lg n)?


e(ho0n3

Recommended Posts

I posted this on PF but it didn't get very far:

 

Suppose {a[n]} is an increasing sequence and whenever m divides n, then

 

a[n] = a[n/m] + d

 

where m is a positive integer and d is a real number. Show that a[n] = Θ(lg n).

 

I can show that a[n] = O(lg n). All I need to do is show that a[n] = Ω(lg n). This is where I'm stuck. Any hints?

Link to comment
Share on other sites

Actually those are sets. They are used in order analysis to determine how a particular function grows.

 

We say that g(x) = O(f(x)) iff for any constanct c > 0 their exists a k >= 0 such that g(x) <= cf(x) for all x >= k. In other words, g(x) = O(f(x)) iff g(x) grows no more than f(x), i.e. O(f(x)) is the set of functions that grow no more that f(x). Example: x^2 + x + 3 = O(x^2). Also x = O(x^2), but x^3 != O(x^2).

 

Similarly, we say that g(x) = Ω(f(x)) iff for any constant c > 0 their exists a k >= 0 such that g(x) >= cf(x) for all x >= k. In other words, Ω(f(x)) is the set of functions that grow no less than f(x), practically the opposite of O(f(x)).

 

If g(x) = Ω(f(x)) and g(x) = O(f(x)) then g(x) = Θ(f(x)) and vice versa.

 

So basically what I need to show is that a[n] grown no more and no less than lg n.

Link to comment
Share on other sites

x^2 + x + 3 = O(x^2). dont understand how those two can be equal. one is a function and the other is a set of functions

 

did u mean

[math]x^2+x+3 \in O(x^2)[/math]?

 

its a new concept to me. and finding a lil bit difficult to visualise. I will have another look at it later

Link to comment
Share on other sites

Oh yeah. I forgot to mention that it's a common convention to say

 

[math]x^2 + x + 3 = O(x^2)[/math]

 

instead of

 

[math]x^2 + x + 3 \in O(x^2)[/math]

 

They are both equivalent. It's just convention.

Link to comment
Share on other sites

Where you have n/m are you meaning the floor/ceiling of n/m.

If so' date=' I think you can use the "Master Theorem". This gives asymptotic bounds.[/quote']

No, I don't mean the floor or the ceiling of n/m. Read the post carefully. Anyways, I would like to solve this without the use of the Master theorem.

Link to comment
Share on other sites

So do you mean it doesn't matter what the values of a[n] are when m doens't divide n, as long as the property of it being an increasing sequence is maintained?

 

If so, then isn't a[n] lower bounded by the sequence {b[n]} with b[n] defined as:

 

b[n] = b[Floor[n/m]] + d

 

In that case you could still use the Master Theorem :rolleyes:

Even if you don't want to use the master theorem, does that help at all?

Link to comment
Share on other sites

So do you mean it doesn't matter what the values of a[n'] are when m doens't divide n, as long as the property of it being an increasing sequence is maintained?

Yes. The recurrence is defined ONLY when m divides n.

 

If so, then isn't a[n] lower bounded by the sequence {b[n]} with b[n] defined as:

 

b[n] = b[Floor[n/m]] + d

Interesting, but first I would need to show that b[n] is less than or equal to a[n] for all n. I haven't attempted this yet so bear with me.

 

In that case you could still use the Master Theorem

I would but the book I got this problem from does not mention the Master Theorem anywhere.

 

Even if you don't want to use the master theorem, does that help at all?

We'll see.

 

Let b[1] = c

b[m] = c + d

b[m^2] = c + 2d

 

b[n] = c + Floor[Log_m[n]]*d

b[n] = ?(1) + ?(lg n)

Note that the "d" in the recurrence of a[n] is not constant (in wouldn't make sense otherwise). I'll examine your suggestion more carefully later (I'm working and I shouldn't be doing this). Thanks.

Link to comment
Share on other sites

I forgot to mention two important details: m > 1 and d > 0.

 

OK. I've managed to show that b[n] ≤ a[n] for all n assumming b[0] = a[0].

 

All I need to do now is show the following: Let f and g be functions of x such that f(x) ≤ g(x) for all x. For some function h, if f = O(h) then g = Ω(h). Proof: Define h such that f(x) ≤ h(x) ≤ g(x).

 

In order to prove that a[n] = Ω(lg n), let f(n) = b[n] - 1 so f(n) < b[n] ≤ a[n] and since f(n) = O(lg n), then a[n] = Ω(lg n).

 

Somehow I have a feeling that I've made an error somewhere. Can anybody check this for me please?

Link to comment
Share on other sites

  • 1 month later...

I revive this old post in order to resolve the matter in question once and for all. I have conferred this problem with a colleague of mine. He suggested that m and d are both constants, which I certaintly agree on. The misunderstandings came from the wording of the problem (i.e. the problem does not explicitly state that d and m are constants). Here is a similar, albeit harder, ambigous problem:

 

Suppose {a[n]} is an increasing sequence and that whenever m divides n, a[n] = ca[n/m] + d, where c and d are real numbers satisfying c > 1 and d > 0, and m is an integer satisfying m > 1. Show that a[n] = [math]\textstyle \Theta (n^{\log _m c})[/math].

 

Nowhere does the problem state that c, d, and m are constants, though one must assume this to solve the problem. Here is the solution:

 

Let n = mk so a[n] = a[mk]. Define b[k] = a[mk]. The given recurrence can be written as b[k] = cb[k - 1] + d. "Expanding" the recurrence yields b[k] = ckb[0] + ck - 1d + ck - 2d + ... + cd + d. Simplifing this expression yields

 

a[mk] = b[k] = cka[1] + d(ck - 1)/(c - 1)

 

where a[1] = b[0] (by definition). Since [math]\textstyle c^k = m^{\log _m c^k} = m^{k\log _mc} = n^{\log _m c}[/math] then [math]\textstyle a_{m^k} = \Theta (n^{\log _m c})[/math]. But what happens if n is not a power of m? Algebra says mk - 1 < n ≤ mk so a[mk - 1] < a[n] ≤ a[mk] or

 

ck - 1a[1] + d(ck - 1 - 1)/(c - 1) < a[n] ≤ cka[1] + d(ck - 1)/(c - 1)

 

Concentrating on the left-hand side, one can see that

 

ck - 1a[1] + d(ck - 1 - 1)/(c - 1) ≥ ck - 1 = [math]\frac{1}{c} m^{k \log _m c} \ge \frac{1}{c}n^{\log _m c} = \Omega(n^{\log _m c})[/math]

 

On the right-hand side, one can use algebra and derive

 

cka[1] + d(ck - 1)/(c - 1) = [math]m^{k \log _m c}a_1 +\dots < cn^{\log _m c}a_1 + \dots = O(n^{\log _m c})[/math]

 

Therefore [math]\textdisplay a_n = \Theta(n^{\log _m c})[/math]. What I have essentially proven here is a simpler version of the Master Theorem. The original problem (see first post) is an even simpler version of the Master Theorem. How interesting!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.