Jump to content

Can you take the derivative of this?


Johnny5

Recommended Posts

In another thread, there was a stage where the following derivative was to be taken:

 

[math]

\frac{d}{dn} log (n) = \frac{1}{n}

[/math]

 

The problem was that n was restricted to the natural number system, not the real number system. But, if there is an error in the differential calculus, or with the limit concept in general, then perhaps there was no error.

 

In this thread, I simply want to investigate whether or not you can take the 'derivative' above.

 

Let f(x) denote an arbitrary function of the variable x.

 

The difference operator is defined as follows:

 

[math] \Delta f(x) \equiv f(x+h) - f(x) [/math]

 

h is called the step size.

 

In the case here, x is restricted to the natural number system, and h=1. Thus, we have:

 

[math] \Delta f(n) \equiv f(n+1) - f(n) [/math]

 

 

Now, the derivative of an arbitrary function f(x) is defined as follows:

 

[math] \frac{df}{dx} \equiv \lim_{h \to 0} \frac{\Delta f(x)}{h} [/math]

 

I am going to do an example, for the case of a simple parabola.

 

Consider the graph of f(x) = x2. The graph is a parabola.

 

Here is something else I found on parabolas.

 

In the case of a parabola, with its vertex at the origin (0,0) we have this:

 

[math]

f(x) = x^2 \text{ and } \Delta f(x) \equiv f(x+h) - f(x)

[/math]

 

[math] \therefore [/math]

 

[math] f(x+h) = (x+h)^2 [/math]

 

[math] f(x+h)-f(x) = (x+h)^2 - x^2 [/math]

 

[math] \therefore [/math]

 

[math] \Delta f(x) = (x+h)^2 - x^2 [/math]

 

Therefore:

 

[math] \frac{\Delta f(x)}{h} = \frac{(x+h)^2 - x^2}{h} [/math]

 

Therefore:

 

[math] \lim_{h \to 0} \frac{\Delta f(x)}{h} = \lim_{h \to 0} \frac{(x+h)^2 - x^2}{h} [/math]

 

Therefore:

 

[math] \frac{df}{dx} = \lim_{h \to 0} \frac{(x+h)^2 - x^2}{h} [/math]

 

Therefore:

 

[math] \frac{df}{dx} = \lim_{h \to 0} \frac{(x^2+h^2+2xh - x^2)}{h} [/math]

 

[math] \therefore [/math]

 

[math] \frac{df}{dx} = \lim_{h \to 0} \frac{(h^2+2xh)}{h} [/math]

 

[math] \therefore [/math]

 

[math] \frac{df}{dx} = \lim_{h \to 0} \frac{h(h+2x)}{h} [/math]

 

[math] \therefore [/math]

 

[math] \frac{df}{dx} = \lim_{h \to 0} (h+2x) [/math]

 

[math] \therefore [/math]

 

[math] \frac{df}{dx} = 2x [/math]

 

Which is the formula for the slope of the straight line which is tangent to x^2, for any x.

 

So that is an example of the concept of derivative in action, in the case where f(x) is a smooth curve. Division by zero error was avoided, which was clear when h was canceled out of the denominator, because it could be factored out of the numerator.

 

Now, consider the case where f(x) is actually a sequence f(n) where [math] n \in \mathbb{N} [/math].

 

We still have to use the definition of derivative.

 

First lets examine f(n) = n2.

 

We have:

 

[math] \Delta (n^2) = f(n+1)-f(n) = (n+1)^2 - n^2 = n^2+1+2n-n^2 = 1+2n [/math]

 

Now, in the definition of derivative, we have h going to zero, where the domain of h is the real numbers. But here, h is a constant, and is equal to 1.

 

But there is an alternative definition of derivative, which is this:

 

[math] \frac{df}{dx} \equiv \lim_{\Delta x \to 0} \frac{\Delta f(x)}{\Delta x} [/math]

 

Consider again, the definition of the difference operator:

 

[math]

\Delta f(x) \equiv f(x+h) - f(x)

[/math]

 

Consider the case where f(x) = x.

 

In this case we have:

 

[math]

\Delta f(x) = \Delta x = (x+h) - x = h

[/math]

 

So as you can see, the definitions are equivalent, that is, it doesn't matter whether we take the limit as h goes to zero, or the limit as delta x goes to zero.

 

So now, let us look at h, in the case of the sequence n. Here is the sequence, starting with n=1.

 

[math] (1,2,3,4,5,6,7,...) [/math]

 

 

Thus, in the case where f(n) = n, we have:

 

[math]

\Delta n = \Delta f(n) = f(n+1) - f(n) = (n+1)-n = 1

[/math]

 

So, applying the definition of the derivative of f(x), to a sequence f(n), we have:

 

[math]

\frac{df(n)}{dn} \equiv \lim_{\Delta n \to 0} \frac{\Delta f(n)}{\Delta n} = \lim_{\Delta n \to 0} \frac{\Delta n}{\Delta n} = 1

[/math]

 

Now, let us look at the case where f(n) = n^2, so we can compare it to the smooth parabola, f(x)=x^2. We have:

 

[math]

\frac{df}{dn} \equiv \lim_{\Delta n \to 0} \frac{\Delta f(n)}{\Delta n}

[/math]

 

Now, we have already seen that delta n =1, hence we have:

 

[math]

\frac{df}{dn} \equiv \lim_{\Delta n \to 0} \Delta f(n)

[/math]

 

And for the sequence n^2, namely (1,4,9,16,25,36,49,64,81,...) we have:

 

[math] \Delta f(n) = f(n+1)-f(n) = (n+1)^2-n^2 = n^2+1+2n-n^2 = 1+2n [/math]

 

So that we have:

 

[math]

\frac{df}{dn} = \lim_{\Delta n \to 0} (1+2n)

[/math]

 

As there is no delta n term, i assume the answer is:

 

[math]

\frac{df}{dn} = 1+2n

[/math]

 

So, i suppose now i can attempt to evaluate the 'derivative' of log n.

Link to comment
Share on other sites

For anyone wanting to answer this you should know Johnny wished to apply l'hopital's rule to somthing for which l'hoptals rule is not valid, in particular this involved differentiating the series

 

log(1)+log(2)+...+log(n)

 

with respect to n

 

where n is naturally a natural number.

Link to comment
Share on other sites

For anyone wanting to answer this you should know Johnny wished to apply l'hopital's rule to somthing for which l'hoptals rule is not valid' date=' in particular this involved differentiating the series

 

log(1)+log(2)+...+log(n)

 

with respect to n

 

where n is naturally a natural number.[/quote']

 

Can you prove that L'Hopitals rule is invalid here? Rigorously?

 

I actually think it would be neat if you could do what i did in the other thread. I mean really i suppose you could just put it into a computer and see what happens, as n tends to infinity, by steps of one, and that solves the whole issue i would think. Use maple, or mathematica.

Link to comment
Share on other sites

L'hopital's rule applies to the ratio f/g when f and g are smooth functions that have a taylor/laurent series in the neighbouhood in which we are intersted in. So of course l'hopital is invalid for this situation by the very hypothesis of when it applies. If you are asking is there an analogue for functions that do not possess laurent series then that is a different theorem, which does not exist almost surely.

Link to comment
Share on other sites

L'hopital's rule applies to the ratio f/g when f and g are smooth functions that have a taylor/laurent series in the neighbouhood in which we are intersted in. So of course l'hopital is invalid for this situation by the very hypothesis of when it applies.

 

That is incorrect reasoning Matt, come on you are a mathematician, deductive thought is what you are best at.

 

Here i'll show you...

 

Let F denote the set of all functions to which L'Hopitals rule applies.

 

As you have said, L'Hopitals rule applies to cases where F= f/g, where f and g are smooth. No argument here.

 

But that does not necessarily mean that it cannot apply to sequences. I mean it may not apply, but without some analysis, based upon the very definition of 'derivative', you should not a priori dismiss the possibility. In fact, if you found a proof that it couldn't be used, that proof itself would be important, and have pedagogical value as well.

 

If I am wrong, then construct a proof, before I construct one proving the opposite.

Link to comment
Share on other sites

As the statement of l'hoptal's rule does not allow for the kinds of functions you are talking about I am perfectly correct to state it is not valid to use it. You can attempt to define a different rule with different hypotheses that generalizes L'hopital, but it will not be the same theorem.

Link to comment
Share on other sites

As the statement of l'hoptal's rule does not allow for the kinds of functions you are talking about I am perfectly correct to state it is not valid to use it. You can attempt to define a different rule with different hypotheses that generalizes L'hopital, but it will not be the same theorem.

 

Well ok, but, i don't recall ever seeing a proof of L'Hopitals rule in the first place.

Link to comment
Share on other sites

Only valid for |x| less than 1 or alpha an integer, before you go too far down whatever line of enquiry this is.

 

No its valid for more than that Matt.

 

And the line of inquiry is supposed to be deriving a series expression for log x. However, i do thank you for saving me time if the series expression is only valid for |X|<1 but or alpha an integer. However you are mistaken, because you can use that series to compute square root of two, as i have done that before, which means it is valid for x=1 and alpha=1/2.

Link to comment
Share on other sites

Ok, shall we get it absolutely correct? The radius of convergence is 1, so it converges absolutely for all |x|<1. It will diverge for all |x|>1, and may or may not converge when |x|=1 dependent on the alpha.

 

Eg it diverges when x=1 and alpha is -1.

Link to comment
Share on other sites

Having read your edited and very lng initial post it seems all you want is the difference f(n+1)-f(n) which is simply log((n+1)/n) in the case you care about. Look up difference equations. What use it is is something unclear.

Link to comment
Share on other sites

Having read your edited and very lng initial post it seems all you want is the difference f(n+1)-f(n) which is simply log((n+1)/n) in the case you care about. Look up difference equations. What use it is is something unclear.

 

I'm not sure what I wanted to do next.

 

Perhaps you are right though, let me go back and look.

 

 

This is wanted:

 

[math]

\frac{df}{dn} \equiv \lim_{\Delta n \to 0} \frac{\Delta f(n)}{\Delta n}

[/math]

 

In the case where f(n) = log (n) = ln (n).

 

In the case of a sequence f(n) delta n = 1, hence we want:

 

[math]

\frac{df}{dn} \equiv \lim_{\Delta n \to 0} \Delta f(n)

[/math]

 

In the case where f(n) = ln (n)

 

For an arbitrary sequence f(n) we have:

 

[math]

\Delta f(n) \equiv f(n+1) - f(n)

[/math]

 

Hence

 

[math]

\Delta f(n) = log (n) = log(n+1) - log(n) = log(\frac{n+1}{n})=log(1+\frac{1}{n})

[/math]

 

So that we have:

 

[math]

\frac{df}{dn} \equiv \lim_{\Delta n \to 0} \Delta f(n)

[/math]

 

Hence:

[math]

\frac{df}{dn}= \lim_{\Delta n \to 0} log(\frac{n+1}{n})

[/math]

 

As there is no delta n term, i presume the answer is:

 

[math]

\frac{df}{dn} = log(\frac{n+1}{n}) = log(1+\frac{1}{n})

[/math]

 

Well the answer isn't 1/n, as I had before. But consider the very original question...

 

Someone wanted to evaluate the limit of:

 

[math] (n!)^{\frac{1}{n}} [/math]

 

as n increases without bound.

 

Assuming there is a limit L, we have:

 

[math] L \equiv \lim_{n \to \infty} (n!)^{\frac{1}{n}} [/math]

Link to comment
Share on other sites

Applying it to the original question you're going to get that n!^{1/n} converges, which is a shame since it is easy to prove that it diverges, as I did in that original thread - a proof you ignored.

 

(why do you "presume" your answers?)

Link to comment
Share on other sites

Applying it to the original question you're going to get that n!^{1/n} converges, which is a shame since it is easy to prove that it diverges, as I did in that original thread - a proof you ignored.

 

Well i realize that, because lim n goes to infinity of log(1+1/n) = log(1)=0

 

So that by L'hopitals rule, which you question, which is ok.. we will get

 

log(L) = 0

 

from which it will follow that L=1.

 

So yes i see that too.

 

Why do you still say it diverges. which step am i doing, that you have a problem with?

 

Actually, we get this:

 

[math] e^0 = \frac{0^0}{0!} [/math]

 

from the power series expansion.

Link to comment
Share on other sites

I say it diverges because it does diverge as is elementary to show,and indeed can be shown to be true becase the series expansion of e^x converges.

 

Also you take log(1)+log(2)+..log(n)

 

and diff wrt n and stated that the first terms vanish because there is no n, which is false, since I can rewrite the sum as

 

log(n) +log(n-1)+log(n-2)+...log(n-(n-2)) + log(n-(n-1))

 

in which every term has an n in it.

 

Plus, there is, of course, the fact that you've not proved your "generalized" l'hopital is true, and this example shows it may be false. Not that it's clear you can differentiate log(n!) properly.

Link to comment
Share on other sites

(why do you "presume" your answers?)

 

 

Because some of what I do is beyond the scope of what is "generally known".

 

When I said 'presume' i didnt really mean im not sure that this is the answer. What i actually mean is that i can control my definitions so well, that i can force this to be the answer if i so desire. Yet, it might conflict with some accepted mathematics someplace else, though what i've no clue.

 

I've found so many errors in 'accepted' things, i just do everything on my own now.

 

For example, there is an error in Cantor's diagonal proof.

Link to comment
Share on other sites

 

Also you take log(1)+log(2)+..log(n)

 

and diff wrt n and stated that the first terms vanish because there is no n' date=' which is false, since I can rewrite the sum as

 

log(n) +log(n-1)+log(n-2)+...log(n-(n-2)) + log(n-(n-1))

 

in which every term has an n in it.

[/quote']

 

This is worth having a look at, I'll do it later.

 

Regards

Link to comment
Share on other sites

I think I have the "johnny l'hopital conjecture"

 

Your "derivative" of a sequence a_n is simply a_n -a_{n-1}, isn't it? You could use that do differentiate log(n!) directly by the way and the answer is log(n), which, if used in the above generalized l'hopital, would give a limit of log(n)/1 which diverges.

 

Conjecture:

Suppose that a_n and b_n are sequences and that a_n and b_n diverge. Then the ratio a_n/b_n tends to the same thing as (a_n-a_{n-1})/(b_n-b_{n-1}) when at least of the sequences a_n - a_{n-1} or b_n - b_{n-1} converges (and both do not converge to zero).

Link to comment
Share on other sites

Can I suggest you first prove that n!^{1/n} diverges, since it is quite straight foward. It may even convince you of somethings.

 

Ok one thing at a time. You say it diverges, and that you argued as much in another thread. I'll go have a look at how your proof ran first, then go from there.

 

Here is your argument, from the other thread.

 

That's a bit messy and hard to follow' date=' but it can be proved directly and easily.

 

Let k be any positive integer, then n! > k^{n-k} for all n usfficiently large.

 

Proof: Assume n>K, then the RHS is k multiplied by itself k-n times, but the LHS is

 

n! = (k)!*(k+1)(k+2)....(k+(n-k)) > k!k^{n-k} >k^{n-k} as required

 

Thus n!^{1/n} > k^{(n-k)/n}

 

but the rhs tends to k as n tends to infinity thus n!^{1/n}> k-1 for n sufficiently large. But k was artbitrary thus n!^{1/n} must tend to infinity.

 

Actually that is rather unnecessary since we know that the sum of k^n/n! is exp{k} it follows that k^n/n! tends to zero, and inparticular k^n/n! <1 for all n sufficiently large, but there's no harm in having two ways of proving something.

 

As for your proof you are abusing the equals sign a lot. Yes

 

[math'] \frac{1}{n}\sum_{r=1}^{n}\log r[/math]

 

does diverge, but I don't see how you can use l'hopital, which deals with real valued differentiable functions to conclude that.

 

Well i had no trouble following your argument which is good. You found a sequence which diverges, and whose terms are less than the terms of the given sequence, therefore the given sequence must also diverge.

 

I followed your argument. It's also not something that easily would spring to mind.

 

What if the randomly chosen positive integer k is equal to one? What happens? Here is your proof with k=1 being the randomly chosen positive integer.

 

 

That's a bit messy and hard to follow' date=' but it can be proved directly and easily.

 

Let 1 be any positive integer, then n! > 1^{n-1} for all n usfficiently large.

 

Proof: Assume n>1, then the RHS is 1 multiplied by itself n-1 times, but the LHS is

 

n! = (1)!*(1+1)(1+2)....(1+(n-1)) > 1!1^{n-1} >1^{n-1} as required

 

Thus n!^{1/n} > 1^{(n-1)/n}

 

but the rhs tends to 1 as n tends to infinity thus n!^{1/n}> 1-1 for n sufficiently large. But 1 was artbitrary thus n!^{1/n} must tend to infinity.

 

[/quote']

 

Now, look at the following line of work that you would have, if the randomly chosen positive integer is 1.

 

n! = (1)!*(1+1)(1+2)....(1+(n-1)) > 1!1^{n-1} >1^{n-1} as required

 

1!1^{n-1} >1^{n-1} as required

 

1! = 1

 

[math] \therefore [/math]

 

1^{n-1} >1^{n-1} as required

 

 

Let X = 1^{n-1} therefore, if the randomly chosen positive integer k is 1, then you have this:

 

X > X as required

 

Which is false. Therefore, your k wasn't arbitrary.

 

I'm not sure yet how that affects your argument, but your claim that k can be any arbitrary positive integer is false.

 

That might not matter, I'll have to think about it a little more.

 

Regards

 

PS: I think all you have to do, is stipulate that k is a positive integer greater than 1, and then your whole argument works.

Link to comment
Share on other sites

I think I have the "johnny l'hopital conjecture"

 

Your "derivative" of a sequence a_n is simply a_n -a_{n-1}' date=' isn't it? You could use that do differentiate log(n!) directly by the way and the answer is log(n), which, if used in the above generalized l'hopital, would give a limit of log(n)/1 which diverges.

[/quote']

 

Apparently that seems to be the case, but I didn't carry out as rigorous an analysis as I normally would, which is part of the reason why i said 'presume'. That is an indication that I'm not certain about my answer, since this is somewhat unfamiliar territory. It's been a decade since advanced calc course, maybe even more. BUt yet i think according to the analysis in this thread, that is what was obtained for the derivative of a sequence a(n). But as I said, i wasn't as thorough as i'd like, so i will now go and look at your argument that the sequence n!^1/n diverges.

Link to comment
Share on other sites

Let me see if i can reproduce your argument in this post.

 

We want to investigate the behavior of

 

[math] (n!)^{\frac{1}{n}} [/math]

 

As n increases without bound.

 

The above formula actually represents a sequence. The domain of which is the set of natural numbers:

 

[math] \mathbb{N} = \mathcal{f}1,2,3,4... \mathcal{g} [/math]

 

So the first term of the sequence is:

 

[math] (1!)^{\frac{1}{1}} = 1!= 1 [/math]

 

The second term of the sequence is:

 

[math] (2!)^{\frac{1}{2}} = 2^{\frac{1}{2}} = \sqrt{2} =1.414...[/math]

 

The third term of the sequence is:

 

[math] (3!)^{\frac{1}{2}} = 6^{\frac{1}{3}} = 1.817... [/math]

 

The fourth term of the sequence is:

 

[math] (4!)^{\frac{1}{2}} = 24^{\frac{1}{4}} = 2.213... [/math]

 

So as you see, the terms of the sequence are increasing, as n increases.

 

We can represent the sequence as follows:

 

[math] (1,\sqrt{2},6^{\frac{1}{3}}, 24^{\frac{1}{4}},... (n!)^{\frac{1}{n}},...) [/math]

 

So we are out to determine the behavior of the sequence whose nth term is

 

[math] (n!)^{\frac{1}{n}} [/math]

 

as n tends to infinity.

 

In other words, we want to compute the following limit:

 

[math] \lim_{n \to \infty} (n!)^{\frac{1}{n}} [/math]

 

Here is Matt's argument that the sequence diverges.

 

[math] n! = n(n-1)(n-2)...(k+2)(k+1)k(k-1)(k-2)...(4)(3)(2)(1) [/math]

 

Let k denote an arbitrary positive integer.

 

Therefore:

 

[math] n! = n(n-1)...(k+1)(k!) [/math]

 

Or equivalently...

 

 

[math] n! = (k!)(k+1)(k+2)...(k+(n-k)) [/math]

 

Let me change to product notation.

 

[math] n! \equiv \prod_{j=1}^{j=n} j = (\prod_{j=1}^{j=k} j)(\prod_{j=k+1}^{j=n} j) = k! \prod_{j=k+1}^{j=n} j [/math]

 

Now,

 

[math] \prod_{j=k+1}^{j=n} j = (k+1)(k+2)...n [/math]

 

Also:

 

[math] \prod_{j=k+1}^{j=n} j = \prod_{j=k}^{j=n-1} (j+1) = \prod_{j=k-1}^{j=n-2} (j+2)[/math]

 

So that:

 

[math] \prod_{j=k+1}^{j=n} j = \prod_{j=1}^{j=n-k} (k+j) = (k+1)(k+2)...(k+(n-k)) [/math]

 

So as you can see, the number of terms on the RHS above, is (n-k). That is, the number of terms in the iterated product is 1+(greater indice minus the lesser) i.e 1+(n-k -1).

 

And each of those terms is necessarily greater than k.

 

The point is that:

 

[math] (k+1)(k+2)...(k+(n-k)) > (k)_1(k)_2(k)_3(k)_{n-k} = k^{n-k} [/math]

 

So that:

 

[math] n! = (k)!*(k+1)(k+2)....(k+(n-k)) > k!k^{n-k} [/math]

 

Exactly as Matt has.

 

So it follows that:

 

[math] n! > k!k^{n-k} \geq k^{n-k} [/math]

 

So it follows that:

 

[math] (n!)^{\frac{1}{n}} \geq (k^{n-k})^{\frac{1}{n}} [/math]

 

Therefore:

 

[math] (n!)^{\frac{1}{n}} \geq k^{\frac{n-k}{n}} [/math]

 

k was a given (henced fixed) arbitrary natural number.

 

Taking the limit of both sides we have:

 

[math] \lim_{n \to \infty} (n!)^{\frac{1}{n}} \geq \lim_{n \to \infty} k^{\frac{n-k}{n}} [/math]

 

Now, focus on the RHS.

 

[math] \lim_{n \to \infty} k^{\frac{n-k}{n}} = \lim_{n \to \infty} k^{(1-\frac{k}{n})}} = \lim_{n \to \infty}k(k^{-\frac{k}{n}}) = k\lim_{n \to \infty}(k^{-\frac{k}{n}})

[/math]

 

[math] k \lim_{n \to \infty} k^{-\frac{k}{n}} = k \lim_{n \to \infty} \frac{1}{k^{k/n}}

[/math]

 

In the limit as n approaches infinity, the exponent k/n above, approaches zero, so that we have:

 

 

[math] k \lim_{n \to \infty} \frac{1}{k^{k/n}} = k \frac{1}{k^{0}} = k

[/math]

 

Thus, we have found that:

 

[math]

\lim_{n \to \infty} (n!)^{\frac{1}{n}} \geq \lim_{n \to \infty} k^{\frac{n-k}{n}} = k

[/math]

 

That is:

 

[math]

\lim_{n \to \infty} (n!)^{\frac{1}{n}} \geq k

[/math]

 

Since k is an arbitrary natural number, it follows that k can be an arbitrarily large natural number, and hence the sequence

 

[math] (n!)^{\frac{1}{n}} [/math]

 

Can be arbitrarily large. Hence it is unbounded from above, so that the sequence diverges.

 

And this was Matt's argument, and it looks good.

Link to comment
Share on other sites

Johnny, when I said, in my argument that you think is wrong, when k=1, that

 

n! >k^{n-k}

 

for all n sufficiently large, why do you ignore the fact that n! *is* strictly greater than 1 for all n sufficiently large? n greater than 2 in this case.

 

I did not state it was true for all n; it isn't; it is true for all n greater than some number dependent on k, but that is immaterial.

 

Perhaps the > should have been a => sign, I honestly can't remember exactly what I intended to type but it really is not important. Once more you miss the point of the argument and focus on unnecessary aspects, just like you missed the point about absolute convergence of power series. It is not important that the power series for (1+x)^t may converge for some t when x is 1, but that it diverges for |x|>1

Link to comment
Share on other sites

I followed your argument. It's also not something that easily would spring to mind.

 

yes it is. It is in fact the obvious "just do it proof" way of showing that n! grows faster than any exponential. It is not the simplest proof, but the simpler proofs require more knowledge.

 

The second proof is much easier: since exp(k) has a power series valied for all k in the real numbers it follows that k^n/n!, the n'th term must converge to zero, that is, assuming k is positive,

 

k^n/n! <1 for all n sufficiently large, ie k^n<n! for all n sufficiently large.

 

But that requires you to know taylor series, radii of convergence and d'alembert's ratio test.

 

My proof can be followed merely by common sense and could be dreamt up by anyone who is prepared to think a little.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.