# Exponential problem

## Recommended Posts

Is there anyway to solve an equation (without logarithms) such as:

$2^x=x^2$

and

$2^x=x^3$

where x is in the exponent and the base of two different terms? If so, how?

##### Share on other sites

I'd be intrigued to learn what methods you have *with* logarithms to solve these things.

##### Share on other sites

Those types of equations are called transcendental, in that the solutions cannot be derived algebraically. The only way to find the roots is by some form of root-finding estimation (e.g. interval bisection, Newton-Raphson, etc).

##### Share on other sites

Oh okay. Thanks dave.

##### Share on other sites

• 2 months later...

umm just incase u actually did want the solutions to those examples, for 2^x=x^2, x=2, 4 and to 5 decimal places accuracy,x= (-23/30).

as for 2^x=x^3, 1.3735 to 3 places. sorry i cant do better lol.

##### Share on other sites

1. Graphical solution method: calculate the values of LHS (left hand side) and RHS at several x points, join the obtained points by smooth curves and you'll get rough estimates.

One can see that there are 3 roots for $x^2=2^x$

2. The next stage may be an expansion in Taylor series. Once you know the approximate location of the roots, you can expand both RHS and LHS in Taylor series around the estimated roots and solve the resulting algebraic equation

For example expanding RHS of $x^2=2^x$ into Taylor series around zero and truncating after the quadratic term gives

$2^x=e^{x\ln{2}}=1+x \ln{2} +\frac{x^2 (\ln{2})^2)}{2}+\ldots =x^2$

This quadratic equation has 2 roots: -0.78 and 1.69

As we can see, the quality of the solution deteriorates rapidly with the distance from the point of expansion.

While the "left" (negative) root is not bad comparing to the root calculated by Ragib $\approx -\frac{23}{30} \approx -0.7667$, the right one deviates significantly. In order to improve it, we should choose the point of expansion closer to the rough estimate of the root (obtained e.g. by graphical solution).

Question to Ragib: How did you calculate the roots?

##### Share on other sites

Aww i was hoping no one would ask and assume im smart i used a computer program to estimate it for me..although now i see your reply it seems like that would have been the smarter way for me to go after i got my computer estimates..my goal was 10 digits but i was stumped on how i would do better, other than maybe newtons method but i was reeli sleepy lol, newtons method would have been too long i rekon anyway

##### Share on other sites

Obtaining 10 digits is easy, even with a simple calculator, with an exp and log button on it. Write out the formula for Newton iteration, start off with Ragib's approximations and withing two or three iterations, you have the answer to 10 digits accuracy.

This equation has an infinite number of solutions. Three are real, the others are complex.

##### Share on other sites

Infinite complex solutions? Would that only apply to Chenbyshev Polynomial Approximations, not to the actual equation? Why would there be an infinte number of complex solutions, unless u mean there is a finite number of solutions that are repeated, like x= -1 when (x+1)^2=0

##### Share on other sites

Any chebyshev approximating polynomial of course only has a finite number of zeros. The fundamental theorem of algebra states that an N-th degree polynomial over the field of complex numbers has N roots. These roots need not necessarily be distinct, and can be either real or complex.

For transcendental equations, the number of roots can be infinite. Look at the simple equation sin(x) = 0. This has an infinite number of roots. All are real and they can be written as ....-2pi, pi, 0, pi, 2pi, 3pi, ......

Look at the very simple equation exp(x) = 2 (better notation: $e^x = 2$). This also has an infinite number of solutions: ln(2) is only one of them. All others can be written as ln(2) + 2*k*pi*i, with k being an integer.

The equation exp(x) = 0 has no roots, unless you also count roots at infinity. It has an infinite number of roots at infinity. Any small perturbation brings back the roots to finite values, e.g. exp(x) = <some very small number> again has finite roots. Try to visualize for yourself what happens to all these roots of exp(x) = <some value>, when <some value> goes to zero.

A similar thing is true for exp(x) = <some number> * x². And this brings us closer to the equation, presented here initially.

For the equation 2^x = x^2, the situation is more complicated. This has three real roots, as is nicely shown by the graph, given a few posts earlier. However, there are infinitely many other finite roots, all of the form A + B*i, and for each of them, there also is a root A - B*i.

##### Share on other sites

O sorry i completely ingnored complex solutions, and I completely showed utter stupidity of Forgetting an n-th degree polynomial has at most N solutons.. and i completely forgot about using Euler Identity, which seems to be what you used when you said : "All others can be written as ln(2) + 2*k*pi*i, with k being an integer." O my im embarrassed...thanks for the posts

##### Share on other sites

No need to be embarrassed . I just wanted to point out and show that there is more to this, than one would think at first glance.

##### Share on other sites

Can't you just brute-force? After putting some numbers, it should be come apparent that one side is larger than other one.

x^2 = 2^x

x = 1 --> 1 < 2

x = 2 --> 4 = 4

x = 3 --> 9 > 8

x = 4 --> 16 = 16

x = 5 --> 25 < 32

x = 6 --> 36 < 64

x = 7 --> 49 < 128

Pretty evident..

##### Share on other sites

I would not call that brute force. You only sample very coarsely. A better approach is what Tannin did with the graphs. That provides more insight, but still, that only provides insight in the real roots, but it assures that all real roots are found.

Your coarse sampling method certainly does not work anymore, when the functions become somewhat more complicated. And it also does not work for complex numbers.

##### Share on other sites

im assuming your brute force is just a cooler way of saying guess and check, ill use it from now on btw sure its pretty evident for simple intergers, but i doubt in your daily guess and checks u try -23/30 ...

##### Share on other sites

Yeah, but why try "brute-force" or "guess-and-check" when iterative techniques like Newton's method will (typically) tend toward a solution. That way you aren't just guessing in the dark, your guesses will (most likely) improve with each time. Brute force has got to be the absolute last resort.

##### Share on other sites

Usually you do need initial guesses to find the roots of an equation, when Newton's method is used. Especially if a problem is new, and it is a one-time situation, then one first tries to obtain an initial guess near the root (and for such purposes, a sketch of the graph can be a good tool) and then Newton's method is used to refine that root.

So, although the "guess-and-check" method is too crude, some guessing can be required in real-life problems. Usually, however, the guesses are educated guesses and not random shots in the dark .

This method, using initial guesses, of course is not useful for situations, where 1000's or millions of similar equations need to be solved over and over again. In such situations, one usually will have to do more research on the properties of the equations to be solved and one must try to find general rules, which can give acceptable initial guesses and then use Newton's method for refining the roots. This latter approach can be really difficult and requires a good deal of understanding of your problem.

##### Share on other sites

im assuming your brute force is just a cooler way of saying guess and check.
No, guess and check or trial and error means that your incorrect result affects your next attempt. Brute force involves trying absolutely everything.

## Create an account

Register a new account