Jump to content
Sign in to follow this  
PeterJ

Twin Primes Conjecture

Recommended Posts

The Twin Primes Conjecture


Sometime ago, when my hasty first attempt went wrong, I promised to come back sometime and give my ‘proof’ of the twin primes conjecture. As was rightly pointed out in the initial discussion it is not a proof as a mathematician would use the word, so I’ll call it an heuristic argument. Or, if you like, it's just a thought.

I make no claims for it other than that I think it correct as far as it goes. All that matters to me is whether I have made mistakes. I’ll try to state it as briefly as possible. It is painfully simple. Please excuse the clumsy termimology. I have tried to be clear but clarity is not one of my talents. .

 

As I make use of the 6n numbers for the calculations it simplifies things if the primes below 5 are ignored. So ‘primes’ here will mean ‘primes >3’..

 

The proposition: There is always a twin prime between the squares of consecutive primes.

 

Call p1, p2 consecutive prime numbers.

Call R the range P12 to P22

Call RP the quantity of prime (>3) products in R (those that occur at 6n+/-1 and thus affect the distribution of twin primes).

Call RL the lower limit for twin primes in R.


RP is calculated as 2R/6p for all primes below P2. Where for any p this produces a fraction we would round up. This is clumsy but it can be refined. It need only produce a lower limit, not a quantity. It becomes increasingly accurate as R grows larger.

 

RL is R/6 - RP. (Count the potential locations and deduct one for each 'relevant' prime product in R.)

 

The calculation will overestimate RP and thus underestimate RL. This is fine as its accuracy increases with P.. As long as it does not overestimate it is useable.

 

If we do the calcs. we find that LL increases with P. Thus the quantity of twin primes in R increases with P and may be arbitrarily large. It can never fall below zero.

 

End.

 

Okay, it’s clumsy, but I don’t think this really matters. It can be made less clumsy but then it gets harder to present. At present I cannot see what is wrong with the basic approach. The increase in R for each increase in P grows exponentially, while Pr as a proportion of R grows ever smaller. Even if the calculation is clumsy for the lower reaches of the number line the trend is unmistakable.


Or maybe not. I’ll wait and see. I have two questions. Have I made a mistake? How can I make the presentation tidier so that it’s easier to see what I’m getting at?

 

EDIT: Tidied it up a lot after the first comment. .

Edited by PeterJ

Share this post


Link to post
Share on other sites

LL is the crucial number. LL is calculated as 2(R/6p) for all primes up to P2. Where R/6p is a fraction we would round up. This is clumsy but it can be refined. It need only produce a lower limit, not a quantity. It becomes increasingly accurate as L grows larger.

 

I follow you until this sentence. I'm assuming R/6p is supposed to be R/(6p), but why is LL equal to twice this value? And what exactly do you mean when you say that LL is equal to that value for all primes up to P_2?

 

This second example shows that the calculation may underestimate LL. This is fine as its accuracy increases with L. As long as it does not overestimate it is useable.

 

What is L?

 

If we do the calcs. we find that LL increases with P. Thus the quantity of twin primes in R increases with P and may be arbitrarily large. It can never fall below zero.

 

What is P?

 

End.

 

Okay, it’s clumsy, but I don’t think this really matters. It can be made less clumsy but then it gets harder to present. At present I cannot see what is wrong with the basic approach. The increase in R for each increase in P grows exponentially, while Pr as a proportion of R grows ever smaller. Even if the calculation is clumsy for the lower reaches of the number line the trend is unmistakable.

 

Or maybe not. I’ll wait and see. I have two questions. Have I made a mistake? How can I make the presentation tidier so that it’s easier to see what I’m getting at?

 

I won't comment on the argument itself, since (as my previous comments show) I'm unclear as to what exactly the argument is--or rather, I see the argument, but I don't see the justification for certain steps building to it. Maybe I'm a derp. Regardless, clumsy is alright, though elegance is, of course, preferred. That said, if you provide even a kludgy proof of the twin prime conjecture, as long as it's valid, you'll be famous.

Edited by John

Share this post


Link to post
Share on other sites

I follow you until this sentence. I'm assuming R/6p is supposed to be R/(6p), but why is LL equal to twice this value? And what exactly do you mean when you say that LL is equal to that value for all primes up to P_2?

Hi John. It;s because there are two products of p at 6n+/-1 (therefore 'relevant') in every 6p numbers.

 

We only need to take account of the primes up to P^2 since products of larger primes will not fall in R.

 

 

What is L?

 

 

What is P?

 

.

Oh hell L is a hangover from a previous attempt to sketch it out. I'll go back and change it to R.

 

Pr is the quantity of relevant prime products in R. I'm afraid I don't know the conventions for the use of the letters.

Share this post


Link to post
Share on other sites

I see that Lu is the upper limit for twin primes in R, containing all numbers in R that can be expressed at 6n +/- 1. Pr is the number of composite numbers in R that can be expressed as 6n +/- 1, having no prime factors less than 5. LL is the lowest possible number of twin primes in R.

 

I still don't see how LL equals 2(R/(6p)), though, especially in light of the value given for LL in example 2, which is zero, meaning R must be zero, which is contradictory since R is 48.

Edited by John

Share this post


Link to post
Share on other sites

Yes. Many apologies. The original post was very messy. I don't know how I made so many mistakes.

 

I've edited it and hope it's now more clear.

 

In short ...

 

R/6 gives the maximun qty of twin primes in R.

 

Totalling 2R/(6p) for each prime up to p2 gives the maximun qty of relevant products in R.

 

Taking one from the other gives a lower limit for twin primes in R.

 

It is a mechnical argument derived from the behaviour of the products of the primes.

 

I hope it's beginning to make sense. It can be made more accurate but this will do to get the principle established.

Edited by PeterJ

Share this post


Link to post
Share on other sites

R/6 gives the maximun qty of twin primes in R.

 

Totalling 2R/(6p) for each prime up to p2 gives the maximun qty of relevant products in R.

 

Taking one from the other gives a lower limit for twin primes in R.

 

You should justify the second sentence for me. I don't see how that sum will give you the number of composite n in R such that n = 6x +/- 1.

 

But, assuming it is valid, the third statement strikes me as incorrect, or at least useless.

 

If I'm understanding you correctly, letting [math]L_l[/math] be the lower limit, and letting your [math]P_2[/math] be the [math]n[/math]th prime, what you end up with is [math]L_l = \frac{R}{6} - \sum_{i=3}^n \frac{2R}{6p_i} = \frac{R}{6}\left(1-\sum_{i=3}^n\frac{2}{p_i}\right)[/math] where [math]p_i[/math] is the [math]i[/math]th prime, e.g. [math]p_3 = 5[/math], [math]p_4 = 7[/math], etc. However, for [math]n \geq 6[/math], that fancy sum is greater than 1 (2/5+2/7+2/11+2/13 = 5112/5005), which means [math]L_l[/math] becomes negative, which doesn't tell us anything about whether there are infinitely many primes in some arbitrary interval, and also further supports the notion that the second statement is incorrect.

Share this post


Link to post
Share on other sites

Peter - could you run a numerical example so that we can be sure that we have the right idea.

Share this post


Link to post
Share on other sites

John

 

Re that 2nd sentence

 

"Totalling 2R/(6p) for each prime up to p2 gives the maximun qty of relevant products in R."

 

This works because the products of a prime occur at 6n+/-1 twice in every 6p numbers. So R/(6p) x 2 gives the products of p that affect the distrubution of twin primes.

 

Ie. the relevant products of 5 occur at 25, 35, 55, 65, 85, 95,..

 

Using this it is possible to calculate the maximum number of prime products that can occur in R. We already know the maximum number of twin primes in R (assuming no prime products it's (R/6)-1). So from this we can produce a lower limit for TPs in R.

 

I'm afraid I cannot follow your maths so can't comment on that.

 

Imatfaal - An example.

 

Where p1 = 5

 

R = 24 (49 - 25. This gives the max qty of TPs in R as 3, which is (24/6) -1)

 

RP = 2 (relevant prime products in R, Given by 2(R/6p) or 2(24/30) )

 

Ergo, where p1 =5 there is at least one TP in R.

 

The trend is then always (on average) that for each increase in P1 there is an increase in the minimum qty of TPs in R.

 

It is only 'on average' because the calculation is very sloppy lower down the number line.and will underestimate TPs.

 

Another way of saying this might be - If there is a TP in R for some pair of consecutive primes, then there must be a TP in R for any larger pair.

 

I would have liked to do the calc for TPs up to N, which would have been a more common approach, but I couldn't make that work. .

Edited by PeterJ

Share this post


Link to post
Share on other sites

John

 

Re that 2nd sentence

 

"Totalling 2R/(6p) for each prime up to p2 gives the maximun qty of relevant products in R."

 

This works because the products of a prime occur at 6n+/-1 twice in every 6p numbers. So R/(6p) x 2 gives the products of p that affect the distrubution of twin primes.

 

Ie. the relevant products of 5 occur at 25, 35, 55, 65, 85, 95,..

 

Using this it is possible to calculate the maximum number of prime products that can occur in R. We already know the maximum number of twin primes in R (assuming no prime products it's (R/6)-1). So from this we can produce a lower limit for TPs in R.

 

I'm afraid I cannot follow your maths so can't comment on that.

 

I understand the concept, just don't necessarily agree that it works as you claim. Multiplying by 2 is obvious. It's the R/(6p) bit that throws me. For instance, if p were 7, then 2R/(6p) = R/(3p), which in this case I believe would be 62/21. But there are more "relevant products" (with respect to 7--do we add this to R/(3(5)) = 62/15?) in this range than 62/21. I guess you could round, but even then, you must show that this process is correct for all ranges up to infinity.

 

Imatfaal - An example.

 

Where p1 = 5

 

R = 24 (49 - 25. This gives the max qty of TPs in R as 3, which is (24/6) -1)

 

RP = 2 (relevant prime products in R, Given by 2(R/6p) or 2(24/30) )

 

Response to this edited out, as it was redundant.

 

Ergo, where p1 =5 there is at least one TP in R.

 

The trend is then always (on average) that for each increase in P1 there is an increase in the minimum qty of TPs in R.

 

It is only 'on average' because the calculation is very sloppy lower down the number line.and will underestimate TPs.

 

Another way of saying this might be - If there is a TP in R for some pair of consecutive primes, then there must be a TP in R for any larger pair.

 

I would have liked to do the calc for TPs up to N, which would have been a more common approach, but I couldn't make that work. .

 

This makes it seem like you're basically restating the twin prime conjecture in a more complicated way. The fact that, assuming a pair of twin primes in some range, there must be at least one pair of twin primes in a larger range, is trivial. And while the number of twin primes in an increasing range increases (at least to a certain point), the conjecture is that it increases to infinity, which you haven't shown.

 

I could still be missing something obvious, of course. Maybe someone else can shed some light, or I'll try to later when I have more time. smile.png

Edited by John

Share this post


Link to post
Share on other sites

I am suffering from the same problems as John - either its a restatement or it is unproven.

 

On the numerical side - I just cannot get my head around where you are getting RP as (2R)/(6P1)

 

Tell me where I am going wrong...

 

take two consecutive primes p1 and p2

 

R = Range = (p2)2 - (p1)2

 

Maximum number of possible sites for twin primes (ie 6n+/-1) is TPmax = (R/6)-1

 

Maximum Number of those possible sites where either 6n-1 a/o 6n+1 are actually composite rather than prime RP = (2R)/(6p1)

 

Therefore there must be at least this many twin primes in the range LL = TPmax - RP

 

OK - hoping that is correct, some numbers

 

lets take p1 as 31 and thus p2 is 37

R = (p2)2 - (p1)2 = 372-312 = 1369 - 961 = 408

 

TPmax = (R/6)-1 = (408/6) -1 = 68-1 = 67

 

RP = (2R)/(6p1) = (2x408)/(6x31) = 816 / 186 = 4.39 and rounded up to RP = 5

 

LL = TPmax - RP = 67 - 5 = 62

 

Your lower limit for the number of twin primes between the square of 31 (961) and the square of 37 (1369) is 62 - whereas by counting the actual number is 11.

They are:

 

1019 1021

1031 1033

1049 1051

1061 1063

1091 1093

1151 1153

1229 1231

1277 1279

1289 1291

1301 1303

1319 1321

 

The problem is, of course RP = (2R)/(6P1) . This is only close to correct at low values of p1 at higher values it is woefully small . You are underestimating the number of sites 6n+/-1 which have composite numbers - and thus you are overestimating the number of twin primes.

 

The twin prime conjecture is almost certainly correct - but this does not prove it as the maths is flawed.

 

Here's hoping I have interpreted your maths correctly. Cheers

Share this post


Link to post
Share on other sites

I understand the concept, just don't necessarily agree that it works as you claim. Multiplying by 2 is obvious. It's the R/(6p) bit that throws me. For instance, if p were 7, then 2R/(6p) = R/(3p), which in this case I believe would be 62/21. But there are more "relevant products" (with respect to 7--do we add this to R/(3(5)) = 62/15?) in this range than 62/21. I guess you could round, but even then, you must show that this process is correct for all ranges up to infinity.

Not quite. If p = 7 then R = 72. R will always be a number divisible by 12.

 

 

 

R

This makes it seem like you're basically restating the twin prime conjecture in a more complicated way. The fact that, assuming a pair of twin primes in some range, there must be at least one pair of twin primes in a larger range, is trivial. And while the number of twin primes in an increasing range increases (at least to a certain point), the conjecture is that it increases to infinity, which you haven't shown.

 

I could still be missing something obvious, of course. Maybe someone else can shed some light, or I'll try to later when I have more time. smile.png

I see what you're saying. But these objections can be met. I just need to clarify what I'm saying.

 

 

I am suffering from the same problems as John - either its a restatement or it is unproven.

 

On the numerical side - I just cannot get my head around where you are getting RP as (2R)/(6P1)

 

Tell me where I am going wrong...

 

RP would be R/3p ( a neater way of putting it) for each p up to sqrt p1. This would exhaust the prime factors that we need to consider.

 

take two consecutive primes p1 and p2

 

R = Range = (p2)2 - (p1)2

 

Maximum number of possible sites for twin primes (ie 6n+/-1) is TPmax = (R/6)-1

 

Maximum Number of those possible sites where either 6n-1 a/o 6n+1 are actually composite rather than prime RP = (2R)/(6p1)

The final formula should be R/3p for each prime up to p2. This might be an arbitrarily large series of calculations.

 

I'm hoping this clarification answers the problem in your example.

 

Thanks to you both for taking an interest. Is the idea clear yet? It should not look obviously wrong. .

Share this post


Link to post
Share on other sites

Not quite. If p = 7 then R = 72. R will always be a number divisible by 12.

 

That's what I get for replying in a hurry.

 

I see what you're saying. But these objections can be met. I just need to clarify what I'm saying.

 

RP would be R/3p ( a neater way of putting it) for each p up to sqrt p1. This would exhaust the prime factors that we need to consider.

 

You're still making that assertion without justification. You divide R by 6, and take that to be the number of multiples of 6 in R. I can see that. Why does dividing (twice) the number of multiples of 6 by some other number clear out composite multiples of that number? If R = 72, for instance, then yes, there are 13 multiples of 6 in that range (I guess we only count 11, though, since 0 and 72 are the end points). Why does dividing 11 by 15 "clear out" multiples of 5?

 

The final formula should be R/3p for each prime up to p2. This might be an arbitrarily large series of calculations.

 

I'm hoping this clarification answers the problem in your example.

 

Thanks to you both for taking an interest. Is the idea clear yet? It should not look obviously wrong. .

 

The total idea's been clear for a few posts. It's just those two steps that seem rather odd.

 

But assuming it's all valid, using 7 and 11, and knowing the process now, let's see.

 

R then equals 72; R/6 - 1 = 11; the sum of R/(3p) for p = 5, 7, 11 = 72/15 + 72/21 + 72/33 = 4008/385 = ~10.41, thus LL = 11 - 10.41 = 0.59

 

Alright. Now, for 11 and 13.

 

R then equals 48; R/6 - 1 = 7; the sum is then 48/15+48/21+48/33+48/39 = 40896/5005 = ~8.17, thus LL = 7 - 8.17 = -1.17

 

This is the problem that I pointed out a few posts ago, though the fraction is larger this time because I didn't factor the R/6 out of it. Since the lower limit is negative, then (using only this reasoning) the possibility exists that zero pairs of twin primes exist in the range.

Edited by John

Share this post


Link to post
Share on other sites

T

You're still making that assertion without justification. You divide R by 6, and take that to be the number of multiples of 6 in R. I can see that. Why does dividing (twice) the number of multiples of 6 by some other number clear out composite multiples of that number? If R = 72, for instance, then yes, there are 13 multiples of 6 in that range (I guess we only count 11, though, since 0 and 72 are the end points). Why does dividing 11 by 15 "clear out" multiples of 5?

There are two products of 5 at 6n+/- in every 30 numbers. So 72/30 x 2 gives the total. (Or 72/15 - which gives not quite the same result but may be better).

 

 

But assuming it's all valid, using 7 and 11, and knowing the process now, let's see.

 

R then equals 72; R/6 - 1 = 11; the sum of R/(3p) for p = 5, 7, 11 = 72/15 + 72/21 + 72/33 = 4008/385 = ~10.41, thus LL = 11 - 10.41 = 0.59

 

Alright. Now, for 11 and 13.

 

R then equals 48; R/6 - 1 = 7; the sum is then 48/15+48/21+48/33+48/39 = 40896/5005 = ~8.17, thus LL = 7 - 8.17 = -1.17

 

This is the problem that I pointed out a few posts ago, though the fraction is larger this time because I didn't factor the R/6 out of it. Since the lower limit is negative, then (using only this reasoning) the possibility exists that zero pairs of twin primes exist in the range.

 

Where R =72 the only primes factors we need consider are 5 and 7. No need to count 11 and 13. So 48/15 + 48/21 = 7. Here R/6 -1 = 11, so 11 - 7 = 4. So there are at least 4 twin primes in R where p1 = 7.

Share this post


Link to post
Share on other sites

There are two products of 5 at 6n+/- in every 30 numbers. So 72/30 x 2 gives the total. (Or 72/15 - which gives not quite the same result but may be better).

 

Again, the multiplication by 2 makes sense. I'm asking about the division by 6p. Also, 2(72/30) and 72/15 are the same number, so there's no difference in the results they provide.

 

Where R =72 the only primes factors we need consider are 5 and 7. No need to count 11 and 13. So 48/15 + 48/21 = 7. Here R/6 -1 = 11, so 11 - 7 = 4. So there are at least 4 twin primes in R where p1 = 7.

 

I only included up to 11 in the first example, because your earlier comments led me to believe we needed to total R/(3p) for p greater than 3 up to and including p_2. If we only include primes greater than 3 and less than p_2 (i.e., primes from 5 to p_1), then using imatfaal's example above, we end up with LL being [408/6 - 1] - [408/15+408/21+408/33+408/39+408/51+408/57+408/69+408/87+408/93] = 67 - 99.6 = -32.6 (rounding to the nearest tenth), which again is a negative number.

Edited by John

Share this post


Link to post
Share on other sites

Again, the multiplication by 2 makes sense. I'm asking about the division by 6p. Also, 2(72/30) and 72/15 are the same number, so there's no difference in the results they provide.

There seems to be a difference in their result if we are rounding up, as we are here. The second version can produce a slightly greater total. But this can be ignored. .

 

The division by 6/p works like this. By using the 6n numbers as a metric it is possible to map the remaining products of the primes against it. One inviolable rule is that the products of p >3 that effect the distribution of twin primes (ie. occur at 6n +/-1) always occur at 6np+/-p, Thus there are always two relevant products of p in every 6p numbers. All other products can be ignored.

 

Thus there can never be more than R/6p x 2 products of p in R.

 

I can't see quite what you're doing in your second para. You would only need to divide R by 15 and 21. There could be no relevant products of primes >7 in the range.

Edited by PeterJ

Share this post


Link to post
Share on other sites

I can't see quite what you're doing in your second para. You would only need to divide R by 15 and 21. There could be no relevant products of primes >7 in the range.

 

You've now given, by my count, four different processes here.

 

1. Sum 2(R/(6p)) for 3 < p <= p_2, where p is prime.

 

2. Sum 2(R/(6p)) for 3 < p <= p_1, where p is prime.

 

3. Sum 2(R/(6p)) for 3 < p <= sqrt(p_1), where p is prime.

 

4. Sum 2(R/(6p)) for p = 5 and p = 7.

 

The first three may all just be the result of mistakes on either of our parts, but the last one seems to be cherry picking values to make the math work.

 

In any case, if we only include p = 5 and p = 7, then (again using imatfaal's example) we get [408/6 - 1] - [408/15+408/21] = 67 - 46.63 = 20.37, whereas imatfaal only found 11 in the range. Is LL supposed to be the lower limit for twin prime pairs (as I'm assuming here) or the lower limit for primes that are members of twin prime pairs (in which case the subtraction step seems off)?

Share this post


Link to post
Share on other sites

There seems to be a difference in their result if we are rounding up, as we are here. The second version can produce a slightly greater total. But this can be ignored. .

 

The division by 6/p works like this. By using the 6n numbers as a metric it is possible to map the remaining products of the primes against it. One inviolable rule is that the products of p >3 that effect the distribution of twin primes (ie. occur at 6n +/-1) always occur at 6np+/-p, Thus there are always two relevant products of p in every 6p numbers. All other products can be ignored.

 

Thus there can never be more than R/6p x 2 products of p in R.

 

But you are basing your idea on the number of positions of 6n that twin primes might be around minus the number of those positions that cannot be a twin prime because either 6n-1 or 6n+1 is a prime; this you blieve gives you a lower limit of the number of twin primes. But every time we run the calculation for the the number of positions that cannot be a twin prime it needs tweaking!

 

If p1 = 13 and p2 17 then what R/(prime*3) do I need to take into consideration?

 

 

 

I can't see quite what you're doing in your second para. You would only need to divide R by 15 and 21. There could be no relevant products of primes >7 in the range.

 

I have run a quick excel number crunch

 

 

damn I hate this interface - shows up fine on preview

 

Hopefully the screendump shows up.

post-32514-0-93991500-1364759266_thumb.jpg

Share this post


Link to post
Share on other sites

Oh man. How can this gave become so complicated? I made some mistakes and that certainly didn't help. My apologies. But I have described just one process and have not cherry-picked.

 

I'll try to put it staighforwardly. We count the products of primes >3 in R. We do not need to count the products of prime factors larger than sqrt p2.because there aren't any. We then count how many 6n numbers there are in R. We then deduct the former from the latter. That is, for each product that 'lands' in R at 6n+/-1, we assume that it prevents one twin prime from occuring. (In reality two products may occur side by side at 6n+/-1 and only prevent one twin prime from occuring, but we are always taking the worst case scenario because we want a lower limit on the twin primes in R, not the actual quantity of them),

 

I hope this makes things more clear. I know I'm coming at it from an odd angle.

 

The calculation becomes ever more lengthy as p grows larger, but there's no need to calculate anything except to establish the principle. I'm not sure there cannot be any exceptions, but I'd be astonished if the quanity of twin primes in R did not increase with p1.

(Would it be possible and a good idea for someone to delete the two or three early posts that referred back to my original messy terminology, which disappeared when I re-edited my first post? They'll completely confuse anybody starting out at the beginning of this discussion, Just at thought.)


Edited by PeterJ

Share this post


Link to post
Share on other sites

I'll try to put it staighforwardly. We count the products of primes >3 in R. We do not need to count the products of prime factors larger than sqrt p2.because there aren't any. We then count how many 6n numbers there are in R. We then deduct the former from the latter. That is, for each product that 'lands' in R at 6n+/-1, we assume that it prevents one twin prime from occuring. (In reality two products may occur side by side at 6n+/-1 and only prevent one twin prime from occuring, but we are always taking the worst case scenario because we want a lower limit on the twin primes in R, not the actual quantity of them),

 

But the square root of p_2 in our go-to example (p_1 = 31, p_2 = 37) is a little over six, and clearly there are primes larger than six that have multiples of the form 6n +/- 1 in R. Even primes larger than p_2 itself have such multiples (e.g. if you look at 41, it has multiples 205 = 6(34) + 1 and 287 = 6(48) - 1).

Share this post


Link to post
Share on other sites

Oh man. How can this gave become so complicated? I made some mistakes and that certainly didn't help. My apologies. But I have described just one process and have not cherry-picked.

No you described what you had at the beginning as a simple mathematical process that works for two consecutive primes p1 and p2. But it does not work even for small primes, let alone for big primes, or in general

 

I'll try to put it staighforwardly. We count the products of primes >3 in R.

COUNT!!! That is the very problem. It is easy to count when p1 =13 and p2 =17; it is more difficult to count when p1 = 1667 and p2 = 1669 (still possible though - 35 twins and 1076 composites); but it is logically and mathematically impossible to count when p1=p1 and p2=p2 - you need to be able to calculate and predict.

 

 

We do not need to count the products of prime factors larger than sqrt p2.because there aren't any. We then count how many 6n numbers there are in R. We then deduct the former from the latter. That is, for each product that 'lands' in R at 6n+/-1, we assume that it prevents one twin prime from occuring. (In reality two products may occur side by side at 6n+/-1 and only prevent one twin prime from occuring, but we are always taking the worst case scenario because we want a lower limit on the twin primes in R, not the actual quantity of them),

 

I hope this makes things more clear. I know I'm coming at it from an odd angle.

 

The calculation becomes ever more lengthy as p grows larger, but there's no need to calculate anything except to establish the principle. I'm not sure there cannot be any exceptions, but I'd be astonished if the quanity of twin primes in R did not increase with p1.

You have shown no principle. You cannot go from a particular example to a universal condition.

 

The twin prime conjecture is that there are an infinite number of pairs of twin primes - that is an interesting question. What you are saying is either trivial (6n+/-1 is either a twin prime site or has at least one prime composite) or completely unproven (that there is a formula that can give the number of twin primes between consecutive primes).

 

For your idea to be a proof of the twin prime conjecture you need to show the following

1. That there exists a formula (or set of formulae) that give p1 and p2 will provide the number of prime composites for any known pair of consecutive primes.

2. That this formula is extendible to any consecutive primes - ie does the formula keep working.

 

At present you haven't given a formula that works for single digit primes - what you have is a method that relies on checking if numbers are prime. But the idea of the method is to show that numbers are prime - if this were in the philosophy section you would already have rejected any argument founded like this as a fallacy based on begging the question.

 

(Would it be possible and a good idea for someone to delete the two or three early posts that referred back to my original messy terminology, which disappeared when I re-edited my first post? They'll completely confuse anybody starting out at the beginning of this discussion, Just at thought.)

 

 

 

We understand what you mean - but it is just trivial; as John said way above, you are just re-stating the twin prime conjecture.

 

The twin prime conjecture is almost certainly true - but we need to prove it. I can show what you have claimed (that between the squares of consecutive primes exist at least one twin prime) is true for the first 1600 primes - took me about 5 minutes on excel. With some time I could make a program that would check higher and higher - and I have no doubt whatsoever that I would not find a counter-example. But counting and checking does not make a proof - a proof needs to show that for all possible...

Share this post


Link to post
Share on other sites

But the square root of p_2 in our go-to example (p_1 = 31, p_2 = 37) is a little over six, and clearly there are primes larger than six that have multiples of the form 6n +/- 1 in R. Even primes larger than p_2 itself have such multiples (e.g. if you look at 41, it has multiples 205 = 6(34) + 1 and 287 = 6(48) - 1).

Yes. but they've already been counted as the products of primes below sqrt p2.

 

205 was counted as a product of 5. If the prime factor is larger than p2^2 then it cannot add any products in R that have not been counted already. .

 

No you described what you had at the beginning as a simple mathematical process that works for two consecutive primes p1 and p2. But it does not work even for small primes, let alone for big primes, or in general

 

COUNT!!! That is the very problem. It is easy to count when p1 =13 and p2 =17; it is more difficult to count when p1 = 1667 and p2 = 1669 (still possible though - 35 twins and 1076 composites); but it is logically and mathematically impossible to count when p1=p1 and p2=p2 - you need to be able to calculate and predict.

I can count.and predict. It's what I'm doing. Why is this not obvious?

 

You have shown no principle. You cannot go from a particular example to a universal condition.

I have, but obviously not effectively.

 

Damn. Text box gone again.

 

" I can show what you have claimed (that between the squares of consecutive primes exist at least one twin prime) is true for the first 1600 primes - took me about 5 minutes on excel. With some time I could make a program that would check higher and higher - and I have no doubt whatsoever that I would not find a counter-example. But counting and checking does not make a proof - a proof needs to show that for all possible."

 

I did not suggest that counting makes a proof. I don't think you understand what I'm saying yet, since all these objections miss the mark. .

 

You can count it for the first million primes, it makes no difference. The mechanism is invariable. No need to check R for lots of different values. It's true where p=5 and true for ever more.

 

Sorry, I'm getting tetchy. Maybe you are as well. I think we are having a communication breakdown but I'm not sure why.

 

The first version of this I sent to a mathematician at Uni of Bristol, (editor of a maths journal) who told me it was correct but unrigorous. No quibbles or doubts about it. This is virtually the same calculation but it should now be rigorous. It cannot suddenly have become a load of nonsense.

 

 

 

 

 

.

Edited by PeterJ

Share this post


Link to post
Share on other sites

Yes. but they've already been counted as the products of primes below sqrt p2.

 

205 was counted as a product of 5. If the prime factor is larger than p2^2 then it cannot add any products in R that have not been counted already. .

 

I see your point about 205. However, 287's only factors (besides 1 and 287) are 7 and 41, both of which are greater than sqrt(37).

 

I can count.and predict. It's what I'm doing. Why is this not obvious?

 

I have, but obviously not effectively.

 

Damn. Text box gone again.

 

" I can show what you have claimed (that between the squares of consecutive primes exist at least one twin prime) is true for the first 1600 primes - took me about 5 minutes on excel. With some time I could make a program that would check higher and higher - and I have no doubt whatsoever that I would not find a counter-example. But counting and checking does not make a proof - a proof needs to show that for all possible."

 

I did not suggest that counting makes a proof. I don't think you understand what I'm saying yet, since all these objections miss the mark. .

 

You can count it for the first million primes, it makes no difference. The mechanism is invariable. No need to check R for lots of different values. It's true where p=5 and true for ever more.

 

For my part, I'm not certain the argument holds water, and I think imatfaal's example of 31, 37 has been shown to be a counterexample, unless there's still some part of the argument we're missing.

 

Sorry, I'm getting tetchy. Maybe you are as well. I think we are having a communication breakdown but I'm not sure why.

 

The first version of this I sent to a mathematician at Uni of Bristol, (editor of a maths journal) who told me it was correct but unrigorous. No quibbles or doubts about it. This is virtually the same calculation but it should now be rigorous. It cannot suddenly have become a load of nonsense.

 

No need to get upset. If the twin prime conjecture were easy to prove, everyone would be doing it. :P Even if our objections are actually silly, any proposed proof of the twin prime conjecture will be put through the ringer by the mathematical community, so be prepared if you take it that far.

 

As for Dr. A. Mathematician, he's one guy. Maybe the argument is actually correct, or maybe he's missing something too, especially if it was presented to him less rigorously than it is in this thread.

 

Regardless of all that, you must show that, assuming your argument is correct, the lower limit eventually reaches infinity. Simply getting larger isn't enough, as convergent infinite series show.

Share this post


Link to post
Share on other sites

I see your point about 205. However, 287's only factors (besides 1 and 287) are 7 and 41, both of which are greater than sqrt(37).

 

Where p1 = 31, 287 is not in R. The definition for R is p1^2 to P2^2.

 

For my part, I'm not certain the argument holds water, and I think imatfaal's example of 31, 37 has been shown to be a counterexample, unless there's still some part of the argument we're missing.

 

No need to get upset. If the twin prime conjecture were easy to prove, everyone would be doing it. tongue.png Even if our objections are actually silly, any proposed proof of the twin prime conjecture will be put through the ringer by the mathematical community, so be prepared if you take it that far.

 

Not upset, but frustrated. It is such a simple idea, but somehow I cannot get it across. My fault I'm sure. .

 

As for Dr. A. Mathematician, he's one guy. Maybe the argument is actually correct, or maybe he's missing something too, especially if it was presented to him less rigorously than it is in this thread.

 

Regardless of all that, you must show that, assuming your argument is correct, the lower limit eventually reaches infinity. Simply getting larger isn't enough, as convergent infinite series show.

 

It was presented rigorously. But he found that the original calculation left open the theoretical possibiliity of a highest twin prime. even though the likelihood was approximately zero.

 

How can a limit reach infinity? Here it just grows larger. As p grow larger the limit cannot fall or stay the same.

 

.

Edited by PeterJ

Share this post


Link to post
Share on other sites

Peter - when in doubt and getting frustrated use numbers.

 

1. p1=13 p2=17

 

2. Range = 289 - 169 = 120

 

3. number of instances of 6n = 120/6 -1 = 19

 

4. by my reckoning you need to account for composites made with

 

5s (235 is 5x47)

7s (259 is 7x37)

11s (209 is 11X19)

13s (221 is 13x17)

 

BTW those examples are all vital (if I have grasped it) as there are no other composites at those 6n which would apply.

 

120 / (3X5) = 8

120 / (3X7) = 5.71

120 / (3x11) = 3.64

120 / (3*13) = 3.08

 

If I have done something wrong with the sums now is the time for you to correct.

 

So what next????

 

Which numbers add up to make the number of composites that I then deduct from the number of sites?

Share this post


Link to post
Share on other sites

Peter - when in doubt and getting frustrated use numbers.

 

etc

 

 

 

it's trying to use the damn things that's frustrating. I like the ideas but can't do the arithmetic.

 

But your sums are spot on. This is the calculation. Almost. A problem with your version is that you divide by 3p instead of using 6p and doubling the result. It looks the same on paper but it's not in practice. My way gives a lower total of 19. So 19 products and 19 locations gives 0 as a lower limit for TPs.

 

Before I say anything about why the result is zero, which is a massive underestimate, can I just check something with you. It will save a lot time if the result is not what I expect. When I did the spreadsheet calcs they came out fine, but it's all a bit messy for very small primes. .

 

Do you have a spreadsheet set up? It sounded like it. Could you do me a favour and do the calculation once (but using 6p as the divisor of R) for two much larger primes, (not so large it becomes a nuisance). Your result will be trustworthy. My calcs worked, but maybe I built in an error. I did them a long time ago.

 

If the result is less than one then you will have demonstrated mathematically that I am an idiot. My approach would still be sound, but the calculation would be a failure.

Edited by PeterJ

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.