Jump to content

Infinite coin flip probability


Lord Antares

Recommended Posts

Let us say you have a ruler laid out on the floor, with the spacings arranged such that 1 increment = 1 meter. You are standing at 0.

You start flipping a coin an infinite number of times. When you get heads, you move 1 meter forwards (positively) on the ruler. When you get tails, you move 1 meter backwards. If you flip tails at 0, you stay at zero. Let's say that you flipped the coin once and it landed on heads, so you are now standing at 1 meter. With a potentially infinite number of coin tosses, what are the odds of eventually arriving at 0? How different would it be if the ruler was infinite as opposed to finite?

 

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

 

Some of my thoughts on this:

 

We can immediately recognize that the odds of that will always be above 50% and progressively get higher because, you already have a 50% chance of getting tails on the first toss when it would be ''game over'', but if you get heads, you still have a chance for it to overturn in the opposite direction. Now, I think I have discovered how to calculate this for a finite number of throws. I've counted them manually for the first several coin tosses and the odds are (in increasing number of coin tosses) 1/2 2/4 5/8 10/16 22/32, so I think you always multiply the denominator by 2, and for the numerator, you do this: n x 2, n x 2 + 1, n x 2, n x 2 + 2, n x 2, n x 2 + 3 etc. I don't know how to express this mathematically, but I hope someone can tell if this is correct.

How I would calculate this for an infinite number of flips is beyond me.

 

Also, the odds have to be slightly higher for a finite ruler. Let's say that the ruler is 700 meters long. In an unlikely event that we flip 699 more heads than tails (without flipping more tails in a row, of course, because it would be game over then), we would find ourselves at 700 meters. Now, if we flip any more heads, we will not be able to go further, whereas in an infinite ruler, we could go farther than 700, when it would be more unlikely that we would return back to 0 than if we were at 700.

I think it would also be easier to calculate this for an infinite ruler because I would assume it follows a linear rule, whereas the finite one doesn't.

 

Is this problem more complicated than I think? If it isn't, I would appreciate if you would try to explain what you are doing in your calculations because I am a big layman.

Edited by Lord Antares
Link to comment
Share on other sites

This is a description of the Gambler's ruin and has been thoroughly studied, but is far from easy so no surprise you haven't got it first go. The Gambler's ruin is essentially a series of bets a gambler makes against a bank with infinite money: heads the gambler wins £1, tails the gambler loses £1. Even if the coin is unbiased , and no matter the gambler's starting stake (as long as it's finite) the gambler will loss all his money with probability one because the bank can always keep playing but the gambler cannot play after hitting zero as he can't afford to pay.

 

Specifically what you are after is called the ruin probability. The proof is quite involved (P26) depending on how much maths we already know; we'd have to go through some preliminary learning first if you want to derive it together?

Link to comment
Share on other sites

This is a description of the Gambler's ruin and has been thoroughly studied, but is far from easy so no surprise you haven't got it first go. The Gambler's ruin is essentially a series of bets a gambler makes against a bank with infinite money: heads the gambler wins £1, tails the gambler loses £1. Even if the coin is unbiased , and no matter the gambler's starting stake (as long as it's finite) the gambler will loss all his money with probability one because the bank can always keep playing but the gambler cannot play after hitting zero as he can't afford to pay.

 

This is true as long as the number of coin flips are ifinite, right? Yes, I think you could equate this to my question. Getting to 0 on the ruler would equate to the gambler losing all his money. Positive infinity is the bank, so we would have to use my infinite ruler example. Is this right?

 

So if these cases are equivalent and the probability for the gambler case is 1, then it must be 1 for my example as well. But this means that the money neccesarilly needs to tip in the bank's favor, right? So, if we, by a stroke of luck, start at 1 dollar and end up at 335. From then on, we continue to flip into infinity. You are saying, that it the flips would always eventually lead you to 0 dollars for the gambler. It is not possible that the money will fluctuate between, say, 76 and 645? This is just an issue with using infinity as the number of flips, right? Is there any intuitive explanation how this leads to that conclusion. I would have assumed that flipping a coin an infinite amount of times will neccessarily lead you to be at 50%.

 

Specifically what you are after is called the ruin probability. The proof is quite involved (P26) depending on how much maths we already know; we'd have to go through some preliminary learning first if you want to derive it together?

 

Unfortunately, you will find that my math knowledge is feeble. I am just trying to use logic to see where it can lead me, but if the problem is as complicated as you say, I'm afraid you would have to waste a substantial amount of time for me to understand it. I didn't know the problem is a complicated one.

Edited by Lord Antares
Link to comment
Share on other sites

 

This is true as long as the number of coin flips are ifinite, right? Positive infinity is the bank, so we would have to use my infinite ruler example. Is this right?

 

Yes.

 

 

Positive infinity is the bank, so we would have to use my infinite ruler example. Is this right?

 

Yes, though there may be a slight difference in set up when you have a finite ruler and i have a finite bank balance: when you reach the end of your ruler you say i go back on a tail and stay still on a head, whereas when the bank has run out of money the gambler just walks away laughing and the game stops.

 

 

So if these cases are equivalent and the probability for the gambler case is 1, then it must be 1 for my example as well. But this means that the money neccesarilly needs to tip in the bank's favor, right? So, if we, by a stroke of luck, start at 1 dollar and end up at 335. From then on, we continue to flip into infinity. You are saying, that it the flips would always eventually lead you to 0 dollars for the gambler. It is not possible that the money will fluctuate between, say, 76 and 645? This is just an issue with using infinity as the number of flips, right?

 

It's not possible that the money will indefinitely fluctuate between any two values (except the lowest and highest values if those exist), but the probability of it fluctuating between two values for some number of flips can be calculated.

 

 

Is there any intuitive explanation how this leads to that conclusion. I would have assumed that flipping a coin an infinite amount of times will neccessarily lead you to be at 50%.

 

There is nothing intuitive about infinity as far as i can tell. All i can say is that given that an outcome is possible (no matter how unlikely) and given infinite repetitions that outcome will occur. Like monkeys writing Shakespeare, notwithstanding practical limitations, they will reproduce all the sonnets.

 

 

Unfortunately, you will find that my math knowledge is feeble. I am just trying to use logic to see where it can lead me, but if the problem is as complicated as you say, I'm afraid you would have to waste a substantial amount of time for me to understand it. I didn't know the problem is a complicated one.

 

So is mine, don't worry about it. I find interest and effort to be more important to learning than any innate ability.

 

You were making a start in the finite case, but here's something to think about: what is the probability of landing on step 0 given any even number of steps (and that you start on step 1)?

 

 

P.S.

 

What about programming? We can explore these concepts using Monte Carlo simulations.

Edited by Prometheus
Link to comment
Share on other sites

 

Yes, though there may be a slight difference in set up when you have a finite ruler and i have a finite bank balance: when you reach the end of your ruler you say i go back on a tail and stay still on a head, whereas when the bank has run out of money the gambler just walks away laughing and the game stops.

Yes, exactly. So it differs a bit. But that just increases the probability of it going back to 0 than having an infinite ruler, because there is a limit for how far you can go away from 0, right?

 

It's not possible that the money will indefinitely fluctuate between any two values (except the lowest and highest values if those exist), but the probability of it fluctuating between two values for some number of flips can be calculated.

It is a bit weird to think about. If you try to pick a number of coin flips, however high that number is, there will always be a chance that it will fluctuate between two non-end numbers. But if that number is inifnity, the chance is 0. This both does and doesn't make sense to me.

 

There is nothing intuitive about infinity as far as i can tell. All i can say is that given that an outcome is possible (no matter how unlikely) and given infinite repetitions that outcome will occur. Like monkeys writing Shakespeare, notwithstanding practical limitations, they will reproduce all the sonnets.

Hm. I'm thinking that this scenario is different from the Shakespear monkey one. I may be wrong, but here's why I think so:

 

The Shakespear monkey process can reset. So, the monkey types drivel until it randomly types out Hamlet. So it would be more like pulling out an infinite pool of lotto numbers. Eventually, all numbers will be pulled out, but there is no extra dimension of moving along the ruler. Saying that zero on the ruler must eventually be reached means that it is certain that there will eventually be more tails than heads! There must always eventually be 1 more tails than heads for it to reach zero. I'm not sure how this impacts the answer.

 

EDIT: of course there would be fluctuation between a higher number of tails and higher number of heads, but consider starting at 500 meters. It would have to be guaranteed that you would, at some point, have 500 more tails than heads.

So is mine, don't worry about it. I find interest and effort to be more important to learning than any innate ability.

Absolutely agreed.

 

You were making a start in the finite case, but here's something to think about: what is the probability of landing on step 0 given any even number of steps (and that you start on step 1)?

I thought I covered that in the OP. I took a shot at how to calculate the porbabilities for finite flips. No one yet told me if it is wrong or not.

 

P.S.

 

 

What about programming? We can explore these concepts using Monte Carlo simulations.

I know nothing about programming, so this is beyond me.

Edited by Lord Antares
Link to comment
Share on other sites

 

Yes, exactly. So it differs a bit. But that just increases the probability of it going back to 0 than having an infinite ruler, because there is a limit for how far you can go away from 0, right?

 

Yes. There's loads of subtle ways to change the set-up that can change probabilities drastically.

 

 

It is a bit weird to think about. If you try to pick a number of coin flips, however high that number is, there will always be a chance that it will fluctuate between two non-end numbers. But if that number is inifnity, the chance is 0. This both does and doesn't make sense to me.

 

 

Consider boundaries a and b where they are, say, 10 steps apart and we start in the middle of them. Whats the likelihood of staying bounded between them after one step? Zero obviously. So too for 2,3 and 4 steps. But after 5 steps there's a 1/16 chance of us hitting a or b. After six steps it's even more likely. And more and more likely for successive steps until, in the limit (i.e infinite steps), the probability is 1.

 

 

 

Hm. I'm thinking that this scenario is different from the Shakespear monkey one. I may be wrong, but here's why I think so:

 

Yes it's a bit different to Shakespeare's monkeys (though you can model both with a Markov chain). I was just trying to convey the point that any event with a finite probability of occurring. however small, will occur if given infinite chances.

 

 

EDIT: of course there would be fluctuation between a higher number of tails and higher number of heads, but consider starting at 500 meters. It would have to be guaranteed that you would, at some point, have 500 more tails than heads.

 

Yes, that's right. We can quantify things like the expected number of flips to get to this state. That number of flips is be very large, but since we have infinite flips, it will happen sooner or later.

 

 

I thought I covered that in the OP. I took a shot at how to calculate the porbabilities for finite flips. No one yet told me if it is wrong or not.

 

You are trying to calculate the probability of arriving at zero on step number n in that set-up aren't you? You tried this manually? Did you notice anything odd about how many steps it took to get to zero?

Edited by Prometheus
Link to comment
Share on other sites

 

Consider boundaries a and b where they are, say, 10 steps apart and we start in the middle of them. Whats the likelihood of staying bounded between them after one step? Zero obviously. So too for 2,3 and 4 steps. But after 5 steps there's a 1/64 chance of us hitting a or b. After six steps it's even more likely. And more and more likely for successive steps until, in the limit (i.e infinite steps), the probability is 1.

 

I understand this. It's just a bit strange. Infinity doesn't follow the rules of numbers in probability at all. Like, for example, if we get a billion heads in a row, we are still guaranteed to reach zero even though there is only a 50% chance of going in the direction of zero for a single flip. So there is nothing stopping infinity from not fluctuating between two given numbers. I understand why this is so (because we are dealing with an infinite number of flips), but it is just a bit strange.

 

 

Yes it's a bit different to Shakespeare's monkeys (though you can model both with a Markov chain). I was just trying to convey the point that any event with a finite probability of occurring. however small, will occur if given infinite chances.

 

True.

 

 

You are trying to calculate the probability of arriving at zero on step number n in that set-up aren't you?

 

Yes.

 

 

 

You tried this manually?

 

Yes. Bear in mind that I know almost nothing about mathematics and any conclusions I derive are from my own attempt at understanding it. I am somehow very drawn to probability and odds and I often try to calculate them for some situations where it would be useful. So any educated comments about why I'm wrong or right are helpful to me. I also have many questions.

 

 

 

Did you notice anything odd about how many steps it took to get to zero?

 

Are you referring to the fact that the more steps there are, the higher the probability that it reaches zero? So for infinite steps, the probability is 1? Or do you mean something else?

 

Thanks for replying. Not many members seem to be interested in probability.

Edited by Lord Antares
Link to comment
Share on other sites

...but it is just a bit strange.

 

Yeah, things can get strange with infinity. That's why it's hard to get an intuitive feel for it.

 

 

Yes. Bear in mind that I know almost nothing about mathematics and any conclusions I derive are from my own attempt at understanding it. I am somehow very drawn to probability and odds and I often try to calculate them for some situations where it would be useful.

 

Nothing wrong doing it the old fashioned way. There are stories of gamblers who would have a good idea of certain probabilities just by playing so much and then seeking confirmation from mathematicians.

 

 

Are you referring to the fact that the more steps there are, the higher the probability that it reaches zero? So for infinite steps, the probability is 1? Or do you mean something else?

 

 

I was hinting at the phenomenon that you will never reach step zero, starting from step one, on an even number of attempts. Try it.

Link to comment
Share on other sites

 

I was hinting at the phenomenon that you will never reach step zero, starting from step one, on an even number of attempts. Try it.

 

Oh, that. Well, obviously as there always needs to be 1 more tails than heads to reach zero. If there is one more tails than heads, the resulting number of flips always has to be odd.

 

So do you think my original question is too hard to answer if I don't have the preliminary knowledge of mathematics? I will be honest and say that I don't know most symbols used in long equations.

Link to comment
Share on other sites

Well lets try your method and see. I'm starting from step one and trying to find the probability of being on step 0 after n moves. Also assuming 50/50 chance of moving backwards or forwards.

 

Trying it for up to seven steps i get the sequence of probabilities for n=1,2,...,7 to be 1/2, 0, 1/8 ,0, 1/16, 0, 1/32. So it seems like a pattern is emerging, but i'm doing this 'by hand' so we must always be wary of mistakes. This is different to your sequence which doesn't seem right to me. Assuming this pattern continues (should really do some more steps but even by seven it was getting tedious - i don't think it will actually hold), we might say the probability that we get to step 0 on move 9 is 1/64. We want to find a formula that returns 0 on even n and returns certain powers of 2 for odd n. We then want to test it.

 

A slightly better way might be to notice where this pattern is coming from: for any move n, the number of possible paths (to anywhere, not just 0) is [math] 2^n[/math]. Then we want to 'keep' just those paths that lead back to zero. So for n=7, we have [math] 2^7 = 128 [/math] and i counted 4 paths that lead to zero out of these 128 giving us [math]\frac{4}{128} = \frac{1}{32}[/math]. I can't think of a simple way to count all the paths to zero in n moves (without landing on n in any proceeding moves either) though.

 

Does any of this make sense/help? Essentially you are adopting an empirical approach to calculating probabilities which is prone to mistakes, cannot be easily generalised (if we want to make the flips p/q instead of 50/50 for instance), and ultimately cannot actually be proved for all n without using the other methods anyway.

Link to comment
Share on other sites

Ah, yes. I was doing a different thing. I was giving odds for flips UP TO n. That is also why they didn't change for an even number of flips (that is, both the numerator and the denominator were multiplied by two, so it stays the same). So there wasn't an error as such, but I did a calculation for the wrong thing.

 

Yes, this helps as you gave the odds for the exact n, which is what we were talking about. Of course, I knew I was giving them for up to n, but that was just a misunderstanding.

 

 

 

Essentially you are adopting an empirical approach to calculating probabilities which is prone to mistakes, cannot be easily generalised (if we want to make the flips p/q instead of 50/50 for instance), and ultimately cannot actually be proved for all n without using the other methods anyway.

 

Yes, exactly. That's why I was looking for a mathematical answer.

 

 

and i counted 4 paths that lead to zero out of these 128

 

Did you count them manually or did I miss something?

Link to comment
Share on other sites

Ah, yes. I was doing a different thing. I was giving odds for flips UP TO n. That is also why they didn't change for an even number of flips (that is, both the numerator and the denominator were multiplied by two, so it stays the same). So there wasn't an error as such, but I did a calculation for the wrong thing.

 

I see now. Not necessarily wrong then, just different. Although if we then sum my sequence up to each n shouldn't we recover your sequence?

 

Also, we know from that link that the sequence should sum to one - we could use this knowledge as one test of any closed form solution we make.

 

 

Did you count them manually or did I miss something?

 

Yes i did it manually which is why i only went up to 7. For this method to really work you really need to get a computer to perform the experiments - no errors (assuming the code is OK) and much quicker. This is essentially what Monte-Carlo simulation is: it's is used all over the place when analytic solutions to problems aren't forthcoming for whatever reason. Even if analytic solutions are available its often quicker to simulate the results, but arguably you lose understanding that way.

Link to comment
Share on other sites

 

Although if we then sum my sequence up to each n shouldn't we recover your sequence?

 

 

But we do! I also thought there was a mistake somewhere, but is only because I forgot to count your zeroes. Where I gave 10/16, I was comparing it to your 11/16, whereas I had to compare it with my 22/32, which is in unison. So, my sequence is above, yours below:

 

1/2 2/4 5/8 10/16 22/32 44/64 91/128

 

1/2 1/2 5/8 5/8 11/16 11/16 23/32

 

But wait, there is an error in the last step. I think I know what it is, but I'm in a hurry to leave for work, so I don't have time to check it.

I think I should add +1, +2, +4, +8 etc. and not +1, +2, +3, +4...

 

If we use this new formula, I get 92/128 on the last step, which corresponds with your 23/32.

Link to comment
Share on other sites

  • 1 month later...

Concerning our talk about coin flips and infinite tosses, it seems that we were both right for different reasons. You said that the probability of reaching 0 in an infinite number of tosses is 1. Today I found out that, while this seems to be correct, it doesn't mean what I thought it does.

Wikipedia states that it is possible that it never reaches zero (the article specifically talked about tails never being flipped in an infinite amount of tosses), which I was thinking might be correct! Since you can always get further on each flip, it is technically possible that we never reach 0 on the ruler, even with an infinite number of flips, which is strange to think about, as infinity is supposed to cover all possible situations and conditions. Still, it makes sense to me that this can be true, even though it's a bit tricky to think about.

 

Here is the article:

 

https://en.wikipedia.org/wiki/Almost_surely

 

And here is the quote specifically concerning coin flips:

 

IzEfjYI.png

 

So basically, the article states that the probability of 1 doesn't necessarily mean that the event MUST happen, which makes no sense to me.

It cannot possibly be mathematically correct to say that p = 1 is an almost certain event.

 

Is this another approach to probability, which I am not aware of? It makes no sense to me so it might be a more subjective (hence, less mathematical?) approach.

Input on this would be greatly appreciated.

 

EDIT: But wait just a minute. If, mathematically speaking, the probability of never flipping tails in an infinite amount of coing flips is technically not 100% (but rather, unfathomably close to it), isn't the probability for my problem around 0.5, rather than 1?

Edited by Lord Antares
Link to comment
Share on other sites

Is this another approach to probability, which I am not aware of?

It's the standard approach to probability. Think of it this way. First think about 3 flips. The odds are 1/8 of getting any particular possibility: hhh, hht, hth, htt, thh, tht, tth, ttt. The odds of any particular sequence are the same.

 

Now what if there are infinitely many flips? The odds of any particular sequence are the same. Zero. But some sequence must occur. Therefore it's possible for probability zero events to occur.

 

This is basic to standard probability theory.

 

https://en.wikipedia.org/wiki/Probability_axioms

 

Another way of imagining this is to pick a random real number in the unit interval. The probability of picking any particular real number is zero, but all the real numbers are there and some real number must get picked.

 

In infinitary probability theory, probability 0 events may occur and probability 1 events might not. That's what the phrase "almost all" or "almost surely" means.

Edited by wtf
Link to comment
Share on other sites

Another way of imagining this is to pick a random real number in the unit interval. The probability of picking any particular real number is zero, but all the real numbers are there and some real number must get picked.

 

Thank you, this helped me understand it. I am going through the linked article, but I am illiterate in the mathematical language so it will take time until I've understood the whole of it.

 

But basically this is because we're dealing with infinity. We have an infinite sample of real numbers, and therefore picking any number will yield a probability of 1/infinity, which is 0. That is a good point and it makes sense to me.

 

But is this diffferent from my problem? Flipping a coin for an infinite amount of tosses and never getting tails has ''almost zero'' chance, as you said, but I am thinking it should be a different probability than for my infinite ruler. So, is the probability of getting to 0 on an infinite ruler ''almost one'', or something else?

Link to comment
Share on other sites

Flipping a coin for an infinite amount of tosses and never getting tails has ''almost zero'' chance, as you said

No I didn't say that. All heads has zero probability. Exactly zero. It's just that probability zero events can occur. In fact since the probability of any particular sequence whatever is exactly zero, a probability zero event MUST occur.

 

I haven't followed the thread so I can't comment on your original question. I only jumped in to clarify probability zero events.

Edited by wtf
Link to comment
Share on other sites

Aha, I get it. I just expressed myself wrongly. So, 0 = almost never, rather than almost 0 = almost never. The latter is an invalid term, right? I confused my terminology.

 

 

I haven't followed the thread so I can't comment on your original question. I only jumped in to clarify probability zero events.

 

That's alright. I found the input interesting and learned something knew, as I didn't think about it.

Link to comment
Share on other sites

wtf, your several posts together make a clearly understandable simplification of a difficult subject viz probability. +1

 

Now here is a question.

 

You have demonstrated that a random event can have a probability of 0, but can a random event have a probability of 1

 

I have a box of 10 chocolates, all different.

I take one out, a random, and eat it.

I take another out at random and eat that one.

and so on until I have eaten 9 chocolates.

 

Making a 10th selection is now a deterministic certainty so can it be random?

 

Lord Antares, can you spot the difference between the chocolates situation and flipping a coin?

Edited by studiot
Link to comment
Share on other sites

I'm thinking there is a deeper point to your post. Surely, 10th selection when isolated is not random. Or, if viewed from the standpoint of probability, it always has the probability of 1.

I'm not sure if the question was directed to wtf or me.

Link to comment
Share on other sites

Don't be greedy.

 

:)

 

There were two questions, one each.

 

How about trying yours?

 

The answer has a bearing on your question "is there more than one probability?"

Edited by studiot
Link to comment
Share on other sites

Ah, I replied before the edit, so I wasn't sure what you were getting at.

 

 

Lord Antares, can you spot the difference between the chocolates situation and flipping a coin?

 

Of course. The point that you're trying to make is that probability changes based on whether the ''pool of possibilities'' diminishes or not after each attempt, right?

So, even a finite number of coin flips always has the unchanged probability of 0.5 for either heads or tails (assuming that the coin is purely mathematical and random, of course).

 

When you take these chocolates out, the probability for any one being taken out diminishes by 1 each time. So, 1/10 on the first draw, 1/9 on the second...eventually arriving at 1/1. So yes, there are different kinds of probability.

 

It is similar with lotto numbers as drawing one ball eliminates the chance of it being drawn again, diminishing the ''pool of possibilities'', so the odds change after each draw.

 

Btw, side question: are the terms ''probability'', ''odds'' and ''chance'' synonimous? I'm not a native English speaker.

Link to comment
Share on other sites

But is this diffferent from my problem? Flipping a coin for an infinite amount of tosses and never getting tails has ''almost zero'' chance, as you said, but I am thinking it should be a different probability than for my infinite ruler. So, is the probability of getting to 0 on an infinite ruler ''almost one'', or something else?

 

It's no different for the infinite ruler problem: we will get to zero with a probability of one, which is the same as saying almost surely (we don't say almost one - unless the probability is 0.9999..., then we might say it). But as wtf explained better than i could, it might not happen. What property of the ruler are you thinking makes the situation different?

Link to comment
Share on other sites

 

What property of the ruler are you thinking makes the situation different?

 

Comparing the infinite ruler to the simple coin flips example (never flipping tails), in the latter, the situation resets every time you flip a new coin. The tosses are independent. When you flip heads, the chance of flipping tails on the next turn doesn't decrease or increase. I understand how the probability of that happening is zero, but it doesn't absolutely mean that it won't happen. It makes sense to me.

 

However, on the infinite ruler example, the probability of getting to zero changes based on previous tosses. So, if you flipped heads 699 times in a row, you would be at 700, making it less likely that you would eventually return to zero. This is all fine and dandy, however, infinity complicates the issue. I am not sure if the probability is 1 (the opposite of the previous example's 0), or if it is somehow changed based on the fact that previous tosses impact the result. It makes sense to me that it would be, but it also makes sense that it wouldn't. I am not certain.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.