Jump to content

C++ is dying


bascule

Recommended Posts

Please note: the title is a joke. Click here if you don't get it

 

Shadow asked a number of questions about why I don't like C++ and don't think new programmers should learn it. That said, I do think there are cases where C++ is useful, and depending on what you want to program it might be a good choice.

 

So hopefully we can get into a little about that here instead of disturbing yourdad's thread.

 

Why would so many people be using it if it were truly such a terrible language?

 

There's a few reasons:

 

1) Many companies are heavily invested in C++ with massive legacy codebases. Transitioning to another language is too costly.

 

2) C++ remains a great language for developing the core functionality of applications which you intend to port to multiple performance-critical environments (e.g. mobile or embedded devices). A C++ compiler is generally available on most platforms. However, C++ remains a poor choice for developing portable libraries as compared to C: there is no standard for C++ name mangling and so interfacing between C++ libraries produced by different compilers require you define a C API anyway.

 

3) C++ remains a great fit for the requirements of developing video games, providing a nice balance between trying keep code structure manageable while being "bare to the metal" in order to outcompete other games in areas like visual effects and physics

 

4) C++ is a great language for developing application virtual machines for other languages. Most of the popular application virtual machines in use today are written in C++. Again, here C++ strikes a nice balance between keeping code structure manageable while remaining "bare to the metal", and C++ does provide a nice set of tools for testing and debugging.

 

5) C++ has immense inertia in terms of programmer mindshare. It's a language a lot of people start with simply because it is popular. However, these programmers fall victim to what Paul Graham has dubbed the Blub Paradox, which is essentially that because these programmers are trapped in a C++ mindset and C++ is a relatively low level language it is hard for them to grasp the advantages offered by higher level languages.

 

Growing as in more people aren't using it, or growing as in updates to the language?

 

The former. Fewer people are using C++. That's because in the domains where C++ traditionally excelled, there are now a number of languages with similar levels of performance that offer more features which reduce complexity and improve maintainability and ease of debugging.

 

Perhaps the foremost of these is Java. I am no fan of Java, but that's merely a matter of personal taste. C# is another alternative, which while it doesn't quite match C++ on performance offers a richer set of features.

 

If the latter, you've got C++0x which is supposed to be released in a couple of years.

 

C++ is a terribly difficult language to grow. The changes proposed do fix some long standing flaws, but they do so by further complexifying the language, which is already dizzingly complex to begin with.

 

Call this bascule's corollary to Feynman, but if someone tells you they fully understand C++, they probably don't understand C++.

 

If the former...

 

tpci_trends.png

 

The Tiobe graph shows some very interesting trends: the real growth in programming mindshare is in niche languages which are eroding the mindshare of the more popular languages like Java, C, and C++. This can be considered an example of what's known as the long tail effect.

 

I don't expect the popularity of C to diminish substantially any time soon. C is very much essential to a number of tasks, particularly systems programming. We see C waning, but not dramatically. C++, on the other hand, is slowly on its way out. It came, it served its purpose, there are lots of people with huge codebases in it, but like COBOL, the sheer virtue of having large existent codebases isn't going to be enough to keep the language popular.

 

I think over time we will see C++'s role in game development diminish, particularly as higher level languages are able to make better use of things like CUDA, vector units, and massively multicore propcessors than C++ programmers can. Indeed game developers are rather perplexed as to how to properly leverage something like the Cell processor, as writing correct concurrent programs in C++ is painstakingly difficult.

 

Game development is transforming to where the performance bottleneck isn't making your game run as fast as possible on one CPU and one GPU. The performance bottleneck is becoming using all the available CPU cores as much as possible. This isn't a problem which C++ is able to solve very well.

 

The reason niche languages are growing is because they provide a better conceptual model for getting things done. A better abstraction means less code, faster development, and easier debugging.

 

The only advantage C++ brings to the table besides its maturity, popularity, and diversity of tools is the simple fact that it lets you write large portions of your program in C and potentially inline assembly. It's a language you use for speed. However, in my personal list of priorities, speed will always take the back seat to correctness and maintainability. Writing correct programs in C++ isn't impossible, but it's substantially harder than writing correct programs in languages like Java or Python.

Link to comment
Share on other sites

Thanks bascule, that helps a lot. I'll have a look at the links you posted when I get back. But I was wondering; what languages do you use, for what purpose (as in what do you mainly use it for) and why those languages? 'Cause I "finished" learning C++ about a year ago, and for some time now have been thinking what language I should start learning. Java was an obvious choice, and I started delving in that a little but I didn't last long. I'm going to pick it up again eventually, and maybe even take a crack at assembly, but that's just going "up" and "down" from C(++). And I'd like to go right :)

Link to comment
Share on other sites

what languages do you use, for what purpose (as in what do you mainly use it for) and why those languages?

 

I use a number of languages:

 

Ruby: This is my choice for a general purpose prototyping and automation language. I use Ruby because allows me to accomplish things quickly with a minimal amount of code. It also allows me to build relatively complex well-structured systems while, again, keeping code size to a minimum without losing readability, which promotes keeping the code maintainable.

 

Erlang: at this point in time I don't use Erlang for anything practical. I've been prototyping my own language on top of it. I've learned it because it provides an excellent approach to concurrency, which I consider the most pressing problem facing programmers today.

 

JavaScript: I'm not a fan of JavaScript however it's not a completely terrible language (I prefer it to C++, for example) and an incredibly practical one due to its ubiquity in browser environments, Flash (as ActionScript), and other places (a.k.a. widget frameworks)

 

C: a language I used on its own for far too long, but C still remains the king for interfacing. The wealth of libraries available in C trumps virtually any other language, and obviously C is a great language for writing your "hot spots" in.

 

'Cause I "finished" learning C++ about a year ago

 

Does that mean you gave up on learning additional featuers?

 

for some time now have been thinking what language I should start learning. Java was an obvious choice, and I started delving in that a little but I didn't last long. I'm going to pick it up again eventually, and maybe even take a crack at assembly, but that's just going "up" and "down" from C(++). And I'd like to go right :)

 

Java is a step in the right direction from C++, but it's not a very big step.

 

I'd suggest a language like Python. I suggest Python over Ruby because it has a larger community and thus has a wider variety of libraries.

Link to comment
Share on other sites

Does that mean you gave up on learning additional featuers?

 

Well, to be honest I'll probably completely ignore the update when it comes out. What I meant to say is that I've finished learning most of the language, as in the stuff you find in most tutorials, and I've delved a little in the standard library. I don't know any other major things in C++ that one can learn. In other words, I think I have all the knowledge I'll need for your basic problem.

 

The problem I have with Ruby, Python and others is that I just can't get over interpreted languages. I had a brief look at the two back when I "finished" learning plain C, and it was something so completely alien to my style of thinking that I dropped both of them within 5 minutes. I can't imagine making an application in them, because I just...I don't know, I just lack the imagination to think "interpreter".

Link to comment
Share on other sites

I don't know any other major things in C++ that one can learn.

 

Templates? RTTI? STL? Boost?

 

The problem I have with Ruby, Python and others is that I just can't get over interpreted languages. I had a brief look at the two back when I "finished" learning plain C, and it was something so completely alien to my style of thinking that I dropped both of them within 5 minutes.

 

I'm not sure what you're getting at with "interpreted languages." Interpreted vs compiled describes the execution model, not the language itself.

 

There's a number of areas where Python and Ruby differ from a language like C++. Both Python and Ruby are dynamic languages. They are hybrid languages which incorporate imperative elements (ala C++) but they both also borrow heavily from functional programming languages. And of course, both Python and Ruby are garbage collected.

Link to comment
Share on other sites

Templates? RTTI? STL? Boost?

 

Yes to 1 and 3, no to 2 and 4, since I never heard of them until know. I had a quick look at RTTI and...well...what's it good for? But Boost looks cool.

 

And I confused two things in my head, I meant dynamic, not interpreted. As in, you type in the commands, and they execute now. I just can't even imagine how one would make, I don't know, a program that accepts an integer x and outputs the first x Prime Numbers.

Link to comment
Share on other sites

I just can't even imagine how one would make, I don't know, a program that accepts an integer x and outputs the first x Prime Numbers.

 

Well, just for fun, here's a program I wrote in a dynamic language called Erlang which does almost that (it finds all the prime numbers between 2 and X). I'm sure this just looks like gibberish, but I think it effectively demonstrates how little code it takes to accomplish a task like that in a high level language:

 

sieve(X) -> sieve(lists:seq(2, X), []).

sieve([], L) -> lists:reverse(L);

sieve([Prime|T], L) -> sieve([X || X <- T, X rem Prime /= 0], [Prime|L]).

 

This program uses an algorithm similar to the Sieve of Eratosthenes. It's powered by a language feature known as a list comprehension, which Python users may be familiar with and the maths people around here might know as set-builder notation.

 

I think it'd be fun if you implemented a program with the same functionality in a low level imperative language and we compared results :D

Link to comment
Share on other sites

sieve(X) -> sieve(lists:seq(2, X), []).

sieve([], L) -> lists:reverse(L);

sieve([Prime|T], L) -> sieve([X || X <- T, X rem Prime /= 0], [Prime|L]).

 

Is that the best you can do? Use a man's language!

 

[math](\sim R \in R \circ.\times R)/R\leftarrow 1 \downarrow \iota R[/math]

 

Sometimes inscrutability isn't all it's cut out to be.

Link to comment
Share on other sites

sieve(X) -> sieve(lists:seq(2, X), []).

sieve([], L) -> lists:reverse(L);

sieve([Prime|T], L) -> sieve([X || X <- T, X rem Prime /= 0], [Prime|L]).

 

Nice. I have no doubt that, were I to do the same thing in C++, it would take a lot more code. But there is another important factor I forgot to state earlier, and that's fun. At my age and with the range of programs I am able to make, I almost never program for the result. I've only done that twice in my life. All the other programs I've made are absolutely useless, and I've never opened them twice. But it was fun. And you have to admit that implementing the Sieve of Eratosthenes in C++ would be a lot more fun than just writing "sieve" on three lines. I think it also explains the divergence of our views quite nicely; I doubt you do much programming for fun anymore, although I admit that's just a guess.

 

[math] (\sim R \in R \circ.\times R)/R\leftarrow 1 \downarrow \iota R [/math]

 

I beg your pardon?

Link to comment
Share on other sites

[math](\sim R \in R \circ.\times R)/R\leftarrow 1 \downarrow \iota R[/math][/quote']I beg your pardon?

http://en.wikipedia.org/wiki/APL_(programming_language)#Examples

Scroll down to the final example.

The following expression finds all prime numbers from 1 to R. In both time and space' date=' the calculation is O(R²).

(~R∊R∘.×R)/R←1↓⍳R

From right to left, this means:

  1. ιR creates a vector containing integers from 1 to R (if R = 6 at the beginning of the program, ιR is 1 2 3 4 5 6)
  2. Drop first element of this vector (↓ function), i.e. 1. So 1↓ιR is 2 3 4 5 6
  3. Set R to the vector (←, assignment primitive)
  4. Generate outer product of R multiplied by R, i.e. a matrix which is the multiplication table of R by R (°.× function)
  5. Build a vector the same length as R with 1 in each place where the corresponding number in R is in the outer product matrix (∈, set inclusion function), i.e. 0 0 1 0 1
  6. Logically negate the values in the vector (change zeros to ones and ones to zeros) (∼, negation function), i.e. 1 1 0 1 0
  7. Select the items in R for which the corresponding element is 1 (/ function), i.e. 2 3 5

APL has been derisively called a "write-only language". The problem with any powerful language is it invites the author to write completely inscrutable code. The authors of such code ofttimes learn that being extremely clever can be pretty stupid when they are asked to modify that write-only code six months later.

 

It looks like Erlang is an even better candidate for obfuscation than is C. So, is there an Erlang equivalent to the IOCCC? Answer: Yep. There is.


Merged post follows:

Consecutive posts merged
[math](\sim R \in R \,^{\circ}.\times R)/R\leftarrow 1 \downarrow \iota R[/math][/quote'] I beg your pardon?

 

Apparently some on the talk page about the Wikipedia APL article don't like the primes program cited in the main article. It's hard to understand and employs bad practices. The following, from the talk page, is much better in that it is much easier to understand and it does follow good programming practices:

 

[math](2=+/[1]0=R\,^{\circ}.|R)/R\leftarrow\iota N[/math]

Link to comment
Share on other sites

It looks like Erlang is an even better candidate for obfuscation than is C. So, is there an Erlang equivalent to the IOCCC? Answer: Yep. There is.

 

Erlang is reputed for its ugly syntax, which is a real shame because it has some beautiful semantics.


Merged post follows:

Consecutive posts merged
Nice. I have no doubt that, were I to do the same thing in C++, it would take a lot more code. But there is another important factor I forgot to state earlier, and that's fun. At my age and with the range of programs I am able to make, I almost never program for the result. I've only done that twice in my life. All the other programs I've made are absolutely useless, and I've never opened them twice. But it was fun. And you have to admit that implementing the Sieve of Eratosthenes in C++ would be a lot more fun than just writing "sieve" on three lines. I think it also explains the divergence of our views quite nicely; I doubt you do much programming for fun anymore, although I admit that's just a guess.

 

I love programming for fun, however I typically program to accomplish a particular goal, not just for programming for programming's sake. With a language like Ruby I can come up with a nifty idea for a web site and have a working prototype finished in a matter of days.

 

When I was younger I certainly loved trying to take an idea and map it directly to the Von Neumann machine model, and being able to do that was fun in and of itself. The straw that broke the camel's back for me was trying to implement a large, complex system in C. I spent 4 years trying to develop a program (unsuccessfully) and eventually came to the realization that C was a poor language for modeling complex problems.

Edited by bascule
Consecutive posts merged.
Link to comment
Share on other sites

Well, I don't think I'm at the complex problem modeling part in my programming life just yet. I still have lots to learn, as is illustrated by this topic. Erlang sounds interesting, I might have a look at it. I don't really mind ugly syntax, on the contrary; the faster it is to type, the happier I am. TBH, I also love unreadable code, but also often fall into the hole you described with APL. Who knows, maybe I'll end up using something like Brain**** :D

For now, I think I'm going to stick with C++, since it serves my purposes well enough. I'll glance at Java further, see if I can manipulate myself into learning it properly, and after that it's either Assembly or Erlang. Probably the latter, since my indifference towards syntax has it's limits. Or would you recommend something else?

Link to comment
Share on other sites

I think over time we will see C++'s role in game development diminish, particularly as higher level languages are able to make better use of things like CUDA, vector units, and massively multicore propcessors than C++ programmers can. Indeed game developers are rather perplexed as to how to properly leverage something like the Cell processor, as writing correct concurrent programs in C++ is painstakingly difficult.

 

Huh????????:confused:

 

http://www.blachford.info/computer/Cell/Cell3_v2.html

 

"The primary language for developing on the Cell is expected to be C with normal thread synchronisation techniques used for controlling the execution on the different cores. C++ is also supported to a degree and other languages are also in development (including apparently, Fortran)."

 

http://www.google.ca/search?hl=en&q=Cell+processor+PS3&btnG=Google+Search&meta=&aq=f&oq=

Link to comment
Share on other sites

Huh????????:confused:

 

http://www.blachford.info/computer/Cell/Cell3_v2.html

 

The primary language for developing on the Cell is expected to be C with normal thread synchronisation techniques used for controlling the execution on the different cores.

 

Writing correct programs using "normal thread synchronisation techniques" is extremely difficult, hence the final sentence in my statement:

 

Indeed game developers are rather perplexed as to how to properly leverage something like the Cell processor, as writing correct concurrent programs in C++ is painstakingly difficult[/b'].

 

You can substitute C for C++ there (although C++ actually provides some niceities for making mutex locking a little more automatic)

 

It all goes back to the general issue that writing correct concurrent programs is hard. There are a number of languages trying to address concurrency, like Erlang , Haskell, and Clojure. All of these are functional languages with rather novel approaches to concurrency which don't rely on "normal thread synchronisation techniques". That said Erlang wouldn't work at all on something like the Cell since its additional cores aren't general purpose CPUs and the ring-based interconnect makes message passing-based concurrency perform poorly. That said Erlang works wonderfully on CPUs with large numbers of general purpose cores and a mesh interconnect like the Tile64.

 

One of my favorite quotes about the difficulties of traditional approaches to threading comes from the book Java Concurrency in Practice:

 

At this writing, multicore processors are just now becoming inexpensive enough for midrange desktop systems. Not coincidentally, many development teams are noticing more and more threading-related bug reports in their projects. In a recent post on the NetBeans developer site, one of the core maintainers observed that a single class had been patched over 14 times to fix threading-related problems. Dion Almaer, former editor of TheServerSide, recently blogged (after a painful debugging session that ultimately revealed a threading bug) that most Java programs are so rife with concurrency bugs that they work only "by accident"[/b']. Indeed, developing, testing and debugging multithreaded programs can be extremely difficult because concurrency bugs do not manifest themselves predictably. And when they do surface, it is often at the worst possible time--in production, under heavy load.
Link to comment
Share on other sites

I do agree, C++ is doomed for the recycling bin. I'm still placing my bets on something we haven't seen yet though................not short term(<10 years) but soon enough. Something fuzzy -ish a whole new platform based on 'revolutionary' recycled techno guber...........

 

with something a little more like this as the programming language with labels and jmp commands

[math](2=+/[1]0=R\,^{\circ}.|R)/R\leftarrow\iota N[/math]

 

...............obviously 'High Level' english prepackaged user dev kits will be available!:P

Link to comment
Share on other sites

I do agree, C++ is doomed for the recycling bin. I'm still placing my bets on something we haven't seen yet though................not short term(<10 years) but soon enough. Something fuzzy -ish a whole new platform based on 'revolutionary' recycled techno guber...........

 

I don't forsee a revolution so much as an evolution of existing concepts and making them more usable.

 

with something a little more like this as the programming language with labels and jmp commands

 

Ummm... no, nothing of the sort. Those are extremely imperative concepts which fail in a concurrent environment.

 

To address the issue of concurrent programs, I see a complete abandonment of "normal thread synchronisation techniques" and a tradition to an evolution of existing concepts, particularly:

 

  • Software transactional memory
  • Communicating sequential processes
  • Process calculus / join calculus

 

I think languages that handle concurrency well will combine two or more these within the same environment.

Link to comment
Share on other sites

I don't forsee a revolution so much as an evolution of existing concepts and making them more usable.

 

That's why it's 'revolutionary' recycled techno guber! :D

Recycled being crucial to the statement.

 

Communicating sequential processes is an absolute or you'll end up with Alien Hand Syndrome.................. :eek:

 

It's been a while since I've dedicated any serious thought to this topic but I do have something I wish to say so anyone reading this bare with me.

 

I said something fuzzy -ish because I still expect a sudden departure from the digital era and a new or revamped arrival of anologue to fuzzy computing networks that will be primarily structured upon mathematics. The current processor scheme of moving information from one place to another and doing some basic functions to it will eventually be replaced by something more differential. Instead of key framing a set of points for an animation, bones and nodes will be assigned mathematical functions or differential equations upon which they will act. Textures will become 3D algorithmic representations. Fonts will all be resized through a numerical model. I know you are looking at the problem from the realistic and now perspective but the reallity I see is it is limited in that it will only ever produce a flash movie sequence and never true fluid dynamics. I want to and am going to read more about the turing test competitors as they are probably the closest to practical functional neural networks. I believe in the reality of Grand Unified Parallel Processing and denounce properly sequenced quantitative serial binary????? I think............:P

 

Database information will still be handled much the same as it is now but with the amalgamation of weighting(analogue priority sequencing.)

 

I mentioned user end dev kits. I see software development being taken out of the hands of programmers(already true) and given to designers. Software development will be done in studios not quite unlike the current game editors 'sans code.' The home user will have the ability to make software through the same dev suits and customize to their needs. Flash will one day get rid of action script and will deliver more ready made solutions with an abundance of resizing options(centre but do not resize upon users change of resolution check check check.)

 

Programming will be given to engineers and there will be a more noticeable split between programmers and designers or Software Engineers and Computer Science Majors with a hint of interior decorator.......... I could be wrong but I think you like to be one of those who does both. I know I am. I took Game Design and was a little bit upset when they told me I had to chose between designing fun games and programming them.

 

There where mentions inside of this thread as to the best language to learn right now going into the field and I think more now than ever anyone in their twenties had better learn a real mans language and that's math. The second thing to learn(not a language) is organization. This will get any user/programmer through any current and future language period no matter how dynamic the syntax or gritty the processing structure.

 

:)

 

huh, I didn't want to go to the gym anyway..............

Link to comment
Share on other sites

I said something fuzzy -ish because I still expect a sudden departure from the digital era and a new or revamped arrival of anologue to fuzzy computing networks that will be primarily structured upon mathematics. The current processor scheme of moving information from one place to another and doing some basic functions to it will eventually be replaced by something more differential.

 

I suggest you read about the Church-Turing Thesis. What you're describing isn't computation.

 

While I can see a future where the physical operation of a processor becomes more probabilistic and error prone, you are not going to get away from the notion of a computer being something which has a state it can deterministically act upon. Otherwise, what you have isn't a computer.

 

I believe in the reality of Grand Unified Parallel Processing and denounce properly sequenced quantitative serial binary????? I think............:P

 

I think distributed computing is the future, and that future will come in the form of communicating sequential processes. That's how all distributed computer systems work: they are merely networks of communicating Von Neumann machines. The idea of communicating sequential processes is also immensely effective in leveraging multicore computers.

 

In fact, some of the key things limiting advancement of multicore computer designs are the fact that multicore CPUs aren't being designed around the communicating sequential process model. Most multicore CPUs are designed to run a single system image, which adds all sorts of arcane technical requirements to the design like cache coherency.

 

Database information will still be handled much the same as it is now but with the amalgamation of weighting(analogue priority sequencing.)

 

I foresee the traditional relational database disappearing and getting replaced by simple key value stores, particularly ones based around ideas like distributed hash tables.

 

I see software development being taken out of the hands of programmers(already true) and given to designers. Software development will be done in studios not quite unlike the current game editors 'sans code.' The home user will have the ability to make software through the same dev suits and customize to their needs. Flash will one day get rid of action script and will deliver more ready made solutions with an abundance of resizing options(centre but do not resize upon users change of resolution check check check.)

 

It sounds like you want a dataflow language. I'm almost reluctant to link this because it's creator is a crackpot, but perhaps this is what you're thinking of:

 

http://www.rebelscience.org/Cosas/quicksort.htm

 

COSA is an idea for a "language" (presently unimplemented) where you program with a bunch of drag and drop components with little pins you can plug together.

 

I do not think this is a very practical idea, particularly for developing complex applications.

 

There where mentions inside of this thread as to the best language to learn right now going into the field and I think more now than ever anyone in their twenties had better learn a real mans language and that's math.

 

Math is not a programming language. You're comparing apples to oranges.

 

That said, I've found there are people who think of programming language more like math (who are typically attracted to functional languages) and ones who like to think of languages as working more like natural language (who are typically attracted to imperative languages). I suppose I lean more towards the natural language side of things. I like my language to be flexible rather than rigid. I'd rather it have a large expressive grammar than a tiny minimalistic one. I like having procedures that do things because I tell them to, rather than functions which take only arguments as inputs and do nothing but return a value.

 

Over time I have come to think more like a functional programmer, but to me I will never think about programming as if it were math.

Link to comment
Share on other sites

  • 1 month later...

I'm still looking for the holy grail, a portable language supporting functional/imperative/OO styles. And of course it has to produce fast programs, which is quite important when you're writing Monte Carlo simulations.

 

Anyway for now I'll stick with C/C++/Python. C++ has to be one of the most inelegant programming language, but building a class in C++ to deal with tridiagonal stochastic matrices is easy and it leads to a very clean syntax (and it's fast, too).

Link to comment
Share on other sites

I'm still looking for the holy grail, a portable language supporting functional/imperative/OO styles. And of course it has to produce fast programs, which is quite important when you're writing Monte Carlo simulations.

 

It sounds like Scala or OCaml would fit your needs.

Link to comment
Share on other sites

How does either OCamel or Scala outperform or make the creation of Monte Carlo simulations any better/easier than when done in C++?

 

note: I am quite interested in these sorts of applications and despite any prior arguing on behalf of C++ am truly looking for all the facts.

Edited by buttacup
Link to comment
Share on other sites

How does either OCamel or Scala outperform or make the creation of Monte Carlo simulations any better/easier than when done in C++?

 

Well, my comments were specifically in response to the requirements of a hybrid functional/imperative/OO language as laid out by PhDP.

 

The resulting code will probably be slower than its C++ counterpart, but will be free of overflows of all types and will likely be shorter and more maintainable.

 

In certain cases it will be implicitly parallelizable too.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.