Jump to content

what's a good programming language to learn?


ecoli

Recommended Posts

I think the issue between D H and bascule here is merely a difference of willingness to use what already works to its full potential vs. willingness to learn new concepts and techniques to leverage them to their potential. The former requires less initial effort on the part of the programmer (less learning curve), but may mean uglier code and other disadvantages in the long run. The latter requires more learning but can make programming and debugging faster at the expense of some execution time.

 

Tradeoffs. What are you willing to lose?

Link to comment
Share on other sites

A few questions to help you narrow things down, ecoli:

  • What domain are you interested in? I do not mean stochastic modeling; that is far too broad. I mean something like atmospheric modeling, chemical modeling, biological systems, ...

Biological systems for sure. Probably infectious disease transmission and/or evolution.

 

Also I'm interested in social science research and economics.

 

 

  • Academia or industry?
academic

 

  • Does one, maybe two, languages dominate in that field? If so, you know what you eventually need to learn.

I have no idea, I'm a complete newb.

 

It would also behoove you to learn something of the art of computer programming. Scientists and engineers for the most part are quite lousy at programming because they have either learned it on their own or have learned it with the aid of other (equally inept) scientists and engineers.

 

would you recommend a class then? That could be possible

Link to comment
Share on other sites

would you recommend a class then? That could be possible

 

Structure and Interpretation of Computer Programs (a.k.a. SICP) is almost universally regarded as the best within the computer science community.

 

If you click the link you will find their complete video lecture series available for download, taken from an actual class taught at MIT.

Link to comment
Share on other sites

Biological systems for sure. Probably infectious disease transmission and/or evolution. Also I'm interested in social science research and economics.

You are in college. There are lots of people there who use computers to do exactly what you are talking about. Rather than asking us what a good language to learn would be, ask them. Ask your advisor. Go snooping around in labs and ask the RAs what they are using. Find out who is publishing papers in this domain at your school and ask those authors.

 

Don't ask us. All you will do is provoke religious wars and get a bunch of wrongheaded answers.

 

would you recommend a class then? That could be possible

Absolutely.


Merged post follows:

Consecutive posts merged
I think the issue between D H and bascule here is merely a difference of willingness to use what already works to its full potential vs. willingness to learn new concepts and techniques to leverage them to their potential. The former requires less initial effort on the part of the programmer (less learning curve), but may mean uglier code and other disadvantages in the long run. The latter requires more learning but can make programming and debugging faster at the expense of some execution time.

 

Tradeoffs. What are you willing to lose?

That is an insult. I am more than willing to learn new concepts. I try to learn a new language every year or so. I can easily list twenty plus languages that I have learned over the last thirty years. I did not count several completely immemorable whose names I can no longer remember.

 

I'm a fan of writing correct programs quickly.

... and sloppily, and more or less by yourself. You are in academia, and you are in a computer science department. Sloppiness and working in very small teams are almost a given. Some elements of computer science used to address issues of reliability, maintainability, understandability, verifiability, verifiability, traceability, cost, and above all, developing and sustaining a large set of knowledgeable workers. Some computer science departments still do concern themselves with such issues. Many no longer do; those boring, real world concerns are now addressed by a rather new discipline, software engineering.

 

I think the issue between D H and bascule ...

The above gets at what the real issue is. It is a modern version of Snow's Two Cultures. Bascule and I are from two very, very different cultures.

 

I worry about reliability and all that crap. Bascule worries slapping crap out quickly. Both concerns are crap, but hey, its the crap we have to worry about.

 

I worry about competing with other companies. Bascule worries about competing with others in academia. These are very different kinds of pressure that lead to very different world views. Different cultures.

 

I worry about being able to hire scientists and engineers who have a rather limited concept of computer programming, computer science, and software engineering. Bascule worries about the keeping up with the state of the art in computer science. Again, very different cultures.

 

I was a part of the 1980-1987 AI revival. I learned several AI languages and made some inroads into applying AI to NASA (one of my programs helped keep a Shuttle flight flying after a major on-board failure). I was also taken in by the AI winter that followed that revival. One reason for that AI winter was that most scientists and engineers could not grok Lisp, rule-based reasoning, or, heaven forbid, backward chaining. The people coming out of schools who could understand logic programming could do Blocks World just fine but were for the most part completely incompetent when it came to real-world applications. There never was a sufficient mass of people who could bridge the different cultures and produce success stories. AI went into a massive decline because of a lack of success.

 

Academic computer science and real-world science and engineering are very, very different cultures. Bascule comes from the former while I come from the latter.

Edited by D H
Consecutive posts merged.
Link to comment
Share on other sites

You are in college. There are lots of people there who use computers to do exactly what you are talking about. Rather than asking us what a good language to learn would be, ask them. Ask your advisor. Go snooping around in labs and ask the RAs what they are using. Find out who is publishing papers in this domain at your school and ask those authors.

I currently work in a department that uses zero computation, aside from the odd statistical test. I'll poke around, however.

 

Don't ask us. All you will do is provoke religious wars and get a bunch of wrongheaded answers.

I haven't read an answer here that I would characterize as "wrong" yet.

Link to comment
Share on other sites

That is an insult. I am more than willing to learn new concepts. I try to learn a new language every year or so. I can easily list twenty plus languages that I have learned over the last thirty years. I did not count several completely immemorable whose names I can no longer remember.

No insult intended. I suppose your point is that while you have learned all these other languages, you find that the traditional stuff works best and there's no need for Haskell or any of the other "new" languages brought up by bascule?

 

What is really called for here is a direct comparison: get some programmers expert in C/C++, some expert in Haskell or Erlang or some other similar language, and give them a task. Time how long it takes both for development and program execution and see what the tradeoffs are. (For added realism, then give them a new task that involves updating the old program to new requirements, and see how easily the code can be updated.)

 

My suspicion is that the "new" languages make programming faster (to an extent) but program speed slower, unless you're sufficiently clever. But I'd have to try it.

 

 

... and sloppily, and more or less by yourself. You are in academia, and you are in a computer science department.

From what I recall, bascule works for an Internet company. But I'll let bascule speak for himself in that department.


Merged post follows:

Consecutive posts merged
I haven't read an answer here that I would characterize as "wrong" yet.

Honestly I don't think you can pick a "wrong" language to learn. All will be useful to you in some way, and learning one language will make it infinitely easier to learn the next -- my knowledge of the workings of Java comes from the object model in PHP, for example. There are so many similar concepts across languages that learning one will help prepare you for learning more.

 

So take a look through the languages mentioned here, look up some code in those languages, and try reading a tutorial or two from each. (Don't let the look of the code scare you. I know Lisp scared me when I first saw it, but it's actually pretty cool.) If you go into anything computational you'll end up learning several languages anyway, so it doesn't really matter which one you start with.

Link to comment
Share on other sites

My suspicion is that the "new" languages make programming faster (to an extent) but program speed slower, unless you're sufficiently clever. But I'd have to try it.

 

But that's beside the point, again it has very little to do with the actual value of the languages. The languages bascule named might be great but nobody use them in science. Scientists have to work in groups, they have to publish articles, and they very often have to rely on librairies. Who cares if your Haskell program is better than my C++ program, you'll probably need to write a version in C for publication or to cooperate with other people. It's already annoying when scientists publish articles filled with words nobody used in decades, at least they don't write their program in some obscur language.

 

In my field (population genetics/molecular evolution), programs are written in C, C++, and minority are written in Java. Theoretical scientists very often use CPython for simple programs... and I encounter R a lot when I have to deal with phylogenetics and statistics. That's about it. With the exception of R, all theses languages are derived from C.

 

That's the crushing force of inertia bascule talked about, which isn't so bad in truth. Communication is very important in science, and it would be messy if we were to follow all the fashions in computer science. I wish more was done with managed languages (Java & C#), they're fast enough for everything that you don't need to run on a supercomputer. And honestly, there's much hype surrounding functional languages these days, and I personally like the concepts, but I doubt we're going to see a truly functional languages in top 10 of TIOBE in the next few years.

 

Landau wrote an article for computational physics, but I think it's useful for other scientists as well;

 

In my view, Java's attention to precision, useful error messages, and object-orientation make it good for scientific computing, while its universality, free compilers, and use in the commercial sector make it popular with students. However, it is not as efficient or as well supported for HPC and parallel processing as are FORTRAN and C, the latter two having highly developed compilers and many more scientific subroutine libraries available. FORTRAN, in turn, is still the dominant language for HPC, with FORTRAN 90/95 being a surprisingly nice, modern, and effective language; but alas, it is hardly taught by any CS departments, and compilers can be expensive. C, and its object-oriented brother C++, are good for HPC, have good free compilers and libraries available, but may be too flexible with precision and memory access for beginners to learn good scientific programming practices. Python, a new, interpreted language that has garnered a small but devoted following, is free, user friendly, good for beginning programming, and has a nice 3D graphics library.

 

Link to the complete article; http://physics.orst.edu/~rubin/Papers/CP-2.pdf

Link to comment
Share on other sites

I'm a fan of writing correct programs quickly.

 

... and sloppily, and more or less by yourself. You are in academia, and you are in a computer science department. Sloppiness and working in very small teams are almost a given.

 

You seem to be rather confused about what I do.

 

I don't work in a computer science department. I work for an Internet TV company. While I work with a small team (3 others besides myself involved on our core product) there is absolutely nothing sloppy about what I do. I am not in academia. I am in the real world, developing products that have real-world deployments.

 

We have a rather rigorous approach to development with extensive emphasis on testing. We do both unit testing and integration testing, with a continuous integration testing tool checking all our code every time we check it in.

 

Our software is modularized and packaged for automated deployment. Error reports in production are emailed to us automatically.

 

The above gets at what the real issue is. It is a modern version of Snow's Two Cultures. Bascule and I are from two very, very different cultures.

 

What culture is it that you work in exactly? I suspect the environment I work in is much more rigorous and demanding than yours.

 

I worry about reliability and all that crap.

 

We absolutely worry about reliability, which is one of the great things about working in a high level language: many of the problems which make systems unreliable are abstracted away.

 

We're working on internet-connected televisions and set top boxes. These are the sorts of devices people turn on and expect to work, all the time. There is very little room for error. If there is a problem with our service then the customer will call one of our partners call centers, which costs them money.

 

Our SLA stipulates 99.9% uptime for our services (e.g. 8 hours of downtime per year, total). That isn't exactly easy to achieve.

 

Bascule worries slapping crap out quickly.

 

Time to market is of immense concern to us. However, rapid development doesn't mean "slapping crap out quickly". It means using a more rigorous, test-driven approach to building your programs and keeping your debugging workflow as simple as possible. It also means working in a high level language that prevents the sorts of bugs that occur constantly in low level environments.

 

I worry about competing with other companies. Bascule worries about competing with others in academia.

 

We have many competitors, and Ruby on Rails gives us a decided advantage in terms of developing quickly. To quote Paul Graham, we are "beating the averages" by working in a high level language, although instead of Lisp we are using Ruby.


Merged post follows:

Consecutive posts merged
The languages bascule named might be great but nobody use them in science.

 

Au contrare:

 

http://www.haskell.org/pipermail/glasgow-haskell-users/2009-April/017050.html

 

That's just one example...

Link to comment
Share on other sites

  • 2 weeks later...

This was a pretty interesting article on why C and C++ are terrible languages for scientific/numerical computing:

 

http://scienceblogs.com/goodmath/2006/11/the_c_is_efficient_language_fa.php?

 

Data parallel arrays were certainly one example... you can do this on an architecture like CUDA, if you code for CUDA.

 

With a language like Haskell you can get data parallelism which abstractly compiles to a GPU through a package like Obsidian.

 

And therein lies yet another argument for functional programming: when new technology like CUDA comes along, scientists whose code is written in C/C++ will need to put forth tons of effort to get it to leverage the CUDA architecture, and then their code will be bound to the CUDA architecture.

 

With a language like Haskell, your code is independent from whatever crazy data parallel execution environments hardware designers can dream up. The compiler can figure it out automagically, whereas with a language like C/C++ you're left to your own devices.

Link to comment
Share on other sites

  • 4 weeks later...
  • 1 month later...
This was a pretty interesting article on why C and C++ are terrible languages for scientific/numerical computing:

 

http://scienceblogs.com/goodmath/2006/11/the_c_is_efficient_language_fa.php?

 

Data parallel arrays were certainly one example... you can do this on an architecture like CUDA, if you code for CUDA.

 

With a language like Haskell you can get data parallelism which abstractly compiles to a GPU through a package like Obsidian.

 

And therein lies yet another argument for functional programming: when new technology like CUDA comes along, scientists whose code is written in C/C++ will need to put forth tons of effort to get it to leverage the CUDA architecture, and then their code will be bound to the CUDA architecture.

 

With a language like Haskell, your code is independent from whatever crazy data parallel execution environments hardware designers can dream up. The compiler can figure it out automagically, whereas with a language like C/C++ you're left to your own devices.

 

http://msdn.microsoft.com/en-us/concurrency/default.aspx

 

http://blogs.msdn.com/nativeconcurrency/archive/2008/10/28/visual-studio-2010-ctp-available-including-the-concurrency-runtime-parallel-pattern-library-and-asynchronous-agents-library.aspx oooooo parallel vectors maybe they'll have fixed some resizing issues as well.....

 

I'm sure this separation won't be the defining end to c++!

Edited by buttacup
Link to comment
Share on other sites

 

Note that nothing presented on either of those sites will help you target an architecture like CUDA.

Link to comment
Share on other sites

So.... ecoli, what was your choice after all ?

 

I had to deal with both Java and C++ in the last couple of days, I must say that programming in Java was so much easier (thanks to NetBeans). I will likely have to teach basic programming pretty soon and I will likely go with Java, I want the students to be able to concentrate on the science problem they have to solve, not to spend their time worrying about memory and pointers.

Link to comment
Share on other sites

  • 3 weeks later...

PHP isn't close to C at all. PHP's closest relatives are shell scripting languages, such as Perl, awk, sed, etc. PHP contains some functions with similar names to C functions, however PHP remains a dynamically typed (barely) garbage collected language with many extreme cases of type coersion. It's the only language I know where "3 dog night" + 2 = 5.

 

That said, PHP is probably a very bad language to start with. It's a horrible amalgamation of different languages fraught with bad design decisions by its creators.

Link to comment
Share on other sites

How is prolog? Has anyone tried that around here?

 

I think logic languages are an interesting thought experiment but generally impractical for most applications. I've seen some people attempt herculean tasks with logic languages and get extremely frustrated when they can't quite pull them off.

Link to comment
Share on other sites

  • 2 weeks later...

I was a programmer for 20 years... well, more or less -- titles included: application programmer; systems programmer; systems/network administrator; Manager of QC; Software Engineer; Software Architect; MIS.

 

IMHO the long answer is... it depends on what you wish to accomplish.

 

The short answer... or, best general, all around answer... or if I had to name just one is... C# (with Java a close second).

 

Again, IMHO.

Rusty

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.