Jump to content

what's a good programming language to learn?


ecoli

Recommended Posts

On 12/22/2015 at 3:43 PM, fiveworlds said:

 

Javascript is really hard to learn and rather poorly documented.

This is a good place to start learning JavaScript: https://developer.mozilla.org/en-US/docs/Learn/JavaScript/First_steps

If you're interested in documentation, here's a complete reference documentation: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference

Link to comment
Share on other sites

  • 2 months later...

I have been trying to learn some programming language as well. I have heard about web development and this work has much more demand. I am going to learn PHP, MySQL, JQuery, Java and many more. What's your opinion in those programming langugae?

Link to comment
Share on other sites

  • 2 months later...
On 7/24/2018 at 8:51 AM, Akshar99 said:

I have been trying to learn some programming language as well. I have heard about web development and this work has much more demand. I am going to learn PHP, MySQL, JQuery, Java and many more. What's your opinion in those programming langugae?

Web development is not really ordinary programming.  In web programming, the client is actually running the browser, such as Chrome or FireFox, and that is sending page requests to Apache Tomcat at a server, and the web developer has to compose html pages using css and other notational aids, to describe what the browser should render.
The main programming then is just JavaScript macros you send to the browser to modify mouse events and things like that.  You are really composing in developer languages instead of traditional programming.

But to do web development, PHP, MySQL, and JQuery are good.  JavaScript is more than just the JQuery library, so is also good.  But Java is not for web development.  It is more of an enterprise business programming language, except on Android devices.  JavaScript is not really related to Java at all.
Some people like JavaScript so much they also use it on the server side, with Node.js, but that is not the common convention.  

You can also use a framework that does most of the work for you, like Drupal, Wordpress, etc.  
If you do start with a framework, it pays to eventually understand what the framework is doing, so that you can hand tailor or modify as necessary later.
I don't consider frameworks to be really sufficient or flexible enough, but they are fast to get something up and running, and can be instructive.

Link to comment
Share on other sites

11 minutes ago, Rigby5 said:

But Java is not for web development. 

Really?

Quote

Java has strong support for web development. While Java on the desktop, with the notable exception of Eclipse RCP based application was never a huge success, Java is frequently used at the server side.

http://www.vogella.com/tutorials/JavaWebTerminology/article.html

 

Link to comment
Share on other sites

1 hour ago, Strange said:

No, Java is not frequently used.

There are large enterprise situations where they already have a team supporting Java, so they also want to do all their web development in Java, so some web development is done in Java, but not much.  In fact, most places I worked, like Intel, HP, and even parts of IBM, do not allow any Java installation.  They consider it a security risk and requiring too much memory.

A single person maintaining their own learning site should not get bogged down installing and maintaining Java, which would not at all help them learn programming or developing web sites.

From your own link is it obvious Java is a poor choice as it is more complex than necessary, compared to something simple like PHP.

{...

1.4. Java web or Java EE container

Java web applications are typically not running directly on the server. Java web applications are running inside a web container on the server.

The container provides a runtime environment for Java web applications. The container is for Java web applications what the JVM (Java Virtual Machine) is for local running Java applications. The container itself runs in the JVM.

In general, Java distinguishes two containers: the web container and the Java EE container. Typical web containers in the Java world are Tomcat or Jetty. A web container supports the execution of Java servlets and JavaServer Pages. A Java EE container supports additional functionality, for example, distribution of server load.

Most of the modern Java web frameworks are based on servlets. Popular Java web frameworks are GWT, JavaServer Faces, Struts and the Spring framework. These web frameworks usually require as a minimum container a web container.

...}

Link to comment
Share on other sites

  • 2 months later...
  • 3 months later...
  • 1 month later...
  • 3 weeks later...
  • 2 weeks later...
On 4/13/2009 at 7:07 PM, ecoli said:

Eventually, I want to be able to do some computationally-heavy modeling work, which obviously requires programming knowledge.

 

Is there a specific language that would be good for this type of interest. Where can a newb start learning about building the tools to develop skills to develop stochastic-type models?

 

Thanks!

I started off programming with Python, I found it a relatively easy language to get a grip of, the syntax wasn't difficult to learn and it also has a wide variety of benefits in the real world, and is a very in-demand language, so there's no shortage of work, I'd recommend it certainly as a language to learn if you've never programmed before! 

Link to comment
Share on other sites

  • 1 month later...
  • 1 month later...

I think you must learn c language because it is the base of all languages. Without it, you can’t understand the concepts of other languages. If you are good in this language then all the languages are easy for you.

Edited by martingail11
Link to comment
Share on other sites

29 minutes ago, martingail11 said:

I think you must learn c language because it is the base of all languages.

Not sure that I agree. Is c really the base for all languages, including functional programming languages (for instance ML) ?

32 minutes ago, martingail11 said:

Without it, you can’t understand the concepts of other languages.

I believe it is quite possible to understand the concepts of for instance Java or C# without first learning c.


 

Link to comment
Share on other sites

1 hour ago, martingail11 said:

I think you must learn c language because it is the base of all languages.

Nonsense.

1 hour ago, martingail11 said:

Without it, you can’t understand the concepts of other languages.

I am not sure that learning C will help with very much with Lisp, Haskell, Prolog, Cobol or even Fortran.

On the other hand, if you learn one procedural language, then some principles of other procedural languages will be clear. But it will make learning a functional or declarative language more difficult.

Link to comment
Share on other sites

  • 1 year later...
  • 2 months later...
  • 2 months later...
On 8/29/2019 at 11:35 AM, martingail11 said:

I think you must learn c language because it is the base of all languages. Without it, you can’t understand the concepts of other languages. If you are good in this language then all the languages are easy for you.

C is basically portable assembly. The closest thing to programming actual processors (they're all the same and have always been, only minor details differ), not a model. With only the abstractions considered absolutely neccessary, enforced or conservatively added as a recommended alternative, preferably to a separate system, to keep the core language clean.

It's simple interface to current and historic hardware makes it the fastest language. It's simple structure, restricted to functions calling other functions with one or no return value, and the use of pointers to solve the rest, makes it the language most easy to connect with code written in other languages.

That's what you get. And a language carefully designed to be portable across different hardware, platforms, processors and whatever, without sacrificing memory use or speed. Of course, the price is that you'll have to do everything yourself, and I don't just mean writing the code. You'll have to deal with preprocessing syntax, compiling, linking, etc, etc...

C takes the lego-building approach even farther than Unix, with it's advantages and disadvatanges. It has a collection of standard libraries, supposed to contain functionality considered neccessary for everyday programming but not in a processors instruction set. And it's worthless.

The free downloadable libraries that attempt to replace it, is better, but a generic library kind of defeats the idea of C. You're better off using a language with these things already incorporated into the syntax, object oriented programming probably being the most important example.

In C, you're riding the lightning as long as you don't call a function you can't trust.

I wouldn't recommend C as a first language, mainly because there's no decent input/output functions in it's standard library. How are you supposed to learn a language practically when you can't communicate as a user with your test programs?

Some of them are mastodont functions, almost a language in itself, too complicated to learn and frequent bug makers. Input functions require you to allocate a memory block for them, restricting the length of a line to a fixed size. Some of them are described as having an undefined return value in certain cases, some even worse as having undefined behaviour, for example when a user types too many digits to scanf() excpecting an integer.

So what's the proper way to do it? Well, you can't allocate an infinite number of bytes, or even a block that grows and shrinks freely in memory. It could potentially, or will eventually, hit another block allocated by another program or your own, and has to be reallocated (moved to another place in memory).

In C, the generic way to solve it is too decide a reasonable amount of bytes to initially allocate, for example 256 bytes for the input of a line of characters, and a multiplier, for example 2, and write the code yourself. Should the input (or really anything else exceeding a fixed limit, not many things are restricted nowadays, like max 255 coins in Zelda) reach the end of your block, you multiply it's size, send a pointer to it and tell the operating system that you need more space, reallocating if necessary.

Other languages, even C++ with it's vector type, do this automatically for you, and probably chooses the initial allocation and the multiplier better than you in all but the extreme cases. I suppose you can say that C is for the extreme cases.

As a language it has some value in standing on solid ground, and the confidence which that brings, but I estimate that to about 20%, and the other 80% just bringing solutions from different companies together. It may be very interesting for someone sitting in a standardization commitee, but not that much for a programmer. After a week of reading up on processors and hardware, you can write your own C language.

If you find a good history book, starting with C=64 and it's chips doing work simultaneously with the processor, Amiga taking it further with it's blitter and copper, multitasking and other stuff made safe with virtual memory implemented in the actual hardware, and much more, let me know.

Link to comment
Share on other sites

On 4/13/2009 at 8:07 PM, ecoli said:

Eventually, I want to be able to do some computationally-heavy modeling work, which obviously requires programming knowledge.

Is there a specific language that would be good for this type of interest. Where can a newb start learning about building the tools to develop skills to develop stochastic-type models?

Thanks!

1. If you find one or more applications already more or less specialized in what you need, with an easy to understand, fast to use, yet powerful interface, check what extensions it offers. Advanced searching options and macros are excellent time savers, but eventually you'll get tired of checkboxes or saving files with one of your apps and opening it with the other. Or you'll need a macro that calls one of two other macros, which one depending on a condition that includes basic arithmetics in the logic.

All of this can, of course, be fixed in more or less (often less) portable ways with a solution external to your limited applications. Other programmers could give you thirty alternatives, some suggesting installing Cygwin or even Linux itself to use a fifteen line shell script, some may come up with a creative way to make Windows scheduler do it for you, some may recommend AutoHotkey or AutoIt, languages made for controlling other applications, if necessary by simply clicking on certain coordinates on the screen as a last resort.

But only five will be bored enough, or too excited about their language, to answer you, since they all know it's a one-time temporary answer to a long-term problem: you've outgrown your box or didn't want it in the first place. Should an application have it's own language, or be designed upon an existing one to make it automatable, I recommend learning it. You'll always have use for any language when you learn or read code from another, either contemplating why the two are different when they are, or accepting it as a standard when they're not.

2. Learning a first procedural scripting language (after which learning others is "only" a matter of memorizing new names and syntax or opening your mind to some useful paradigms, for example object oriented programming, recursion or whatever Lisp programmers tell you will change your entire view of programming, which it probably will) is quite a large step, but made easier if you have a natural ability for logic, and even made fun if you're able to isolate and appreciate the details you can use in other situations.

Personally I like all forms of BASIC, a language so bad and rudimentary that I get my confidence from wrtiting code in it better that the language itself. But I'm far from a productive programmer, I find patching more interesting. Python is probably the opposite - a well designed procedural scripting language with the major paradigms added, and with a load of packages because of it's popularity, that easily and effectively does anything you want.

There's other scripting language, and I don't think I'm way off if I claim that Python's popularity is mostly random, or because of it's failure (?) to build it in a radically different way from top to bottom. JavaScript has it's simplified object orientation, Perl has it's many ways to do things and it's variables named $_ etc, making it easy to write very compact code, Tcl has it's simplified syntax that a child can understand. Python is boring.

3. As a _complement_ to C, no language is more suited as a scripting language than Tcl, assuming you define scripting as something fast and easy to both learn and use, and that the compiled or interpreted code on average will be about ten to twenty time slower than other scripting languages (more of a guess than a good estimate, check bencmarks yourself). As an _alternative_ to C, there's plenty of choices, all of them unholy combinations of programming and scripting.

In Tcl, the number of digits in an integer is limited only by memory running out. It has been built-in since version 8.5 and includes the ** operator that does "to the power of" on integers. In JavaScript you'll have to determine if a variable or operation could exceed the processors largest type, which depends on what machine the script is running on. Or load and use a bignum library, making your code more wordy (but not less effecient), possibly something like `bignum.add(x, y);`. Or assume that eveyone has a 64-bit processor by now and that 18,446,744,073,709,551,616 is enough for anything.

The only disadvantage of Tcl is the overhead. Should the two numbers be small enough to be stored and added with a single processor instruction, not an algorithm, I'm sure that Tcl does it more effecient than JavaScript, but the freedom (abstraction) to don't care about when or if an integer passes the systems limit, going beyond it or below it again, means that Tcl has to check first every time.

I haven't read the source code of Tcl, but I figure that the most effective way in a simplified case could be a bit/byte/word attached to every integer. Checking that with one instruction and conditionally jumping with another, is already 3 instructions rather than 1, and has only solved storage, not operations on two random integers. Perhaps it's all done with an algorithm internally, in which case ten to twenty times slower than other languages is a reasonable guess.

Part of why I love Tcl is that decisions as these are taken seriously, not just plastered on to a language that got it's 15 or 1500 minutes of fame because of one new concept. Part of it is portability made so easy that it almost becomes interesting, between character sets, line endings and more. Part of it is a minimal instruction set and a minimal syntax, with a script being defined as a string containing one or more commands and a command as a string containing it's name followed by zero or more arguments. I'll praise the documentation when it changes the definition of a script to the correct _zero_ or more commands.

This makes quoting correctly, including avoiding the traps, the big issue when learning Tcl. And when you've mastered that, the command driven system is sometimes trying to resemble shell programming too much, and isn't entirely clean. Some functions treat the character ~ specially, some need -- to signal the end of switches (which all start with -). This is true in other languages too, but it should've been thrown out and burned somewhere a long time ago in a flexible scripting language with the slogan "everything is a string".

The most common problem that new programmers stumble upon and ask about, is caused by treating a list as a string or vice versa. A list is one of Tcl's data types. You may treat it as a string, reading from or writing to it, but you shouldn't assume anything more about the internal structure of the storage of it's elements, than that space separates a-z and 0-9, used in source code for convenience by typing {ding dong 123} rather than

  1. . Don't use string functions on lists, don't use list functions on strings.

The expression "everything is a string" from Tcl's early days, probably meant that the language was type free and did conversions automatically. Perhaps everything actually _was_ stored internally in a string of characters. `set a 2` is no different from `set a "2"`. After the parser has removed the quotes, the same two arguments are sent to the set command. And 42 in an arithmetic expression is no different than "4$a".

Especially JavaScript, with it's flexible OOP, solves this in a more beautiful and sofisticated way. Integers and strings are considered to be two different types of data, with conversion methods between them built in to the language, and easily extended or replaced. Should the third argument to your arithmetic function be "eat my shorts", it's up to you what to do with it. The default method will probably cause an error that propagates down through the stack of calls until one of them catches it.

Tcl will do the same, and it has the same abilities to handle it in any way but they are more complicated in syntax and program flow, often using one or more commands in the extreme outlands of the language. Because of it's original assumption that all strings (text) corresponds to an integer. Add other data types to that, for example converting a clown object to a bridge object, and you'll realize that JavaScript is more suited to write larger programs in.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.