Jump to content

bascule

Senior Members
  • Joined

  • Last visited

Everything posted by bascule

  1. bascule replied to iNow's topic in Politics
  2. bascule replied to iNow's topic in Politics
    For those of you who didn't see it, Jon Stewart just performed a brilliant parody of Glenn Beck: http://www.thedailyshow.com/watch/thu-march-18-2010/conservative-libertarian Central to his arguments against Beck: slippery slopes slip both ways. Beck paints doom and gloom of a Nazi/Stalinist totalitarian state while never mentioning the possibility of a theocratic totalitarian state. While saying progressive virtues "lead to" the latter, the possibility that religious values my "lead to" the former is never addressed. Oh yes, and libertarians are lying aryan Berts! (I guess I'm one of them, although I'm a liberaltarian, so I'm still a lying aryan but at least I'm not EVIL BERT)
  3. bascule replied to iNow's topic in Politics
    It seems the FOX failure has crept into the forums as well!
  4. bascule replied to iNow's topic in Politics
    FOX News: we have technical difficulties with our satellite link when Democrats speak
  5. Oh my, when I tried to read that it broke my brain.
  6. I've finished The Difference Engine, which was, well... a good read, albeit strange. I'm now reading The Diamond Age by Neil Stephenson, my second Stephenson book after Snow Crash. It's a story about how the advent of molecular nanotechnology will affect the word, sort of like a fictional future based on Eric Drexler's Engines of Creation.
  7. I've begun reading (for the second time, because I didn't finish it the first time) The Difference Engine by William Gibson and Bruce Sterling. This book is one of the progenators of the whole idea of "steampunk", and posits an alternate past in which Charles Babbage managed to bring on the computer revolution in the 19th century through the use of "engines", mechanical computers.
  8. Personally I think PHP is a terrible first language. In general, it's very messy and uncohesive, and its semantics are rather ad hoc as opposed to being planned. It has some of the worst type coercion I've ever seen ("3 dog night" + 2 = 5?!!!). Object orientation was tacked-on after-the-fact and is not used by the standard library. The standard library is enormous, with many functions that duplicate each other. Determining which version is best for the task at hand is often difficult (I should use mysql_real_connect instead of mysql_connect?), and worse, often writing things yourself in PHP are better than using the hooks into C libraries provided by the stdlib, which is totally counterintuitive (I assume because the runtime used to be even slower than it is now). The PHP interpreter is also extremely slow (in some cases slower than the standard Ruby interpreter), however the language is not featureful and expressive like Ruby, so you're really getting the worst of both worlds: a slow, low quality language. I would suggest learning another language like Python or Ruby first, then moving to PHP later.
  9. Well, that said, I can't help but link to what I think are the greatest source of video lectures on computer programming: The Sussman/Abelson SICP lectures While this course is intended for freshmen, I wouldn't recommend it for beginners. If you are a well seasoned imperative programmer who's wondering what all this functional stuff is about, these are a great series of lectures I would love for you to watch.
  10. Eh? Time limit? Python is free and certainly doesn't have any kind of time limit. What exactly did you install?
  11. I think logic languages are an interesting thought experiment but generally impractical for most applications. I've seen some people attempt herculean tasks with logic languages and get extremely frustrated when they can't quite pull them off.
  12. PHP isn't close to C at all. PHP's closest relatives are shell scripting languages, such as Perl, awk, sed, etc. PHP contains some functions with similar names to C functions, however PHP remains a dynamically typed (barely) garbage collected language with many extreme cases of type coersion. It's the only language I know where "3 dog night" + 2 = 5. That said, PHP is probably a very bad language to start with. It's a horrible amalgamation of different languages fraught with bad design decisions by its creators.
  13. The CUDA SDK is aimed at C, although as I said earlier, it can be targeted by any language with a compiler that can output CUDA opcodes, like Obsidian in Haskell.
  14. Note that nothing presented on either of those sites will help you target an architecture like CUDA.
  15. Yes MIT recently switched to Python as their introductory language for their CS courses (for purely practical reasons, too)
  16. bascule replied to iNow's topic in Politics
    Barack Obama got a regular cheesburger, extra well done, and he didn't get good old American ketchup, he got... mustard, and not just regular mustard... spicy mustard... DIJON MUSTARD. And Hannity found out: http://gawker.com/5244126/obama-orders-burger-with-elitist-european-condiment?skyline=true&s=i Enjoy your FANCY BURGER, MR. PRESIDENT! Apparently it isn't just Hannity playing up the Grey Poupon angle: http://mediamatters.org/research/200905070031 As a fan of spicy/dijon mustard, I'm not sure what the deal is, although I prefer store brand to Grey Poupon. Obama just wants a SPICY BURGER. Apparently to Fox News that's a cardinal sin.
  17. This was a pretty interesting article on why C and C++ are terrible languages for scientific/numerical computing: http://scienceblogs.com/goodmath/2006/11/the_c_is_efficient_language_fa.php? Data parallel arrays were certainly one example... you can do this on an architecture like CUDA, if you code for CUDA. With a language like Haskell you can get data parallelism which abstractly compiles to a GPU through a package like Obsidian. And therein lies yet another argument for functional programming: when new technology like CUDA comes along, scientists whose code is written in C/C++ will need to put forth tons of effort to get it to leverage the CUDA architecture, and then their code will be bound to the CUDA architecture. With a language like Haskell, your code is independent from whatever crazy data parallel execution environments hardware designers can dream up. The compiler can figure it out automagically, whereas with a language like C/C++ you're left to your own devices.
  18. ... and sloppily, and more or less by yourself. You are in academia, and you are in a computer science department. Sloppiness and working in very small teams are almost a given. You seem to be rather confused about what I do. I don't work in a computer science department. I work for an Internet TV company. While I work with a small team (3 others besides myself involved on our core product) there is absolutely nothing sloppy about what I do. I am not in academia. I am in the real world, developing products that have real-world deployments. We have a rather rigorous approach to development with extensive emphasis on testing. We do both unit testing and integration testing, with a continuous integration testing tool checking all our code every time we check it in. Our software is modularized and packaged for automated deployment. Error reports in production are emailed to us automatically. What culture is it that you work in exactly? I suspect the environment I work in is much more rigorous and demanding than yours. We absolutely worry about reliability, which is one of the great things about working in a high level language: many of the problems which make systems unreliable are abstracted away. We're working on internet-connected televisions and set top boxes. These are the sorts of devices people turn on and expect to work, all the time. There is very little room for error. If there is a problem with our service then the customer will call one of our partners call centers, which costs them money. Our SLA stipulates 99.9% uptime for our services (e.g. 8 hours of downtime per year, total). That isn't exactly easy to achieve. Time to market is of immense concern to us. However, rapid development doesn't mean "slapping crap out quickly". It means using a more rigorous, test-driven approach to building your programs and keeping your debugging workflow as simple as possible. It also means working in a high level language that prevents the sorts of bugs that occur constantly in low level environments. We have many competitors, and Ruby on Rails gives us a decided advantage in terms of developing quickly. To quote Paul Graham, we are "beating the averages" by working in a high level language, although instead of Lisp we are using Ruby. Merged post follows: Consecutive posts merged Au contrare: http://www.haskell.org/pipermail/glasgow-haskell-users/2009-April/017050.html That's just one example...
  19. Structure and Interpretation of Computer Programs (a.k.a. SICP) is almost universally regarded as the best within the computer science community. If you click the link you will find their complete video lecture series available for download, taken from an actual class taught at MIT.
  20. I'm a fan of writing correct programs quickly. Utility is in the eye of the beholder. Ever use YouTube? You can thank Python for that one. One of the more interesting apps I've seen demonstrated at our local Ruby group is a web application for controlling the power grid. That's some utility... literally. I'm a polyglot who doesn't fall victim to the Blub Paradox:
  21. http://shootout.alioth.debian.org/u32/benchmark.php?test=nbody〈=all Interesting to see that Fortran only barely eeks out a lead over Scala... Haskell is only a little more than half as slow as Fortran. To reiterate from my previous post: Well, Haskell was the only language you listed which I was advocating for numerical computing, so let's look at some of that: do m <- foldr (.+.) (Vec 0 0 0) `fmap` (mapM momentum . take (nbodies - 1) . iterate next $ next planets) setVec (vel planets) $ (-1/solar_mass) *. m where momentum !p = liftM2 (*.) (mass p) (getVec (vel p)) Hmm, maps and folds... not particularly procedural at all. In fact, that's pretty much some quintessential functional programming right there. Colloquially (thanks to Google) this approach is typically described as a "MapReduce" and is an extremely easy operation to parallelize. There are ways to write imperative/procedural type code in Haskell, using things like the state monad. This program does not use them. Merged post follows: Consecutive posts mergedHere's an interesting success story using Haskell for scientific modeling: http://www.haskell.org/pipermail/glasgow-haskell-users/2009-April/017050.html
  22. It depends which ones you're talking about. Functional languages like Haskell and OCaml perform just fine: http://shootout.alioth.debian.org/u32/benchmark.php?test=all〈=all&box=1 Neither of the languages are interpreted. Both are compiled and execute as native code, and both provide unboxing for both integers and floats, making them both great for numerical computing. I'm not particularly a fan of Scala as a language but I've certainly heard of it being used in scientific computing. Having worked with a scientific model its operation followed this basic pattern: [math]t_1 = F(t_0)[/math] [math]t_2 = F(t_1)[/math] [math]t_3 = F(t_2)[/math] Or, that is to say, the state at each timestep is the result of a functional transformation of the previous timestep, an approach which maps quite well to functional languages. They think procedurally because they were taught to think procedurally. Again, it's a cultural thing. And moreover, the systems being modeled are described in terms of pure functions, are they not? Is it honestly easier to take a declarative description in the form of pure functions and translate it into an imperative procedure? Or is that just what everyone is used to... Yes, and COBOL was written for the business community, but even they eventually figured out it's retarded. Merged post follows: Consecutive posts merged Yes, depending on what you're doing Java can be faster than C++ by virtue of its ability to automatically profile and recompile itself at runtime. And that's exactly what I said above: scientists are stuck in the 1980s when it comes to programming languages, simply because there's so much inertia dragging them down into low level imperative land. Fortunately I've seen gradual movement away from Java. My university was very much a Java school in the past but has since moved on to become far more polyglot oriented. MIT is switching to Python for the introduction to programming classes. Java is, sadly, in an "uncanny valley" of abstraction: not high level enough to model problems well, and not low level enough that students actually understand what's going on behind the scenes. This makes Java a particularly bad language to start out with. It's easy to think of modeling speed in terms of solely execution performance, but there are many hidden time costs to consider. Can your model resume where it left off if it crashes? If not, all the time that was spent running the model is wasted every time it crashes. How long is the debugging cycle? When you buy a new cluster, how much time is spent getting scientists up-to-speed on it? How often do you switch compilers, and how much time does it take to work out the kinks whenever making a compiler change? How long does it take to add new features to the model, or merge existing codebases? Where are the bottlenecks in your model, and how difficult is it to address them? One model I worked on had sinusoidal CPU usage because it would go between alternating periods of being CPU bound and I/O bound. Nobody could fix this because nobody knew how: the codebase was a gigantic mound of spaghetti and trying to make changes to core behavior was horrible black voodoo. This is the traditional "I'll write it in C... FOR SPEED!" trap, when really the amortized speed is a lot slower thanks to the amount of time it takes to address problems stemming from the language's low-level nature. You can laugh at Ruby if you want and try to argue it has no place in scientific computing, but really it's an issue of right-tool-for-the-job. While I would never advocate it or any other interpreted language for numerical computing (or even a JITed language with boxed integers/floats), Ruby is an excellent automation language. These maps are generated using a system which was automated using Ruby: http://www.ngdc.noaa.gov/dmsp/night_light_posters.html I used Ruby frequently at my previous (scientific computing) job as an automation language for everything from scheduling jobs and letting scientists automate parameter tweaking for their runs to serving the data we generated through our web site. Again, right tool for the job. Yep, and that's exactly what I said before: Understand that this is the only good argument for using Fortran. You use Fortran because it's what everyone else is and has been using and what your legacy code is written in. It's not as if you have magical requirements that are different from anywhere else in the computing world which Fortran as a language fulfills. You're just doing numerical computing, and for that there are any number of languages that fit the bill. Financial analysts do numerical computing too, and guess what they're using? C++ and OCaml. Fortran is an old, archaic language riddled with a number of problems. No computer scientist working outside of scientific computing would ever even consider Fortran for new development. I think most competent computer scientists would do everything in their power to avoid Fortran. All that said: I can understand the rationale for Fortran (culture and legacy code). I can understand the rationale for C++ (higher level identity modeling and avoiding pointer hell). But if you're a scientist programming directly in C, sorry, you're just being silly. Find a better language, like this one perhaps: http://www.sac-home.org/
  23. Fortran, C, and C++ SUCK. Why on earth would you ever argue these languages are easier for scientists to understand? Having supported an atmospheric model developed in Fortran and C for 5 years of my life, I can personally attest to the nightmares these languages bring to scientists. Scientists don't care about incompatibilities between Fortran compilers and are quite confused when they try to crank up the resolution of their grids and suddenly find their programs no longer run because they've exceeded some hardcoded maximums for array sizes in the compilers we were using. As scientists are well-versed in math, and implementing scientific models which are essentially translations of mathematical functions, I would think functional languages would in fact be easier to understand. Functional languages like OCaml have been immensely successful on Wall Street, where they're used for financial modeling. I think cultural reasons are to blame for why they aren't used more often in scientific computing: scientists aren't typically programmers by nature, but rather programming is something they must learn to do science. Scientists don't often develop models from scratch, but work with legacy, decades old models written in Fortran, C, and C++, so there's immense inertia to stick with low level imperative languages, even though the way they model problems is much, much different from how scientists use math to model programs, and these languages are difficult to use, error prone, and require substantially more code. Honestly, I think if scientists were exposed to functional programming first, they'd have a lot easier time. The only reason scientists would have a hard time understanding functional languages is because they were taught a language like C or Fortran first.
  24. Personally I'm a Ruby fan. I like languages with a purely expression-based grammar (like Lisp). Python breaks things apart into expressions and statements, which is a bit weird and very imperative-feeling to me. Ruby also has these things called "blocks" which are basically syntactic sugar for passing an anonymous function as an argument to another. They're completely awesome, but something you really have to use to "get".
  25. Lots of people turn to GoF when trying to deal with problems which arise because features do not exist in their given language to address them. The Factory pattern is perhaps the most infamous in Java, and there are some comically bad usage examples to be found, like "Factory Factories": http://ws.apache.org/xmlrpc/apidocs/org/apache/xmlrpc/server/RequestProcessorFactoryFactory.html?rel=html It's not like the Factory pattern doesn't have its uses. I use it in Ruby, particularly when writing test cases (I use factories to build my fixture data). Java forces you to use it far more often than you should, both because it lacks metaclasses and because it makes you declare constructors as final. Templates can be... ok. Boost tries to do them right. The STL is absolutely horrid. For declarative code generation, which is what templates try to do to a certain extent, I think it's much nicer to use a language with first class metaprogramming. Python certainly has this. Rather than having a wacky "template" language which is its own animal, you can use Python to generate Python. Python is certainly a conceptual playground for a lot of features you haven't been exposed to. I'd suggest experimenting with a functional approach, trying to use features like list comprehensions and map/filter/reduce. Avoid using for loops when you don't have to, for there are better ways! Yes to the latter, however Python has a sequential memory model. There is Stackless Python to address this, but that won't help you scale across multiple CPU cores, and Stackless Python is both slow and not a "first class" Python platform. If you really want to break out of the Java/C++ mold you should try to learn Lisp. You probably won't ever write anything useful in Lisp, but it will change your perspective on programming, and many of the ideas you pick up will translate into languages like Python (or Ruby, or any functional language) I'd suggest downloading PLT Scheme: http://download.plt-scheme.org/drscheme/ And following along with the SICP videos: http://groups.csail.mit.edu/mac/classes/6.001/abelson-sussman-lectures/ You can just watch the first one, and Abelson will teach you Lisp in about an hour: http://groups.csail.mit.edu/mac/classes/6.001/abelson-sussman-lectures/videos/Lecture-1a.mpg Python is another language you can pick up quickly.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.