Is the Right Tool a Matter of Preference?

A place to discuss the implementation and style of computer programs.

Moderators: phlip, Moderators General, Prelates

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Wed Jul 11, 2012 8:58 am UTC

I'm not really trying to make much of a judgment regarding the fitness of C++ for certain applications as opposed to C, just pointing out that C++ isn't inherently more divorced from the hardware than C.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
Jplus
Posts: 1721
Joined: Wed Apr 21, 2010 12:29 pm UTC
Location: Netherlands

Re: Is the Right Tool a Matter of Preference?

Postby Jplus » Sun Jul 22, 2012 11:33 pm UTC

Funny to see that the discussion is almost permanently on the verge of derailing into a religious war. :)

sourmìlk wrote:I'm going to clarify something:

I certainly don't mean to imply that there are tools one shouldn't even learn how to use. Even if functional programming is the stupidest choice ever for my purposes, I certainly agree that it's worth learning. (I've been taking a look at it, and it's cool, but all the recursion is making me a bit uncomfortable. I empathize with the stack.) The question is [1] whether there is a best tool for a job, [2] whether I can determine what that tool is, and [3] whether that tool is always the best tool for the job regardless of who is using it.

My answers are pretty much in agreement with what Sc4Freak (and some others) has said so far:
  1. There is usually a small subset of languages that are more fit for the job than all other languages, but the composition of the set depends partly on who you are and under what conditions you're working on it. Basically it's the intersection of languages that have favourable properties for the job, languages that you know or are willing to learn and languages that are not ruled out by your circumstances.
  2. Yes, because you know the job, you know who you are, you know your circumstances and you know what language from the subset (see above) you feel most attracted to in the particular case.
  3. To some extent the favourable properties for a job are objective, and so are mappings from those properties to languages. Because of that, if person A and person B have to do the same job in different circumstances, they will generally still agree on a common pool of languages from which to choose their personal subsets of languages that are "most fit" for the task. The smaller the common pool, the more likely A's subset is to intersect with B's subset.
As an example, take your situation. For developing a game (taking both engine and other aspects into account), some favourable properties include support for object orientation, speed, memory efficiency and bindings to a rendering API. We can sort-of objectively agree that C++, C# and Delphi are among the languages that have these properties. (We can even sort-of objectively maintain that among these three, C++ has better efficiency than C# and better OO support than Delphi). That doesn't mean, however, that a team can't have valid reasons to use Python, Haskell or Ada. Those are still very favourable choices compared to e.g. Forth, J or MATLAB!
In your case, however, the choice is clear: you know C++, you like it, and there's really no reason why you couldn't use it.

As for your empathising with the stack: perhaps you should read a bit about tail call optimalisation.

Also, I want to compliment you for your level of open-mindedness in this thread. :)
"There are only two hard problems in computer science: cache coherence, naming things, and off-by-one errors." (Phil Karlton and Leon Bambrick)

coding and xkcd combined

(Julian/Julian's)

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Sun Jul 22, 2012 11:38 pm UTC

Did I come off as open-minded? You must be misinterpreting what I've said. I'm never open minded and nothing you can say will convince me otherwise. Really though I think it's more self-doubt than open-mindedness :P

Also, I have read about tail call optimization, and I am satisfied with it for tail calls, but... well enough about my speculative feelings about functional programming efficiency, how efficient is it actually compared to imperative programming?
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
ahammel
My Little Cabbage
Posts: 2135
Joined: Mon Jan 30, 2012 12:46 am UTC
Location: Vancouver BC
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby ahammel » Sun Jul 22, 2012 11:59 pm UTC

sourmìlk wrote:Also, I have read about tail call optimization, and I am satisfied with it for tail calls, but... well enough about my speculative feelings about functional programming efficiency, how efficient is it actually compared to imperative programming?

Play around with this for a while.

Long story short: Haskell is generally a bit slower than C(++) or Java, generally a bit faster than C#.
He/Him/His/Alex
God damn these electric sex pants!

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 12:02 am UTC

That is an awesome resource.

Dammit, clojure is substantially less memory efficient than C. I'm scrapping that then. Although it's actually not much worse than just Java, which is as fast as C, but which uses more memory. I'm confused. How can it be that these languages are as fast as C while using so much more memory? Does that mean the memory is only a problem when you're using it all up, though there are about as many calculations either way? And if I'm doing anything seriously performance dependent, it seems I must use C or C++, because everything else is at least 2 to 4 times as slow. or am I misreading this all. It's very discouraging.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
Sc4Freak
Posts: 673
Joined: Thu Jul 12, 2007 4:50 am UTC
Location: Redmond, Washington

Re: Is the Right Tool a Matter of Preference?

Postby Sc4Freak » Mon Jul 23, 2012 12:39 am UTC

Yes, generally speaking memory is only a problem if you run out. There are some related issues which can impact performance - such as cache coherence - but strictly speaking that's not a direct consequence of high memory usage.

EDIT: Kinda OT, but I'd actually be interested to see where C# lies on the spectrum. It's a shame those benchmarks only have the Mono implementation of C#. Mono is... not known for its performance. The MS implementation running on Windows would do a lot better, but I've yet to see comprehensive benchmarks like those.
Last edited by Sc4Freak on Mon Jul 23, 2012 12:43 am UTC, edited 2 times in total.

User avatar
ahammel
My Little Cabbage
Posts: 2135
Joined: Mon Jan 30, 2012 12:46 am UTC
Location: Vancouver BC
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby ahammel » Mon Jul 23, 2012 12:40 am UTC

sourmìlk wrote:That is an awesome resource.

Yes it is :D

sourmìlk wrote:Dammit, clojure is substantially less memory efficient than C. I'm scrapping that then. Although it's actually not much worse than just Java, which is as fast as C, but which uses more memory. I'm confused. How can it be that these languages are as fast as C while using so much more memory?

A good programmer is better at memory management than a garbage collection algorithm, I guess.

sourmìlk wrote:Does that mean the memory is only a problem when you're using it all up, though there are about as many calculations either way? And if I'm doing anything seriously performance dependent, it seems I must use C or C++, because everything else is at least 2 to 4 times as slow. or am I misreading this all. It's very discouraging.

If 'seriously performance dependent' means 'every byte and every microsecond is precious', then yeah, C(++) is probably your best bet. Is that generally your situation?
He/Him/His/Alex
God damn these electric sex pants!

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 12:45 am UTC

I'm making a game engine and have been making small games and so that performance oriented environment is the one I'm used to. So yeah, performance is very important to me. I mean, I guess for just learning the things I don't need it to run like assembly, but it would be nice if I could get C-like performance in something that isn't C and C++.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
ahammel
My Little Cabbage
Posts: 2135
Joined: Mon Jan 30, 2012 12:46 am UTC
Location: Vancouver BC
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby ahammel » Mon Jul 23, 2012 1:10 am UTC

Fortran? :wink:

I've never head anybody say "I started writing this in C, but it was running too slow, so I switched to x instead." Turns out you can get a compiler to spit out pretty fast code if you hack on it for forty years.

Sc4freak wrote:Kinda OT, but I'd actually be interested to see where C# lies on the spectrum. It's a shame those benchmarks only have the Mono implementation of C#. Mono is... not known for its performance. The MS implementation running on Windows would do a lot better, but I've yet to see comprehensive benchmarks like those.

They make the source available. You could always compile it yourself and give it a spin.
He/Him/His/Alex
God damn these electric sex pants!

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 1:23 am UTC

Well I can see why you couldn't really get faster than C (as a lot of it is just higher level names and syntax for assembly), but I thought maybe you could get something about as fast :|

On a barely related note, I just wrote some mathematical vector operation functions in clojure, and it's kind of fun. I just can't shake the feeling that I'm not actually programming in a functional style, but just importing imperative programming to a functional language.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

Ben-oni
Posts: 278
Joined: Mon Sep 26, 2011 4:56 am UTC

Re: Is the Right Tool a Matter of Preference?

Postby Ben-oni » Mon Jul 23, 2012 6:17 am UTC

sourmìlk wrote:I'm making a game engine and have been making small games and so that performance oriented environment is the one I'm used to. So yeah, performance is very important to me. I mean, I guess for just learning the things I don't need it to run like assembly, but it would be nice if I could get C-like performance in something that isn't C and C++.

It's probably worth mentioning that like memory, speed only matters when you run out (of clock cycles).

In game development, you're probably spending most of your time in rendering, and there's no reason why that has to be in the same language as the game logic itself, which doesn't have to be performance optimized. (Actually, if you're using OpenGL or any other graphics library, they're not.) If GC is an issue (that is, you want to maintain a continuous frame-rate without interruption), put all rendering calls into one thread and the logic in another.

As for writing rendering engines... many programmers have found that when they use high level languages, they can spend more time focusing on algorithm improvements when they're freed from the shackles of low level concerns like memory management. Consider rending a 3D environment. You need to know which objects to render, so you filter what's in the field of view. Then you test to see what's hidden behind other objects. When you only have a few thousand objects, that's fine. No problem. But you want detail, so you create more objects that need to be rendered. A hierarchical structure is established so that you don't need to test objects unless their parent is in the FOV. But you find that you have far more objects than you can deal with that don't fit into a natural composite structure. Say you're rendering a tree, the kind with bark. Each leaf will render independently of the branches that may lead to it. That's a lot of detail to tell whether different bits are occluded or not, and you probably don't want to test every leaf on the tree just to see if it's in the FOV. So you decide you want to apply some digital geometry to the task to be able to make broad sweeping gestures and filter bits and pieces out. These are some complicated algorithms, but we can refer to Knuth and he'll help us out. Now that we know what the potential candidates are, we divide them into two groups: close and far. Render the closest pieces, and then filter out everything occluded by it. Again the digital geometry. Hard stuff, very math intensive, but it's doable. Then there's rendering modes and smooth transitions, and lots more complicated stuff.

It's really nice being able to do that stuff in a language that gets out of your way and lets you think just about the problem at hand.

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 7:32 am UTC

This is a good point. If I ever get around to actually finishing a lot of the basic rendering stuff, I can make some bindings for other languages and have a lot of fun screwing around with them.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby EvanED » Mon Jul 23, 2012 7:37 am UTC

sourmìlk wrote:I'm confused. How can it be that these languages are as fast as C while using so much more memory?

OK, time for a quick lesson on how good GCs work:

GC need not be as slow as people say, at least in comparison to "naive" manual memory management or smart pointer-style reference counting. The reason for this is that malloc and free are actually kinda slow. Not really slow or anything, but slower than you may think. By contrast, with a type of GC known as a copying collector (which I'll talk about in a minute), allocation is dirt cheap* and freeing is zero cost as it never explicitly happens. (This is ignoring finalize.) What this means is that for many workloads, the extra cost of the GC running vs the extra cost of malloc/free turns into somewhat of a wash -- and that for manual allocation to really win out decidedly you really need to be able and put in the (usually substantial) extra effort to do pool allocation so you can deallocate millions of objects in one fell swoop, and perhaps use a faster malloc. (There are other nonobvious performance benefits to GC as well. For instance, GC'd programs are basically the only programs you'll see where cache locality can improve over time -- with the caveat that the periods of better cache locality is interrupted by the GC itself basically ruining your cache as it reads all over the heap. :-))

Non-copying collectors (like Boehm, or what you get with reference counting) aren't able to take advantage of the same optimizations that a copying collector can, and wind up doing the work of both a manual allocator's malloc/free and the GC part; my impression is to a large extent, this is where the "GC is slow!" mantra comes from.

*
Spoiler:
Essentially, malloc() with a copying GC could look like this:

Code: Select all

void* malloc(int size) {
    if (current_ptr + size > end) {
        ... call the GC ...
    }
    void* block = current_ptr;
    current_ptr += size;
    return block;
}

Basically it just keeps a pointer to the next address of free memory, bumps that pointer, and returns the old value. You just need a check that you aren't going past the end of the current heap. (You could even probably avoid that check with fancier tricks but it would probably not be worth it.

This contrasts with traditional mallocs which, at least sometimes, look through free lists and such, and frees that have to do coalescing of adjacent blocks.


However, to get this, the GC counter-intuitively has to copy objects around. The simplest kind of copying collector to describe is probably what's called a semispace collector. Basically, you divide the heap into two parts. Only one part is in use at any given time. (For reasons which will become apparent, we'll call the in-use part the "from" part of the heap, and the other the "to" part.) The program runs, allocating memory from the from part of the heap until it runs out, at which point the GC runs. The GC then basically starts traversing the graph of reachable objects starting with references that are on the stack or held in globals. As the GC encounters an object, it copies it from the from part of the heap to the to part and updates the reference that caused it to be moved to point to its new address. It also leaves behind its new address at the old location, and if the GC later encounters another reference to that object, it just replaces the old value with the new one. When this is done, the roles of the from and to parts of the heap are reversed and the program continues.

Note that we've now essentially doubled the heap requirement of the program by doing this though, since now we have both spaces.

One interesting characteristic of this is the GC need not ever touch dead objects, only live ones. (Again, ignoring finalizers.) This means that dead objects don't add any time to collection at all -- which is why I say that free has no cost. (This is not quite true of course: allocating a lot of garbage will cause the GC to be triggered more frequently. But the more garbage there is, the more efficient the GC will be. This contrasts with normal allocation, where the total cost of free is proportional to the number of blocks that you free.)

To get a really good-performing GC, you really need to push the idea further, which will give you a heap with many different regions called "generations". The idea behind this is that objects are allocated in generation 0, and the GC copies from generation 0 to generation 1, from generation 1 to generation 2, etc. using the same idea as the semispace collector. But it will only collect a region when it fills up (or if an older generation fills up) -- e.g. if generation 1 fills and triggers GC, it won't collect generation 2. This takes advantage of the fact that objects usually are either short-lived (in which case there's a good chance they'll be dead before the GC even runs and the GC won't spend any time on them) or will last a while, in which case we can leave it sit for many collections of newer generations (instead of just copying it back and forth like a semispace collector would). Note that getting generational collectors right takes some effort, as you need to make sure that pointers from older generations to newer generations are honored by the GC. There are a variety of tricks for this. ("Remembered sets" is one such key word.)

But it also means that now we've divided up the heap into more segments, which means a further decrease in memory efficiency.

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 7:52 am UTC

That's very interesting, but it still appears only to work if you aren't running out of memory. If you're running out of memory, then languages that use memory are going to necessarily be slower, ya?
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby EvanED » Mon Jul 23, 2012 8:02 am UTC

sourmìlk wrote:That's very interesting, but it still appears only to work if you aren't running out of memory. If you're running out of memory, then languages that use memory are going to necessarily be slower, ya?

Yep. Programming in a GC'd language is wagering that assuming that your users have enough memory is worth the decrease in development time of being able to work in a GC'd language.

(And, I believe, in the longer term, the increased ability to use formal methods to verify program behaviors. There's already some traction in industry for tool support for this, and I think it will necessarily become more prevalent as systems become even more complicated and critical. (Heck, for a long time "games" was a really good example of a genre of software where I felt languages like C and C++ fit really well, because they both require soft-realtime high-performance and because it didn't used to matter too much if there was an occasional bug -- the stakes were low. But with every most games including multiplayer now and with things like WoW where people put ungodly amounts of time into building characters and stuff, the latter is often not even true any more, and if it were up to me I'd say the performance hit of a memory safe language is worth it for the increased assurance against security bugs. And memory safe means GC, both practically and theoretically speaking.) And though my advisor would probably kick me out for saying this, I don't see verification in C or C++ as being very successful. It's hard enough when you can assume memory safety; I think it's basically hopeless otherwise. :-))

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 8:06 am UTC

EvanED wrote:If it were up to me I'd say the performance hit of a memory safe language is worth it for the increased assurance against security bugs.

When games use up all the memory and you're suddenly demanding twice as much, I think you might change that opinion.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby EvanED » Mon Jul 23, 2012 8:09 am UTC

sourmìlk wrote:
EvanED wrote:If it were up to me I'd say the performance hit of a memory safe language is worth it for the increased assurance against security bugs.

When games use up all the memory and you're suddenly demanding twice as much, I think you might change that opinion.

When that happens I'll go spend $100 and get more RAM. :-) That's easily worth never hearing the phrase "buffer overflow exploit" again.

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 8:14 am UTC

You will never have enough. Also apparently it's not 2x, but 20x as much :|
And demanding that customers shell out an extra $100 on hardware so that you can program your game in a garbage collected language is not a reasonable solution. Also, how much RAM do you have that $100 is twice as much? I'm pretty sure that 8 gigabytes only costs around $60 these days.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby EvanED » Mon Jul 23, 2012 8:22 am UTC

sourmìlk wrote:You will never have enough. Also apparently it's not 2x, but 20x as much :|

More like 3-5x, according to this, which in the absence of other information I'll trust more than the shootout. :-)

These results quantify the time-space tradeoff of garbage collection: with five times as much memory, an Appel-style generational collector with a non-copying mature space matches the performance of reachability-based explicit memory management. With only three times as much memory, the collector runs on average 17% slower than explicit memory management. However, with only twice as much memory, garbage collection degrades performance by nearly 70%.

And demanding that customers shell out an extra $100 on hardware so that you can program your game in a garbage collected language is not a reasonable solution.

And shipping software with unnecessary bugs and security holes is?

Also, how much RAM do you have that $100 is twice as much? I'm pretty sure that 8 gigabytes only costs around $60 these days.

Eh, I just pulled a number out of my ass. Just helps my point anyway. :-) (I have 6 or 8 GB, depending on the computer.)

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 8:28 am UTC

EvanED wrote:
These results quantify the time-space tradeoff of garbage collection: with five times as much memory, an Appel-style generational collector with a non-copying mature space matches the performance of reachability-based explicit memory management. With only three times as much memory, the collector runs on average 17% slower than explicit memory management. However, with only twice as much memory, garbage collection degrades performance by nearly 70%.


I don't know if you've ever played a game at 10% - 30% the framerate that it is usually at, but it's unplayable.

And shipping software with unnecessary bugs and security holes is?

Compared to shipping something that won't even run? Yeah. But at least with an unmanaged language it's possible to write bug-free code. With a garbage collected language it's impossible to write fast code (if you're using up all available memory).

Eh, I just pulled a number out of my ass. Just helps my point anyway. :-) (I have 6 or 8 GB, depending on the computer.)

I have 8 gigabytes as well, and even using 64-bit Java, that's not enough to get my computer on Minecraft to run without lagging horribly.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

Ben-oni
Posts: 278
Joined: Mon Sep 26, 2011 4:56 am UTC

Re: Is the Right Tool a Matter of Preference?

Postby Ben-oni » Mon Jul 23, 2012 8:37 am UTC

sourmìlk wrote:I don't know if you've ever played a game at 10% - 30% the framerate that it is usually at, but it's unplayable.

Not true! Games are generally designed the scale up when they run on better hardware. This is actually a point for high-level languages: it's easier to design behavior that adjusts gracefully to resource availability.

Compared to shipping something that won't even run? Yeah. But at least with an unmanaged language it's possible to write bug-free code. With a garbage collected language it's impossible to write fast code (if you're using up all available memory).

This is BS and you know it. Games often have to run on a wide array of machines, so publishers specify minimum system requirements.

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 8:43 am UTC

Ben-oni wrote:Not true!

O.o

I'm really not sure what video games you've used that this is the case for you.

This is BS and you know it. Games often have to run on a wide array of machines, so publishers specify minimum system requirements.

Okay, yes, the other option is to dumb everything down enough that it doesn't require the same amount of resources, but if you have to reduce memory requirements by a factor of four, you can end up making a totally different game. That's severely limiting.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
Jplus
Posts: 1721
Joined: Wed Apr 21, 2010 12:29 pm UTC
Location: Netherlands

Re: Is the Right Tool a Matter of Preference?

Postby Jplus » Mon Jul 23, 2012 9:20 am UTC

sourmìlk wrote:Dammit, clojure is substantially less memory efficient than C. I'm scrapping that then. Although it's actually not much worse than just Java, which is as fast as C, but which uses more memory. I'm confused. How can it be that these languages are as fast as C while using so much more memory? Does that mean the memory is only a problem when you're using it all up, though there are about as many calculations either way? And if I'm doing anything seriously performance dependent, it seems I must use C or C++, because everything else is at least 2 to 4 times as slow. or am I misreading this all. It's very discouraging.

I don't know about you, but when I view the results of the benchmarks game I'm under the distinct impression that ATS, Ada and Fortran are all about as fast as C and C++ (more so than Java, actually). Ada and Fortran have barely any memory overhead combared to C (despite Ada being as least as safe as Java) and ATS even has no memory overhead compared to C at all, while its memory-safety is about perfect. It's a shame LuaJIT is out of the game; it used to have speed like Java but memory usage similar to Fortran, while still using mark-and-sweep garbage collection. Also note Pascal, which is slower than Java but uses substantially less memory than C and ATS.

There are many solutions to safety. ATS uses theorem proving, Ada uses a special approach to memory pools and syntax. In C++ you have the option to use rich typing. RELIGIOUS WARS WARNING: I've always found that "safety" is a lousy excuse for Java to be a resource hog, especially since the JVM is always full with security leaks like a stray dog with fleas. ;)
"There are only two hard problems in computer science: cache coherence, naming things, and off-by-one errors." (Phil Karlton and Leon Bambrick)

coding and xkcd combined

(Julian/Julian's)

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 12:03 pm UTC

Well I suppose I could consider Ada as an alternative for a lot of things then. Mostly I was hoping for a functional language with C-like performance so I could find an excuse to use one, but I rarely ever consider performance an acceptable sacrifice for making the programming job easier. I don't see why I should be allowed to make a worse product because it's easier to make one. I guess to only time I can think of when this is overridden is if programming a more efficient program is impossible given time constraints.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
Xanthir
My HERO!!!
Posts: 5413
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby Xanthir » Mon Jul 23, 2012 12:56 pm UTC

sourmìlk wrote:I don't see why I should be allowed to make a worse product because it's easier to make one. I guess to only time I can think of when this is overridden is if programming a more efficient program is impossible given time constraints.

This. There are many definitions of "worse". Wasting time on screwing around with your memory in exchange for speed means less time hunting non-memory-related bugs and adding more features and polish. If you have sufficient money and time, you can make that "less" be "enough", but that's not always possible.
(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 12:59 pm UTC

I don't think it needs to be a zero-sum game: in a memory intensive application, an unmanaged language doesn't force you to choose between efficiency and cleanliness, but a managed one does.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

Ben-oni
Posts: 278
Joined: Mon Sep 26, 2011 4:56 am UTC

Re: Is the Right Tool a Matter of Preference?

Postby Ben-oni » Mon Jul 23, 2012 7:54 pm UTC

sourmìlk wrote:Well I suppose I could consider Ada as an alternative for a lot of things then. Mostly I was hoping for a functional language with C-like performance so I could find an excuse to use one, but I rarely ever consider performance an acceptable sacrifice for making the programming job easier. I don't see why I should be allowed to make a worse product because it's easier to make one. I guess to only time I can think of when this is overridden is if programming a more efficient program is impossible given time constraints.

There are two aspects to efficiency: complexity, and scale. Most programmers know to focus on the Big-O (in terms of both time and space) than scale. Make the program as algorithmically efficient as possible, then optimize. Use Rapid Prototyping to get the program off the ground fast, then find what needs to be improved, optimize the algorithms, then rewrite the most critical loops in C. And under no circumstances shalt thou waste time fiddling with working code until then. Horrors shall be visited upon those who violate the Sixth Commandment: "Simplify only after the function is correct."

In a professional setting, a programmers time is worth money. A programmer needs to spend his time on the current requirements, which rarely include efficiency. Be fast and correct, not clever. Cleverness is rarely a good attribute for code to possess.
Last edited by Ben-oni on Mon Jul 23, 2012 8:02 pm UTC, edited 1 time in total.

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Mon Jul 23, 2012 8:00 pm UTC

So sayeth the almighty Knuth. But scale doesn't matter in terms of the end user's experience. And ultimately the only part of a program that matters is the actual compiled executable (or bytecode file etc.) itself.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
headprogrammingczar
Posts: 3072
Joined: Mon Oct 22, 2007 5:28 pm UTC
Location: Beaming you up

Re: Is the Right Tool a Matter of Preference?

Postby headprogrammingczar » Mon Jul 23, 2012 9:07 pm UTC

sourmìlk wrote:So sayeth the almighty Knuth. But scale doesn't matter in terms of the end user's experience. And ultimately the only part of a program that matters is the actual compiled executable (or bytecode file etc.) itself.


That's like saying "it's not the horsepower that matters, but the car itself". Maybe you can tolerate driving a bus, but I like being able to do a U-turn and stop on a downhill.
Someday, someone will need to handle a larger dataset than you anticipated. If you write your program well, computer performance has improved in lockstep with the demand for larger problem sizes. If you write your program poorly, the gap between demanded power and actual power grows.

For an example of this in action, see the gap between US internet speeds and hard drive capacities. The time it takes to do a full online backup of a modern drive increases every year.
<quintopia> You're not crazy. you're the goddamn headprogrammingspock!
<Weeks> You're the goddamn headprogrammingspock!
<Cheese> I love you

User avatar
Jplus
Posts: 1721
Joined: Wed Apr 21, 2010 12:29 pm UTC
Location: Netherlands

Re: Is the Right Tool a Matter of Preference?

Postby Jplus » Tue Jul 24, 2012 12:51 am UTC

Xanthir wrote:
sourmìlk wrote:I don't see why I should be allowed to make a worse product because it's easier to make one. I guess to only time I can think of when this is overridden is if programming a more efficient program is impossible given time constraints.

This. There are many definitions of "worse". Wasting time on screwing around with your memory in exchange for speed means less time hunting non-memory-related bugs and adding more features and polish. If you have sufficient money and time, you can make that "less" be "enough", but that's not always possible.

Honestly, I think this issue is grossly overrated (because there are so many other tradeoffs that determine how much features and polish you can add). Whenever C++ is compared to <insert language with GC here>, it always ends up pointing out the costs of manual memory management. While it's true that messing around with pointers and bare arrays introduces lots of opportunities for bugs and security holes, it's not like a C++ programmer has to do such things all the time. In general, "manual management versus GC" is a false dichotomy; there are many approaches to safety and languages without GC do tend to have some other kind of solution. Besides, the focus on pointer issues totally neglects the fact that (tracing) GC has some very important downsides as well, which can affect project costs just as much.
  • It negatively impacts efficiency. While in itself this can be a good trade for security, as is always emphasised in this kind of discussion, efficiency is important in lots of situations. Like sourmìlk mentioned, better efficiency in a game opens opportunities, e.g. for prettier graphics.
  • It introduces nondeterminism. In the best case, it only causes undepredictable pauses (which can be partially alleviated with incremental or concurrent GC, but that negatively affects overall efficiency). In the worst case, it makes it virtually impossible to reason about finalising code for the release of resources other than memory, which brings me to the next point:
  • It doesn't work for resources other than memory (files, sockets, devices...). In addition, the destructor mechanism (which does work for all types of resources) is severely handicapped by the nondetermism of tracing GCs. So while GC takes away your worries about memory, it often also forces you to revert to managing all other types of resources manually.
  • There is a tradeoff between effective GC and fast GC. Some optimisations that improve GC speed knowingly cause it to leak some memory. (EDIT: to make things worse, when using finalisers this might indirectly cause your program to leak other kinds of resources as well, nondeterministically!)
So to return to my earlier point that there are many ways to manage resources, here's what I think is a more complete overview and fairer comparison of different approaches.
  • Manage everything manually everywhere, like in C. Obviously this is the most error-prone as well as the most labour-intensive strategy. Nonetheless, with enough effort it can produce safe and bug-free software. Zero overhead unless you manage your resources clumsily (which admittedly is likely to happen once in a while).
  • Destructors (semi-automatic): manually define how to manage a resource in one place, then have it applied everywhere else automatically, like in C++. It's still possible to make mistakes, but much less than in fully manual management, and the mistakes are concentrated at a single location in the code. It works for any resource and there is still zero associated overhead thanks to the stack.
  • Tracing GC, like in Java. Memory management is fully automatic. On the other hand, for all other resources the programmer has to choose between fully manual management or nondeterministic semi-automatic management (finalisers). Programmer effort and susceptibility to bugs as compared to destructors therefore depends on how much work is done with resources other than memory. Overhead is substantial, either in memory, in time, or in both.
  • Reference counting GC, like in Python. Like in tracing GC memory management is fully automatic, but this approach plays nice with destructors. The tradeoff is that it has more associated overhead in time than tracing GC (though potentially less overhead in memory).
  • Other types of GC: I'm not knowledgeable about them, but it's safe to assume that they'll have their own advantages and disadvantages. Generally however, GC by itself is never a solution for resources other than memory.
  • Access types, like in Ada. This is semi-automatic, in a way that is somewhat reminiscent of destructors. I'm guessing that it's slightly safer but also takes slightly more effort for custom resource management. I don't know about the associated overhead but it's probably not much.
  • Theorem proving, like in ATS. The compiler proves that the code is correct, using hints that the programmer needs to manually insert everywhere. This is perfectly safe, but it's also very taxing on the programmer's brains. It is however guaranteed free of any runtime overhead.
  • ... (other approaches that I don't know about).
Now we're talking. It all depends on your circumstances. :)
"There are only two hard problems in computer science: cache coherence, naming things, and off-by-one errors." (Phil Karlton and Leon Bambrick)

coding and xkcd combined

(Julian/Julian's)

User avatar
WarDaft
Posts: 1583
Joined: Thu Jul 30, 2009 3:16 pm UTC

Re: Is the Right Tool a Matter of Preference?

Postby WarDaft » Tue Jul 24, 2012 9:23 am UTC

sourmìlk wrote:I don't think it needs to be a zero-sum game: in a memory intensive application, an unmanaged language doesn't force you to choose between efficiency and cleanliness, but a managed one does.


But it must be a zero sum game. For X dollars invested into a project, you get Y amount of coding hours on it. The more efficiently those coding hours are spent, the better the project. Functional and garbage collected (which is universally implied by functional AFAIK) languages are known to be more efficient use of programmer's time. If it takes you half as long to build the needed features, then you can use the other half of the time making it better - which just so happens to include making it faster and more resource efficient.

Never mind how much easier it makes parallel programming. My laptop is a 4 core HT, and I can use all of that with two lines of Haskell.
All Shadow priest spells that deal Fire damage now appear green.
Big freaky cereal boxes of death.

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby sourmìlk » Tue Jul 24, 2012 9:45 am UTC

With a garbage collected language, infinite coding time will never achieve the same efficiency as an unmanaged one in cases where memory is scarce.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
Jplus
Posts: 1721
Joined: Wed Apr 21, 2010 12:29 pm UTC
Location: Netherlands

Re: Is the Right Tool a Matter of Preference?

Postby Jplus » Tue Jul 24, 2012 10:47 am UTC

WarDaft wrote:Functional and garbage collected (which is universally implied by functional AFAIK) languages are known to be more efficient use of programmer's time.

I've heard this claim many times, but I've never seen it backed by actual numbers. It's a dubious claim, given that GC is only a solution for memory, not for other resources, and that while functional style tends to take slightly less typing, also tends to take slightly more thinking. Again it totally neglects that there are many other factors that affect programmer efficiency, such as availability of libraries, syntax, typing discipline, coding style and workflow.

I wouldn't be surprised if there are circumstances in which Fortran is a more efficient use of the programmer's time than Haskell. Heck, I can think of one: when you're a physicist and you just need to crunch a lot of numbers in linear algebra.
"There are only two hard problems in computer science: cache coherence, naming things, and off-by-one errors." (Phil Karlton and Leon Bambrick)

coding and xkcd combined

(Julian/Julian's)

User avatar
Jplus
Posts: 1721
Joined: Wed Apr 21, 2010 12:29 pm UTC
Location: Netherlands

Re: Is the Right Tool a Matter of Preference?

Postby Jplus » Tue Jul 24, 2012 10:56 am UTC

sourmìlk wrote:Well I suppose I could consider Ada as an alternative for a lot of things then. Mostly I was hoping for a functional language with C-like performance so I could find an excuse to use one, but I rarely ever consider performance an acceptable sacrifice for making the programming job easier. [...]

I forgot to mention this, but ATS is functional.
"There are only two hard problems in computer science: cache coherence, naming things, and off-by-one errors." (Phil Karlton and Leon Bambrick)

coding and xkcd combined

(Julian/Julian's)

Ben-oni
Posts: 278
Joined: Mon Sep 26, 2011 4:56 am UTC

Re: Is the Right Tool a Matter of Preference?

Postby Ben-oni » Tue Jul 24, 2012 12:30 pm UTC

Jplus wrote:
WarDaft wrote:Functional and garbage collected (which is universally implied by functional AFAIK) languages are known to be more efficient use of programmer's time.

I've heard this claim many times, but I've never seen it backed by actual numbers. It's a dubious claim, given that GC is only a solution for memory, not for other resources, and that while functional style tends to take slightly less typing, also tends to take slightly more thinking. Again it totally neglects that there are many other factors that affect programmer efficiency, such as availability of libraries, syntax, typing discipline, coding style and workflow.

I wouldn't be surprised if there are circumstances in which Fortran is a more efficient use of the programmer's time than Haskell. Heck, I can think of one: when you're a physicist and you just need to crunch a lot of numbers in linear algebra.

I'm not sure there can be objective studies concerning this, but functional tools at least allow programmers to create things they could not otherwise have thought of. It's not all about declarative style and compile time assurances. We wouldn't have things like monads if not for functional programming. As for real-life programming, you've never done OpenGL programming until you've done it in Haskell the way it was meant to be done. Spend just one day on that, and I think you'll be convinced that functional programming increases productivity.

As for your particular concerns, solving problems that have already been solved is easy. Trivially easy. The language doesn't matter anymore. Once the solution is seen, anyone can do it. Work on true research solving problems you've never encountered before, and the existence of a vast array of standard libraries will seem quite inadequate.

User avatar
WarDaft
Posts: 1583
Joined: Thu Jul 30, 2009 3:16 pm UTC

Re: Is the Right Tool a Matter of Preference?

Postby WarDaft » Tue Jul 24, 2012 6:26 pm UTC

Of course memory management makes coding faster. Imagine C, exactly as it is, except now it's memory managed. All the time you've ever spent debugging memory leaks and segfaults just basically vanished. How is that not making you more productive?

I wouldn't be surprised if there are circumstances in which Fortran is a more efficient use of the programmer's time than Haskell. Heck, I can think of one: when you're a physicist and you just need to crunch a lot of numbers in linear algebra.
I sincerely doubt that, Haskell excels at math. I haven't looked into it, but if there aren't libraries to do things like matrix multiplication (which, yes, I know is a primitive operation in FORTRAN) then it's a one-time cost to make one and no one else will need to again because they can just type "cabal install linear-algebra". It will also be vastly more extensible, so sooner or later someone will probably generalize it to algebra over arbitrary kinds of traversable data or some such.

Also, the part about needing more thinking... thinking makes you learn, and so think faster and more effectively in the future. People have time and again demonstrated that the upper bound on this has not been reached. Typing bazillions of lines of code will only make you type so fast however, and once you reach the point where you don't need to take breaks in typing as you're coding up your concepts, then the language is slowing you down. Note also that you don't have to use high level concepts that you have to think about more in functional languages, you can easily use very simple things and mimic the operation of any control structures of imperative languages - you just have to use slightly different syntax. After this though, you have more control over how things work, and if you notice patterns, you can exploit that much more easily.

There is also no real upper bound on the level of abstractions you can build effectively in a functional language, which lets you do even more with less. Imagine if, say Haskell and C/C++, switched places in usage levels throughout history. All of the time spent on making better C/C++ compilers was instead spent on making better Haskell compilers. All the time spend on C/C++ libraries was instead spent on Haskell libraries. We're talking a nearly two order of magnitude increase in time invested. How do you think they would compare then? Do you think there would there be as many arguments for using C/C++ in that world as there are for using Haskell in this one? More? Less?
All Shadow priest spells that deal Fire damage now appear green.
Big freaky cereal boxes of death.

User avatar
Sc4Freak
Posts: 673
Joined: Thu Jul 12, 2007 4:50 am UTC
Location: Redmond, Washington

Re: Is the Right Tool a Matter of Preference?

Postby Sc4Freak » Tue Jul 24, 2012 7:05 pm UTC

WarDaft wrote:Of course memory management makes coding faster. Imagine C, exactly as it is, except now it's memory managed. All the time you've ever spent debugging memory leaks and segfaults just basically vanished. How is that not making you more productive?

Because there's no such thing as a free lunch. C is a systems programming language - if a kernel is written in "managed C" which then turns out to be too slow, does the engineering effort to optimize it outweigh the benefit? What about the nondeterminism you've introduced, how many problems is that going to cause? What about your real-time guarantees - do you just throw those out the window?

There has always been the argument that high-level languages often trade performance for programmer time (which then arguably allows for more time spent performing optimizations). But that's always a tradeoff. No doubt managed languages may make you more productive in the short term, but automatic memory management isn't free and can have associated long-term costs.

User avatar
WarDaft
Posts: 1583
Joined: Thu Jul 30, 2009 3:16 pm UTC

Re: Is the Right Tool a Matter of Preference?

Postby WarDaft » Tue Jul 24, 2012 7:47 pm UTC

That is rather dependent on the implementation of the memory management, so I can't give blanket answers to that specific a question. They are all good reasons to have a memory management model the programmer can have input in however, just not free have reign of it. That's actually one thing I feel is really lacking in most functional languages, why can't I tell it that this should be discarded ASAP, offer it a proof that certain things just aren't needed, or that something should stick around for a while longer than it otherwise would have been allowed to? Don't actually allow any errors, only optimization.
All Shadow priest spells that deal Fire damage now appear green.
Big freaky cereal boxes of death.

Ben-oni
Posts: 278
Joined: Mon Sep 26, 2011 4:56 am UTC

Re: Is the Right Tool a Matter of Preference?

Postby Ben-oni » Tue Jul 24, 2012 8:53 pm UTC

Sc4Freak wrote:There has always been the argument that high-level languages often trade performance for programmer time (which then arguably allows for more time spent performing optimizations). But that's always a tradeoff. No doubt managed languages may make you more productive in the short term, but automatic memory management isn't free and can have associated long-term costs.

I was just reading a blog yesterday about this... Back in the day (the argument goes) CGI scripts were written in C. This was all very good because the internet was a new and exciting place, and most pages were static HTML anyways. And then the perl hackers came along and starting hacking together CGI scripts in perl. The C programmers railed against the very idea: "You can't write efficient code in perl!" they screamed. But as it turns out, perl is a language designed for text processing, and the internet was built upon text so it worked pretty well. And while the C folks were bitterly resisting perl, the perl (and python, and php) folks were building the internet.

So stick with C if you want, but in the real world, high-level languages are more profitable.

User avatar
Xeio
Friends, Faidites, Countrymen
Posts: 5101
Joined: Wed Jul 25, 2007 11:12 am UTC
Location: C:\Users\Xeio\
Contact:

Re: Is the Right Tool a Matter of Preference?

Postby Xeio » Tue Jul 24, 2012 9:12 pm UTC

sourmìlk wrote:With a garbage collected language, infinite coding time will never achieve the same efficiency as an unmanaged one in cases where memory is scarce.
Doesn't matter, nobody has infinite development time, they do happen to have oodles of memory though (especially given the memory requirements of your average application).


Return to “Coding”

Who is online

Users browsing this forum: No registered users and 9 guests