Algol vs Fortran was RE: VHDL vs Verilog
derschjo at mail.msu.edu
Tue Feb 9 16:22:07 CST 2010
On Feb 9, 2010, at 1:58 PM, Dave McGuire <mcguire at neurotica.com> wrote:
> On Feb 9, 2010, at 3:49 PM, Josh
>>> Yes, to a large degree! You're talking about it as if it happens
>>> ONCE, and you know it doesn't. Would you care to estimate how
>>> many of those vtable lookup happen when someone simply clicks a
>>> mouse in a modern OS? I don't know for certain, but I'm willing
>>> to bet that it's thousands.
>> I'm not going to speculate on things I have no knowledge of. Even
>> if it were thousands, that would mean an overhead of tens of
>> thousands of instructions. On a modern CPU, in a user-interaction
>> scenario, that doesn't even begin to be noticable to the user.
> You're speaking from the standpoint of running one program, once,
> alone on one computer. That's not how (most) computers are used.
> How many processes are running, right now, on your Windows box? 146
> right now on my Mac, *nearly seven hundred* on the central computer
> here (a big Sun). Those "this code is only 20% slower"
> inefficiencies that allowed us to get back on the golf course
> fifteen minutes sooner do add up.
So you're clicking every button in every process running on your
machine constantly? My point is that for *many* operations (like user-
interactions which typically are gated on the response time of humans)
virtual call overhead is acceptable.
>> If it's millions of calls in a tight loop doing some heavy
>> calculation then it will have an impact; but anyone calling a
>> virtual function in such a scenario is doing it wrong.
> ...which happens ALL THE TIME. I've seen (and fixed) code like
> that in every programming job I've ever had. Loop strength
> reduction is something that nearly all optimizing compilers do, but
> the fact that compilers have optimizers doesn't give us free license
> to write sloppy code. There will always be situations in which the
> compiler can't reduce the strength of a loop, and they will always
> get right by you when you're writing that code if you don't pay
So again, the problem is the programmers, not the programming paradigm.
>>> ...what happens when someone uses an Integer instead of an int? A
>>> whole object gets created when all one likely needed was a memory
>> Depends greatly on the language. Don't confuse one implementation
>> with ALL OO programming languages. In C++ an integer maps to a
>> register. Same in C#. Same in objc. Java does this differently.
> I'm going to go dig into a C++ implementation and look at the in-
> memory composition of an integer object. I sure hope you're right.
C++ definitely has no concept of an integer object. (it offers no
built/in object types, not even a base Object class.)
>>> And howabout template programming. I've never seen executables so
>>> big as the ones in which templates were overused.
>> Template programming is another paradigm altogether, it's basically
>> C++ specific and it has very little to do with OO. (It's also an
>> abomination along with most of C++.)
> C# has a form of templates, if memory serves. I believe they're
> called "generics". Name one non-OO language that has such a
> construct. I don't know of any.
C# generics and C++ template metaprogramming are nowhere near the same
thing. They both let you easily define reusable container objects
(and for that use, they are efficient.). C++ templates actually
provide a Turing-complete language (an ugly one) that runs at compile
time. You can do clever things with it, you can also do horrible
things with it.
C++ metaprogramming is very much a paradigm unto itself.
>>> Note well, however, that I'm talking about more than just the
>>> number of instructions required to accomplish a given task. Sure,
>>> that in itself ehas bad side effects when you think about what it
>>> does to the instruction cache hit rates...the principal of
>>> locality of reference is blown out the window. But what about
>>> memory utilization? How big, in bytes, is an Integer compared to
>>> an int? Ok, the difference may be only a few bytes, but what
>>> about the program (which would be "most of them") with tens of
>>> thousands of them? (I'm typing this on a Mac, into Mail.app,
>>> which is currently eating 1.73GB of memory)
>> I beleive tha in Objective C, ints are still registers, no magical
>> Integer objects here. Sounds like Mail.app is poorly written. I'm
>> running Outlook here (written in a mix of c and c++) and it's using
>> 100mb (with an inbox size of 10gb...).
> Again I'm going to try to find the in-memory representation of
> those integer objects. Regardless, however, this was just an
> example...I'm sure you see my point. You're suggesting that OO
> programming involves no runtime overhead over procedural/imperative
> languages when run on processors whose architecture is arguably
Never said that. My argument has and will continue to be that the
performance impact is not nearly as bad as you are implying.
>>>> You keep talking about howi OO programming is the reason that
>>>> software today is so inefficient but you offer no data to back it
>>>> up other than "it doesn't map to the hardware."
>>> I'm sorry, but knowing how processors work, it's pretty obvious to
>>> me. The data that backs it up is lots of programs (some of which
>>> are operating systems) that I use every day, written in OO
>>> languages, including (perhaps especially!) OS X, are far slower
>>> than they should be given the hardware they're running on. YOU
>>> know how processors work too, I know you do, so I know you see my
>> This is only a valid argument if you have an OS X written in plain
>> C and an OS X written in OO that you can do a real comparison
>> between. Anything else is speculation. OO does have its
>> overheads, I disagree that they are anywhere nearly as bad as you
>> claim them to be. Speculating that OS foo is far slower than it
>> "should be" based on anecdotal evidence is not proof.
> It may be, at least in part, speculation...but with lots of
> experience to back it up. Quite simply, almost everything I've seen
> written in C++ and Java (even with native compilation) is slow, and
> most everything I've seen written in C, assembler, and Forth is fast.
I could argue that I've also seen the exact opposite, but I'm not
sure what that would prove.
> One such example in which the functionality is similar is groff vs.
> nroff. Big speed difference between the two on similar hardware
> performing similar functions.
> Speculating that OS foo is far slower than it "should be" is
> something that I think I can get a pretty good feel for, having used
> dozens of operating systems on dozens of types of computers over
> dozens of years. You're suggesting that my argument is completely
> illegitimate because I'm not willing to spend the next two weeks
> cooking up some sort of a benchmark suite to prove to you, by the
> numbers, something that I've never heard anyone else disagree with,
I'm suggesting that you are exaggerating the performance impact and
that you keep basing these projections on feelings.
>>> In an ideal world, one in which all programmers were competent, OO
>>> languages wouldn't be such a problem. So I guess what I really
>>> mean is, "Bad programmers are even more detrimental to computing
>>> when armed with OO languages".
>> You really think these same programmers would somehow write better
>> code if only they would stop using OO?
> Yes, absolutely. Most OO languages give bad programmers more code-
> inflating features to misunderstand and abuse. If they don't know
> how to write good code in C, which is a tiny, very fast, very low-
> overhead, very simple language with very few features, how can they
> be expected to write good code in C++, C# or Java, which are
> anything but?
I suppose we'll have to agree to disagree here.
> Handing an idiot a loaded rifle is dangerous. Handing an idiot a
> loaded rifle with a loaded grenade launcher is MORE dangerous
Yes. Programming languages are just like firearms.
If you want to discuss this further, feel free to contact me
offlist, I'm already feeling guilty for dragging this thread wayyyy
> Dave McGuire
> Port Charlotte, FL
More information about the cctalk