Algol vs Fortran was RE: VHDL vs Verilog
oltmansg at bellsouth.net
Tue Feb 9 19:18:32 CST 2010
I agree with this. I think C++ is a powerful tool, but as the joke goes, "When you shoot yourself in the foot with C, you blow a hole in your foot. When you shoot your foot in C++, your foot is obliterated." I'm sure the presentation of the original is funnier. ;)
On Feb 9, 2010, at 7:00 PM, George Currie wrote:
> geoffrey oltmans wrote:
>> I'm not so sure that the overhead for C++ objects is quite a bad as everyone's making them out to be. Most of it boils down to simple pointer math and in other cases the compiler itself can abstract away some of the offset calculations. If variables live on the stack anyway, you'd have to do these offset calculations to get the data as well.
> I think the bigger point is that there are _many_ places in modern apps where TONS of time is being wasted such that the overhead of vtables (or more importantly, the lost functionality by doing away with them) is relatively minor (depends in large part on the app of course). Heck, in a modern OS, every time I type a character, like in this email, there are about a bazillion things that happen (proper font/face chosen, rendered, line formatting checked, spell checking, text identification and highlighting (e.g. urls), etc, etc, etc) compared to a character mode editor on a CLI. All that is obviously going to drain huge cycles. Now you could certainly make that entire chain more efficient by minimizing the effects of the vtable, but it's the features itself that factor into the overall feeling that the modern OS is less responsive (or at the least no more responsive) than those of days gone by.
> I happen to work at a job where we _do_ count clock cycles and we'll spend much time shaving microseconds from our code. We even study and understand the effects of processor groupings in modern multiprocessor/core computers. That said, we still code in C++ but we understand when it's worthwhile to shave, when it's not worthwhile getting that detailed and when it's worthwhile to change the algorithm vs trying to shave a few nanoseconds by avoiding vtables or aligning your memory accesses.
> I grew up doing assembler and C. That said, the biggest issue with software today (imho) isn't really speed (the folks at places like Intel and nVidia will help us there) it's really stability and functionality. The goal of a lot of the bloat is to attempt to help in those areas (with varying levels of success of course).
More information about the cctalk