Reproduction micros

Peter Corlett abuse at cabal.org.uk
Mon Jul 25 08:06:58 CDT 2016


On Mon, Jul 25, 2016 at 05:54:51AM -0600, ben wrote:
[...]
> The other factor is that the 3 big computers at the time IBM 360/370's PDP 10
> and PDP 11 where machines when the Dragon Book came out thus you favored
> register style code generators. Later you got the Pascal style one pass
> generators that is still with us.

Those backend designs are completely obsolete now. Modern machines have more
grunt, so can use more complex algorithms to eke out more performance.

> After that the Intel hardware mess, and the growth of (to me useless
> languages like C++ or JAVA) or features like OBJECTS has made every thing so
> complex, that only few people can even understand just what a compiler is
> doing.

A C++ object is a C struct, and a C++ method is a C function that has an
invisible first parameter that is a pointer to the object. The generated
machine code is bitwise identical. Dynamic dispatch muddies the water a bit,
and is equivalent to (but not usually implemented as) a function pointer in the
struct. This is not exactly rocket science.

There are more features in C++, and they tend to get overused by newcomers who
are blinded by the shinies, but it is possible to exercise good judgement to
make it *easier* to read than the equivalent C program by tucking all the
boilerplate out of the way.

> PS: Has computer science really advanced since the late 1970's? What about
> hardware?

Is that a rhetorical question? I'm going to answer it anyway :)

One of my favourite bits of CS is Andrew Tridgell's PhD thesis, from 1999. The
advancement to human knowledge was his algorithm to very efficiently run a
sliding window across data using rolling checksums and compute deltas in a
network-efficient manner. This is the business end of rsync(1), and it can also
be used for data compression.

On compilation itself, there are a few interesting advances. Single Static
Assignment form was invented by IBM in the 1980s; ISTR it's covered in my 1986
edition of the Dragon Book. The essence of the idea is that variables can only
be assigned once, and since they are immutable values, it enables many very
worthwhile optimisations that wouldn't be as simple were the values liable to
change.

Functional programming is finally mainstream. Yes, Lisp has been around since
the 1950s, but it's been refined somewhat over the 2000s. It's used for
immutability (which Lisp didn't really have) and so is easier to reason about
on multi-threaded and multi-processor machines because one now doesn't need to
worry about shared mutable state.

There are *loads* of novel algorithms and techniques out there, but it seems to
take decades for them to appear in algorithm textbooks.



More information about the cctech mailing list