For reference, Shirriff describes aspects of the 8086 microcode here
How the 8086 processor's microcode engine works
<https://www.righto.com/2022/11/how-8086-processors-microcode-engine.html>
https://www.righto.com/2022/11/how-8086-processors-microcode-engine.html
Including a reference to 1951 Wilkes concept.
As the S/360 used microcode, I'm suspect if PALM used some form of
microcode (which was developed at or near Boca Raton c.1971, but not much
is known about it -- we have its instruction set documented as early at
1972, and the "M" is PALM is said to be Microcode). Just unclear what they
really had going on in those SLT modules.
But back on the original Wang question: I still can't find high resolution
images of it's CPU board. According to 1991 discussion here on the 2200B,
there is also mention of the system not really having an instruction set,
BASIC was all it could do since it was "hard wired."
https://groups.google.com/g/alt.folklore.computers/c/lb0DzzDja-Y
I would think by 1991 they knew what a ROM was and would have called it as
such, so I'm still curious if we have right on how that system worked.
People may have had that impression about the IBM 5100, but we've showed it
has a DSP (kind of diagnostic mode) where you code in PALM machine code
(or even load "binary blobs" of previously stored PALM code onto tape [
video on it here, towards the end I load an IBM 5100 ported version of
Corti's original one he did for the 5110
https://www.youtube.com/watch?v=e2GYWyZyfpE
]), and the earliest PALM instruction docs we have is from 1972. That doc
doesn't describe how many cycles each instruction takes, but I think in the
IBM 5100 SLM docs it does imply they have a variable number of cycles.
On Mon, May 5, 2025 at 4:18 PM Brian L. Stuart via cctalk <
cctalk(a)classiccmp.org> wrote:
On Mon, May 05, 2025 at 04:09:29PM +0100, David Wade
via cctalk wrote:
I think "Microprogramming" as a
technique has been around as long as we
have
had computers. Couldn't the setting up ENIAC
to behave like a stored
program
computer in 1948 be described as
"Microprogramming"?
That's an interesting question, but I'd say yes. I base that
on the idea that microprogramming is essentially programming
one universal machine to emulate another universal machine
with the purpose of using the programming model of the second
machine as one that is more convenient than that of the
first. Of course, it doesn't look at all like the microprogramming
we're used to, but I'd say it still applies.
> The Zuse Z1 from 1936(!) was microcoded,
too. It implemented for
example
> floating-point arithmetic and conversion
instructions
> (binary<-->decimal).
>
> Christian
I'd add that we can go back even farther. Babbage included
a mechanism on the analytical engine for the more complex
operations that was effectively microcode. I was implemented
with a cylinder (referred to as a barrel) that you could
screw blocks into. A set of levers were pressed against
a line of block positions along the length of the cylinder
and the presence or absence of a block would determine
whether the connected mechanism is engaged. Then the cylinder
is turned one step and the process repeated. The whole
thing ends up being a lot like typical horizontal microcode.
BLS