off topic - capatob - saratov2 computer Russsian pdp8

Paul Koning paulkoning at comcast.net
Sun Jan 6 13:46:13 CST 2019



> On Jan 6, 2019, at 2:34 PM, Bob Smith via cctalk <cctalk at classiccmp.org> wrote:
> 
> With the advent of wide spread introduction of 16 bit machines the
> definition of a byte as an 8 bit unit was accepted because ASCII
> supported character sets for multiple languages, before the 8bit
> standard there were 6 bit, 7 bit variations of he character sets.
> Gee, what were teletypes, like the model 15, 19, 28, oh yeah 5 level
> or 5 bit..with no parity.

I think some of this discussion suffers from not going far enough back in history.

"Byte" was a term used a great deal in the IBM/360 series, where it meant 8 bits.  Similarly "halfword" (16 bits).  But as was pointed out, mainframes in that era had lots of different word sizes: 27, 32, 36, 48, 60...  Some of them (perhaps not all) also used the term "byte" to mean something different.  In the PDP-10, it has a well defined meaning: any part of a word, as operated on by the "byte" instructions -- which the VAX called "bit field instructions".  6 and 9 bit sizes were common for characters, and "byte" without further detail could have meant any of those.  In the CDC 6000 series, characters were 6 or 12 bits, and either of those could be "byte".

"Nybble" is as far as I can tell a geek joke term, rather than a widely used standard term.  "Halfword" is 16 bits on IBM 360 and VAX, 18 on PDP-10, and unused on CDC 6000.  Then there are other subdivisions with uncommon terms, like "parcel" (15 bits, CDC 6000 series, the unit used by the instruction issue path).

ASCII was originally a 7 bit code.  There were other 7 bit codes at that time, like the many variations of Flexowriter codes; 6 bit codes (found in typesetting systems and related stuff such as news wire service data feeds), and 5 bit codes (Telex codes, again in many variations).

	paul



More information about the cctech mailing list