Ancient computers

yeah but actually, if you ever want to watch something on how computers work, i've found a playlist by Sebastian Lague that's inspired by Ben Eater's series and another one called "From NAND to Tetris"
it's only four videos, but it simplifies everything by a lot
(sorry, i know this is a little off-topic)

One of the things to be grateful about this Christmas season (oh wait the "Grateful" Season was last month on Thanksgiving :|) that computer engineers throughout the span of...almost a century have made computers so small and powerful that no one needs to toil away for hours at a time inputting 0's and 1's and replacing vacuum tubes in a computer.

yeah, i cannot bear to imagine what it'd be like if we had to do all of that to get one simple task done with technology...

A post was merged into an existing topic: Working Tetration Function

When ENIAC first came out, there was a common saying that every time ENIAC would perform a computation, the Streetlights of New York City would dim for a few moments, mainly because ENIAC used so much power and power generation, or power plants, were a new idea at the time.

man, that's crazy
just thinking about how long people needed to spend on that thing
and also just the fact that it basically caused brownouts is absolutely insane

It was just a rumour, but it damn well might have seemed true.

I've seen it and followed along. (Had to use WINE to run it. :( )

There were already transistors when I started using computers, but I did enter ones and zeros into switches on the computer's console. I learned on an IBM 7094 (back then, three-digit numbers were vacuum tube models and four-digit numbers were transistor models, so the 7090 was the same architecture as the 709, and the 7094 was almost the same but a few things different). And then I got to use a DEC PDP-10. Different architecture, but still switches on the console.
pdp10

pdp10-2

What was it like, standing in the midst of a PDP-10, The raw power of the machine whirring around you, entering data in raw binary, no abstraction weighing you down (except UNIX, probably)?

Well, to be fair, we didn't usually have to enter data in binary. Those keys on the console were mainly for the bootstrap process by which you got the machine to load its operating system from (in the early days) a magnetic tape. (These things
Screen Shot 2023-12-15 at 2.17.57 PM
above the console are readers/writers for tiny tapes, I dunno, maybe 5-inch diameter? There are also a couple of regulation-size tape drives, the kind you see in old movies, off to the right in the picture.) The console keys were also useful if you were a real system wizard trying to debug a crashed system (as opposed to just rebooting it, which is what you did if there weren't any wizards around). For other things, we could enter programs in real programming languages just like today.

These machines predate Unix quite a bit. There were three main rival operating systems: TOPS-10, the one made by DEC, who built the hardware; TENEX, the one from BBN (Bolt, Beranek, and Newman, the people who built the Arpanet, the experimental predecessor to the Internet); and ITS, the Incompatible Timesharing System (why called that? Look it up), from the MIT AI Lab, the best of the bunch, some of us thought, but not widely used outside MIT. There was also WAITS, the Western AI Timesharing System, from the Stanford AI Lab, but that's not in the list because it was a mod of TOPS-10 rather than an independent system, and because nobody used it outside of SAIL except for IRCAM, L'Institut de Recherche et Coordination Acoustique/Musique, in Paris, which used a modified WAITS because I'm the one they hired to run their system. :~)

Still though, I find programming languages like FORTRAN and ASM to be very convoluted, compared to what we have today. Nowadays you can just plug in an ASM-C++ Compiler to C++ and just program in C++ onto a bare Computer Chip and BAM! You've saved like hours programming on C++ compared to programming in ASM.

I assume you used ASM as well.

Well, an assembler, yes. Not that one of course.

The PDP-10 had the best ever machine language for people actually programming directly in assembler. All 16 dyadic Boolean operators, for example. Big families of instruction, e.g., halfword transfers Hxyzw, where x (one bit) is

  • L = from left half
  • R = from right half

, y (one bit) is

  • L = to left half
  • R = to right half

, z (two bits) says what to put in the other halfword of the destination:

  • 𝜑 = leave unchanged
  • Z = set to zeros
  • O = set to all ones
  • E (for Extended) = set to the leftmost bit of the source halfword, good for twos complement numbers

, and w (two bits) is transfer from where to where:

  • 𝜑 = from memory to register
  • I (for Immediate) = from 18 bits of the instruction itself to register
  • M = from register to memory
  • B = from memory to both

where 𝜑 means "no letter," i.e. the empty string. So HLRE means "move the left halfword of the memory location operand to the right halfword of the register operand, extending the leftmost bit to the left halfword of the result," which turns an 18-bit twos complement value in the left half of a memory location into a 36-bit twos complement value in the register.

So people who programmed in assembler loved this architecture. Sadly, various things made it hard to make the machine run fast, e.g., some of these halfword instructions, among others, required the machine to read the memory location, do some computation, and then write the result back to the same memory location, and this turns out to make it hard to pipeline the implementation. And eventually nobody programmed in assembler any more (except for game programmers who would write little bits of assembler code for the parts of the program that were run very often and therefore had to be super fast), especially once there were higher level languages such as C designed for system programming. So then architectures that were uglier but easier to pipeline, such as the early RISC ones, beat out the elegant designs.

That sound like a joy compared to RISC-V assembly!


Have you seen the KIM-1?

Oh yeah.

Sorry. I just meant earlier "What was it like working with a room-sized computer?"

No problem! Actually in the really old days you weren't allowed in the machine room; you left your deck of punched cards in another room, and the operator would pick it up and bring it in to the machine room. I'm not sure if they were afraid that klutzy programmers would bump into the console switches by accident, or that off-leash programmers would mess with the switches on purpose!

No, I've never heard of the KIM-1 (except now I have, by way of Wikipedia). But of course the Apple II and the Atari 800 were also 6502-based.

I'm actually doing a website about the invention of the internet for nhd.org!

Cool -- be sure to look up the Finger protocol.

I think the KIM-1 was made by MOS to demo the 6502.