Fascinating that Willams tube (and Atanasoff–Berry computer's electrostatic drum) is in some way much closer to modern DRAM and flash than ferrites that were so revolutionary back then (or acoustic delay lines for that matter).
John Vincent Atanasoff has written in 1940 a very good report about the design of the Atanasoff-Berry computer (“Computing Machine for the Solution of Large Systems of Linear Algebraic Equations”).
In this report, he had analyzed which are the possible solutions for implementing a computer memory. He enumerated 5 kinds of memories: with mechanical bi-stable devices, with electro-mechanical bi-stable devices (e.g. polarized relays), with ferromagnetic materials, SRAM (with vacuum-tube flip-flops) and DRAM (with capacitors).
His list covered very well the most important kinds of memories used later.
The report “First Draft of a Report on the EDVAC” written by John von Neumann in 1945 added to these alternatives the memories made with delay lines (an idea presumably taken from the discussions with the ENIAC team). Moreover, the von Neumann report has also added a second variant of DRAM besides that using discrete capacitors, i.e. the DRAM made with a charge-storing image tube, like those used at that time for video cameras ("iconoscope memory").
Therefore the idea of the Williams tube has been taken directly from the von Neumann report. It is not known whether this idea has originated from von Neumann himself or whether he had got it from someone else.
See also acoustic delay lines. Basically the same but instead of using phosphorous, they use mercury, or piano-wire, but never gin. Gin was proposed by some fanciful idiot [1] but never used.
> Was the modified ENIAC less of a computer than the Manchester Baby because its program was in ROM and could not be changed by the computer?
To me, the most remarkable property of a computer is that data and code are interchangeable. This makes it possible for the same computer to run different programs, run programs that transform programs, and so forth. It's the same fundamental concept that today means that one can "download" an app and then run it.
(See also: Lisp, which is equally remarkable in the software space for the same reason)
> Look at it this way: many modern microprocessors, especially small ones for embedded control, have their programs in ROM. If they are modern-style computers, then so was the modified ENIAC.
What makes them modern-style computers, though, is that they are capable of having their firmware flashed - or at least the development versions can do this while their software is engineered. If the final product only runs a ROM, it has lost the essence of a general purpose computer, which is the fundamental and very remarkable invention that is what we actually celebrate.
"No-one would claim that a modern Harvard Architecture computer with its program stored in ROM isn’t a stored-program computer. So does ENIAC take the prize from Baby as the first electronic stored-program computer?"
I do actually claim that a modern Harvard Architecture computers such as the AVR8 or PIC microcontrollers are not stored-program computers. You can't store a program in them and then execute it. To be fair, some MCUs can change their own Flash so the difference can be subtle - in that case the processor is used either normally or as part of the ISP (in-system programming) circuit at different times.
For very simple stored-program machines the ability to modify running code is needed for it to be Turing complete. In a computer like the Baby, how would you add two arrays? It had no index registers. So you would need to increment the instructions that load and store from the arrays every time you go through the loop. I agree that this isn't an issue on a machine with only 32 words of memory in all, but it is a key idea in theory.
Of course, a Harvard computer can simulate a Von Neumann one (see AVR8 simulating an ARM in order to boot Linux where it does indeed store programs and then run them). In fact, a popular way to implement CISC computers was to build a tiny Harvard machine running a single program in its "microcode ROM" emulating the computer you actually wanted to use.
I have never seen a MCU that cannot write its own flash memory with a new program. This is how everybody makes firmware updates. At most, some MCUs have the option to blow some fuses to make some part of the memory immutable, but during program development the memory is still writable and used as such.
Ancient MCUs with true ROM or ultraviolet-erasable PROM may be claimed to not be a stored-program computer, but this claim is completely false about any MCU with a flash memory.
Manchester Baby was just a test device, not intended to be really useful, but its successor Manchester Mark 1, which became operational next year, in 1949, already had index registers.
Nevertheless, in the next few years there have been built many computers that lacked index registers, despite the precedent of Manchester Mark 1, so instruction modification during program execution was mandatory in them. After 1954, few, if any, computers remained without index registers, so instruction modification was no longer necessary.
I had to work on writing a simulator in Java for the SSEM/Manchester Baby as part of my University studies back in the 2000s, it was a fun task, nice to read more about it.
I totally love this early principle [0] for RAM by literally writing in electrons on a 2D surface and then reading them back using the electron gun.
[0] https://en.wikipedia.org/wiki/Williams_tube
Fascinating that Willams tube (and Atanasoff–Berry computer's electrostatic drum) is in some way much closer to modern DRAM and flash than ferrites that were so revolutionary back then (or acoustic delay lines for that matter).
John Vincent Atanasoff has written in 1940 a very good report about the design of the Atanasoff-Berry computer (“Computing Machine for the Solution of Large Systems of Linear Algebraic Equations”).
In this report, he had analyzed which are the possible solutions for implementing a computer memory. He enumerated 5 kinds of memories: with mechanical bi-stable devices, with electro-mechanical bi-stable devices (e.g. polarized relays), with ferromagnetic materials, SRAM (with vacuum-tube flip-flops) and DRAM (with capacitors).
His list covered very well the most important kinds of memories used later.
The report “First Draft of a Report on the EDVAC” written by John von Neumann in 1945 added to these alternatives the memories made with delay lines (an idea presumably taken from the discussions with the ENIAC team). Moreover, the von Neumann report has also added a second variant of DRAM besides that using discrete capacitors, i.e. the DRAM made with a charge-storing image tube, like those used at that time for video cameras ("iconoscope memory").
Therefore the idea of the Williams tube has been taken directly from the von Neumann report. It is not known whether this idea has originated from von Neumann himself or whether he had got it from someone else.
See also acoustic delay lines. Basically the same but instead of using phosphorous, they use mercury, or piano-wire, but never gin. Gin was proposed by some fanciful idiot [1] but never used.
[1] https://en.wikipedia.org/wiki/Delay-line_memory#cite_ref-3
> Was the modified ENIAC less of a computer than the Manchester Baby because its program was in ROM and could not be changed by the computer?
To me, the most remarkable property of a computer is that data and code are interchangeable. This makes it possible for the same computer to run different programs, run programs that transform programs, and so forth. It's the same fundamental concept that today means that one can "download" an app and then run it.
(See also: Lisp, which is equally remarkable in the software space for the same reason)
> Look at it this way: many modern microprocessors, especially small ones for embedded control, have their programs in ROM. If they are modern-style computers, then so was the modified ENIAC.
What makes them modern-style computers, though, is that they are capable of having their firmware flashed - or at least the development versions can do this while their software is engineered. If the final product only runs a ROM, it has lost the essence of a general purpose computer, which is the fundamental and very remarkable invention that is what we actually celebrate.
"No-one would claim that a modern Harvard Architecture computer with its program stored in ROM isn’t a stored-program computer. So does ENIAC take the prize from Baby as the first electronic stored-program computer?"
I do actually claim that a modern Harvard Architecture computers such as the AVR8 or PIC microcontrollers are not stored-program computers. You can't store a program in them and then execute it. To be fair, some MCUs can change their own Flash so the difference can be subtle - in that case the processor is used either normally or as part of the ISP (in-system programming) circuit at different times.
For very simple stored-program machines the ability to modify running code is needed for it to be Turing complete. In a computer like the Baby, how would you add two arrays? It had no index registers. So you would need to increment the instructions that load and store from the arrays every time you go through the loop. I agree that this isn't an issue on a machine with only 32 words of memory in all, but it is a key idea in theory.
Of course, a Harvard computer can simulate a Von Neumann one (see AVR8 simulating an ARM in order to boot Linux where it does indeed store programs and then run them). In fact, a popular way to implement CISC computers was to build a tiny Harvard machine running a single program in its "microcode ROM" emulating the computer you actually wanted to use.
I have never seen a MCU that cannot write its own flash memory with a new program. This is how everybody makes firmware updates. At most, some MCUs have the option to blow some fuses to make some part of the memory immutable, but during program development the memory is still writable and used as such.
Ancient MCUs with true ROM or ultraviolet-erasable PROM may be claimed to not be a stored-program computer, but this claim is completely false about any MCU with a flash memory.
Manchester Baby was just a test device, not intended to be really useful, but its successor Manchester Mark 1, which became operational next year, in 1949, already had index registers.
Nevertheless, in the next few years there have been built many computers that lacked index registers, despite the precedent of Manchester Mark 1, so instruction modification during program execution was mandatory in them. After 1954, few, if any, computers remained without index registers, so instruction modification was no longer necessary.
The first ATMega, the ATMega603, needed an external device to write to its Flash program memory. The following models removed this restriction.
You can play with an online simulator of this machine here: https://davidsharp.com/baby/online/index.html
The simulator is originally written in Java, and the browser version is powered by CheerpJ, a WebAssembly-based JVM (https://cheerpj.com/)
I had to work on writing a simulator in Java for the SSEM/Manchester Baby as part of my University studies back in the 2000s, it was a fun task, nice to read more about it.
> In the ENIAC the form of the [sic] that [program] storage (decimal) was quite different to that of the data that ENIAC operated on (binary).
Huh? ENIAC was a decimal machine all the way through.