When was the first mainframe made




















Sure, it was fast. But it had very little storage. More than that, it had to be reprogrammed by re-wiring it, which could take hours or even days, and it was inherently unreliable because the computer used so many vacuum tubes. In addition to being unreliable, vacuum tubes also used a lot of power, required a lot of space, and generated a lot of heat.

Clearly, minimizing their use would have multiple advantages. There were two important conceptual changes one of which was revolutionary on the EDVAC that seem very obvious today. Also, rather than rewiring the machine every time you wanted to change the "program," the EDVAC introduced the idea of storing the program in memory, just as if it were data.

This is what we do today. We do not, after all, have separate RAM areas for applications and for their data although L1 caches typically operate this way. The processor knows, based on the context in which the memory was accessed, whether it is data or an executable. In addition, memory no longer consisted of vacuum tubes, but was stored as electrical impulses in mercury. The mercury delay line was times more efficient in terms of the electronics necessary to store data and made much larger amounts of memory feasible and more reliable.

It was a binary stored-program computer, which could be programmed much more quickly than the ENIAC could. It was also much smaller, weighing less than nine tons, and consumed "only" 56 kilowatts of power. Even still, our two heroes were not done yet.

They incorporated their company in , calling it the Eckert-Mauchly Computer Corp. The dynamic duo, however, wanted to explore the commercial opportunities that this new technology offered, which was not possible with university-sponsored research, so they developed a computer based on their ideas for the EDVAC and even superseded them.

The UNIVAC was the first-ever commercial computer, 46 units of which were sold to businesses and government after its introduction. All machines before it were unique, meaning they only made one of them. Eckert and Mauchly correctly concluded that a computer could be used not only for computations, but also for data processing, while many of their contemporaries found the idea of using the same machine for solving differential equations and paying bills to be absurd.

On a lower level, the UNIVAC consisted of 5, vacuum tubes almost all in the processor , weighed 29, pounds, consumed KW, and ran at a whopping 2. It was capable of doing multiplications per second and could hold 1, words in its mercury delay-line memory. Each word of memory could contain two instructions, an digit number and sign, or 12 alphabetical characters.

The processing speed was roughly equivalent to the amount of time it took the ENIAC to complete the tasks that it could perform. But in virtually every other way, it was better. On top of this, the "Automatic" in its name alluded to how it required no human effort to run. All the data was stored and read from a metal tape drive as opposed to having to manually load the programs each time they were to be run with paper tapes or punched cards.

There were other niceties that made their appearance on the UNIVAC as well, like buffers similar to a cache between the relatively fast delay lines and relatively slow tape drives, extra bits for data checking, and the aforementioned ability to operate on both numbers and alphabetical characters. This and the fact it was the first commercially available computer gave Remington Rand which had bought EMCC a very strong position in the burgeoning electronic computer industry.

But what was IBM doing at this time? While most of our esteemed readers have a good idea of IBM's dominance in the world of computing from the mid to late 20th century, what may be less-known is where it starts, how and why it happened, and how it progressed. Memory was not stored in a mercury-delay line, but in 3" vacuum tubes referred to as "William's Tubes," in deference to their inventor.

Although they were more reliable than normal vacuum tubes, they still proved to be the greatest source of unreliability for the computer. However, one benefit was that all bits of a word could be retrieved at once, as opposed to the UNIVAC's mercury delay lines, where memory was read bit by bit. It could also execute almost 17, additions and subtractions, as well as most other instructions, per second. This was remarkable for that time. IBM's eight million byte tape drive was also very good and could stop and start much faster than the UNIVAC's and was capable of reading or writing 12, digits per second.

It allowed data to be quickly read from anywhere on the disk and could be attached not just to the , but to IBM's other computers, including the , which we will look at next. As most of you no doubt realize, this technology is the progenitor to the hard disks that are very much with us today. Also, as mentioned, the was only part of IBM's response. The was the other. While IBM's more direct response to the UNIVAC was the and later the , it also was working on a lower-end machine known as the Magnetic Drum Data Processing Machine so named because it employed a rotating drum that spun at 12, revolutions per minute and could store 2, digit numbers.

It was positioned somewhere between the big mainframes like the and UNIVAC and the punched-card machines used at the time, the latter of which were still dominating the market.

While the generated most of the excitement, the earned most of the money and did much more to establish IBM as a player in the electronic computer industry. In total, over 2, of these machines were built and leased. While this greatly exceeded the 's and UNIVAC's deployment, it was paltry compared to the number of punched-card accounting machines that IBM sold during the same period.

Although very reliable by computer standards, it still used vacuum tubes and thus was inherently less reliable than IBM's electromechanical accounting machines. On top of this, it was considerably more expensive.

Finally, the peripherals for the machine were mediocre at best. So, right up to the end of the s, IBM's dominant machine was the punched-card Accounting Machine The computer would need better peripherals and had to become more reliable and faster, while costing less. Our next machine is not the computer that finally banished the into obsolescence--at least not directly--but many of the technologies that were developed for it did.

The Whirlwind project was ironic. It went way over budget, took much longer than intended, and was never used in its intended role, but was arguably one of the most important technological achievements in the computer field. In , when the US Air Force gave MIT's Jay Forrester the Whirlwind project, he was told to create a simulator to train aircraft pilots rather than have them learn by actually being in a plane. This intended use was very significant in that it required what we now call a "real-time system," as the simulator had to react quickly enough to simulate reality.

While other engineers were developing machines that could process 1, to 10, instructions per second, Forrester had to create a machine capable of a minimum of , instructions per second. On top of this, because it was a real-time system, reliability had to be significantly higher than other systems of that time. The project dragged on for many years, long after World War II had ended.

By that time, the idea of using it for a flight simulator disappeared, and for a while, they weren't quite sure what this machine was being developed to do. That is, until the Soviets detonated their first nuclear bomb and the U. One part of this was to develop computer-based command-and-control centers. The Whirlwind had a new life, and with so much at stake, funding would never be a problem.

Memory, however, was a problem. The mercury-delay line that others were using was far too slow, so Forrester decided to try a promising technology: electrostatic storage tubes. One problem he faced was that they did not yet exist, so a lot of development work had to be put into this before he would have a working product. But once it was completed, electrostatic storage tubes were deemed unreliable and their storage capacity was very disappointing.

Consequently, Forrester, who was always looking for better technology, started work on what would later be called "core memory. It was very fast, very reliable, and did not even require electrical refreshes to hold its values. We'll talk more about core memory later, but suffice it to say, it was an extremely important breakthrough that quickly became the standard for well over a decade.

He took this idea to IBM, who was tasked with figuring out how to create this computer. Finally, in , the device was completed. It was only capable of doing three additions or subtractions per second and was a bit of a letdown to the community. However, it represented the first fully automated computing machine and was a preview of the mainframes that would eventually change the world.

Presper Eckert. The ENIAC was retired in and estimates show that it did more calculations itself in the 10 years it was working, than all of humankind had done up to It incorporated numerous logical improvements that were discovered during the production of the ENIAC, like high-speed serial-access memory.

The EDVAC was completed in , began operating in , and by was running over 20 hours a day with error-free run time averaging eight hours. Common business-oriented language, attributed to pioneer and grand mother of COBOL, Grace Hopper , is still widely used today in legacy applications deployed on the mainframe. Mainframes are often synonymous with IBM, and to paint a picture of mainframe inventors and the history of mainframes, it is important to look at Big Blue.

As mentioned previously, IBM had a hand in the first ever mainframe computer, and their influence and designs were a constant from then on. Magnetic storage — While early mainframes were based on vacuum tubes for storing data, a major innovation came to the mainframe world with the development of what was called core memory.

In place of vacuum tubes, core memory stores information magnetically. Your magnetic RPM hard disk it was not, but core memory provided a faster and more reliable way to store and retrieve data than vacuum tubes. Core memory was first used in and soon replaced vacuum tubes entirely. It beats even C, which originated in the early s. IBM is the name most closely associated with mainframes but, historically, the mainframe commercial ecosystem was more diverse.

More than half-dozen companies — including Univac, General Electric, and RCA — also sold mainframes during the first few decades of mainframe computing. Linux for mainframes — Also worth noting is that, while mainframes for the first decades of their history ran on special mainframe operating systems, by the late s this changed. Starting in , IBM began developing a Linux-based operating system that could run on mainframes in place of mainframe-native systems. Read our white paper Getting the Most Out of Your Mainframe See how to offload, accelerate and lower cost of your mainframe to maximize its value.

Related posts. Transform Your Mainframe Data for the Cloud with Precisely and Apache Kafka Cloud migration projects are happening in virtually every large enterprise throughout the world, and in many small and midsize companies as well. Precisely Editor Mainframe November 4, Precisely Editor Mainframe October 28,



0コメント

  • 1000 / 1000