History of Computers

This chapter provides a quick overview of computer history. This is reinforced by the two PBS films “Inventing the Future” and “The Paperback Computer”, which are available on videotape. This chapter identifies some of the technological advances to be seen in documentaries. There are two things you should especially keep in mind:

  • The progression in hardware representation of a bit of data:
    1. Vacuum Tubes (1950s) – one bit on the size of a thumb.
    2. Transistors (1950s and 1960s) – one bit on the size of a fingernail.
    3. Integrated Circuits (1960s and 70s) – thousands of bits on the size of a hand
    4. Silicon computer chips (1970s and on) – millions of bits on the size of a finger nail.
  • The progression of the ease of use of computers:
    1. Almost impossible to use except by very patient geniuses (1950s).
    2. Programmable by highly trained people only (1960s and 1970s).
    3. Useable by just about anyone (1980s and on).

First Computer

At the University of Pennsylvania John W. Mauchly and J. The ENIAC machine was the first important computer designed by Presper Eckert. Unlike prior automatic calculators/computers, the ENIAC (Electrical Numerical Integrator and Calculator) used a term with ten decimal digits instead of a binary one. With over 18,000 vacuum tubes, ENIAC was the first machine to employ more than 2,000 vacuum tubes. Vacuum tubes, as well as the equipment needed to keep them cool, occupy a floor area of ​​approximately 167 square meters (1,800 sq ft). However, it included punch-card input and output as well as 1 multiplier, 1 divider-square router and 20 adders, which used decimal “ring counters” as adders and quick-access counters (0.0002 seconds).

The executable instructions that make up a program were stored in separate units of the ENIAC, which were linked to provide a path for computation to run through the computer. For each difficulty, these connections, as well as presetting function tables and switches, had to be repeated. This “wire-it-yourself” instruction approach was cumbersome, and ENIAC could only be considered programmable with some sort of license; Nevertheless, it was efficient in processing the specific programs for which it was designed. The ENIAC is widely regarded as the first successful high-speed electronic digital computer (EDC), and was in service from 1946 to 1955. However, in 1971, a debate erupted about the patentability of ENIAC’s original digital concepts, alleging that another 1930s American physicist at Iowa State College, John V. Atanasoff made the same assumption. Used in a small vacuum-tube system. The court ruled in favor of the corporation in 1973 using Atanasoff’s claim, and Atanasoff earned the recognition he deserved.

Hardware Progression

Two technologies would be created in the 1950s that would advance the computer industry and set the stage for the computer revolution. The transistor was the first of these two devices. Invented in 1947 by William Shockley, John Bardeen and Walter Bretton of Bell Labs, the transistor was destined to replace vacuum tubes in computers, radios and other electronics. In 1906, American physicist Lee de Forest created the vacuum tube, which was used in almost all computers and computing devices at the time.

The vacuum tube, which is roughly the size of a human thumb, works by heating a filament within the tube until it turns cherry red with a large amount of energy. The release of electrons into the tube as a result of heating this filament can be controlled by other components within the tube. The triode was de Forest’s first invention, which allowed him to control the flow of electrons across a positively charged plate inside the tube. The absence of an electron current in the plate indicates zero, while the existence of a small but detectable current in the plate indicates one.

Vacuum tubes were inefficient, took up a lot of space, and had to be replaced frequently. In the 1940s and 1950s, computers had 18,000 tubes, and cooling rooms with all these tubes and the heat generated by 18,000 tubes were not cheap. All these issues were promised to be solved by transistors, and they were. Transistors, on the other hand, have their issues. The major issue was that transistors, like other electrical components, require soldering. As a result, as circuits became increasingly sophisticated, the connections between individual transistors became more complex and numerous, increasing the risk of mis-wiring. Jack St. Clair Kilby of Texas Instruments also solved this problem in 1958. He created the first integrated circuit, often called a chip. When a transistor is made, it is made up of a collection of small transistors that are linked together to form a chip. As a result, the need for soldering a large number of transistors was essentially eliminated. Now only the connection of other electrical components was required. In addition to reducing the space, the machine’s speed was increased due to the shorter distance the electrons had to travel.

Mainframes to PCs

Large mainframe computers were significantly more popular during the 1960s in large companies as well as in the US military and space program. IBM established itself as the undisputed market leader in the sale of these huge, expensive, error-prone and difficult to use computers. In the early 1970s, there was a real boom of personal computers, with Steve Jobs and Steve Wozniak presenting the first Apple II at the First West Coast Computer Faire in San Francisco. For only $1298, the Apple II included a built-in BASIC programming language, color graphics, and a 4,100 character memory. An audio cassette recorder can be used to store programs and data. By the end of the show, Wozniak and Jobs had secured 300 orders for the Apple II, and the company took off from there.

The TRS-80 was also released in 1977. Tandy Radio Shack was the manufacturer of this home computer. The TRS-80 Model II, in its second version, had 64,000-character memory and a disk drive for storing programs and data. At that time only Apple and TRS had machines with disc drives. Personal computer applications began to function after the advent of the disk drive, as the floppy disk was a much easier publishing medium for program distribution.

IBM, which by this point had been producing mainframes and minicomputers for medium to large enterprises, realized it needed to get into the action and began development on Acorn, which eventually became known as the IBM PC. . . . The PC was the first computer built for the domestic market with a modular design that allowed the easy addition of new parts to the architecture. Surprisingly, most of the components came from outside IBM, as using IBM parts would have been too expensive for the home computer market. For $1265, the PC came with 16,000 character memory, an IBM electric typewriter keyboard, and a connector for a tape cassette player when it was first released.

Apple and IBM both released new models in 1984. The original Macintosh was introduced by Apple, and was the first computer to include a graphical user interface (GUI) and a mouse. Because it was easy to operate, the GUI made the system significantly more attractive to home computer users. The popularity of the Macintosh skyrocketed like never before. With software such as Lotus 1-2-3, a spreadsheet, and Microsoft Word, IBM worked hard to find Apple and introduced the 286-AT, which quickly became a favorite of corporate concerns.
This happened to us about ten years ago. People now have powerful home computers and personal graphics workstations. The average home computer is several orders of magnitude more powerful than a system like the ENIAC. The computer revolution has been the fastest technological advance in human history.

Also Read: