A History Of The Computer

For better or worse, computers are here to stay and they’re everywhere. They calculate our grocery bill while keeping track of store inventory; play traffic cop to millions of phone calls and let us conduct banking transactions from virtually anywhere in the world.

But how did the computer come to be what it is today? To fully appreciate the impact of computers on our lives, it would help if we knew what a computer was in the first place. That will involve some understanding of their evolution or how they came to be.

In terms of a date, the history of computers has no clear beginning point. By some accounts, computing devices go back as far as the early cave men who put stones in a pile to help them count.

Other popular candidates for first computer status include:

The Abacus – a device with sliding beads on a rack.
The Slide Rule – a mechanical device, composed of a ruler with sliding insert, marked with various number scales, which facilitates such calculations as division, multiplication, finding roots and finding logarithms.
The Calculator – a device that could perform a variety of computations including multiplication and division.

Historical accounts of the computer typically begin with the evolution of calculator which is basically a search for a device that could perform the four basic arithmetic functions of addition, subtraction, division and multiplication. Today’s computer, however, is much more than this. It has become a device that among other things, collects and manipulates data and helps us to communicate with others.

The Early History – Mechanical Computation: 1600 – 1820

The Abacus – which emerged about 5,000 years ago in Asia Minor and is still in use today, is often cited as the first computer. This device allows users to make computations using a system of sliding beads arranged on a rack.
The Slide Rule – In 1621 William Oughtred – an English mathematician – invents the first circular slide rule – the first analog computing device.
The Pascaline – In 1642, Blaise Pascal, invents what he calls a numerical wheel calculator – a brass rectangular box, also called a Pascaline. It uses eight movable dials to add sums up to eight figures long. Pascal’s device used a base of ten to accomplish this. For example, as one dial moved ten notches, or one complete revolution, it moved the next dial – which represented the ten’s column – one place. When the ten’s dial moved one revolution, the dial representing the hundred’s place moved one notch and so on.
The Mechanical Multiplier – In 1694, a German mathematician and philosopher, Goddfried Wilhem von Leibniz, improved the Pascaline by creating a machine that could also multiply. By studying Pascal’s original notes and drawings, Leibniz developed a stepped-drum gear design, which offered an elongated version of the simple flat gear.
The Arithometer – Around 1820, Charles Xavier Thomas de Colmar invents the arithometer, which performs all four arithmetic functions. With its enhanced versatility, the arithometer was widely used up until the First World War. Although later inventors refined Colmar’s calculator, together with fellow inventors Pascal and Leibniz, he helped define the age of mechanical computation.

A history of the computer can be partitioned into six major periods:

1600 – 1820 Mechanical Computation
1820 – 1940 The Punched Card
1940 – 1948 Vacuum Tubes
1948 – 1960 The Transistor
1960 – 1965 The Integrated Circuit
1965 – 1970 The Silicon Chip

The next two phases of the computer’s evolution shows us that while the first generation computer looked like nothing more than a device that did our counting for us, it was also a device for collecting and manipulating data. For the next century and a half – 1800 to 1948 – some of the most interesting and fundamental concepts of computer science were formulated and applied.

1804 Joseph Jacguard – invents the loom for weavers – a card punched with holes which lets some strands of thread pass while blocking others. The idea of a punched card was later adapted by Charles Babbage as the first mechanical method for entering information into a computer.

1822 Charles Babbage – frustrated by all the errors he found while examining calculations for the Royal Astronomical Society, Babbage noticed that while machines were best at performing tasks repeatedly without a mistake; mathematics, particularly the production of mathematical tables, often required the simple repetition of steps. The solution was to design a machine that could meet the needs of mathematics. His first attempt was the Difference Machine followed 10 years later by the Analytical Engine which was powered by steam. Although it was never built, the Analytical Engine included 5 concepts crucial to future computers:

  • An input device
  • A storage facility to hold numbers for processing
  • A processor or number calculator
  • A control unit to direct tasks to be performed
  • An output device.

1833 Augusta Ada – comes up with the idea that the analytical engine could be programmed using a single set of cards for repeating instructions. This is the first time the concept of computer programming was suggested. She is considered the first computer programmer. In the 1980’s, the U. S. Defense Department named a programming language ADA in her honor.

1887 Herman Hollerith – invents the first tabulating machine to use punched cards to count electronically. Hollerith brings his punch card reader into the business world, founding Tabulating Machine Company in 1896, later to become IBM in 1924.

1931 Vannevar Bush – develops a calculator for solving differential equations in 1931.

1936 Alan Turning – writes his seminal paper describing a hypothetical digital computer, now referred to as The Turning Machine.

1938 John V. Atanasoff and Clifford Berry – envision an all-electronic computer that applies Boolean algebra to computer circuitry. Using the work of George Boole who stated that all mathematical equations could be stated simply as either true or false, Atanasoff and Berry extended this concept to electronic circuits in the form of on or off.

1945 John Von Neuman – develops the stored program concept. His idea was to store not only the data to be processed in computer memory, but also the instructions used to process the data. This idea is considered to be among the most important in all of computer science.

The First Computers

The development of the computer was strongly influenced by World War II when they were used to calculate missile trajectories, decipher codes, and much more.

The Vacuum Tube

 

ABC 1939 – the first digital computer was designed by John Astanasoff.
Z3 1941 – invented by German engineer Konrad Zuse to design airplanes and missiles.
Colossus 1943 – was developed by the British to decode German messages.
Mark I 1944 – the first American general purpose computer controlled by programs. During development of the Mark II, a relay inside a computer failed and researchers found a moth beaten to death inside its contacts. This is thought to be the origin of the terms bug and debugging.
ENIAC 1946 – a room sized computer with 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints.
UNIVAC 1951 – was developed by John Von Neuman with a central processing unit.
IBM 701 1953 – designed by IBM, the IBM 701 was the first commercially sold computer.

The Transistor

Invented in 1948 the transistor radically changed the way we saw the computer . Replacing the large and cumbersome vacuum tube used in televisions and radios, the size of the computer has been shrinking ever since.

IBM 1401 1960 – first computer to use the transistor
The Integrated Circuit

Though transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer’s sensitive internal parts. The quartz rock eliminated this problem. Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit (IC) in 1958. The IC combined three electronic components onto a small silicon disc, which was made from quartz. Scientists later managed to fit even more components on a single chip, called a semiconductor. As a result, computers became ever smaller as more components were squeezed onto the chip. Another third-generation development included the use of an operating system that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer’s memory.

In 1965 the IBM 360 was the first computer to pioneer the use of integrated circuits on wafer chips instead of transistors to store data and process instructions. The 360 represented the 360 degrees of a compass suggesting its application beyond a business or scientific use.

The Silicon Chip

With the arrival of the transistor, the chip was not far behind. Using large scale integration techniques hundreds of components could be put on a single chip. Even better methods of integration meant that this could turn into millions of components on a single chip. In 1969, Ted Hoff began work on the idea of placing all of the processing circuits of a single computer on a single chip. His subsequent development of this idea became the microprocessor of today.

Developed in 1971, the Intel 4004 took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a single micro-sized chip. Instead of using an integrated circuit which had to be manufactured to fit a special purpose, one microprocessor could be manufactured and then programmed to meet any number of demands. Soon everyday household items such as microwave ovens, television sets and automobiles with electronic fuel injection incorporated microprocessors.

In 1981, IBM introduced its personal computer, the IBM 286 for use in the home, office and schools. The number of personal computers more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used.

Two Computers on One Silicon Chip – Core Duo

As of 2006, after years of unsuccessfully trying to find a faster, more energy-efficient chip from IBM and Freescale Semiconductor, Apple Computer has finally decided to go with Intel’s latest generation in chip technology, the Core Duo Processor. This not only means that we will be able to run Windows on an MacIntosh computer, we will have one of the fastest computers ever made.

Unable to develop a G5 chip that could handle a notebook, IBM could not meet the needs of Apple’s new iMacs and MacBooks. In 2003, Intel introduced the Centrino notebook which featured a chip that could boost battery life by minimizing its power demand without hurting performance. In 2007 it introduced the latest generation chip, the Core Duo which features two computing engines on a single piece of silicon. This was the final straw for Apple. The new Intel Macs should be two to three times faster than the iMac G5 and with Mac OS X plus Intel’s dual-core processor under the hood should blow away anthing from the PC world.

All, however, is not rosy. There is the issue of backward compatibility and the possibility that PC users might run pirated versions of OS X on their cheaper non-Apple computers. Apple introduced Rosetta which allows Intel-based Macs to run older applications, but avoided the issue of how it will tie its operating system to its hardware.

A Summary: Intel 4004 to Core Duo.

The 4004  was the first computer to use a 4-bit microprocessor.

8086 1978 8MHz used a 16-bit microprocessor
80286 1982 20MHz >used a 24-bit microprocessor
80386 1985 33MHz used a 32-bit microprocessor
80486 1989 50MHz was the first to offer a 8KB primary cache which reduced the need to access the systems slower main memory, the math coprocessor which handled floating point calculations, clock doubling and doubled the primary cache to 16KB.
Pentium 1993 200MHz introduced the CISC architecture, which included a doubling of the data bus width to 64 bits, a doubling of the primary cache to 32KB.
Pentium I 1995 200MHz introduced the RISC architecture which included a secondary cache or an integrated Level 2 cache with its own bus.
Pentium II 1997 450MHz. replaced the L2 cache with a special small circuit board containing the processor and 512KB of secondary cache. This assembly called a SEC or single-edge cartridge was designed to fit into a 242 pin slot (Socket 8).
Pentium III 1999 600MHz. put both L1 and L2 cache on the processor. It also introduced a process called SIMD which enabled one instruction to perform the same function on several pieces of data at the same time. Other improvement all stemmed from the new .18 micron process.
Pentium IV 2001 2200MHz. introduced new micro-architectural changes code named NetBurst allowed higher clock speeds and logic changes that allowed more instructions to be processed per clock cycle. The most important being an internal pipeline that consists of 20 pipeline stages vs the 10 stages for the P6 microarchitecture. Also introduced the 400MHz system bus.
Core Duo 2006 1GHZ The Core Duo processor is introduced by Intel which offers two computational cores with the lowest power consumption at less than 25 watts. Offers enhanced security desgined for businesses as a way to defend themselves against thefts of sensitive information.
Core i3 2012 1GHZ The Core i3 processor is lauched in 2012 replacing the Core Duo processor. The Core i5 processor appears next. In 2011 the Core i7 processor is introduced by Intel for business and high-end consumer markets for both desktop and laptop computers.

While the history of the computer can be characterized as a search for devices that can do our calculations for us and as a search for a method for collecting and manipulating data, it can also be characterized as a search for new methods of communicating with others. A computer then, is a device that has enabled us to do all of these things. To do these things, we have discovered that it is a device which accepts input data, performs operations or computations on the data in a prearranged sequence or program, and provides the result as an output or action. This broad definition includes everything from purely mechanical devices, to the sophisticated computer we use today.

Defining the Computer

A computer is a lot like a car. Both respond to you when you do something to them. A car goes faster when you step on the accelerator and a computer loads Windows when you turn it on. Both are machines which means that without human input they cannot function. Both depend on human input and this is what makes the end user the most important component of what we shall call a computer system.

It is a principle of any computer system that an end user must act before the computer can respond. Consequently, doing something is always better than doing nothing. However, this act cannot be arbitrary. The end user cannot do anything she pleases. She has to do the correct thing or the computer will not respond as expected. This brings us to one of the greatest barriers to becoming a competent end user. An end user is in the strange position of knowing that she has to do something and that she does not know what to do. This unknown is a major source of fear which inhibits the learning process.

The end user lives with the fear that she could do something that short circuits the computer system. To get over this fear the end user has to realize and be comforted by the thought that mistakes are necessary and that there is nothing she can do that cannot be corrected. Short of throwing the computer in the bathtub the end user should be encouraged to treat the computer like a new toy with a lot of buttons. The idea being that she has to start pushing buttons and taking a risk. This will be a frustrating experience, but it is the first step to being an end user. Do not think of the computer system as just an object or thing, but as a process that involves two events: the action of an end user and a machine which responds to that action.

A computer system, therefore, is a series of inputs and outputs. An event like the clicking of a mouse or pressing a key on a keyboard is the input, while the output could be a print out or the loading of Windows. This process or system is made possible by computer programming which is an entirely different story.