Wednesday, February 4, 2009

History of Computers

Computers, as the term suggests, were originally developed for computing and calculations. However, present day computers are more commonly used for non-numeric purposes like graphics, music, and word processing than for numeric calculations.
Computer is an electronic device which can receive, store, and process data according to a set of instructions (known as computer program) stored in it to generate the output.

The data could be almost any thing, like The Names, Register numbers and Marks of students in a class, or it could be a Song, a Picture, or a Document or it could be the Temperature and Pressure on different parts of a Space-craft or the Blood pressure, E.C.G and other vital parameters of a patient in the critical care unit of a hospital or the Photo, Finger print and other details of a suspected criminal.

Whatever is the data, the computer stores and handles all of them as just numbers or digits. Hence such computers are known as Digital Computers. Almost all computers we encounter are digital computers.

There are other types of computers like Analogue Computers and Hybrid Computers (combination of Analogue and Digital). But use of such computers is so specialized and limited that only specialists need know about them. In our discussions the terms ‘Computer’ will always mean Digital Computers only.

Computers store all data as numbers in electronic circuits inside them. Since electronic circuits have two distinct states ( ON and OFF) which are easily produced and detected, a number system based on two digits is used for representing numbers in the computer. This system of numbers which has only two digits, is known as binary system and the binary digit is called a bit ( as the short form of binary digits).

In the long history of mankind, number system and calculations are rather recent inventions (only a few thousand years old). Before efficient number systems developed, people used sticks or stones to count their cows or their own number. One stone represented one cow or one man. Addition and subtraction were done by moving the stones and multiplication and division were done by repeated addition or subtraction. In fact the word ‘calculate’ came from the Latin word ‘calculus’ which means stone. The calculations which early men did with their stones were amazing. They could even calculate and correctly predict solar and lunar eclipses using their counting stones.

Moving of stones to calculate, led to the development of the abacus. In the hands of an expert, an abacus is a very powerful calculating device. In 1946 a contest was held in Japan between an American using a calculator and a Japanese using an abacus and the Japanese won. Invention of paper and spreading of Hindu-Arabic numbers (0,1,2,3,4,5,6,7,8,9) in Europe around 12th century led to the down fall of abacus.

In ancient times, in large stores, there used to be a counting man called counter. He sat in front of the store and had a table with counting lines and a set of counting stones. He moved the stones along the counting lines and thus made the calculations. Even today the term “counter” is used to denote the place where bills are prepared and money is received.

Finding that the counting tables are too heavy to be moved from one place to another, some one with an inventive mind drew the counting chart on a piece of thick woolen cloth and used to spread it on a table while making calculations. The French word for that kind of cloth was ‘bure’ and hence the table for spreading such calculating cloth got the name ‘bureau’ and men working at the counting tables came to be called “bureaucrats”.

In 1614 John Napier, a Scottish man, invented and published logarithms. Logarithms made calculations easier as multiplication and division could be done by addition and subtraction of logarithm of numbers. Napier also introduced the decimal point to separate numbers into integer and decimal parts. Napier also invented a mechanical device for multiplications and divisions. These were originally made from bone and got the nickname “Napier’s Bones”.

In 1642, a 19 year old French boy, Blaise Pascal, designed and made a mechanical calculating machine to help his father, who was a taxman who had to a lot of calculations during his official work. This machine built using gears and Ratchets was the first reliable calculator. Pascal is also credited with the invention of Hydraulic Press. In 1968 a computer language was named after him.

In 1671, Gottfried Wilhem Leibnitz, a German, made a calculating machine as an improvement over the design of Pascal. He was a great mathematician. He studied the binary number system, which is the basis of modern computers.

An invention which had great influence in the development of computers is the punch card control of weaving looms. Punched cards allow only specific needles to pass through the holes and using a series of such cards, specific patterns could be woven. Thus these cards work as a control program for the loom. Though he was not the original inventor of this idea, Joseph J. Jacquard perfected this card control of looms and such looms are known as Jacquard looms. A series of such punched cards in fact formed a stored program for the loom.

The punched cards used in Jacquard looms led to the use of punched cards for storing data. It was successfully used in the census of America near the end of the 19th century. Punched cards were in wide use for feeding data and program to computer till early 70’s, until Floppy Diskettes came into being.

English Mathematician Charles Babbage laid the theoretical foundation of modern computer. In 1822, he made a mechanical device to print tables of squares. The machine was named “difference engine”. In 1833 he got a contract for constructing an automatic machine with far more capability. The machine was designed to store one thousand 50 digit numbers for calculations and made decisions depending on the results. (It is this decision making feature that distinguishes a computer from a calculator). This engine was named ‘Analytical Engine’. The engine was conceived grandly to be steam engine powered. It could punch the result on a copper plate and would ring a bell when data is finished. Analytical engine was purely mechanical device. Babbage worked on perfecting this engine for 40 years till death. The machine he designed was never built, because at that time no one could achieve accuracy of machining, which such a complex engine required. In fact, it is believed that Charles Babbage was at least half a century ahead of his time. Had he lived 50 years later he could have seen his design taking shape and working. Though the Analytical Engine was never built, the ideas behind it are still relevant in modern computers too. The idea of storing data internally was the idea of Babbage. This machine itself was called the “mill” (instead of grinding wheat into grain, it ground numbers into result). Stamping out the result on a copper plate is the forerunner of printing out results in modern computers.

Though Babbage spent all his money and time on developing and perfecting the Analytical Engine, he had lost all his money and was a poor man towards the end of his life. But one lady admired and supported him till his death. She was the daughter of the famous English Poet Lord Byron. She was a lady Ada Augusta. She helped Babbage much in the design of the engine and is considered to be the first computer programmer. In 1979, a computer language called Ada was named after her. U.S. defense department uses Ada as the standard language for all its official software projects. A computer language was named Babbage and it was in use in Great Briton.

A Swedish printer, George Scheutz built a practical but smaller version of Babaage’s machine. One such engine was purchased by an observatory in New York in 1855, to print Astronomical Tables. This was America’s first ‘Computer’. Another Scheutz machine was built for the British Government to print Insurance Tables.

An English Mathematician George Boole also made significant contribution to development of Computer. Boole laid the foundation for what is now known as Boolean Algebra which is based on binary systems (a number system in which there are only 2 numbers ‘0’ and ‘1’). Like Babbage, Boole was also far ahead of his time. His algebra of logic did not have any use for nearly 100 years.

In 1890, William S Burroughs designed and built a successful calculator. He later added a printer to his calculator so that entries and calculations would be printed on a paper tape. He formed a company which is now a major supplier of Computers and is known as the “Burroughs Corporation”.

Cash Registers with built-in calculator came into being. The first Cash Register was built by James Ritty in 1878. Four years later. Jacob H.Eckert purchased Ritty’s company and added the cash drawer and bell to cash register. This company was later purchased by John H.Patterson who renamed the firm as the National Cash Register Company. NCR is a leading international company today.

Necessity is the matter of invention, so is the saying. The Automated Data Processing also came into being because of the need to tabulate the census data of 1890 census of America. The results of 1880 Census was prepared manually and it took seven and a half years for the final result to arrive. It was feared that due to the large increase in population, the results of 1890 census would not be completed before the next census. A competition for the data processing work was held and the Electric Tabulating System was the winner. The idea of Electric Tabulation was conceived by a Mechanical Engineer Herman Holbrith in 1887.

In Hollerith system cards were punched with hole representing information such as the number of people in a house, their gender, age. The data cards were then placed under a card reader. Card reader had rows needles. Where there were holes the needles would go down and complete an electric circuit. This would cause the corresponding dial to advance one digit. The cards were placed one by one manually. Using this Electric Tabulating System, the 1890 census could be completed in just two and a half years.

Hollerith founded a firm called ‘Tabulating Machine Company’ which later merged with several others in 1924 to become the world famous International Business Machines or IBM for short. Today IBM is the largest manufacturer of data processing equipment.

In 1905, U.S. Census Bureau entrusted the work of developing one tabulating machine of their own to one of their engineers, James Powers. In 1911, he left the census bureau and started his own firm. ‘Powers Accounting Machine Company’. Later this Company became the Tabulating Machine Division of Remington Rand Corporation, which manufactured the first commercial computer in 1951, the UNIVAC – 1, to analyse the 1950 census data.

A general purpose calculator using relays and mechanical parts was built at Harward University in 1944. It was designed by Howard Aiken and was called “Automatic Sequence Controlled Calculator” or Howard Mark-1. It was a big machine, 51 feet long, 8 feet high and contained 7,60,000 parts connected by 500 miles of wire. It could add two 23-digit numbers in 0.3 sec and multiply in 5 seconds. It could perform all basic arithmetic operations and also calculate mathematical functions such as logarithms and sines. Although named a calculator, it was really a computer as it had decision making capability. A faster all-relay revision Mark-II was built in 1947.

John V Atanasoff and Clifford Berry had built a prototype electronic device in 1942. But ENIAC (Electronic Numerical Integrator and Computer) is considered as the first Electronic Computer in the world. It was commissioned in 1946 at the Moore School of Electrical Engineering. This was constructed by a physicist John W Mauchly and an electrical engineer, J Presper Eckert. It was a huge machine weighing 30 tons and filling a large hall (30x50 feet). It had over 18,000 electronic valves and used 150 KW of Power. Its cost was $4,00,000/- at that time. When commissioned, it was 5000 times faster than its nearest competitor, the Howard Mark-1. But, by present standards, it was an extremely slow machine. Even a cheap computer of today is several thousand times faster than this. It used to fail frequently because of the large number of parts and the enormous amount of heat produced. In one instance the cause of malfunction was found to be a moth short circuiting two solder joints. When the moth was removed the machine started functioning normally. From then on the process of finding and rectifying defects in circuits came to be called humerously as debugging. This term is now used for correcting software problems. Software problem themselves are called ‘bugs’.

John von Neumann is often called father of modern computers. He had developed many concepts on design of computers which are still being applied. He pointed out the advantage of binary over decimal system in the construction of computers. Another important concept was the stored program concept in which the machine instructions are stored along with data internally in the computer. In 1950 JOHN Von Neumann constructed a Computer called “Electronic Discrete Variable Automatic Computer (EDVAC).

In 1949, Cambridge University, England built an electronic calculator called EDSAC ( Electronic Delay Storage Automatic Calculator). This was one early computer.

In 1947, Brattain, Shockley and Bardeen of Bell Labs invented transistors. Transistors are semi-conductor solid state devices which required no heater, unlike electronic valves. Their use in building computer circuits instead of valves, drastically reduced the size and power consumption of computers. Computers built using transistors are known as Second generation Computers and computers built using valves were called First Generation Computers.
First all-transistorized computer was IBM 7090 and was introduced in 1959. This marked the beginning of second generation of computers. Next major development in hardware was Integrated Circuit (IC). First integrated circuit was built by Jack Kilby of Texas Instruments in 1958. Over the years the number of devices on a single chip went on increasing and various generations of ICs came into being.


IBM 360 Computer was the first to use ICs (1964). It also introduced the concept of byte, which is a group of 8 bits. Till then bits and words were used to describe memory size.

In 1965, PDP-8, first mini-computer, was introduced by DEC (Digital Equipment Corporation).

In 1969, Intel, the IC manufacturers, produced the First Microprocessor Chip later named Intel 4004. A microprocessor is virtually “a computer on a chip”.
A microprocessor contains all essential elements of Computer on a single wafer of silicon no more large than your thumb nail.

1972 saw the First 8 bit microprocessor – Intel 8008. In 1972 IBM introduced Floppy Disk and in 1975 the First popular micro computer – Altair 8800 – was introduced.

For thousands of years the challenge had been to build a good computing machine. Now it is available to all at affordable cost. The challenge now is to learn how to use the Computer in the best manner.

1 comment: