A computer is a general purpose device that can be programmed to carry out a set of arithmetic or logical operations automatically. Computers are used for word processing, pictures, music, videos, games, research, internet, programming, accounting (business), just about anything and everything nowadays.
The First Computer: ABACUS
Abacus is a device which used for simple operations. It works with beads on it. People do their operations with beads. Chinese invented it around 2400BC but some people claimed that it is Babylonians invent. It came to America and Europe through oversea commerce merchants. It is ancestor of computers.
1600s - present
Firstly calculators used for four operations and then they were improved and they used for all mathematical operations. Actually, first calculators were abacuses. Wilhelm Schickard invented a calculator which could do four operations, for the first time in 1623. It used for astronomy, cartography designed a calculator in . Also, invented a calculator in . They were all different and they didn't use widespreadly. After a century,
1940S - 1950S
FIRST GENERATION COMPUTERS
First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 150 kilowatts of power. It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors.
1955S - 1960S
SECOND GENERATION COMPUTERS
The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards.
THIRD GENERATION COMPUTERS
The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.
1971 – PRESENT
FOURTH GENERATION COMPUTERS
First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004.