It might be hard to believe, but computers have been in existence for thousands of years. The earliest computers were nothing like what we think of as a computer today, though. The word “computer” was actually first recorded in 1613, but it was used to describe a person making calculations. It was not until the end of the 1800s that it started to be used to described a machine.
Computers developed from a variety of different devices capable of computing. The first was the Sumerian abacus, the first “machine” used to count and calculate. This was followed by the Greek Antikythera mechanism, which was an ancient version of a computer invented around 80 BC. Slide rules, electronic calculators, and the astrolabe also played a role in the eventual development of the modern computer. Another important development occurred during the 10th century when a French monk returned from Spain with information about a machine that was able to answer yes or no questions.
The Mechanical Calculator
The Renaissance was an important time in the development of the computer. In 1642 the mechanical calculator was invented. The device was
capable of performing all four mathematical operations without human input. This became the basis of the modern day computer in two different ways. First, it was the effort to build more powerful calculators that led to the development of the computer. Second, low-cost electronic calculators led to the development of Intel, which was first microprocessor integrated circuit that was commercially available.
Toward the end of the 19th century, a method for recording data onto a machine-readable medium was invented by Herman Hollerith. The machine used punched cards and led to the invention of the tabulator and the keypunch machine. It was these three inventions that created the foundation for modern information processing. The 1890 U.S. Census used mechanized data processing invented by Hollerith. His company later became the basis of the IBM company.
Methods of computing continued to evolve throughout the 20th century. Analog computers were invented during the first half of the century and used direct mechanical and electrical models for computation. The machines were not programmable and had little versatility. In 1936, Alan Turing formulized the concept of algorithm and computation. He used his invention, the Turing machine. as a blueprint for electronic digital computing. Turing is considered one of the 100 most influential people of the century and most believe modern computers would not exist without his invention.
Modern day computer us integrated circuits, which have billions of times the ability of the original computers and take up only a fraction of the space. Today, computers are small enough to carry around, as well as be included in cellular phones and music and video devices.