# A history of the computers computer science and programming

A translation of this page into Russian is available, prepared by Ted Mosby. Before 1900 People have been using mechanical devices to aid calculation for thousands of years.

For example, the abacus probably existed in Babylonia present-day Iraq about 3000 B. The ancient Greeks developed some very sophisticated analog computers. In 1901, an ancient Greek shipwreck was discovered off the island of Antikythera. Inside was a salt-encrusted device now called the Antikythera mechanism that consisted of rusted metal gears and pointers.

More Antikythera info here.

- Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956;
- Zuse, because of the scarcity of material during WW II, used discarded video film as punch cards;
- The PC and clone market begins to expand.

John Napier 1550-1617the Scottish inventor of logarithms, invented Napier's rods sometimes called "Napier's bones" c. In 1641 the French mathematician and philosopher Blaise Pascal 1623-1662 built a mechanical adding machine.

Similar work was done by Gottfried Wilhelm Leibniz 1646-1716. Leibniz also advocated use of the binary system for doing calculations. A brief description of the device is contained in two letters to Johannes Kepler. Unfortunately, at least one copy of the machine burned up in a fire, and Schickard himself died of bubonic plague in 1635, during the Thirty Years' War. Joseph-Marie Jacquard 1752-1834 invented a loom that could weave complicated patterns described by holes in punched cards.

Charles Babbage 1791-1871 worked on two mechanical devices: Babbage was a bit of an eccentric -- one biographer calls him an "irascible genius" -- and was probably the model for Daniel Doyce in Charles Dickens ' novel, Little Dorrit. A little-known fact about Babbage is that he invented the science of dendrochronology -- tree-ring dating -- but never pursued his invention.

In his later years, Babbage devoted much of his time to the persecution of street musicians organ-grinders. One of Babbage's friends, Ada Augusta Byron, Countess of Lovelace 1815-1852sometimes is called the "first programmer" because of a report she wrote on Babbage's machine.

The programming language Ada was named for her. William Stanley Jevons 1835-1882a British economist and logician, built a machine in 1869 to solve logic problems. It was "the first such machine with sufficient power to solve a complicated problem faster than the problem could be solved without the machine's aid. Herman Hollerith 1860-1929 invented the modern punched card for use in a machine he designed to help tabulate the 1890 census.

The Rise of Mathematics Work on calculating machines continued. Some special-purpose calculating machines were built. For example, in 1919, E. A history of the computers computer science and programming 1880-1925a lieutenant in the French infantry, designed and had built a marvelous mechanical device for factoring integers and testing them for primality. The Spaniard Leonardo Torres y Quevedo 1852-1936 built some electromechanical calculating devices, including one that played simple chess endgames.

He posed three questions: This last question was called the Entscheidungsproblem.

### 1900 - 1939: The Rise of Mathematics

He showed that every sufficiently powerful formal system is either inconsistent or incomplete. Also, if an axiom system is consistent, this consistency cannot be proved within itself.

The third question remained open, with 'provable' substituted for 'true'. In 1936, Alan Turing 1912-1954 provided a solution to Hilbert's Entscheidungsproblem by constructing a formal model of a computer -- the Turing machine -- and showing that there were problems such a machine could not solve. One such problem is the so-called "halting problem": Wartime brings the birth of the electronic digital computer The calculations required for ballistics during World War II spurred the development of the general-purpose electronic digital computer.

At Harvard, Howard H. Military code-breaking also led to computational projects.

## A Brief History of Computer Science

Alan Turing was involved in the breaking of the code behind the German machine, the Enigma, at Bletchley Park in England. The British built a computing device, the Colossus, to assist with code-breaking. At Iowa State University in 1939, John Vincent Atanasoff 1904-1995 and Clifford Berry designed and built an electronic computer for solving systems of linear equations, but it never worked properly. Atanasoff discussed his invention with John William Mauchly 1907-1980who later, with J.

Exactly what ideas Mauchly got from Atanasoff is not complely clear, and whether Atanasoff or Mauchly and Eckert deserve credit as the originators of the electronic digital computer was the subject of legal battles and ongoing historical debate. Von Neumann's report, "First Draft of a Report on the EDVAC", was very influential and contains many of the ideas still used in most modern digital computers, including a mergesort routine.

Meanwhile, in Germany, Konrad Zuse 1910-1995 built the first operational, general-purpose, program-controlled calculator, the Z3, in 1941. More information about Zuse can be found here. In 1945, Vannevar Bush published a surprisingly prescient article in the Atlantic Monthly about the ways information processing would affect the society of the future. Another copy of the Bush article appears here. This machine is sometimes called the first stored-program digital computer. The invention of the transistor in 1947 by John Bardeen 1908-1991Walter Brattain 1902-1987and William Shockley 1910-1989 transformed the computer and made possible the microprocessor revolution.

For this discovery they won the 1956 Nobel Prize in physics. Shockley later became notorious for his racist views. More about Forrester here. Earlier, in 1947, Hopper found the first computer "bug" -- a real one -- a moth that had gotten into the Harvard Mark II. Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer.

Dijkstra is famous for his caustic, opinionated memos. For example, see his opinions of some programming languages. In a famous paper a history of the computers computer science and programming appeared in the journal Mind in 1950, A history of the computers computer science and programming Turing introduced the Turing Testone of the first efforts in the field of artificial intelligence.

He proposed a definition of "thinking" or "consciousness" using a game: If this distinction could not be made, then it could be fairly said that the computer was "thinking". In 1952, Alan Turing was arrested for "gross indecency" after a burglary led to the discovery of his affair with Arnold Murray.

Overt homosexuality was taboo in 1950's England, and Turing was forced to take estrogen "treatments" which rendered him impotent and caused him to grow breasts. On June 7, 1954, despondent over his situation, Turing committed suicide by eating an apple laced with cyanide. In fact, the term was coined by George Forsythea numerical analyst.

The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. Operating systems saw major advances. The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin.

### The History Of Computer Programming

Chomsky later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy. Proving correctness of programs using formal methods also began to be more important in this decade. The work of Tony Hoare played an important role. Hoare also invented Quicksort.

- The WWW has also become a convenient way to buy and sell services and goods;
- Alan Turing -- 1912-1954;
- Panini used metarules, transformations and recursions;
- The "Information Superhighway" links more and more computers worldwide;
- Switching circuit theory provided the mathematical foundations and tools for digital system design in almost all areas of modern technology;
- The WWW was originally conceived and developed for the high-energy physics collaborations, which require instantaneous information sharing between physicists working in different universities and institutes all over the world.

Engelbart invents the computer mouse c. A rigorous mathematical basis for the analysis of algorithms began with the work of Donald Knuth b. Codd on relational databases. Codd won the Turing award in 1981. Unixa very influential operating system, was developed at Bell Laboratories by Ken Thompson b. Brian Kernighan and Ritchie together developed C, an influential programming language. Other new programming languages, such as Pascal invented by Niklaus Wirth and Ada developed by a team led by Jean Ichbiaharose.

Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time. The 1970's also saw the rise of the supercomputer. It could perform 160 million operations in a second. The Cray XMP came out in 1982. Cray Research was taken over by Silicon Graphics.

There were also major advances in algorithms and computational complexity.

## A Very Brief History of Computer Science

In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet. The first computer viruses are developed c. In 1981, the first truly successful portable computer was marketed, the Osborne I. In 1984, Apple first marketed the Macintosh computer.

Biological computing, with the recent work of Len Adleman on doing computations via DNA, has great promise.

## The History Of Computer Programming

Quantum computing gets a boost with the discovery by Peter Shor that integer factorization can be performed efficiently on a theoretical quantum computer. The "Information Superhighway" links more and more computers worldwide. Computers get smaller and smaller; the birth of nano-technology.