Plyojump - Computer History
Computer History - Pre-1945
An example of an abacus, a digital computer driven by fingers instead of electricity which has been used for over 2,000 years.
The Antikythera Mechanism - A Roman-Era Analog Computer (2,000 years ago)
Excellent article on the mechanism
The Antikythera Mechanism is the only surviving example of ancient computing technology. Sometime around 2,100 years ago, Greek scientists and engineers created this device. The extreme complexity of the mechanism, as well as scattered references in classical texts, indicate that many such devices must have existed. They have not survived to our era since they were made of brass, which often was melted down and recycled.
The mechanism, as originally found in a Roman-era shipwreck.
The mechanism was finally reconstructed after using advanced X-ray and other scanning techniques to examine it.
Side view of the gears in the mechanism.
Reconstruction of the front gearset of the mechanism. Note the small balls representing the moon and sun. Additional lost gears probably indicated the positions of the five known planets at that time.
The complete mechanism, with dials indicating calendar data and important festival dates (e.g. the Olympics). The right image shows detail of the back panel -the large calendar dial, and the smaller dial recording times of Olympic festivals.
Images of the Antikythera mechanism - a complex analog computer which predicted the motions of the stars and planets during the Roman Empire, reconstructed
- YouTube video of a working model of the Antikythera Mechanism
- Copy of the Antikythera Mechanism implemented as LEGO!
- Decoding an ancient computer - Greek technology tracked the heavens
- More movies - animation and working models in Athens museum
Computing's First Dark Age (400AD-1000AD in Europe)
During this period few technical advances were made, due to the collapse of the Roman Empire and the Han empire in China. In China, computing machines were re-discovered in the early 1100s, but were wiped out during the Mongol Invasion led by Gehngis Khan.
The Re-Discovery of Mechanical Computation (1500-1800 in Europe)
The astrolabe, a very simple mechanical computing system, was developed in the Arabic world and later used by England to establish mastery over the seas. It is used to calculate the angle of the sun, moon, and planets above the horizon to determine one's position on Earth - a sort of proto-GPS.
During the 1400s to 1800s, increasingly complex mechanical clocks and simple computers were constructed. Unlike computers built after 1800, they could not be programmed. However, they could perform simple arithmetic calculations. Pascal's calculator could do addition and multiplication.
Pascal's Calculator (1670)
This is a mechanical calculator that could add and subtract, built by the philosopher Blaise Pascal in the 1670s. The system of cranks and gears is comparable in complexity to the Antikythera mechanism from 1000 years earlier.
The mathematician Gottfied William von Liebnitz built a better gear-driven calculating instrument, called the Stepped Reckoner. It was faster than Pascal's design, and could also multiply and divide.
However, most scientists and engineers used a much simpler analog computer called a Slide Rule. Below is an example. Slide Rules were common well into the 1970s.
Jacquard Loom - the first industrial robot (1801)
The Jacquard Loom is a mechanical loom invented by Joseph Marie Jacquard in 1801. It is controlled by cards with punched holes, each row of which corresponds to one row of the design. Changing the cards changes the design of the textile produced.
The punched cards used to encode data are to the right in the image above. Note their similarity to the IBM punched card system used 150 years later (see below).
Another close-up of the punched-card system on the Jacquard Loom - in effect the first automatic programmable robot.
This image is actually a textile created on a Jacqare Loom in 1839, using 24,000 punch-cards (Wikipedia).
Charles Babbage's (1791-1871) Difference Engines (circa 1830)
English scientist Charles Babbage began to extend computing in the 1820s and 1830s. During his lifetime, he designed several mechanical computers much more complex than any made before. The most advanced of these systems included all the components found in a modern digital computer. However, they were mechanical instead of electrical - turned by crank, or by a steam engine, as in the larger Analytical Engine.
Babbage had an extraordinary career apart from his computing machines, and held many patents for new technology in the 1800s. He was also a codebreaker - he cracked the supposedly unbreakable Vigenère cipher. Code creation and codebreaking formed a major part of the drive to automate computing in the 19th and 20th centuries.
An early form of mechanical computer developed by Babbage in the 1820s as a "test system" for his difference engines.
Difference Engine No. 2 (reconstruction from 1830s design)
A more complex difference engine designed by Babbage. This design was never fully completed, due to Babbage's fights with various individuals and with the British government, which partly sponsored the effort.
The Difference Engine in Action
The Difference Engine implemented as LEGO!
The Scheutz Difference Engine (1850s)
Other people built functional difference engines during the 1800s. Here is an example of a successful engine from the 1850s.
Father and son, George and Edward Scheutz, built a Difference Engine, based on the ideas of British computing pioneer Charles Babbage, with the numbers impressed into paper mache or metal strips. These were then used as moulds from which multiple copies of the results could be printed. The engine was bought in 1859, and in 1864 was used, by the General Register Office to produce the 'English Life Table', life assurance tables.
Babbage's Analytical Engine (fragment)
Babbage's Analytical Engine was never fully built due to cost overruns and the inventor's cranky personality. Study of the designs shows that the system would have worked, and would have been comparable to mechanical computers built 100 years later at the end of the WWII. The Harvard Mark IV, built in the early 1940s, has many features borrowed from the Analytical Engine's design. If it had been built, the Analytical Engine would have incorporated memory, a central processor or "mill", an input system, and even a graphical plotter for producing charts and images from the computations.
The images on the left shows part of the "mill", or computing portion, of Babbage's engine. The right engine shows the punched-card system storing data along with part of the processing unit. The right component was built by Henry Provost Babbage, the youngest son of Charles Babbage, after his father's death.
Ada Byron (Lovelace) 1815-1852
Ada Byron is sometimes called the world's first programmer. She was one of the few people who fully understood the potential of Babbage's machines, and also realized that the code, or software operating them, was as important as their hardware.
Ada Lovelace was the only legitimate child of the poet Lord Byron. From 1832, when she was seventeen, her remarkable mathematical abilities began to emerge, and she knew and corresponded with many scientists of her day, including Mary Somerville, noted researcher and scientific author of the 19th century, who introduced her to Charles Babbage on 5 June 1833.
Ada Lovelace met and corresponded with Charles Babbage on many occasions, including socially and in relation to Babbage's Difference Engine and Analytical Engine. Babbage was impressed by Lovelace's intellect and writing skills. He called her "The Enchantress of Numbers".
During a nine-month period in 1842-43, Lovelace translated Italian mathematician Luigi Menabrea's memoir on Babbage's newest proposed machine, the Analytical Engine. With the article, she appended a set of notes.The notes include the world's first computer program, designed for Babbage's Analytical Engine.
Lovelace is now widely credited as being the first computer programmer. As the patron saint of programmers, she has a holiday on May 24th.
Ada died at age thirty-six - what would have happened if she had lived, and Babbage had completed his design for the Analytical Engine and she could have tested her programs?
More in Wikipedia - http://en.wikipedia.org/wiki/Ada_Lovelace
Ada's 1842 article containing the first computer program, designed for Babbage's Analytical engine (HEAVY math).
Another good book (Amazon)
Computing's Second "Dark Age"
Between the end of the 19th century and the period just before WWII, very little work on machine computing was done. In some ways, it constitutes a "dark age" of computing. Interest in computing re-emerged in the 1930s as the world headed for a global war.
Mechanical calculators make a comeback
For the 1890 census, Hollerith developed an electro-mechanical computer to tabulate statistics.
The left image shows the Hollerith punched-card system used in the 1890 census, and a 1920s era punched-card system that created data for these mechanical calculators.
Hollerith later founded IBM, the main computing company of the 20th century. IBM produced ever more sophisticated "partial" computers based on the Hollerith model, but stopped short of creating a fully functional digital computer.The Hollerith tabulator below (from 1928) was typical of mechanical computing machinery in the 1920s and 1930s. To the right, an example of the 80-column punched-card for storing data (also used by the Jacquard Loom and Babbage's Analytical Engine).
A 1928 Hollerith machine.
More on the Hollerith machines and IBM's early history at - http://www.columbia.edu/acis/history/tabulator.html
The Rise of Electronic Computers
By the 1930s, it was practical to implement computing machinery using electricity instead of mechanical parts. This allowed calculation speed to jump 1000-fold, and ushered in the modern computer age.
Vannevar Bush and the Differential Analyzer (1930)
In 1930, Vannevar Bush introduced the first electronic "computer" in the United States. Bush worked on developing machines that could automate human thinking. His first one was an analog device rather than digital - a sort of electric slide rule that did not use symbol processing the way a modern digital computer (and Babbage's engines) does.
Atanasoff-Berry Computer (1937)
Unlike the Differential Analyzer, the Atanasoff-Berry computer used digital symbols to compute. It used electricity, along with vacuum tube technology to reach speeds impossible for a mechanical computer. However, it was severely limited in the types of computations it could do.
Harvard Mark I - The first modern digital computer (1942)
Developed by Howard H. Aiken, built at IBM and shipped to Harvard in February 1944. The machine was directly inspired by Babbage's Analytical Engine - in some ways it is the realization of Babbage's vision. Howard Aiken saw himself as something close to a reincarnated Babbage, and even invited Babbage's grandson to participate in the first conference on computers held after the war.
It is a hybrid electrical-mechanical device, with a large shaft synchronizing the operation of its parts. It was made up of 78 adding instruments and desk calculators connected by almost 500 miles of wires. In one second it could add three-digit numbers. It was 51 feet long and 8 feet high and had over one million parts.
The Mark I was also unique in not being an experimental system - it was immediately put into use by the US Military. Part of the complex computations needed to produce the first atomic bomb were performed on the Mark I. One of its remarkable features was its reliability - unlike the early electronic computers, it could compute 20 hours a day without breaking down. The first programmers of the Mark I were computing pioneers Richard Milton Block, Robert Campbell and Grace Hopper.
Postwar, there was a serious rivalry between Aiken and IBM. IBM had built the machine, but did not fully understand what Aiken had created. The Mark I was superior to any of IBM's machines.
Grace Murray Hopper
Of the Mark I programmers - probably the first people to create and run useful computing in a data center - Grace Hopper stood out. With a doctorate in mathematics, Hopper joined the Navy in WWII and later joined Aiken's group. Grace Hopper worked closely with Aiken, and in some ways the old Babbage/Byron hardware/software duo had reappeared. Grace Hopper went on to define many features of modern computing, including "open source" and high-level computing languages. Her creation of software "compilers" was critical in that it brought large numbers of new programmers into the profession in the 1950s. COBOL, the first high-level programming language that she developed circa 1960, is still in use - in fact, 70% of all code running on computers today is still written in COBOL. In part due to Hopper (and Ada Lovelace before her), computing was seen as a profession open to women, and by the 1960s computing was the preferred "high-tech" choice of many women
An image of the first "computer bug" - On September 9, 1947, Grace Hopper was working on the Harvard University Mark II, and discovered a problem with the program was actually a moth trapped between the points of Relay #70, in Panel F. When it was fixed, word got out that the program was "debugged" - a a new word in computer jargon was born.
The Navy was so proud, they made her the first female Rear Admiral in history, and named a guided missile destroyer (seen here launching) after her!
ACE (1942)ACE (Automatic Computing Engine): Alan Turing presented a detailed paper to the National Physical Laboratory (NPL) Executive Committee, giving the first reasonably complete design of a stored-program computer. However, because of the strict and long-lasting secrecy around his wartime work at Bletchley Park, he was prohibited (having signed the Official Secrets Act) from explaining that he knew his ideas could be implemented in an electronic device. ACE is the first true electronic computer using stored programs. Unfortunately, due to wartime secrecy, the computer and its plans were destroyed at the end of the war.
Turing also published the first formal descriptions of digital computing. All modern computers are a form of "Turing Machine" following the principles he described. He also helped to found the science of mathematical biology.
Alan Turing unfortunately did not live long enough to contribute to computing. In the 1950s his sexual orientation caused problems, leading to botched female hormone treatments (chemical castration causing his breasts to grow), and his ultimate suicide. At the time in England, there was acute public anxiety about spies and homosexual entrapment by Soviet agents, which explains the reduced tolerance by society.
Developed to break German encrypted communications, 10 of these machines were built in Britain.
Movie showing the Colossus rebuild:
Closeup of the Lorenz machine - a mechanical encryption computer used in Germany, whose code Colossus we developed to "crack". Superiority in computing was one factor in the Allies winning WWII.
John von Neuman and the architecture of the modern digital computer
Vannevar Bush and the Memex (1945)
Bush later wrote an astounding article in 1945 which fully envisioned the World Wide Web 50 years before it actually appeared. Like Babbage's Analytical Engine, it was never built. He called his "web browser" a Memex. The design used electrical wires and microfilm files filling a desk to create the equivalent experience of web surfing - in particular, "hyperlinking" one page of data to another. There are two screens, one for visual display of information, and another - a touch-sensitive pad - for writing and drawing. Bush also imagined a user would have a small camera connected to the system, similar to webcams today.
Link to the original Atlantic Monthly article - http://www.theatlantic.com/doc/194507/bush