How Punch Cards And Vacuum Tubes Gave Us iPhones

Think of how heavily dependent we are on computers. When was the last time the internet went out in your office? What happened? That’s right, nothing, because everything is so completely dependent on computers and the internet these days. Not that that’s a bad thing, but it can certainly be frustrating when there’s an outage. But when things are working properly we can accomplish more in a shorter period of time than just a generation before us.

 

Computer technology has been developing at a breakneck pace over the last century, but the ideas for computers came even before that. Binary code, on which most early computer programming was based, was invented way back in 1703 by Gottfried Leibniz. Punch cards, with which most early versions of computers were controlled, were first used in 1801 on an automated fabric loom. Shortly thereafter in 1843 Ada Lovelace published the first computer code, making her the first ever computer programmer.

 

No doubt there were other extraneous inventions that helped usher in the computer age. Electricity did not become common in homes until the 1930s, and even then it was only in major urban areas. Electricity didn’t reach some rural areas until the 1950s. In this context it’s no wonder computer science didn’t start to take off until the 1940s. First the Bombe machine was invented to break the Nazi code, which is credited with bringing the war to an end 2 years sooner by some estimates. Then the ENIAC machine was invented. This room full of vacuum tubes and punch cards was revolutionary.

 

Memory started to be stored on a spinning magnetic drum, and then vacuum tubes were replaced with transistors. Eventually Jack Kilby of Texas Instruments got the idea to put a bunch of transistors on a chip, which eventually led to the microchip. This development allowed for computers to get smaller and more powerful at an exponential pace.

 

Moore’s Law was proposed in 1965 by Intel founder Gordon Moore. It predicted the number of transistors on an integrated circuit would double every year, which would have the effect of increasing power in a smaller space while decreasing cost. Moore’s Law proved to be true for over 5 decades, leading to the smaller and less expensive computer chips we all take for granted today.

 

The 1970s became the decade of the Personal Computer. Atari was released in 1975 for home use, and the Apple II was introduced in 1977. From there new operating systems revolutionized how people interacted with their computers. Then the World Wide Web was invented in 1989.

 

Now technology is evolving in ways no one a hundred years ago could have imagined. There are unmanned drones delivering toilet paper to your house, movies are streamed to your television instead of going to the movie store to pick them out, and we all have computers that are more powerful than the first lunar capsule in our pockets. Each of these advances builds on previous advances to open up endless possibilities for the future, including virtual reality and augmented reality. Learn more about the history of computer science from this infographic!

evolution_of_computer_science_IG