Computers

Key Terms

Apple computer
binary system
bit
Babbage
byte
gigabyte
GUI (graphical user interface)
hardware
IBM
Intel
integrated circuit (IC)
kilobyte
megabyte
Microsoft
personal computer (PC)
software
transistor

What do computers do?

Computers are complex devices with many separate parts that all work together to produce an incredible machine.  Most machines are manufactured to do only one specific task.  However, a computer is unlike any other machine you know because it can do many different things, and do them very well.  You can think of a computer as an "all purpose" electronic device because the user determines what it is used for. For example, some people use a computer to:  

How is this possible? That is what we will try to help you understand in this unit.

A very brief history of computers

No one person can be given credit for the invention of the computer.  This is because the desktop units we use today have evolved on many levels, by many people, at different times.  A computer 50 years ago was an entirely different machine from the units on the market today.  In fact, before 1935 a computer wasn't even a machine, it was a person who performed mathematical calculations.  The story of the computer is still being written as new ideas replace old technology.  The computers in 50 years will look nothing like the units we have today.

Charles Babbage (1791-1871) had a vision well ahead of his time when he tried to build a steam powered calculating machine. Babbage was part of the Industrial Revolution and saw how steam was used to replace manual devices.  Why not replace manual calculations with a machine?  He successfully built a "difference engine" which could be programmed to produce perfect squares (1, 4, 9, 16, etc.)  His machine could be re-programmed to perform different calculations, if required.  With the help of Augusta Ada Lovelace, Babbage began working on a much more complex machine which he called the Analytical Engine.   Babbage worked on the physical aspects of the machinery (something we call hardware now), and Lovelace wrote codes to make it perform a specific task (which we now call software).   The Analytical Engine was never completed, and the idea of mechanical computing died soon after his death.

Enter the age of electronics

One of the problems with Babbage's machine was the numbering system it used.  It used the familiar digits we all use today (0, 1, 2, 3, 4, 5, 6, 7, 8, 9).  Our whole numbering system is based on 10 digits (I wonder why?) and is known as base 10.  In grade school you learned how to add, subtract, multiply, and divide with this system.  This system may work great for people, but it is terribly difficult for machines to work with.  A much better solution is a numbering system with only 2 digits - 0 and 1.  This is the binary system (also known as base 2).  Oddly enough, any number you can display in base 10, can also be displayed in base 2.  Any operation that you can do in base 10 can also be done in the binary system.  Binary is a perfect choice for a mechanical device because numbers can be represented by switch settings (0 = off, 1=on) and mathematical operations (addition, for example) can be done by passing these numbers through different gates.  Gates carry out logistical operations on numbers.  If the gates are set up in one configuration, it leads to addition.  The best part is, ....a gate can be constructed by a system of switches which alters the path of electric current.  That is, both the numbers and the operations performed on the numbers can be done with on/off switches.  This is why computers love the binary system.  Don't worry too much if this isn't 100% clear right now.  Later in this unit we will show you how this all works.  For now, trust me that for a computer, binary is good.

Konrad Zuse (1910-1995) took the next step when he invented an electronic means of performing mathematical operations.  Computers used mechanical relays (electromagnets) as a switching device.  The machine worked, but was very slow because it required many mechanical switches.  A colleague suggested to Zuse that he replace the mechanical switches with vacuum tubes.  It worked and the computer became 1000 times faster.

During WWII, the need for high speed computing produced ENIAC.  Sponsored by the Defense department, this monster took up an entire room, used over 17,000 vacuum tubes and broke down every few minutes.  It wasn't completed until after the war ended, but it proved a valuable tool in high speed computing.  However, it was programmed by manually rewiring it for every new problem it had to solve (which often took weeks).  The solution came from John von Neumann who suggested that programs (software) be stored in memory just as data is.  Imagine having to take your desktop computer in for rewiring every time you wanted to do something different on it!

Things got interesting with the invention of two devices - the transistor (1948) and the integrated circuit - IC (1958).  The transistor replaced the vacuum tube as the switching device.  This made computers smaller, cooler, more reliable, and cheaper to run, but how do you solder all the wires to these tiny devices?  Answer: You don't!  The IC was a way to form tiny transistors directly on a small circuit board, but it also included a bonus because the wiring came with it as part of the manufacturing process.

An IC (integrated circuit) Credit David Carron Wikimedia Commons
 

The IC was not utilized much because it was extremely expensive.  That didn't stop NASA who needed a portable computer as a guidance system to orbit and land on the moon.  Their research and development dollars brought the cost down as IC's became smaller and more powerful.  Intel soon became the leading chip manufacturer in the world.  IBM entered the business computer market quite late, but quickly became dominant in the industry.  Up to the mid 1970's, only big business and the government owned computers.

Computers reach the home

Two young computer enthusiasts, Steve Jobs and Steve Wozniak, saw the opportunity to open the home computing market.  They built and sold home computers under the company name Apple.  Soon afterwards, IBM introduced a personal computer (PC) to the public.  Bill Gates (Microsoft) enters the picture here because IBM needed an operating system for their PC.  Microsoft wrote the operating system for the IBM PC, but really made their money by licensing their software products to companies who developed PC clones (which worked exactly like the IBM PC).  Soon IBM was pushed out of the home computer market by the competition, but we still call any IBM clone a "PC" today.

Apple introduced the Macintosh computer (Mac) in 1984.  The Mac offered a graphical user interface (GUI) ... which is a fancy way of saying you clicked on pictures (icons) and used a mouse to get things done.  Microsoft soon followed suit by introducing Windows in 1985.  You are showing your age if you remember the days when computers only worked by typing commands from the keyboard.  Nothing revolutionary (on the hardware side) has happened since then.  Computers have gotten faster, cheaper, and easier to use.  The equipment (peripherals) have gotten fancier (flat screens, wireless, more portable, etc), but nothing revolutionary.  What has changed dramatically was the way computers talk to each other (the Internet) as well as a software revolution that changed the way we use them. 

Computer Lingo

The digital age has brought us a whole new vocabulary.  Let's start with the basics:

We usually use these numbers to describe the storage capacity of the device (how much data it can hold).

Device Capacity
3½" data disk (obsolete) 1.44 MB
Thumb Drive / Memory Card (SD cards) typically 8 - 128 GB but can go higher
data /music CD  700 MB
Typical Hard drive  600 GB to 1 TB
DVD  4.7 GB (single layer)  8.5 GB (dual layer)
Blu-ray disc™ 25 GB (single layer)   50 GB (dual layer)
Typical RAM (memory) 2-32 GB

The same prefixes can be used to describe the clock speed of a computer (how fast it is).  For example a 3 GHz computer means the central processing unit (CPU) turns over data at a rate of 3 gigahertz ... which is 3 billion times per second.

©2001, 2004, 2007, 2009, 2016 by Jim Mihal - All rights reserved
No portion may be distributed without the expressed written permission of the author