Biyernes, Hulyo 29, 2011

Generations of Computers

Each generation of computers is characterized by major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. Read about each generation and the developments that led to the current devices that we use today.

First Generation - 1940-1956: Vacuum Tubes

The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. A magnetic drum,also referred to as drum, is a metal cylinder coated with magnetic iron-oxide material on which data and programs can be stored. Magnetic drums were once use das a primary storage device but have since been implemented as auxiliary storage devices.
The tracks on a magnetic drum are assigned to channels located around the circumference of the drum, forming adjacent circular bands that wind around the drum. A single drum can have up to 200 tracks. As the drum rotates at a speed of up to 3,000 rpm, the device's read/write heads deposit magnetized spots on the drum during the write operation and sense these spots during a read operation. This action is similar to that of a magnetic tape or disk drive.
They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language to perform operations, and they could only solve one problem at a time. Machine languages are the only languages understood by computers. While easily understood by computers, machine languages are almost impossible for humans to use because they consist entirely of numbers. Computer Programmers, therefore, use either high level programming languages or an assembly language programming. An assembly language contains the same instructions as a machine language, but the instructions and variables have names instead of being just numbers.
Programs written in  high level programming languages retranslated into assembly language or machine language by a compiler. Assembly language program retranslated into machine language by a program called an assembler (assembly language compiler).
Every CPU has its own unique machine language. Programs must be rewritten or recompiled, therefore, to run on different types of computers. Input was based onpunch card and paper tapes, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.
Acronym for Electronic Numerical Integrator And Computer, the world's first operational electronic digital computer, developed by Army Ordnance to compute World War II ballistic firing tables. The ENIAC, weighing 30 tons, using 200 kilowatts of electric power and consisting of 18,000 vacuum tubes,1,500 relays, and hundreds of thousands of resistors,capacitors, and inductors, was completed in 1945. In addition to ballistics, the ENIAC's field of application included weather prediction, atomic-energy calculations, cosmic-ray studies, thermal ignition,random-number studies, wind-tunnel design, and other scientific uses. The ENIAC soon became obsolete as the need arose for faster computing speeds.

Second Generation - 1956-1963: Transistors

Transistors replaced vacuum tubes and ushered in the second generation computer. Transistor is a device composed of semiconductor material that amplifies a signal or opens or closes a circuit. Invented in 1947 at Bell Labs, transistors have become the key ingredient of all digital circuits, including computers. Today's latest microprocessor contains tens of millions of microscopic transistors.
Prior to the invention of transistors, digital circuits were composed of vacuum tubes, which had many disadvantages. They were much larger, required more energy, dissipated more heat, and were more prone to failures. It's safe to say that without the invention of transistors, computing as we know it today would not be possible.
The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube,allowing computers to become smaller, faster, cheaper,more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages,which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.

Third Generation - 1964-1971: Integrated Circuits

The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
A nonmetallic chemical element in the carbon family of elements. Silicon - atomic symbol "Si" - is the second most abundant element in the earth's crust, surpassed only by oxygen. Silicon does not occur uncombined in nature. Sand and almost all rocks contain silicon combined with oxygen, forming silica. When silicon combines with other elements, such as iron, aluminum or potassium, a silicate is formed. Compounds of silicon also occur in the atmosphere, natural waters,many plants and in the bodies of some animals.
Silicon is the basic material used to make computer chips, transistors, silicon diodes and other electronic circuits and switching devices because its atomic structure makes the element an ideal semiconductor. Silicon is commonly doped, or mixed,with other elements, such as boron, phosphorous and arsenic, to alter its conductive properties.
A chip is a small piece of semi conducting material(usually silicon) on which an integrated circuit is embedded. A typical chip is less than ¼-square inches and can contain millions of electronic components(transistors). Computers consist of many chips placed on electronic boards called printed circuit boards. There are different types of chips. For example, CPU chips (also called microprocessors) contain an entire processing unit, whereas memory chips contain blank memory.
Semiconductor is a material that is neither a good conductor of electricity (like copper) nor a good insulator (like rubber). The most common semiconductor materials are silicon and germanium. These materials are then doped to create an excess or lack of electrons.
Computer chips, both for CPU and memory, are composed of semiconductor materials. Semiconductors make it possible to miniaturize electronic components, such as transistors. Not only does miniaturization mean that the components take up less space, it also means that they are faster and require less energy.
Related Article: History Behind It All
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

Fourth Generation - 1971-Present: Microprocessors

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits we rebuilt onto a single silicon chip. A silicon chip that contains a CPU. In the world of personal computers,the terms microprocessor and CPU are used interchangeably. At the heart of all personal computers and most workstations sits a microprocessor. Microprocessors also control the logic of almost all digital devices, from clock radios to fuel-injection systems for automobiles.
Three basic characteristics differentiate microprocessors:
  • Instruction Set: The set of instructions that the microprocessor can execute.
  • Bandwidth: The number of bits processed in a single instruction.
  • Clock Speed: Given in megahertz (MHz), the clock speed determines how many instructions per second the processor can execute.
In both cases, the higher the value, the more powerful the CPU. For example, a 32-bit microprocessor that runs at 50MHz is more powerful than a 16-bitmicroprocessor that runs at 25MHz.
What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004chip, developed in 1971, located all the components of the computer - from the central processing unit and memory to input/output controls - on a single chip.
Abbreviation of central processing unit, and pronounced as separate letters. The CPU is the brains of the computer. Sometimes referred to simply as the processor or central processor, the CPU is where most calculations take place. In terms of computing power,the CPU is the most important element of a computer system.
On large machines, CPUs require one or more printed circuit boards. On personal computers and small workstations, the CPU is housed in a single chip called a microprocessor.
Two typical components of a CPU are:
  • The arithmetic logic unit (ALU), which performs arithmetic and logical operations.
  • The control unit, which extracts instructions from memory and decodes and executes them, calling on the ALU when necessary.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUI's, the mouse and handheld devices

Fifth Generation - Present and Beyond: Artificial Intelligence

Fifth generation computing devices, based on artificial intelligence, are still in development,though there are some applications, such as voice recognition, that are being used today.
Artificial Intelligence is the branch of computer science concerned with making computers behave like humans. The term was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology. Artificial intelligence includes:
  • Games Playing: programming computers to play games such as chess and checkers
  • Expert Systems: programming computers to make decisions in real-life situations (for example, some expert systems help doctors diagnose diseases based on symptoms)
  • Natural Language: programming computers to understand natural human languages
  • Neural Networks: Systems that simulate intelligence by attempting to reproduce the types of physical connections that occur in animal brains
  • Robotics: programming computers to see and hear and react to other sensory stimuli
Currently, no computers exhibit full artificial intelligence (that is, are able to simulate human behavior). The greatest advances have occurred in the field of games playing. The best computer chess programs are now capable of beating humans. In May,1997, an IBM super-computer called Deep Blue defeated world chess champion Gary Kasparov in a chess match.
In the area of robotics, computers are now widely used in assembly plants, but they are capable only of very limited tasks. Robots have great difficulty identifying objects based on appearance or feel, and they still move and handle objects clumsily.
Natural-language processing offers the greatest potential rewards because it would allow people to interact with computers without needing any specialized knowledge. You could simply walk up to a computer and talk to it. Unfortunately, programming computers to understand natural languages has proved to be more difficult than originally thought. Some rudimentary translation systems that translate from one human language to another are in existence, but they are not nearly as good as human translators.
There are also voice recognition systems that can convert spoken sounds into written words, but they do not understand what they are writing; they simply take dictation. Even these systems are quite limited -- you must speak slowly and distinctly.
In the early 1980s, expert systems were believed to represent the future of artificial intelligence and of computers in general. To date, however, they have not lived up to expectations. Many expert systems help human experts in such fields as medicine and engineering, but they are very expensive to produce and are helpful only in special situations.

Timeline of Computers

The idea of any time line is to present developments in a particular field as a continuum, a cascade of inevitability leading up to the present. A closer look at our highlights from computer history will show this long and winding road to have been traveled in fits and starts. What if Pascal hadn't gotten religion and retreated from science? Or if Babbage had completed his Analytical Engine? Or if Konrad Zuse had been able to get all the spare parts he needed? We'll never know whether the flow of computer developments would have been greatly accelerated. The course of computer history does, after all, seem inevitable.

1642

In France, mathematics genius Blaise Pascal devises the first true calculating machine. Using eight rotating gears and wheels, the Pascaline performs addition and subtraction.

Blaise Pascal
1679


Gottfried Leibniz
Gottfried Wilhelm Leibniz, German philosopher, historian and scientist, perfects the binary system of notation. In a few centuries this system of is and Os will prove invaluable in machine computations.

1801

French silk weaver JosephMarie Jacquard invents a loom with punched cardboard cards for controlling woven patterns. The Jacquard loom modernizes the textile industry and will become the model for Babbage's use of punched cards.

Punched Carboard Card
1822

In London, Charles Babbage begins work on his Difference Engine, a calculating machine that performs mathematical functions (with sines, cosines and logarithms) to six decimal places. The hundreds of gears, shafts and counters weigh two tons. Seventy yearslater William Burroughs will use these principles in constructing the first successful adding machines.

Charles Babbage
1833


Analytical Engine
Babbage designs the first general-purpose computer, the Analytical Engine, to which the modern computer bears a remarkable resemblance. The Engine has five parts: the mill, or calculating unit, the store (memory), an input device, a control section and a printer. The input and the control section are fed by punched cards.

1842

Babbage's friend Lady Augusta Ada Lovelace documents his major work in her "Observations on Mr. Babbage's Analytical Engine." She will also write the first program, streamlining operations with such instructions as "Here follows a repetition of operations 13 to 23.

Ada Lovelace

1847


With two landmark theses, English mathematician George Boole sets up a system called Boolean algebra,, wherein logical problems are solved like algebraic problems. Boole's theories will form the bedrock of computer science.

George Boole
1855

The first computer prize, a gold medal, is awarded at the Paris Exhibition to the Scheutz Difference Engine. Swedish engineer Georg Scheutz devised this simplified version of Babbage's machine after reading Lady Lovelace's "Observations." By this time painfully frustrated with his own slow progress, Babbage is in the audience when the medal is awarded to the younger inventor.

George Scheutz

difference engine
1859

The Registrar's Office in England commissions a Scheutz Difference Engine for calculating actuarial tables to predict life expectancy. This is the first use of the new technology by a government agency.

1890

Dr. Herman Hollerith completes the first electromechanical counting machine, the Hollerith tabulator, in which punched cards are used in data processing for the first time.

Hollerrith Tabulator
The U.S. government buys the Hollerith tabulator to compute the census. The machine completes the job in just 6 weeks as against previous 10-year preparation periods. (The U.S. population is 62,622,250.)

Scientific American
1895


slot machine
Charles Fey, a young mechanic, opens arcade history by creating the slot machine. For $20 he sells the "Liberty Bell" to a San Francisco saloon, where it sits on the bar, accepts and pays out nickels, and is a huge success. (Symbols are bells, horseshoes, hearts, diamonds, spades and one star.)

1896


Herman Hollerith
Herman Hollerith forms the Tabulating Machine Company to accommodate the demand for his counting machines. Eventually, the firm will take on a new identity as the Computing Tabulating and Recording Company (CTR).

counting machine
1913

Thomas J. Watson leaves National Cash Register, where he coined his legendary THINK slogan, to assume presidency of the now ailing CTR.

Thomas Watson
1924

With Watson at the helm, Hollerith's fledgling finally emerges as IBM (International Business Machines).

1925

The "modern era of computation" begins at Massachusetts Institute of Technology, where electrical engineer Vannevar Bush and colleagues devise a large-scale analog calculator. Though mostly mechanical, the calculator has electric motors that store number values as voltages in its thermionic tubes. For this invention, some consider Bush the true father of computing.

Vannevar Bush
Bell Laboratories is founded in Murray Hill, N.J.

Bell Labs
1935


Konrad Zuse
In Germany, inventor Konrad Zuse decides to use the binary system in his computer designs. The binary numbers calculate much faster than decimals.

1936

English mathematician Alan M. Turing publishes his "On Computable Numbers," one of the single most important papers in the development of computer science.

Alan Turing
Zuse designs the Z1, a computer with keyboard input, mechanical switches for storing numbers, and a row of light bulbs to flash answers. The Z1 can store instructions and is thus the first working stored program computer.

Z1 computer
1938


Helmut Schreyer
In Germany, Helmut Schreyer receives his doctoral thesis in engineering for demonstrating how electronic vacuum tubes can be used as basic units for ultra-high-speed digital computers.

1939


HP
Hewlett-Packard is founded in Palo Alto, Calif.

1940


George Stibitz
George Stibitz rents a telephone link from Dartmouth to his computer at Bell Labs in New Jersey and demonstrates long-distance computing in an address to the Dartmouth Mathematical Society in Hanover, N. H.


1942


Vannevar Bush
Vannevar Bush completes his second model of the analog calculator, subsequently used to help devise artillery firing tables for the U.S. government.

1943

Colossus, the world's first electronic computer, begins operations in December at Bletchley Park in England. Designed by Alan Turing and a team of scientists to decipher the signals of the German code machine Enigma, Colossus will help win the war for the Allies.

Enigma
To help the war effort, math professor Grace Murray Hopper enters the U.S. Naval Reserve and embarks on the first modern programming career. Upon graduation, she is assigned to the Bureau of Ordnance Computation Project at Harvard.
Grace Hopper
1944


Howard Aiken
Mark I
The Harvard Mark I, designed primarily by Prof. Howard Aiken (with funding by Thomas Watson and IBM), launches today's computer industry. The Mark I is the world's first fully automatic computer and the first machine to fulfill Babbage's dream. 1945

In wartime Germany, unable to obtain material for circuits to control his computers, Konrad Zuse creates the first programming language, Plankalkul, for both numerical and nonnumerical problems.

John von Neumann
In America, John von Neumann posits his five characteristics of computing: 1) fully electronic execution, 2) the binary number system, 3) an internal memory, 4) a stored program, and 5) universality i.e., a machine that can perform more than one task.

1946

Electrical engineer J. Presper Eckert and physicist John Mauchly complete the first programmable electronic computer, ENIAC, at the University of Pennsylvania's Moore School of Electrical Engineering.

Eckert and Mauchly form the first commercial computer firm, the Electronic Control Company (later the Eckert-Mauchly Corporation), to manufacture electronic computers.

1947

Bell Labs scientists John Bardeen, Walter Houser Brattain and William Bradford Shockley revolutionize the young computer industry by inventing the transistor,

transistor
1949


Claude Shannon
M.I.T.'s Claude Shannon switches on computer game history when he demonstrates how to outline problems using game-playing machines, then builds a chessplaying machine called Caissac.

EDSAC (Electronic Delay Storage Automatic Computer) makes its first calculation on May 6. Built by Maurice Wilkes at Cambridge University, England, EDSAC performs one computation in three milliseconds. Wilkes is the first inventor to have a subroutine libraryin mind while designing a computer.

edsac
1950

On an 8x8 board, Alan Turing writes the first computer program to simulate chess

chess
Kurt Vonnegut, Jr., writes about "EPICAC" in one of the first love stories involving a computer.

epicac
The American military begins to use computers to simulate operations in its "war games."

1951

The first nonspecialist computer magazine, Computers and People (originally titled Computers and Automation), comes on the market.

John Pinkerton completes the first business computer, LEO, for Lyons Teashop Company in England. LEO will be used for administrative purposes, not for calculating.

Eckert and Mauchly complete UNIVAC I (Universal Automatic Computer), the first computer specifically designed for commercial operations, and deliver it to the U.S. Census Bureau for tabulating the 1950 census.

While working on UNIVAC I, Grace Hopper meets the need for faster programming by devising a set of  instructions that tells the machine how to convert its language into symbolic code. This is the A-O compiler, the first of its kind.

1952

IBM, the world's largest purveyor of punched card office machines, shifts to the manufacture of electronic computers.

John Diebold's "Automation: The Advent of the Factory" leads off the string of studies that will explore the computer's impact on employment and leisure time.

automation
1954

FORTRAN is born, through a paper titled "Specifications for the IBM Mathematical Formula Translating System, FORTRAN," written by IBM's Programming Research Group.

1955

At RCA Labs in Princeton, N.J., Harry Olson and Herbert Belar complete the RCA Electronic Music Synthesizer, the first of its kind.

M.I. T.'s Whirlwind I introduces the first computer graphics: primitive interactive line drawings on two display consoles.

Wirlwind
The first formal computer user group, SHARE, meets in the basement of Rand Corporation headquarters in Santa Monica, Calif. The members, including government, research, aviation and computer organizations, gather to exchange "homegrown" software in the absence of instructions for the IBM 704.

1956


silicon vally
The 45-mile stretch of high-tech creativity known as Silicon Valley etches itself on the landscape of California's Santa Clara Valley.

Bardeen, Brattain and Shockley receive the Nobel Prize for their invention of the transistor. Shockley, who had left Bell Labs in 1955, founds Shockley Transistor Corporation, one of the first of the Silicon Valley firms. Engineers from Shockley Transistor will form their own major electronics firms, such as Fairchild Semiconductor.

Bardeen, Brattain and Shockley
1957

At his marriage in Amsterdam, programming expert Edsgar Dijkstra fills in his profession on the license as "programer." Finding this unacceptable on the grounds that no such profession exists, city authorities erase his entry and sub stitute "theoretical physicist."

Lejaren Hiller arranges the first computercomposed music, Illiac Suite for String Quartet.

In Maynard, Mass., Ken Olsen starts Digital Equipment Corporation (DEC) as a mail-order parts business.

1958


Rout 128
Computer firms spring up along Route 128, north of Boston.

Jack Kilby
Texas Instruments' Jack St. Kilby develops the first working model of the integrated circuit.

IC

At Control Data Corporation, Seymour Cray designs the CDC 1604, the first fully transistorized supercomputer.

CDC 1604
1959

At Fairchild Semiconductor, Robert Noyce and Jean Hoerni develop the planar process, in which circuit components are interconnected by photoengraving on a flat, polished wafer, usually silicon. With integrated circuits, computers grow smaller and much more powerful.

CODASYL (Committee on Data Systems Languages), representing government, military and industry, meets to decide on a common language for business data processing. COBOL, for Common Business Oriented Language, is published within months, whereupon the Defense Department stipulates that all its suppliers must use the language.

1960

The term "software" becomes widely accepted throughout the computer industry.

1961

The National Institutes of Health Clinic Center in Bethesda, Md., implements the first computerized patientmonitoring system.

patient monitoring system
1962

Dr. Edward O. Thorp's best-selling Beat the Dealer describes using a computer to work out the odds at blackjack. Thorpe's system is so successful that several casinos bar him from the game.

Beat the Dealer
Disk file storage is initiated with the IBM 1440 series. The 14-inch disks look like phonograph records, are arranged in stacks of six and store three million characters.

With a $30 million investment and an IBM 9090, American Airlines launches SABRE, the first computerized airline reservation system. One of the largest commercial data bases in operation, SABRE allows customers to book reservations and rent cars. By 1968 it will handle over 100,000 calls per day from passengers, travel agents and other airlines.

Ivan Sutherland, a doctoral candidate at M. I. T.'s Lincoln Laboratory, designs Sketchpad, a linedrawing system for draftsmen. Using a cathode ray display tube, the system features an electronic stylus, or light pen, to display calculations at any stage of design. Soon after, another M.I.T. researcher, Timothy Johnson, develops a collateral program to display three-dimen-  sional drawings.

sketchpad
1963


Joseph Weizenbaum
M.I.T.'s Dr. Joseph Weizenbaum develops Eliza, a program that simulates conversation between psychotherapist and patient.

General Motor; Research Labs produces the first computerdesigned auto part: the trunk lid for 1965 Cadillacs. The computer system is DAC-1 (Design Augmented by Computer), whose screen displays an image that can be modified with a light pen.

A
fter more than 73,000 hours of steadfast service, UNIVAC I is retired to the Smithsonian Institution.

univac-1

1964


Sara Lee, maker of frozen pastries, becomes the first fully automated factory. The Deerfield, Ill., plant uses a Honeywell 610 computer to change equipment speeds and oven temperatures and to determine what products are needed in filling orders.

In Texas v. Hancock a programmer who stole his employer's computer software, worth about $5 million, is convicted and sentenced to five years. This constitutes the first computer crime leading to criminal prosecution.

1965


John Kemeny and Thomas Kurtz
On May 1, at four A.M. in a room at Dartmouth College, John Kemeny and Thomas E. Kurtz run their first program in BASIC (Beginners' All-Purpose Symbolic Instruction Code) for non professional computer users.

Harris-Intertype Corporation introduces three models of a computer designed specifically for typesetting. All of them justify automatically, and the top-end version offers near-perfect hyphenation.

Several Wall Street firms turn to computers for securities analysis and accounting.

DEC produces the first "mini" computer, incorporating many features of a large computer but with smaller storage capacity and a slower processing speed.

Schools begin to use computers for science simulation, math quizzes and educational games.

1966

In the first federal case involving criminal use of computers, U.S. v. Bennett, a bank programmer is convicted of adjusting a computer to ignore all his overdraft checks.

match
Operation Match, one of the early computer dating services, opens in Cambridge, Mass.

Texas Instruments unveils the first solid-state hand-held calculator. It has no electronic display, but prints out answers on a strip of heat-sensitive paper.

calculator
1967


MacHack IV
The chess-playing MacHack IV is entered by Richard Greenblatt in the Massachusetts state championship, becoming the first program to compete successfully against human chess players.

Computerworld, one of the most comprehensive weekly newspapers geared to the computer industry, begins publication.

1968

The movie 2001: A Space Odyssey plays across the country, introducing the mutinous computer HAL.

Switched-On Bach
W. Carlos' Switched-On Bach, an album of fugues, preludes and two-part inventions played on a Moog Synthesizer, is a big hit.

Gordon Moore
Gordon Moore and Robert Noyce leave Fairchild Semiconductor to form Intel (Integrated Electronics) Corporation.

1969


cpu
M.E. Hoff, Jr., a young engineer at Intel, takes charge of the Busicom project involving the manufacture of chips for a Japanese calculator firm. His improvements on the design result in a central processing unit of 2,250 microminiaturized transistors on a chip less than 1/6" long and 1/8" wide. The Intel 4004 is the first micro computer.

Intel 4004
1970


Ralph Baer
Ralph Baer, a division manager at Sanders Associates in New Hampshire, originates the home video game when he develops an electronic unit with hand controls that sends broadcast signals to a TV set.

1971

Magnavox buys the patent rights to Baer's TV/hand-control invention, then sells the sublicensing rights to Atari and other manufacturers.

TV game
Left with a stock of unsold chips, Intel puts the 4004 microprocessor in its catalog. To everyone's surprise, the chip takes the industry by storm and paves the way for most of the advances of the decade.

1972

IBM announces the System/32, a desk-size unit that contains all the computer hardware.

Intel develops the 8008 microprocessor, originally designed for the Display Terminal Corporation (now Datapoint) CRT. The 8008 ultimately satisfies all customer requirements except in the area of speed.

In a move to reduce clutter and clatter in the newsroom, the Augusta (Ga.) Chronicle and Herald install CRTs for use in writing and editing stories.

CRT editor
Atari founder Nolan Bushnell invents and markets Pong, considered by many the first milestone in video game history.

Pong
Diablo Systems of Hayward, Calif., develops the first automatic printer for data processing systems. The "daisy wheel" Hytype Printer I features a glass-reinforced nylon disk and can print 30 characters per second; integrated circuits do much of the work.

The Summer Olympics in Munich, Germany, are the first games to use computers as "primary" judges of times and finishes. The computer companies involved are Gebr. Junghams GMBH and Compagnie de Montres Longines Francillon S. A.

1973

The National Computer Conference is held at the New York Coliseum June 4-8, replacing the fall and spring joint conferences.

Intel turns out the 8008 microprocessor, which is 20 times faster than the original 4004 chip.

S
hugart Associates of Sunnyvale, Calif., ships its first 8" floppy disks. Replacing punched cards as a data entry medium, the reusable plastic/ oxide disks weigh less than two ounces and store programs and files.

Truong Trong Thi
Truong Trong Thi, a Frenchman of Vietnamese origin, introduces the first commercially available microcomputer system, based on the Intel 8008, but fails to secure adequate distribution.

1974

The July cover story of Radio-Electronics magazine tells how to "Build the Mark-8, Your Personal Minicomputer" (with an Intel 8008 microprocessor).

Computer magazines now range from Computer Law and Tax Reporter, which documents legal battles in data processing, to Creative Computing, one of the first magazines devoted to recreational use of computers.

In the first experiment with bank computer terminals, two branches of the Lincoln, Neb., Hinky Dinky grocery chain install computer terminals for bank deposits and withdrawals. In six weeks First Federal Savings & Loan takes in 672 new accounts.

Two leading designers at Intel leave to form Zilog, another microprocessing firm. They develop the Z80 chip, which competes directly with Intel's new 8080.

1975

The January 1975 issue of Popular Electronics features a cover story on the MITS Altair, the first widely available personal computer.

William Gates and Paul Allen
In a five-week period, Harvard student William Gates and associate Paul Allen adapt BASIC to fit the microcomputer. Having wrested the new computers from the hands of a small group of assembly language programmers, they form Microsoft to market their version of the language.

Objective Design of Tallahassee, Fla., offers Encounter, the first commercial personal computer game, in assembly language on paper tape.

1976

The New York Times starts to convert to electronic editing and typesetting on a Harris 2550 system.

With a surplus of calculator chips, Commodore enters the personal computer market through MOS (metal oxide semiconductor) technology.

The first Adventure game is programmed by Crowther and Wood at Princeton University.

Byte magazine
The number of computer magazines grows to include Byte: The Small Systems Journal (aimed at the "personal computer" amateur and professional), the quarterly Computer Graphics and Art, and Dr. Dobb's Journal of Computer Calisthenics and Orthodontia for the microcomputer hobbyist.

1977

Storage systems become smaller, more powerful and more convenient. Micropolis Corporation of Northridge, Calif., announces the Metafloppy, a family of integrated 51/4' floppy disk systems with the storage capacity of 8" disks.

The newsweekly Computerworld begins a Microcomputing section to handle the flood of information on micros.

Apple markets the Apple II, ultimately to become the personal computer equivalent of the Volkswagen.

Apple II

TRS-80
Radio Shack unveils its fully assembled microcomputer, the TRS-80 Model 1, with keyboard, CRT and cassette unit. The whole system, which offers some graphics and can be programmed in BASIC, sells for $599.95.

eye diagnosis
CRTs come under suspicion when two New York Times copy editors are diagnosed as having cataracts. Tested for radiation, the machines are ultimately cleared. This is the first of many complaints linking eye irritations and CRTs

Commodore International, enters the personal computer field with PET (personal electronic transactor).

ComputerLand, among the largest of today's computer retailers, opens its first store.

Originally developed for computerized astrology machines, CP/M (control program for microcomputers) is offered by Gary Kildall and his Digital Research Company. CPM will soon become a standard for business applications on personal computers.

Digital Research
1978


Daniel Bricklin and Robert Frankston
Fed up with time-consuming projections using a calculator and spreadsheet, first-year Harvard Business School student Daniel Bricklin teams up with Robert Frankston at M.I.T. to create VisiCalc, an electronic spreadsheet that can recalculate all related numbers when one variable changes. They pool their finance. and with $16,000 found Software Arts in Wellesley, Mass.

Seymour Rubenstein
Seymour Rubenstein, formerly of IMSAI, founds MicroPro International and commissions John Barnaby to write the word processing program that will become WordStar.

speak & spell
Texas Instruments produces its Speak & Spell toy, the first widespread offering of digital speech synthesis.

Epson America in Anaheim, Calif, introduces its 80-column dot-matrix printer, which becomes a runaway best seller.

dot-matrix printer
1979

Personal Software markets VisiCalc, soon called the "smash hit of software." The first version works only on the Apple II and thus boosts that computer's sales. VisiCalc is credited with taking micros out of the home and making them "serious."

Adam Osborne
Publisher Adam Osborne sells his company to McGraw-Hill and founds Osborne Computer in Hayward, Calif.

video games
Video games appear everywhere: in restaurants, gas stations, bars. With threatening names like Centipede and Space Invaders, the quarter-gobbling dwarfs cause concern among parents.

electronic services
The Source offers an electronic service enabling home computer owners to read newspapers, get stock info, check airline schedules and browse through restaurant guides. Similar services will include CompuServe and Dow Jones News/Retrieval.

1980

Shugart Associates markets the 5 1/2" Winchester disk drive, which stores 30 times as much data as a standard small floppy and transfers the information 20 times faster.

Texas Instruments unveils its first personal computer, the TI 99/4, based on a 16-bit processor and list-priced at $1,200. With modifications and aggressive marketing, this computer eventually lists for $99 before almost bankrupting the company.

Radio Shack introduces the TRS-80 Color Computer for recreation and education.

TRS-80 color
1981

Four eighth graders at Manhattan's private Dalton School use its terminals to link up with other computers. By trial and error, they gain entry into several Canadian companies' computers, temporarily destroying certain data and preventing legitimate users from accessing the systems. The FBI and Royal Canadian Mounted Police join forces and catch the 13year-olds after a week of their long-distance raids. No charges are pressed despite a loss of several thousand dollars' worth of computer time

VIC-20
Commodore introduces the VIC-20, destined to be the first home computer model to sell more than one million units. Waiting in the wings is the more powerful Commodore 64, the first popularly priced machine to have 64K of memory built in.

Osborne 1
Osborne Computer unveils the Osborne 1, the first portable micro. Its 24 pounds hold a disk operating system that can handle word processing and electronic spreadsheets.

Zork, a "second-generation" adventure game capable of responding to complex sentences, is introduced by Infocom. Originally written in a proprietary language on a minicomputer, the game is quickly converted by Infocom into versions for virtually every popular personal computer model.

The six-yearold personal computer industry passes the $1.5 billion mark.

At ENIAC's thirty-fifth birthday celebration in Philadelphia, the trail-blazing machine is pitted against a Radio Shack TRS-80 and commanded to square all integers from 1 to 10,000. The young micro wins handily, completing the exercise in a third of a second vs. ENIAC's six seconds.

Computer camps become popular among kids (and some adults).

computer camp

Clive Sinclair
Watchmaker Timex Inc. contracts with England's Clive Sinclair to market Timex/Sinclair 1000, the first fully assembled under$100 computer in the U.S.

IBM PC
The IBM PC debuts, with a memory that can store more than 250 pages of data and a system that can complete about 700,000 additions per second. The PC is as powerful as anything on the market, which shifts dramatically toward the industry's giant.

IBM chooses Microsoft's MSDOS operating system for its PC. When other hardware manufacturers hop on the IBMcompatible bandwagon, MS-DOS becomes the new standard for business applications programs.

In a lean Christmas shopping season, computer video games (with TV hook-ups) are huge hits. The favorites are Intellivision and Atari.

1982

According to a study by Prof. Sanford Weinberg of St. Joseph's University, Philadelphia, at least 30 percent of daily users of computers have some degree of "cyberphobia," or fear of computers. Victims range from high blood pressure sufferers to the policeman who shot the computer console in his car. Another Weinberg study shows cyberphiliacs (compulsive computer programmers) to be no better off: they are usually friendless and single.

Jimmy Carter
Jimmy Carter becomes the first former President to write his memoirs with a word processor. Like many tyro computer users, he hits a wrong key and deletes an entire chapter.

As the video game craze reaches fever pitch, 15-yearold Steve Juraszek of Arlington Heights, Ill., plays Defender for 16 hours, 34 minutes, on the same quarter. His score: 15,963,100.

For its annual "Man of the Year" issue, Time magazine features the computer on its cover.

Over 17,000 software packages are now available to run on Apple computers.

1983

Lotus 1-2-3, the first integrated software package for personal computers, hits the market. Lotus founder Mitchell D. Kapor packaged an electronic spreadsheet, information management and graphics on one 5 1/4" disk.

Lotus 123
Radio Shack brings out a book-size computer: the Radio Shack' 100. The tiny machine weighs about four pounds, has built-in word processing and communications software, and costs just under $800. Other companies quickly following with book-size computers are Sharp and Nippon, taking advantage of the power and size made possible by CMOS chips.

Radio Shack 100
Apple puts out the Lisa 32bit Motorola 68000 microprocessor-based computer featuring high-res graphics, onscreen windows for multi-program use and a mouse for controlling cursor position and data entry. The initial $10,000 offering price is prohibitive, but Lisa establishes the state of the art for personal computers.

Computerized burglaries become so popular among teens that the FBI conducts a huge "sting" operation to round up micro-criminals in 13 cities. (Computers in brokerage houses, hospitals and the Defense Department had been raided mostly through GTE's Telenet, based in Vienna, Va. ) Word of the FBI crackdown is flashed to other hackers across the country via computer bulletin boards.

Less than two years after introducing inexpensive portable computers, Osborne files for reorganization under chapter 11 of the federal bankruptcy law. In the highly competitive microcomputer market, other high-tech firms founder. Texas Instruments and Mattel leave the home computer business, eventually followed by Timex.

Pronto
New York's Chemical Bank makes the first large-scale launch of a home banking system. Its Pronto service is soon offered through 200 banks across the U.S.

Hewlett-Packard unveils the HP-150, the first personal computer to offer a touch screen.


President Ronald Reagan helps unemployed steelworker Ronald D. Bricker get his first job interview in a year. Bricker goes to work as computer repair technician for Radio Shack, realizes he is earning less than if he were collecting unemployment insurance and gladly returns to the steel mill when his old job becomes available.

A Korean Airlines Boeing 747 with 269 people on board is shot down by a Russian fighter plane for straying into Soviet air space. Western aviation experts  blame a one-digit human error by the crew in programming the plane's  navigational computer-enough to account for its being 300 miles off course.

navagation computer
The movie WarGames, in which a young hacker gains entry to a Defense Department computer and plays "global thermonuclear war," explores the adolescent fantasy of possessing ultimate power in the adult world.

war games
Reared by Doug Englebart, the mouse input device makes its popular debut with the launching of Apple's Lisa and adoption for IBM PC software.

mouse
As marketing takes over from engineering, Pepsi-Cola v.p. John Sculley becomes president of Apple.

Coleco Adam
Toymaker Coleco announces its Adam, the first inexpensive home computer system with built-in word processing capabilities. By Christmas eve Adam has disappointing sales; what saves the company from bankruptcy are Cabbage Patch dolls, their names computergenerated.

IBM brings out the PCjr, a home-oriented, lower-priced encore to its PC.

PCjr
1984

VisiCorp and Software Arts sue each other over marketing rights to the pioneering program in a move that could reflect an end to the cooperative era in the software industry.

Orwell
George Orwell's 1984, thought by many to be a prophetic indictment of the computer age, is found to contain no mention of computers.

As competition heats up, commercial TV becomes a battleground for the personal computer wars. Apple and Kaypro ads go on to win "Cleos", the "Oscars" of TV commericals.

tv ad
tv ad
Apple signals a new generation of personal computers with its powerful, compact Macintosh, whose 3 1/2" disks store more than the 5 1/4" disks used in most micros. With its mouse and pull-down menus and windows, it is truly "Lisa for the masses."

Computer magazine titles reach 450, the largest number ever devoted to a single subject. An ensuing shakeout decimates the ranks of computer publications.

During a period of refinement and consolidation, the biggest news in software is Lotus Symphony, the five-in-one integrated successor to 1-2-3 and "thought processing" programs like Think Tank by Living Video Text and integrated Framework from Aston-Tate.

…and beyond


Still in its relative infancy, the computer seems to be infinitely perfectible. The march toward computopia is hardly linear-for every step forward there are several steps sideways and back-but there is a clear progression in the direction of greater intelligence and sophistication. Which should bring up even more often the fundamental question: Can computers replace us? Only time will tell ...