short essay on evolution of computer

The Evolution Of Computer | Generations of Computer

The development of computers has been a wonderful journey that has covered several centuries and is defined by a number of inventions and advancements made by our greatest scientists. Because of these scientists, we are using now the latest technology in the computer system.

Now we have Laptops , Desktop computers , notebooks , etc. which we are using today to make our lives easier, and most importantly we can communicate with the world from anywhere around the world with these things.

So, In today’s blog, I want you to explore the journey of computers with me that has been made by our scientists.

Note: If you haven’t read our History of Computer blog then must read first then come over here

let’s look at the evolution of computers/generations of computers

COMPUTER GENERATIONS

Computer generations are essential to understanding computing technology’s evolution. It divides computer history into periods marked by substantial advancements in hardware, software, and computing capabilities. So the first period of computers started from the year 1940 in the first generation of computers. let us see…

Table of Contents

Generations of computer

The generation of classified into five generations:

  • First Generation Computer (1940-1956)
  • Second Generation Computer (1956-1963)
  • Third Generation Computer(1964-1971)
  • Fourth Generation Computer(1971-Present)
  • Fifth Generation Computer(Present and Beyond)

1. FIRST GENERATION COMPUTER: Vacuum Tubes (1940-1956)

short essay on evolution of computer

The first generation of computers is characterized by the use of “Vacuum tubes” It was developed in 1904 by the British engineer “John Ambrose Fleming” . A vacuum tube is an electronic device used to control the flow of electric current in a vacuum. It is used in CRT(Cathode Ray Tube) TV , Radio , etc.

short essay on evolution of computer

The first general-purpose programmable electronic computer was the ENIAC (Electronic Numerical Integrator and Computer) which was completed in 1945 and introduced on Feb 14, 1946, to the public. It was built by two American engineers “J. Presper Eckert” and “John V Mauchly” at the University of Pennsylvania.

short essay on evolution of computer

The ENIAC was 30-50 feet long, 30 tons weighted, contained 18000 vacuum tubes, 70,000 registers, and 10,000 capacitors, and it required 150000 watts of electricity, which makes it very expensive.

Later, Eckert and Mauchly developed the first commercially successful computer named UNIVAC(Univeral Automatic Computer) in 1952 .

Examples are ENIAC (Electronic Numerical Integrator and Computer), EDVAC (Electronic Discrete Variable Automatic Computer), UNIVAC-1 (Univeral Automatic Computer-1)

short essay on evolution of computer

  • These computers were designed by using vacuum tubes.
  • These generations’ computers were simple architecture.
  • These computers calculate data in a millisecond.
  • This computer is used for scientific purposes.

DISADVANTAGES

  • The computer was very costly.
  • Very large.
  • It takes up a lot of space and electricity
  • The speed of these computers was very slow
  • It is used for commercial purposes.
  • It is very expensive.
  • These computers heat a lot.
  • Cooling is needed to operate these types of computers because they heat up very quickly.

2. SECOND GENERATION COMPUTER: Transistors (1956-1963)

short essay on evolution of computer

The second generation of computers is characterized by the use of “Transistors” and it was developed in 1947 by three American physicists “John Bardeen, Walter Brattain, and William Shockley” .

short essay on evolution of computer

A transistor is a semiconductor device used to amplify or switch electronic signals or open or close a circuit. It was invented in Bell labs, The transistors became the key ingredient of all digital circuits, including computers.

The invention of transistors replaced the bulky electric tubes from the first generation of computers.

Transistors perform the same functions as a Vacuum tube , except that electrons move through instead of through a vacuum. Transistors are made of semiconducting materials and they control the flow of electricity.

It is smaller than the first generation of computers, it is faster and less expensive compared to the first generation of computers. The second-generation computer has a high level of programming languages, including FORTRAN (1956), ALGOL (1958), and COBOL (1959).

Examples are PDP-8 (Programmed Data Processor-8), IBM1400 (International business machine 1400 series), IBM 7090 (International business machine 7090 series), CDC 3600 ( Control Data Corporation 3600 series)

short essay on evolution of computer

ADVANTAGES:

  • It is smaller in size as compared to the first-generation computer
  • It used less electricity
  • Not heated as much as the first-generation computer.
  • It has better speed

DISADVANTAGES:

  • It is also costly and not versatile
  • still, it is expensive for commercial purposes
  • Cooling is still needed
  • Punch cards were used for input
  • The computer is used for a particular purpose

3. THIRD GENERATION COMPUTER: Integrated Circuits (1964-1971)

short essay on evolution of computer

The Third generation of computers is characterized by the use of “Integrated Circuits” It was developed in 1958 by two American engineers “Robert Noyce” & “Jack Kilby” . The integrated circuit is a set of electronic circuits on small flat pieces of semiconductor that is normally known as silicon. The transistors were miniaturized and placed on silicon chips which are called semiconductors, which drastically increased the efficiency and speed of the computers.

short essay on evolution of computer

These ICs (integrated circuits) are popularly known as chips. A single IC has many transistors, resistors, and capacitors built on a single slice of silicon.

This development made computers smaller in size, low cost, large memory, and processing. The speed of these computers is very high and it is efficient and reliable also.

These generations of computers have a higher level of languages such as Pascal PL/1, FORTON-II to V, COBOL, ALGOL-68, and BASIC(Beginners All-purpose Symbolic Instruction Code) was developed during these periods.

Examples are NCR 395 (National Cash Register), IBM 360,370 series, B6500

short essay on evolution of computer

  • These computers are smaller in size as compared to previous generations
  • It consumed less energy and was more reliable
  • More Versatile
  • It produced less heat as compared to previous generations
  • These computers are used for commercial and as well as general-purpose
  • These computers used a fan for head discharge to prevent damage
  • This generation of computers has increased the storage capacity of computers
  • Still, a cooling system is needed.
  • It is still very costly
  • Sophisticated Technology is required to manufacture Integrated Circuits
  • It is not easy to maintain the IC chips.
  • The performance of these computers is degraded if we execute large applications.

4. FOURTH GENERATION OF COMPUTER: Microprocessor (1971-Present)

short essay on evolution of computer

The fourth generation of computers is characterized by the use of “Microprocessor”. It was invented in the 1970s and It was developed by four inventors named are “Marcian Hoff, Masatoshi Shima, Federico Faggin, and Stanley Mazor “. The first microprocessor named was the “Intel 4004” CPU, it was the first microprocessor that was invented.

short essay on evolution of computer

A microprocessor contains all the circuits required to perform arithmetic, logic, and control functions on a single chip. Because of microprocessors, fourth-generation includes more data processing capacity than equivalent-sized third-generation computers. Due to the development of microprocessors, it is possible to place the CPU(central processing unit) on a single chip. These computers are also known as microcomputers. The personal computer is a fourth-generation computer. It is the period when the evolution of computer networks takes place.

Examples are APPLE II, Alter 8800

short essay on evolution of computer

  • These computers are smaller in size and much more reliable as compared to other generations of computers.
  • The heating issue on these computers is almost negligible
  • No A/C or Air conditioner is required in a fourth-generation computer.
  • In these computers, all types of higher languages can be used in this generation
  • It is also used for the general purpose
  • less expensive
  • These computers are cheaper and portable
  • Fans are required to operate these kinds of computers
  • It required the latest technology for the need to make microprocessors and complex software
  • These computers were highly sophisticated
  • It also required advanced technology to make the ICs(Integrated circuits)

5. FIFTH GENERATION OF COMPUTERS (Present and beyond)

These generations of computers were based on AI (Artificial Intelligence) technology. Artificial technology is the branch of computer science concerned with making computers behave like humans and allowing the computer to make its own decisions currently, no computers exhibit full artificial intelligence (that is, can simulate human behavior).

short essay on evolution of computer

In the fifth generation of computers, VLSI technology and ULSI (Ultra Large Scale Integration) technology are used and the speed of these computers is extremely high. This generation introduced machines with hundreds of processors that could all be working on different parts of a single program. The development of a more powerful computer is still in progress. It has been predicted that such a computer will be able to communicate in natural spoken languages with its user.

In this generation, computers are also required to use a high level of languages like C language, c++, java, etc.

Examples are Desktop computers, laptops, notebooks, MacBooks, etc. These all are the computers which we are using.

short essay on evolution of computer

  • These computers are smaller in size and it is more compatible
  • These computers are mighty cheaper
  • It is obviously used for the general purpose
  • Higher technology is used
  • Development of true artificial intelligence
  • Advancement in Parallel Processing and Superconductor Technology.
  • It tends to be sophisticated and complex tools
  • It pushes the limit of transistor density.

Frequently Asked Questions

How many computer generations are there.

Mainly five generations are there:

First Generation Computer (1940-1956) Second Generation Computer (1956-1963) Third Generation Computer(1964-1971) Fourth Generation Computer(1971-Present) Fifth Generation Computer(Present and Beyond)

Which things were invented in the first generation of computers?

Vacuum Tubes

What is the fifth generation of computers?

The Fifth Generation of computers is entirely based on Artificial Intelligence. Where it predicts that the computer will be able to communicate in natural spoken languages with its user.

What is the latest computer generation?

The latest generation of computers is Fifth which is totally based on Artificial Intelligence.

Who is the inventor of the Integrated Circuit?

“Robert Noyce” and “Jack Bily”

What is the full form of ENIAC ?

ENIAC Stands for “Electronic Numerical Integrator and Computer” .

Related posts:

  • The History of Computer Systems and its Generations
  • Basic Components of Computer Systems and its Functions
  • What is a computer? How is it a useful device?
  • 10 Limitations Of Computers System|And its Capabilities?
  • Different Applications of Computer Systems in Various Fields | Top 12 Fields
  • Explain Von Neumann Architecture?
  • What are the input and Output Devices of Computer System with Examples
  • What is Unicode and ASCII Code
  • What is RAM and its Types?
  • What is the difference between firmware and driver? | What are Firmware and Driver?

2 thoughts on “The Evolution Of Computer | Generations of Computer”

It is really useful thanks

Glad to see

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Notify me of follow-up comments by email.

Notify me of new posts by email.

Logo for Clemson University Open Textbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Modern (1940’s-present)

58 History of Computers

Chandler Little

History of Computers

Modern technology first started evolving when electricity started to be used more often in everyday life. One of the biggest inventions in the 20th century was the computer, and it has gone through many changes and improvements since its creation. The last two decades have shown more advancement in technology than any other invention. They have advanced almost every level of learning in our lives and looks like it will only keep impacting through the decades. Computers in today’s society have become a focal point for everyday life and will continue to do so for the foreseeable future. During the evolution of computers, many people have helped with the creation and development, but some people’s contributions have been left out due to social status or lacking some credibility in the field.

Computers have come a long way from their creation. This first computer was created in 1822 by Charles Babbage. This computer was created with a series of vacuum tubes and weighed a total of 700 pounds, which is much larger than the computers we see today. For example, most laptops weigh in a range of two to eight pounds. A picture of one of the first computers can be seen below in figure 1. There have been large amounts of movement in the data storage sector in computers. The very first hard drive was created in 1956 and had a capacity of 5MB and weighed in at 550 pounds. Today hard drives are becoming smaller and we see them weighing a couple of ounces to a couple of pounds. As files have come more complex, the need for more space in computers has increased drastically. Today we see games take up to 100GB of storage. To give you a reference as to how big of a difference 5MB is to 100GB of storage is that 5MB is .005 GB. The hard drives we have today are seeing sizes from 10TB and larger. A TB is 1000GB. The evolution of the hard drive can be seen in figure 2. As the world of computers keeps progressing, there is a general concept of making them smaller, but at the same time seeing a generational step in improvement. With these large improvements, we see daily tasks from users like teachers, researchers, and doctors become shorter making their tasks quicker and easier to accomplish. With these great advancements in hardware, we are witnessing advancements in software as well. In software development, we are seeing strives in staying connected to others in the ways of social media, messaging platforms, and other means of communication. With all of these advancements in hardware and software, the hope is that we don’t become too reliant on computers. In the wake of a large power outage or an EMP attack, it could cripple our way of life.

short essay on evolution of computer

The evolution of computers has been happening at a fast rate, and when this happens people’s contributions are left out. The main demographic in computers that are left out are women. Grace Hopper is one of the most influential people in the computer spectrum, but her work is not shown in the classroom. In the 1950s, Grace Hopper was a senior mathematician on her team for UNIVAC(UNIVersal Automatic  Computing INC). Here she created the very first compiler (Cassel, 2016). This was a massive accomplishment for anyone in the field of computing because it allowed the idea that programming languages are not tied to a specific computer, but can be used on any computer. This single feature in computers was one of the main driving forces for computing to become so robust and powerful that it is today. Grace Hopper’s work needs to be talked about in classrooms not only just in engineering courses, but as well as general classes. Students need to hear that a woman was the driving force behind the evolution of computing. By talking about this, it may urge more women to join the computing field because right now only 25% of jobs in the computing sector are held by women (Cassel, 2016). With a more diverse workforce in computing, we can see the creation of new ideas and features that were never thought of before.

During the evolution of computers, many people have been left out with their creation with respect to the development and algorithms. With the push to gender equality in the world in future years, this gap between the disparity between women’s credibility and men’s credibility will be shrunk to a negligible amount. As computers continue to evolve the world of STS will need to evolve with them to adapt to the changes in technology. If not, some of the great creations in the computer sector will be neglected, and most notoriously here is VR (Virtual Reality) with its higher entry-level price and motion sickness that comes along with VR (Virtual Reality).

How has the advancement in tech improved your life?

REFERENCES                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

A brief history of computers – unipi.it . (n.d.). Retrieved November 7, 2022, from http://digitaltools.labcd.unipi.it/wp-content/uploads/2021/05/A-brief-history-of-computers.pdf

Kleiman, K., About the author Kathy Kleiman is the founder of the ENIAC Programmers Project, & Saklayen, N. (2018, April 19). These 6 pioneering women helped create modern computers. ideas.ted.com. Retrieved September 26, 2021, from https://ideas.ted.com/how-i-discovered-six-pioneering-women-who-helped-create-modern-computers-and-why-we-should-never-forget-them /.

Lillian Cassel. “Op-Ed: 25 Years After Computing Pioneer Grace Hopper’ s Death, We Still Have Work to Do”. USNEWS.com, December 15, 2016 Thursday. advance-lexis-com.libproxy.clemson.edu/api/document?collection=news&id=urn:contentItem:5MDD-4VS1-JCKG-J4GB-00000-00&context= 1516831.

Thompson, C. (2019, June 1). The gendered history of human computers. Smithsonian.com. Retrieved September 26, 2021, from https://www.smithsonianmag.com/science-nature/history-human-computers-180972202/.

Zimmermann, K. A. (2017, September 7). History of computers: A brief timeline. LiveScience. Retrieved September 26, 2021, from https://www.livescience.com/20718-computer-history.html.

(May 26, 2017 Friday). Women in Computing and Women in Engineering honored for promoting girls in STEM. US Official News. https://advance-lexis-com.libproxy.clemson.edu/api/document?collection=news&id=urn:contentItem:5NMW-SXG1-DXCW-D04G-00000-00& context=1516831.

IMAGES                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        

“Gene Amdahl’s first computer.” by Erik Pitti is licensed under CC BY 2.0

“First hard drives” by gabrielsaldana is licensed under CC BY 2.0

To the extent possible under law, Chandler Little has waived all copyright and related or neighboring rights to Science Technology and Society a Student Led Exploration , except where otherwise noted.

Share This Book

The Evolution of the Computer: How Did We Get to Where We Are Today?

The computer has become a mainstay in many people's lives. But were you previously aware of these interesting facts?

Computers have become essential parts of our modern lives. We use them for work, school, shopping, entertainment, and almost everything else.

But where did it all start for these groundbreaking devices, and what does the future hold for them? Here's how the computer has changed over time.

The First Automatic Computing Engine Didn't Look Like You'd Expect

The first automatic computing engines were invented in the 19th century and called the Babbage Difference Engine. However, it was conceptualized beforehand by Johann Helfrich von Müller.

Müller was a German engineer who sketched a proposed structure for this computer on paper in the late 18th century. Unfortunately for him, though, technology hadn't yet reached the point where he could build the device himself.

The Babbage Difference Engine was huge, weighing around five tons. It was designed to solve complex mathematical problems that took a long time to solve manually.

Some argue that the Electronic Numerical Integrator And Computer (ENIAC) was the first actual computer. Over a century after the invention of the Babbage Difference Machine, John Mauchly invented this device. He quickly filed for a patent, allowing him to take credit for the first computer.

The First Computer Monitor Was Created in 1973

Neither the Babbage Machine nor the ENIAC looks anything like the computers we use today. The first computer with a monitor was invented in 1973 and was called the Xerox Alto.

The Xerox Alto was developed by PARC, an American company, as a research system. Its user-friendly features added to its groundbreaking stance in the electronics industry. Even a child could operate this computer, which was pretty much impossible for the Babbage Difference Engine or ENIAC.

Related: What Is The Future of Pay-Per-View Streaming?

The Alto paved the way for modern-day Graphic User Interfaces (GUIs) with its easy-to-use graphics software. The Alto's screen used a bitmap display, a rudimentary computer output device, but pretty impressive for the 1970s. It even had its own mouse, though it looked rather different to the ones we use today.

The First Publicly Available Laptop Came Just Nine Years After the First Monitor

It doesn't really look much like a laptop, does it?

This device is called the Osborne 1 and was the first laptop ever made available to the public. It was created by Adam Osborne, a British author and software publisher, in 1981. This laptop had its own display monitor, like the Xerox Alto, but it was only five inches wide. It had 64K random access memory (RAM), 4K read-only memory (ROM), and two floppy disk drives.

While this laptop certainly caused a lot of discussion in the industry upon its release, it wasn't exactly convenient as a portable computer. Initially, the Osborne 1 didn't have a battery. Therefore, you needed to plug it into an outlet at all times during use. Some years after its initial release, developers began including a battery in the laptop. However, this only provided one hour of wireless usage.

Not many people could afford the Osborne 1 when it was first released. The original price of the Osborne 1 in 1981 was $1,795. This is a shockingly high price point, given that standard laptops today can cost anywhere between $800 and $2,000—and have several more features. But considering that the Osborne 1 was the first ever laptop you could buy, can we really be mad at the price?

Modern Computers Have Pushed Boundaries We Could Never Have Previously Imagined

The computer has come an incredibly long way over the 21st century. We've seen drastic improvements to picture quality, memory storage, and battery life—plus several other elements. But what does the future hold for computers?

There are many things that people want to see in future computers: all-day battery life, even faster processing speeds, and even better graphics. Well, tech companies are already looking into improving these aspects. Lenovo has already released a laptop with a dual-display feature. Meanwhile, some computers now on the market—such as the Dell XPS 13 and the HP Spectre x360—have incredible 4K displays.

But more than anything, the quantum computer is now taking the stage as the future of computing. Quantum computers are pretty different from those you'll find in a store. These machines use the properties of quantum physics to perform operations. The key difference between traditional and quantum computers is the way they store data. Conventional computers use bits, whereas quantum computers use quantum bits—or "qubits".

Related: Nanocomputing: Can Computers Really Be Microscopic?

Quantum computers can therefore generate solutions to huge problems with the help of complex learning algorithms. They can also consider and process multiple outcomes and combinations simultaneously, making them super-fast in their operation. The exciting part of this technology is that, in ten years, these computers could even provide solutions to substantial global crises, similar to the COVID-19 pandemic and climate change. Such capabilities would potentially make quantum computers life-saving, which would be a huge technological step for humanity.

There’s No Knowing What Computers Could End Up Doing, but It’s Incredible to Think About

With technology advancing so massively from decade to decade, one can only imagine what our computers will be able to do in 30 or 40 years.

With our home computers' current capabilities and the promising possibilities of quantum computers, we can only assume that computers will continue to change the world even more than they already have.

  • Random article
  • Teaching guide
  • Privacy & cookies

A model of a Babbage-style Difference Engine at the Computer History Museum. Photo by Cory Doctorow.

A brief history of computers

by Chris Woodford . Last updated: January 19, 2023.

C omputers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations more quickly than the human brain—is exactly the same.

Read on to learn more about the history of computers—or take a look at our article on how computers work .

Photo: A model of one of the world's first computers (the Difference Engine invented by Charles Babbage) at the Computer History Museum in Mountain View, California, USA. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Cogs and Calculators

It is a measure of the brilliance of the abacus, invented in the Middle East circa 500 BC, that it remained the fastest form of calculator until the middle of the 17th century. Then, in 1642, aged only 18, French scientist and philosopher Blaise Pascal (1623–1666) invented the first practical mechanical calculator , the Pascaline, to help his tax-collector father do his sums. The machine had a series of interlocking cogs ( gear wheels with teeth around their outer edges) that could add and subtract decimal numbers. Several decades later, in 1671, German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) came up with a similar but more advanced machine. Instead of using cogs, it had a "stepped drum" (a cylinder with teeth of increasing length around its edge), an innovation that survived in mechanical calculators for 300 hundred years. The Leibniz machine could do much more than Pascal's: as well as adding and subtracting, it could multiply, divide, and work out square roots. Another pioneering feature was the first memory store or "register."

Apart from developing one of the world's earliest mechanical calculators, Leibniz is remembered for another important contribution to computing: he was the man who invented binary code, a way of representing any decimal number using only the two digits zero and one. Although Leibniz made no use of binary in his own calculator, it set others thinking. In 1854, a little over a century after Leibniz had died, Englishman George Boole (1815–1864) used the idea to invent a new branch of mathematics called Boolean algebra. [1] In modern computers, binary code and Boolean algebra allow computers to make simple decisions by comparing long strings of zeros and ones. But, in the 19th century, these ideas were still far ahead of their time. It would take another 50–100 years for mathematicians and computer scientists to figure out how to use them (find out more in our articles about calculators and logic gates ).

Artwork: Pascaline: Two details of Blaise Pascal's 17th-century calculator. Left: The "user interface": the part where you dial in numbers you want to calculate. Right: The internal gear mechanism. Picture courtesy of US Library of Congress .

Engines of Calculation

Neither the abacus, nor the mechanical calculators constructed by Pascal and Leibniz really qualified as computers. A calculator is a device that makes it quicker and easier for people to do sums—but it needs a human operator. A computer, on the other hand, is a machine that can operate automatically, without any human help, by following a series of stored instructions called a program (a kind of mathematical recipe). Calculators evolved into computers when people devised ways of making entirely automatic, programmable calculators.

Photo: Punched cards: Herman Hollerith perfected the way of using punched cards and paper tape to store information and feed it into a machine. Here's a drawing from his 1889 patent Art of Compiling Statistics (US Patent#395,782), showing how a strip of paper (yellow) is punched with different patterns of holes (orange) that correspond to statistics gathered about people in the US census. Picture courtesy of US Patent and Trademark Office.

The first person to attempt this was a rather obsessive, notoriously grumpy English mathematician named Charles Babbage (1791–1871). Many regard Babbage as the "father of the computer" because his machines had an input (a way of feeding in numbers), a memory (something to store these numbers while complex calculations were taking place), a processor (the number-cruncher that carried out the calculations), and an output (a printing mechanism)—the same basic components shared by all modern computers. During his lifetime, Babbage never completed a single one of the hugely ambitious machines that he tried to build. That was no surprise. Each of his programmable "engines" was designed to use tens of thousands of precision-made gears. It was like a pocket watch scaled up to the size of a steam engine , a Pascal or Leibniz machine magnified a thousand-fold in dimensions, ambition, and complexity. For a time, the British government financed Babbage—to the tune of £17,000, then an enormous sum. But when Babbage pressed the government for more money to build an even more advanced machine, they lost patience and pulled out. Babbage was more fortunate in receiving help from Augusta Ada Byron (1815–1852), Countess of Lovelace, daughter of the poet Lord Byron. An enthusiastic mathematician, she helped to refine Babbage's ideas for making his machine programmable—and this is why she is still, sometimes, referred to as the world's first computer programmer. [2] Little of Babbage's work survived after his death. But when, by chance, his notebooks were rediscovered in the 1930s, computer scientists finally appreciated the brilliance of his ideas. Unfortunately, by then, most of these ideas had already been reinvented by others.

Artwork: Charles Babbage (1791–1871). Picture from The Illustrated London News, 1871, courtesy of US Library of Congress .

Babbage had intended that his machine would take the drudgery out of repetitive calculations. Originally, he imagined it would be used by the army to compile the tables that helped their gunners to fire cannons more accurately. Toward the end of the 19th century, other inventors were more successful in their effort to construct "engines" of calculation. American statistician Herman Hollerith (1860–1929) built one of the world's first practical calculating machines, which he called a tabulator, to help compile census data. Then, as now, a census was taken each decade but, by the 1880s, the population of the United States had grown so much through immigration that a full-scale analysis of the data by hand was taking seven and a half years. The statisticians soon figured out that, if trends continued, they would run out of time to compile one census before the next one fell due. Fortunately, Hollerith's tabulator was an amazing success: it tallied the entire census in only six weeks and completed the full analysis in just two and a half years. Soon afterward, Hollerith realized his machine had other applications, so he set up the Tabulating Machine Company in 1896 to manufacture it commercially. A few years later, it changed its name to the Computing-Tabulating-Recording (C-T-R) company and then, in 1924, acquired its present name: International Business Machines (IBM).

Photo: Keeping count: Herman Hollerith's late-19th-century census machine (blue, left) could process 12 separate bits of statistical data each minute. Its compact 1940 replacement (red, right), invented by Eugene M. La Boiteaux of the Census Bureau, could work almost five times faster. Photo by Harris & Ewing courtesy of US Library of Congress .

Bush and the bomb

Photo: Dr Vannevar Bush (1890–1974). Picture by Harris & Ewing, courtesy of US Library of Congress .

The history of computing remembers colorful characters like Babbage, but others who played important—if supporting—roles are less well known. At the time when C-T-R was becoming IBM, the world's most powerful calculators were being developed by US government scientist Vannevar Bush (1890–1974). In 1925, Bush made the first of a series of unwieldy contraptions with equally cumbersome names: the New Recording Product Integraph Multiplier. Later, he built a machine called the Differential Analyzer, which used gears, belts, levers, and shafts to represent numbers and carry out calculations in a very physical way, like a gigantic mechanical slide rule. Bush's ultimate calculator was an improved machine named the Rockefeller Differential Analyzer, assembled in 1935 from 320 km (200 miles) of wire and 150 electric motors . Machines like these were known as analog calculators—analog because they stored numbers in a physical form (as so many turns on a wheel or twists of a belt) rather than as digits. Although they could carry out incredibly complex calculations, it took several days of wheel cranking and belt turning before the results finally emerged.

Impressive machines like the Differential Analyzer were only one of several outstanding contributions Bush made to 20th-century technology. Another came as the teacher of Claude Shannon (1916–2001), a brilliant mathematician who figured out how electrical circuits could be linked together to process binary code with Boolean algebra (a way of comparing binary numbers using logic) and thus make simple decisions. During World War II, President Franklin D. Roosevelt appointed Bush chairman first of the US National Defense Research Committee and then director of the Office of Scientific Research and Development (OSRD). In this capacity, he was in charge of the Manhattan Project, the secret $2-billion initiative that led to the creation of the atomic bomb. One of Bush's final wartime contributions was to sketch out, in 1945, an idea for a memory-storing and sharing device called Memex that would later inspire Tim Berners-Lee to invent the World Wide Web . [3] Few outside the world of computing remember Vannevar Bush today—but what a legacy! As a father of the digital computer, an overseer of the atom bomb, and an inspiration for the Web, Bush played a pivotal role in three of the 20th-century's most far-reaching technologies.

Photo: "A gigantic mechanical slide rule": A differential analyzer pictured in 1938. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

Turing—tested

The first modern computers.

The World War II years were a crucial period in the history of computing, when powerful gargantuan computers began to appear. Just before the outbreak of the war, in 1938, German engineer Konrad Zuse (1910–1995) constructed his Z1, the world's first programmable binary computer, in his parents' living room. [4] The following year, American physicist John Atanasoff (1903–1995) and his assistant, electrical engineer Clifford Berry (1918–1963), built a more elaborate binary machine that they named the Atanasoff Berry Computer (ABC). It was a great advance—1000 times more accurate than Bush's Differential Analyzer. These were the first machines that used electrical switches to store numbers: when a switch was "off", it stored the number zero; flipped over to its other, "on", position, it stored the number one. Hundreds or thousands of switches could thus store a great many binary digits (although binary is much less efficient in this respect than decimal, since it takes up to eight binary digits to store a three-digit decimal number). These machines were digital computers: unlike analog machines, which stored numbers using the positions of wheels and rods, they stored numbers as digits.

The first large-scale digital computer of this kind appeared in 1944 at Harvard University, built by mathematician Howard Aiken (1900–1973). Sponsored by IBM, it was variously known as the Harvard Mark I or the IBM Automatic Sequence Controlled Calculator (ASCC). A giant of a machine, stretching 15m (50ft) in length, it was like a huge mechanical calculator built into a wall. It must have sounded impressive, because it stored and processed numbers using "clickety-clack" electromagnetic relays (electrically operated magnets that automatically switched lines in telephone exchanges)—no fewer than 3304 of them. Impressive they may have been, but relays suffered from several problems: they were large (that's why the Harvard Mark I had to be so big); they needed quite hefty pulses of power to make them switch; and they were slow (it took time for a relay to flip from "off" to "on" or from 0 to 1).

Photo: An analog computer being used in military research in 1949. Picture courtesy of NASA on the Commons (where you can download a larger version.

Most of the machines developed around this time were intended for military purposes. Like Babbage's never-built mechanical engines, they were designed to calculate artillery firing tables and chew through the other complex chores that were then the lot of military mathematicians. During World War II, the military co-opted thousands of the best scientific minds: recognizing that science would win the war, Vannevar Bush's Office of Scientific Research and Development employed 10,000 scientists from the United States alone. Things were very different in Germany. When Konrad Zuse offered to build his Z2 computer to help the army, they couldn't see the need—and turned him down.

On the Allied side, great minds began to make great breakthroughs. In 1943, a team of mathematicians based at Bletchley Park near London, England (including Alan Turing) built a computer called Colossus to help them crack secret German codes. Colossus was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube (also known, especially in Britain, as a valve). The vacuum tube, each one about as big as a person's thumb (earlier ones were very much bigger) and glowing red hot like a tiny electric light bulb, had been invented in 1906 by Lee de Forest (1873–1961), who named it the Audion. This breakthrough earned de Forest his nickname as "the father of radio" because their first major use was in radio receivers , where they amplified weak incoming signals so people could hear them more clearly. [5] In computers such as the ABC and Colossus, vacuum tubes found an alternative use as faster and more compact switches.

Just like the codes it was trying to crack, Colossus was top-secret and its existence wasn't confirmed until after the war ended. As far as most people were concerned, vacuum tubes were pioneered by a more visible computer that appeared in 1946: the Electronic Numerical Integrator And Calculator (ENIAC). The ENIAC's inventors, two scientists from the University of Pennsylvania, John Mauchly (1907–1980) and J. Presper Eckert (1919–1995), were originally inspired by Bush's Differential Analyzer; years later Eckert recalled that ENIAC was the "descendant of Dr Bush's machine." But the machine they constructed was far more ambitious. It contained nearly 18,000 vacuum tubes (nine times more than Colossus), was around 24 m (80 ft) long, and weighed almost 30 tons. ENIAC is generally recognized as the world's first fully electronic, general-purpose, digital computer. Colossus might have qualified for this title too, but it was designed purely for one job (code-breaking); since it couldn't store a program, it couldn't easily be reprogrammed to do other things.

Photo: Sir Maurice Wilkes (left), his collaborator William Renwick, and the early EDSAC-1 electronic computer they built in Cambridge, pictured around 1947/8. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

ENIAC was just the beginning. Its two inventors formed the Eckert Mauchly Computer Corporation in the late 1940s. Working with a brilliant Hungarian mathematician, John von Neumann (1903–1957), who was based at Princeton University, they then designed a better machine called EDVAC (Electronic Discrete Variable Automatic Computer). In a key piece of work, von Neumann helped to define how the machine stored and processed its programs, laying the foundations for how all modern computers operate. [6] After EDVAC, Eckert and Mauchly developed UNIVAC 1 (UNIVersal Automatic Computer) in 1951. They were helped in this task by a young, largely unknown American mathematician and Naval reserve named Grace Murray Hopper (1906–1992), who had originally been employed by Howard Aiken on the Harvard Mark I. Like Herman Hollerith's tabulator over 50 years before, UNIVAC 1 was used for processing data from the US census. It was then manufactured for other users—and became the world's first large-scale commercial computer.

Machines like Colossus, the ENIAC, and the Harvard Mark I compete for significance and recognition in the minds of computer historians. Which one was truly the first great modern computer? All of them and none: these—and several other important machines—evolved our idea of the modern electronic computer during the key period between the late 1930s and the early 1950s. Among those other machines were pioneering computers put together by English academics, notably the Manchester/Ferranti Mark I, built at Manchester University by Frederic Williams (1911–1977) and Thomas Kilburn (1921–2001), and the EDSAC (Electronic Delay Storage Automatic Calculator), built by Maurice Wilkes (1913–2010) at Cambridge University. [7]

Photo: Control panel of the UNIVAC 1, the world's first large-scale commercial computer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

The microelectronic revolution

Vacuum tubes were a considerable advance on relay switches, but machines like the ENIAC were notoriously unreliable. The modern term for a problem that holds up a computer program is a "bug." Popular legend has it that this word entered the vocabulary of computer programmers sometime in the 1950s when moths, attracted by the glowing lights of vacuum tubes, flew inside machines like the ENIAC, caused a short circuit, and brought work to a juddering halt. But there were other problems with vacuum tubes too. They consumed enormous amounts of power: the ENIAC used about 2000 times as much electricity as a modern laptop. And they took up huge amounts of space. Military needs were driving the development of machines like the ENIAC, but the sheer size of vacuum tubes had now become a real problem. ABC had used 300 vacuum tubes, Colossus had 2000, and the ENIAC had 18,000. The ENIAC's designers had boasted that its calculating speed was "at least 500 times as great as that of any other existing computing machine." But developing computers that were an order of magnitude more powerful still would have needed hundreds of thousands or even millions of vacuum tubes—which would have been far too costly, unwieldy, and unreliable. So a new technology was urgently required.

The solution appeared in 1947 thanks to three physicists working at Bell Telephone Laboratories (Bell Labs). John Bardeen (1908–1991), Walter Brattain (1902–1987), and William Shockley (1910–1989) were then helping Bell to develop new technology for the American public telephone system, so the electrical signals that carried phone calls could be amplified more easily and carried further. Shockley, who was leading the team, believed he could use semiconductors (materials such as germanium and silicon that allow electricity to flow through them only when they've been treated in special ways) to make a better form of amplifier than the vacuum tube. When his early experiments failed, he set Bardeen and Brattain to work on the task for him. Eventually, in December 1947, they created a new form of amplifier that became known as the point-contact transistor. Bell Labs credited Bardeen and Brattain with the transistor and awarded them a patent. This enraged Shockley and prompted him to invent an even better design, the junction transistor, which has formed the basis of most transistors ever since.

Like vacuum tubes, transistors could be used as amplifiers or as switches. But they had several major advantages. They were a fraction the size of vacuum tubes (typically about as big as a pea), used no power at all unless they were in operation, and were virtually 100 percent reliable. The transistor was one of the most important breakthroughs in the history of computing and it earned its inventors the world's greatest science prize, the 1956 Nobel Prize in Physics . By that time, however, the three men had already gone their separate ways. John Bardeen had begun pioneering research into superconductivity , which would earn him a second Nobel Prize in 1972. Walter Brattain moved to another part of Bell Labs.

William Shockley decided to stick with the transistor, eventually forming his own corporation to develop it further. His decision would have extraordinary consequences for the computer industry. With a small amount of capital, Shockley set about hiring the best brains he could find in American universities, including young electrical engineer Robert Noyce (1927–1990) and research chemist Gordon Moore (1929–). It wasn't long before Shockley's idiosyncratic and bullying management style upset his workers. In 1956, eight of them—including Noyce and Moore—left Shockley Transistor to found a company of their own, Fairchild Semiconductor, just down the road. Thus began the growth of "Silicon Valley," the part of California centered on Palo Alto, where many of the world's leading computer and electronics companies have been based ever since. [8]

It was in Fairchild's California building that the next breakthrough occurred—although, somewhat curiously, it also happened at exactly the same time in the Dallas laboratories of Texas Instruments. In Dallas, a young engineer from Kansas named Jack Kilby (1923–2005) was considering how to improve the transistor. Although transistors were a great advance on vacuum tubes, one key problem remained. Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error prone. Wouldn't it be better, Kilby reflected, if many transistors could be made in a single package? This prompted him to invent the "monolithic" integrated circuit (IC) , a collection of transistors and other components that could be manufactured all at once, in a block, on the surface of a semiconductor. Kilby's invention was another step forward, but it also had a drawback: the components in his integrated circuit still had to be connected by hand. While Kilby was making his breakthrough in Dallas, unknown to him, Robert Noyce was perfecting almost exactly the same idea at Fairchild in California. Noyce went one better, however: he found a way to include the connections between components in an integrated circuit, thus automating the entire process.

Photo: An integrated circuit from the 1980s. This is an EPROM chip (effectively a forerunner of flash memory , which you could only erase with a blast of ultraviolet light).

Mainframes, minis, and micros

Photo: An IBM 704 mainframe pictured at NASA in 1958. Designed by Gene Amdahl, this scientific number cruncher was the successor to the 701 and helped pave the way to arguably the most important IBM computer of all time, the System/360, which Amdahl also designed. Photo courtesy of NASA .

Photo: The control panel of DEC's classic 1965 PDP-8 minicomputer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Integrated circuits, as much as transistors, helped to shrink computers during the 1960s. In 1943, IBM boss Thomas Watson had reputedly quipped: "I think there is a world market for about five computers." Just two decades later, the company and its competitors had installed around 25,000 large computer systems across the United States. As the 1960s wore on, integrated circuits became increasingly sophisticated and compact. Soon, engineers were speaking of large-scale integration (LSI), in which hundreds of components could be crammed onto a single chip, and then very large-scale integrated (VLSI), when the same chip could contain thousands of components.

The logical conclusion of all this miniaturization was that, someday, someone would be able to squeeze an entire computer onto a chip. In 1968, Robert Noyce and Gordon Moore had left Fairchild to establish a new company of their own. With integration very much in their minds, they called it Integrated Electronics or Intel for short. Originally they had planned to make memory chips, but when the company landed an order to make chips for a range of pocket calculators, history headed in a different direction. A couple of their engineers, Federico Faggin (1941–) and Marcian Edward (Ted) Hoff (1937–), realized that instead of making a range of specialist chips for a range of calculators, they could make a universal chip that could be programmed to work in them all. Thus was born the general-purpose, single chip computer or microprocessor—and that brought about the next phase of the computer revolution.

Personal computers

By 1974, Intel had launched a popular microprocessor known as the 8080 and computer hobbyists were soon building home computers around it. The first was the MITS Altair 8800, built by Ed Roberts . With its front panel covered in red LED lights and toggle switches, it was a far cry from modern PCs and laptops. Even so, it sold by the thousand and earned Roberts a fortune. The Altair inspired a Californian electronics wizard name Steve Wozniak (1950–) to develop a computer of his own. "Woz" is often described as the hacker's "hacker"—a technically brilliant and highly creative engineer who pushed the boundaries of computing largely for his own amusement. In the mid-1970s, he was working at the Hewlett-Packard computer company in California, and spending his free time tinkering away as a member of the Homebrew Computer Club in the Bay Area.

After seeing the Altair, Woz used a 6502 microprocessor (made by an Intel rival, Mos Technology) to build a better home computer of his own: the Apple I. When he showed off his machine to his colleagues at the club, they all wanted one too. One of his friends, Steve Jobs (1955–2011), persuaded Woz that they should go into business making the machine. Woz agreed so, famously, they set up Apple Computer Corporation in a garage belonging to Jobs' parents. After selling 175 of the Apple I for the devilish price of $666.66, Woz built a much better machine called the Apple ][ (pronounced "Apple Two"). While the Altair 8800 looked like something out of a science lab, and the Apple I was little more than a bare circuit board, the Apple ][ took its inspiration from such things as Sony televisions and stereos: it had a neat and friendly looking cream plastic case. Launched in April 1977, it was the world's first easy-to-use home "microcomputer." Soon home users, schools, and small businesses were buying the machine in their tens of thousands—at $1298 a time. Two things turned the Apple ][ into a really credible machine for small firms: a disk drive unit, launched in 1978, which made it easy to store data; and a spreadsheet program called VisiCalc, which gave Apple users the ability to analyze that data. In just two and a half years, Apple sold around 50,000 of the machine, quickly accelerating out of Jobs' garage to become one of the world's biggest companies. Dozens of other microcomputers were launched around this time, including the TRS-80 from Radio Shack (Tandy in the UK) and the Commodore PET. [9]

Apple's success selling to businesses came as a great shock to IBM and the other big companies that dominated the computer industry. It didn't take a VisiCalc spreadsheet to figure out that, if the trend continued, upstarts like Apple would undermine IBM's immensely lucrative business market selling "Big Blue" computers. In 1980, IBM finally realized it had to do something and launched a highly streamlined project to save its business. One year later, it released the IBM Personal Computer (PC), based on an Intel 8080 microprocessor, which rapidly reversed the company's fortunes and stole the market back from Apple.

The PC was successful essentially for one reason. All the dozens of microcomputers that had been launched in the 1970s—including the Apple ][—were incompatible. All used different hardware and worked in different ways. Most were programmed using a simple, English-like language called BASIC, but each one used its own flavor of BASIC, which was tied closely to the machine's hardware design. As a result, programs written for one machine would generally not run on another one without a great deal of conversion. Companies who wrote software professionally typically wrote it just for one machine and, consequently, there was no software industry to speak of.

In 1976, Gary Kildall (1942–1994), a teacher and computer scientist, and one of the founders of the Homebrew Computer Club, had figured out a solution to this problem. Kildall wrote an operating system (a computer's fundamental control software) called CP/M that acted as an intermediary between the user's programs and the machine's hardware. With a stroke of genius, Kildall realized that all he had to do was rewrite CP/M so it worked on each different machine. Then all those machines could run identical user programs—without any modification at all—inside CP/M. That would make all the different microcomputers compatible at a stroke. By the early 1980s, Kildall had become a multimillionaire through the success of his invention: the first personal computer operating system. Naturally, when IBM was developing its personal computer, it approached him hoping to put CP/M on its own machine. Legend has it that Kildall was out flying his personal plane when IBM called, so missed out on one of the world's greatest deals. But the truth seems to have been that IBM wanted to buy CP/M outright for just $200,000, while Kildall recognized his product was worth millions more and refused to sell. Instead, IBM turned to a young programmer named Bill Gates (1955–). His then tiny company, Microsoft, rapidly put together an operating system called DOS, based on a product called QDOS (Quick and Dirty Operating System), which they acquired from Seattle Computer Products. Some believe Microsoft and IBM cheated Kildall out of his place in computer history; Kildall himself accused them of copying his ideas. Others think Gates was simply the shrewder businessman. Either way, the IBM PC, powered by Microsoft's operating system, was a runaway success.

Yet IBM's victory was short-lived. Cannily, Bill Gates had sold IBM the rights to one flavor of DOS (PC-DOS) and retained the rights to a very similar version (MS-DOS) for his own use. When other computer manufacturers, notably Compaq and Dell, starting making IBM-compatible (or "cloned") hardware, they too came to Gates for the software. IBM charged a premium for machines that carried its badge, but consumers soon realized that PCs were commodities: they contained almost identical components—an Intel microprocessor, for example—no matter whose name they had on the case. As IBM lost market share, the ultimate victors were Microsoft and Intel, who were soon supplying the software and hardware for almost every PC on the planet. Apple, IBM, and Kildall made a great deal of money—but all failed to capitalize decisively on their early success. [10]

Photo: Personal computers threatened companies making large "mainframes" like this one. Picture courtesy of NASA on the Commons (where you can download a larger version).

The user revolution

Fortunately for Apple, it had another great idea. One of the Apple II's strongest suits was its sheer "user-friendliness." For Steve Jobs, developing truly easy-to-use computers became a personal mission in the early 1980s. What truly inspired him was a visit to PARC (Palo Alto Research Center), a cutting-edge computer laboratory then run as a division of the Xerox Corporation. Xerox had started developing computers in the early 1970s, believing they would make paper (and the highly lucrative photocopiers Xerox made) obsolete. One of PARC's research projects was an advanced $40,000 computer called the Xerox Alto. Unlike most microcomputers launched in the 1970s, which were programmed by typing in text commands, the Alto had a desktop-like screen with little picture icons that could be moved around with a mouse: it was the very first graphical user interface (GUI, pronounced "gooey")—an idea conceived by Alan Kay (1940–) and now used in virtually every modern computer. The Alto borrowed some of its ideas, including the mouse , from 1960s computer pioneer Douglas Engelbart (1925–2013).

Photo: During the 1980s, computers started to converge on the same basic "look and feel," largely inspired by the work of pioneers like Alan Kay and Douglas Engelbart. Photographs in the Carol M. Highsmith Archive, courtesy of US Library of Congress , Prints and Photographs Division.

Back at Apple, Jobs launched his own version of the Alto project to develop an easy-to-use computer called PITS (Person In The Street). This machine became the Apple Lisa, launched in January 1983—the first widely available computer with a GUI desktop. With a retail price of $10,000, over three times the cost of an IBM PC, the Lisa was a commercial flop. But it paved the way for a better, cheaper machine called the Macintosh that Jobs unveiled a year later, in January 1984. With its memorable launch ad for the Macintosh inspired by George Orwell's novel 1984 , and directed by Ridley Scott (director of the dystopic movie Blade Runner ), Apple took a swipe at IBM's monopoly, criticizing what it portrayed as the firm's domineering—even totalitarian—approach: Big Blue was really Big Brother. Apple's ad promised a very different vision: "On January 24, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984'." The Macintosh was a critical success and helped to invent the new field of desktop publishing in the mid-1980s, yet it never came close to challenging IBM's position.

Ironically, Jobs' easy-to-use machine also helped Microsoft to dislodge IBM as the world's leading force in computing. When Bill Gates saw how the Macintosh worked, with its easy-to-use picture-icon desktop, he launched Windows, an upgraded version of his MS-DOS software. Apple saw this as blatant plagiarism and filed a $5.5 billion copyright lawsuit in 1988. Four years later, the case collapsed with Microsoft effectively securing the right to use the Macintosh "look and feel" in all present and future versions of Windows. Microsoft's Windows 95 system, launched three years later, had an easy-to-use, Macintosh-like desktop and MS-DOS running behind the scenes.

Photo: The IBM Blue Gene/P supercomputer at Argonne National Laboratory: one of the world's most powerful computers. Picture courtesy of Argonne National Laboratory published on Wikimedia Commons in 2009 under a Creative Commons Licence .

From nets to the Internet

Standardized PCs running standardized software brought a big benefit for businesses: computers could be linked together into networks to share information. At Xerox PARC in 1973, electrical engineer Bob Metcalfe (1946–) developed a new way of linking computers "through the ether" (empty space) that he called Ethernet. A few years later, Metcalfe left Xerox to form his own company, 3Com, to help companies realize "Metcalfe's Law": computers become useful the more closely connected they are to other people's computers. As more and more companies explored the power of local area networks (LANs), so, as the 1980s progressed, it became clear that there were great benefits to be gained by connecting computers over even greater distances—into so-called wide area networks (WANs).

Photo: Computers aren't what they used to be: they're much less noticeable because they're much more seamlessly integrated into everyday life. Some are "embedded" into household gadgets like coffee makers or televisions . Others travel round in our pockets in our smartphones—essentially pocket computers that we can program simply by downloading "apps" (applications).

Today, the best known WAN is the Internet —a global network of individual computers and LANs that links up hundreds of millions of people. The history of the Internet is another story, but it began in the 1960s when four American universities launched a project to connect their computer systems together to make the first WAN. Later, with funding for the Department of Defense, that network became a bigger project called ARPANET (Advanced Research Projects Agency Network). In the mid-1980s, the US National Science Foundation (NSF) launched its own WAN called NSFNET. The convergence of all these networks produced what we now call the Internet later in the 1980s. Shortly afterward, the power of networking gave British computer programmer Tim Berners-Lee (1955–) his big idea: to combine the power of computer networks with the information-sharing idea Vannevar Bush had proposed in 1945. Thus, was born the World Wide Web —an easy way of sharing information over a computer network, which made possible the modern age of cloud computing (where anyone can access vast computing power over the Internet without having to worry about where or how their data is processed). It's Tim Berners-Lee's invention that brings you this potted history of computing today!

And now where?

If you liked this article..., find out more, on this site.

  • Supercomputers : How do the world's most powerful computers work?

Other websites

There are lots of websites covering computer history. Here are a just a few favorites worth exploring!

  • The Computer History Museum : The website of the world's biggest computer museum in California.
  • The Computing Age : A BBC special report into computing past, present, and future.
  • Charles Babbage at the London Science Museum : Lots of information about Babbage and his extraordinary engines. [Archived via the Wayback Machine]
  • IBM History : Many fascinating online exhibits, as well as inside information about the part IBM inventors have played in wider computer history.
  • Wikipedia History of Computing Hardware : covers similar ground to this page.
  • Computer history images : A small but interesting selection of photos.
  • Transistorized! : The history of the invention of the transistor from PBS.
  • Intel Museum : The story of Intel's contributions to computing from the 1970s onward.

There are some superb computer history videos on YouTube and elsewhere; here are three good ones to start you off:

  • The Difference Engine : A great introduction to Babbage's Difference Engine from Doron Swade, one of the world's leading Babbage experts.
  • The ENIAC : A short Movietone news clip about the completion of the world's first programmable electronic computer.
  • A tour of the Computer History Museum : Dag Spicer gives us a tour of the world's most famous computer museum, in California.

For older readers

For younger readers.

Text copyright © Chris Woodford 2006, 2023. All rights reserved. Full copyright notice and terms of use .

Rate this page

Tell your friends, cite this page, more to explore on our website....

  • Get the book
  • Send feedback

The History and Evolution of Computers

The “Abacus” was the first non-electric computer. The Abacus, also called a counting frame, is a rack of sliding beads and/or pebbles. It is used for performing arithmetic processes. It allowed the users to compute numbers by adding, subtracting, multiplying, and dividing. This way allows this computer to use no electricity at all. This computer was developed approximately 5000 years ago. The Mesopotamia,

Egyptians, Greeks, Romans, Chinese, Indians, Japanese, Koreans, Native Americans , and Russians have their own versions of the Abacus (different material was used, better suited to their own environments) 2. What were the major innovations of first-generation computers over the mechanical era? The major innovations of the first-generation computer includes the use of electronic switches, with the help of vacuum tubes they were able maker these electronic switches open on close approximately 1000 times faster than mechanical switches. The inputs for these first-generation computers were primarily by the use of “punch cards”.

Information was stored on magnetic tape made of a thin unimaginable coating on a very long and narrow strip of plastic tape. The memory of these computers were equivalent to about 20 words. Since these computers used a large number of vacuum tubes so a “small computer” was about the size of an entire room. Large power was necessary to run these machines. The “MANIAC”, a machine built by John Macaulay and J. Prosper Checker had a 29 power supplies to keep it up and running. The MANIAC had 18,000 vacuum tubes which generated a very large amount of heat.

A very coordinated and elaborated fan system was designed to cool it down. First generation computers had no memory (no ability no store data that can be retrievable at a later time. E. G. RAM, Heads Sods etc. ) 3. What were the major innovations of second-generation computers? During the second-generation of computers, many changes were seen. Including the use of transistors, this replaced the vacuum tubes from the first-generation. These transistors were very small and they improved speed from vacuum tubes by quite a ton, thus resulting in more compact computers, without the loss of speed.

First time in history, high level languages for programming were created. With the use of Tambala use AT common Engel’s words In ten language, It mace It sealers Tort programmers to create complex programs with the use of these codes. 4. What is a mainframe? “Mainframe” computers are known as very large and expensive first- and second- generation computers. They were used by major and big companies to handle corporate and business information and were very well capable of handling hundreds to thousands of users at a time.

These computers can still be used today in banks to handle all the transaction that are happening at the same time. A regular imputer at that time would not be able to hand that many transactions and processes and would probably overheat because of the overload of information and data. 5. What were the major innovations of third-generation computers? One major innovation and a thing that third-generation computers improved on over their former is speed and power (performance). Other things included , integrated circuit with built in transistors that were ranging in the thousands.

With integrated circuits that had wires connecting to one computer chip, that single chip was deemed to be faster and more powerful that the computers from the previous generations. They had computer chips that allowed more information to be stored on than ever before and all this large information was able to be accessed quickly at anytime. This made computers of this generation stand out greatly against others. Operating systems for computers became mainstream and they had been developed to easily control the overall activity of the computer.

By using these operating systems, they were able to control the computer hardware efficiently with the software . 6. What were the major innovations of fourth-generation computers? The “Big Bang” of this generation was the development and opening of the microprocessor”. It was a single chip that could process data, as well as store information automatically in memory for further retrievable. It had the ability to produce output data. This innovation and invention led to smaller and more compact computers available at much lower prices, this indefinitely led to the making of the “PC” (or Personal Computer).

These computers were designed for specific purpose, for some who have real need for them. Computers of the previous generations weren’t really designed for a purpose other than experiments and test. Thus, because of this deed, many people saw real value in these computers and purchased them. More programming languages were created which provided extremely easier to use software for the “average Joe” and resulted in word processing and data spreadsheet software as well as some early game of the time. These computers used large floppy disks and/or cassette tapes. 7.

What were the major innovations of fifth-generation computers? TNT-generation computers are widely Known Day tenet tidally Tort “Parallel Processing’ ‘. Parallel processing meaner that more than one processor had the ability to do efferent parts of a task at the same time, which resulted in a major speed improvement as many and probably most applications at lots of different parts of a task and when different processor take up their own individual task, again, it resulted in much faster performance. The use of computers that are networked can increase speed and allows user to complete tasks much faster.

As the prices dropped for PC’s ( Personal Computers ) more people started buying them and thus improving their software ecosystem with multiple new applications and programs. In this generation he Internet grew help make it what it is today in the sixth-generation. The use of the internet allowed many people to connect over long distances. 8. What is a microprocessor? An integrated circuit that contains all the functions of a central processing unit of a computer but into single integrated circuit (though can sometimes be found in multiple circuits). In other words, a COP].

It can also be defined as a chip of silicon (because of its semi-conductive features) that contains a COP]. In most cases it is noted that higher the frequency that the CPU runs at, the faster the microprocessor, Hough in some cases, this is not the story. For example, the fastest computer in the world at Thursday, September 5th, 2013 is the “Titian-2 (Milky-2)” and is made up of a bunch of Intel Xenon E-2692 processors and they all run at a frequency of 2. 2 GHz. This computer is in China and belongs to the National University of Defense Technology. My Intel 13 550 runs at 3. GHz, and my friend’s Intel 17 KICK (overcooked) runs at 4. 4 GHz, but my main point is that these computers are no match for that Xenon. Even if they used a single processor that Xenon would still have a such greater performance. So, higher frequency of the CPU doesn’t necessarily mean a faster microprocessor, but it usually does. 9. How did microprocessors change computers? Microprocessors changed computers in a way that made them faster, more reliable, and energy efficient, as well as compact then the “mechanical processors” of the older generations.

By limiting the amount of moving parts of a processor and using mainly one single integrated circuit board, it made microprocessors use less power (because of the less of amount of moving parts), used large amounts of transistors in hem (this made them faster), more reliable by using built in memory (cache), as well as the integrated circuit board which made them more compact (smaller or taking up less space), which also made them cheaper and more affordable to the consumer markets. 0. What were the major innovations of sixth-generation computers? One of the main and major innovations of the sixth-generation computers are that ten speed AT ten central processing unlit (CPU) Ana Increased Ana newer ageless Ana greatly improved speed. An improvement to networking for computers included the argue and vast growth of the Wide Area Networks (WANTS).

Eventually as the 21st century turned in, the number of transistors and the overall speed of microprocessors increased around every 18 months or so (this is like “Moor’s Law’), parts of computer became real cheap – more affordable for others and became more compact and portable so that we know see much faster ones in our latest computers, tablets, cell phones, smart-watches, and also TV’s. 1 1 . What trends can you identify over the first six generations of computing?

That technology kept advancing over the years and that every time that a new type of enervation and its computers were made, people were tired of them and decided to make another version of them. People always keep inventing innovating and they will go on till the physical limit. We never stick to one type of computer, we try to find and make a better version of it in order to suit our hard and very much required needs.

Though what I actually saw is that things started moving from mechanical to electronic, we need to do more stuff with the least amount of effort possible. For example, we used the Abacus in our early times with math and computing, then all of sudden the 20th and 21st century saw a much major improvement that changed the world, how we communicate, how we interact. Storage space and the speed of reading/writing onto that storage space kept improving, so that it doesn’t take very long to complete a desired task.

Over the generations, people wanted more speed, more portability, and less power consumption which are the things that mostly differed each generation from each other. 12. What competencies would a “computer expert” need at each generation? A “computer expert” would first need to know the how the computer first came to be. They would need to know the parts of the computers. How they used to work, how they work now, and how they might work in the future. At the first-generation they would need to know how the vacuums work and how they affect performance and other related stuff.

Second -generation, they would mainly need to know how transistors work as they were pretty much one of the main innovations of the generation. Third-generation- they would need to know much more about transistors capacitors and resistors as they were mainly involved with computer chips (which were mainstream at that time). For fourth-generation, a “computer expert”, in order to be called an expert would need to have good knowledge of how a microprocessor works and about the input/output of data.

As well as some basic knowledge in computer coding in software and programs (as they Just started rolling at that time). For the fifth- and sixth-generations I believe a computer expert would be required to know all about the Internet and HTML as well as other graphic and site coding used online. They should be able to create a great number of software for computers to enhance user experience and should be able to do the rest listed in this paragraph.

To export a reference to this essay please select a referencing style below:

Related essays:

  • Economic Consequences of Software Crime
  • The History and Evolution of Computers Answers
  • Computer Engineering
  • History Of The Computer
  • Computer Software Piracy and it’s Impact on the International Economy
  • History of Computing Hardware
  • Computers: Not the Greatest Discovery of the Twentieth Century
  • Software Piracy: A Worldwide Problem
  • History Of Computers

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy Williamson

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

The 7 most powerful supercomputers in the world right now

Computing 'paradigm shift' could see phones and laptops run twice as fast — without replacing a single component

Where does the solar system end?

Most Popular

By Anna Gora December 27, 2023

By Anna Gora December 26, 2023

By Anna Gora December 25, 2023

By Emily Cooke December 23, 2023

By Victoria Atkinson December 22, 2023

By Anna Gora December 16, 2023

By Anna Gora December 15, 2023

By Anna Gora November 09, 2023

By Donavyn Coffey November 06, 2023

By Anna Gora October 31, 2023

By Anna Gora October 26, 2023

  • 2 'You could almost see and smell their world': Remnants of 'Britain's Pompeii' reveal details of life in Bronze Age village
  • 3 Hair-straightening cream tied to woman's repeated kidney damage
  • 4 Future quantum computers will be no match for 'space encryption' that uses light to beam data around — with the 1st satellite launching in 2025
  • 5 9,000-year-old rock art discovered among dinosaur footprints in Brazil
  • 2 The 7 most powerful supercomputers in the world right now
  • 3 Fiber-optic data transfer speeds hit a rapid 301 Tbps — 1.2 million times faster than your home broadband connection
  • 4 Powerful X-class solar flare slams Earth, triggering radio blackout over the Pacific Ocean
  • 5 Polar vortex is 'spinning backwards' above Arctic after major reversal event

A Short History of Computers

  • First Online: 19 November 2022

Cite this chapter

Book cover

  • Jonathan Bartlett 2  

2047 Accesses

The history of computers is weird and wonderful. What started as an abstract philosophical quest ended up setting the course for society for over a century and continues to be one of the most profound parts of modern life. The goal of this chapter is to trace an outline of where computing started, where it has been, and where it is now.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

If you want to dive deeper into this subject, you can see my article in MindMatters, “Why I Doubt That AI Can Match the Human Mind,” available at https://bit.ly/3FOtOsS . You may also want to check out my YouTube video, “How to Build an Artificial Intelligence Using the Doctrine of Man,” available at https://youtu.be/FzXW7p3AG1Y .

Author information

Authors and affiliations.

Tulsa, OK, USA

Jonathan Bartlett

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to APress Media, LLC, part of Springer Nature

About this chapter

Bartlett, J. (2023). A Short History of Computers. In: Programming for Absolute Beginners. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-8751-4_2

Download citation

DOI : https://doi.org/10.1007/978-1-4842-8751-4_2

Published : 19 November 2022

Publisher Name : Apress, Berkeley, CA

Print ISBN : 978-1-4842-8750-7

Online ISBN : 978-1-4842-8751-4

eBook Packages : Professional and Applied Computing Professional and Applied Computing (R0) Apress Access Books

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Question and Answer forum for K12 Students

Computer Paragraph

The Evolution Of Computers: A Brief History And Future Trends Of Computer Paragraph

Computer Paragraph: Computers have come a long way since their inception in the 19th century. From mechanical calculators to quantum computing, computers have evolved to become a ubiquitous part of modern society. In this article, we will discuss the evolution of computers and their future trends.

In this blog, we include the Computer Paragraph, in 100, 200, 250, and 300 words. Also cover the Computer Paragraph belonging to classes 1, 2, 3, 4, 5, 6, 7, 8, 9, and up to the 12th class. You can read more essays in 10 lines, and Essay Writing about sports, events, occasions, festivals, etc… Computer Paragraph is also available in different languages.

The Evolution Of Computers

The Evolution Of Computers

Early Computers

The first mechanical computers were invented in the 19th century, but it wasn’t until the 1940s that electronic computers were developed. The first electronic computers were large, expensive, and slow, but they paved the way for modern computing. Pioneers such as Charles Babbage and Alan Turing made significant contributions to the development of early computers.

Personal Computers

In the 1980s, personal computers emerged, and they quickly became a part of homes and businesses. The rise of personal computers changed the way we work and communicate, leading to the development of the internet and social media. Companies such as Apple and Microsoft played a significant role in the development of personal computers.

Mobile Computing

The emergence of mobile devices such as smartphones and tablets has revolutionized computing. Mobile computing has made it possible to work and communicate on the go, leading to the rise of remote work and e-commerce. The future of mobile computing is likely to include more wearable technology such as smartwatches and augmented reality devices.

Artificial Intelligence And Quantum Computing

Artificial intelligence (AI) has the potential to transform every aspect of modern life, from healthcare to transportation. With the development of AI, computers can process and analyze vast amounts of data, leading to more accurate predictions and insights. Quantum computing is a technology in its infancy, but it has the potential to revolutionize computing by solving problems that classical computers cannot solve.

Computers have come a long way since their inception, and they are likely to continue to evolve rapidly. With the development of new technologies such as AI and quantum computing, computers will become even more powerful and ubiquitous in our lives. As we look towards the future of computing, we must be mindful of the ethical considerations surrounding these technologies and work towards creating a world where technology is used for the greater good.

Read More: Components Of A Computer System

FAQs On Computer Paragraphs

Question 1. What is a computer in 250 words?

Answer: A computer is an electronic device that processes and stores data. It can perform various tasks based on the instructions given to it. A computer consists of hardware components such as a central processing unit (CPU), memory, input/output devices, and storage devices. The software programs and operating system installed on a computer enable it to perform complex tasks, making it an indispensable tool in today’s digital age.

Question 2. What are the 10 lines of computers?

Answer: A computer is an electronic device that can perform a variety of tasks by following a set of instructions.

  • It is capable of processing, storing, and retrieving large amounts of data quickly and accurately.
  • Computers come in different sizes and types, ranging from desktops and laptops to tablets and smartphones.
  • They consist of various hardware components, such as a central processing unit (CPU), memory, storage, and input/output devices.
  • A computer operates using software programs that allow it to perform specific functions, such as word processing, internet browsing, and gaming.
  • Computers can connect to networks and the internet, allowing users to access and share information globally.
  • They have revolutionized the way we work, communicate, and learn, making tasks faster, easier, and more efficient.
  • Artificial intelligence and machine learning have further advanced the capabilities of computers, enabling them to perform complex tasks that were once only possible for humans.
  • With the increasing reliance on technology, computers have become an essential part of everyday life, used in education, healthcare, finance, entertainment, and many other fields.
  • The rapid pace of technological advancement means that computers will continue to evolve, shaping the future of society and transforming the way we live and work.

Question 3. What is computer 80 words?

Answer: A computer is an electronic device that can perform various operations and calculations at high speed. It can store, retrieve, and process data in a way that humans find useful. Computers come in various forms, from desktops to laptops, tablets, and smartphones. They are essential tools in modern society and are used in many fields, including business, education, healthcare, entertainment, and more. With the advancement of technology, computers continue to evolve and improve, making tasks more efficient and accessible.

Question 4. What is a computer paragraph in 100 words?

Answer: A computer is an electronic device that can process data and perform tasks according to a set of instructions or programs. It consists of several hardware components such as the processor, memory, storage, and input/output devices. Computers have revolutionized the way we work, communicate, and access information. They are used in various fields such as education, healthcare, entertainment, and business. With the advancement of technology, computers have become faster, more efficient, and smaller in size, making them more accessible and affordable to people worldwide.

Question 5. What is a computer in 150 words?

Answer: A computer is an electronic device that can process, store and retrieve data quickly and accurately. It is made up of hardware components such as the motherboard, CPU, RAM, storage devices, and input/output devices, as well as software components such as the operating system and applications. Computers can perform a variety of tasks, including word processing, browsing the internet, playing games, and running complex simulations.

They have become an integral part of our daily lives, used for work, entertainment, education, and communication. The evolution of computers has been rapid, with advancements in technology allowing for smaller, faster, and more powerful devices. With the increasing reliance on technology, it is important to understand how computers work and how to use them effectively.

Computer Evolution, Its Future and Societal Impact Research Paper

Introduction, the evolution of computers, the future of computers, effects of future of computers, works cited.

Today, computers have become an integral component of human life. One wonders how life would be without computers. Randell holds, “In reality, computers, as they are known and used today, are still relatively new” (45). In spite of the computers being in existence since the abacus, it is the contemporary computers that have had a significant impact on the human life. The current computers have progressed through numerous generations to what we have today. The ongoing technological advancement is bound to result in the development of supercomputers in the future (Randell 47). Computer engineers look forward to the development of miniature, powerful computers that will have a significant impact on the society. This paper will discuss the evolution of computers. The article will also discuss the future of computers and its potential repercussions on the society.

The modern-day computers have evolved through four generations. The first generation of computers occurred between 1940 and 1956. The computers manufactured during this period were big and used magnetic drums as memory (Randell 49). Additionally, the computers used vacuum tubes as amplifiers and switches. The use of vacuum tubes led to the computers emitting a lot of heat. The computers did not use advanced programming language. Instead, they relied on a simple programming language known as the machine language.

The second generation of computers dated between 1956 and 1963. The computers used transistors instead of vacuum tubes. As a result, they did not consume a lot of power. Furthermore, the use of transistors helped to minimize the amount of heat that the computers released (Randell 50). These computers were more efficient than their forerunners. The elimination of vacuum tubes led to a reduction of the size of the computer. The second generation computers comprised a magnetic storage and a core memory.

The third generation of computers dated between 1964 and 1971. The computers developed during this period were superior in speed. They used integrated circuits. The integrated circuits comprised many tiny transistors embedded on silicon chips. The integrated circuits enhanced the efficiency of the computer. Besides, they contributed to the development of small, cheap computers (Zabrodin and Levin 747). The previous generations of computers used printouts and punch cards. However, the third generation computers used monitors and keyboards.

The fourth generation computers were developed between 1971 and 2010. The computers were designed at a time when the human had realized tremendous technological growth. Thus, it was easy for computer manufacturers to put millions of transistors on one circuit chip. Besides, the manufacturers developed the first microprocessor known as the Intel 4004 chip (Zabrodin and Levin 748). The development of a microprocessor marked the beginning of production of personal computers. By early 1980s numerous brands of personal computers were already in the market. They included International Business Machine (IBM), Apple II, and Commodore Pet. The computer engineers also came up with graphical user interface (GIU) that enhanced computer usage (Zabrodin and Levin 749). They also improved the storage capability, primary memory and speed of the computer.

The current computers use semiconductors, electric power, and metals. There are speculations that future computers will use light, DNA or atoms. Moore’s Law hints that the future computers will shift from quartz to quantum. Computer scientists continue to increase the number of transistors that a microprocessor holds. With time, a microprocessor will comprise multiple atomic circuits. That will usher in the era of quantum computers, which will utilize the power of molecules and atoms to execute commands (Ladd et al. 47). The quantum computers will use qubits to run operations. A quantum computer will ease the computation of complicated problems. Unfortunately, such computers will be unstable. People will require ensuring that they do not interfere with the quantum state of the computer. Interfering with the quantum state will affect the computing power of the computer.

Lajoie and Derry claim, “Perhaps the future of computers lies inside us” (23). Computer scientists are in the process of developing machines that use DNA to execute commands. The collaboration between biologists and computer scientists could see the creation of the next generation of computers. Scientists argue, “DNA has the potential to perform calculations many times faster than the world’s most powerful human-built computers” (Lajoie and Derry 31). Therefore, in future, scientists may look for ways to develop computers that exploit the computing powers of the DNA. Scientists have already come up with the means to apply DNA molecules to execute complicated mathematical problems (Lajoie and Derry 34). Indeed, it is a matter of time before computer scientists use DNA to develop biochips to enhance the power of computers. DNA computers will have a storage capacity that can hold a lot of data.

The development of sophisticated computers will have a myriad of effects on human life. The future computers will have an intelligent that is akin to or superior to that of humans. Presently, some computers can read multiple books in a second. Besides, some computers have the capacity to respond to questions asked in natural language. Google Company is working on a project to develop an artificial intelligence that can read and comprehend different documents (Russell and Norvig 112). Such an artificial intelligence will serve as a source of information. People will no longer require reading books or going to school. Besides, it will render insignificant the need for human interactions. People will use computers to get answers to all their problems.

Development of sophisticated computers will result in many people losing their jobs. Once computer scientists develop a computer with intelligence akin to that of human, there will be the rise of intelligent robots that will perform most human jobs. Currently, some robots facilitate production of products (Doi 201). In the future, there will be robots that can construct roads, work in supermarkets, and prepare meals in restaurants. There will be no need for human labor any longer. The development of supercomputers will have positive impacts on the provision of quality healthcare. There will be computers that can perform blood tests, measure the level of cholesterol, and diagnose allergies (Doi 203). Besides, the computers will examine people’s DNA to determine potential genetic risks and forecast possible illnesses. Such computers will help to boost the quality of healthcare and minimize deaths that result from erroneous diagnoses.

Computer development has evolved over time resulting in the formation of personal computers that are not only small in size but also efficient. Computer scientists continue to develop sophisticated computers. In Future, computers will use DNA, light, and atom to process data. The scientists are in the course of developing quantum computers. Additionally, collaboration between computer scientists and biologists will facilitate the creation of biochips using human DNA. Development of supercomputers will not only enhance the provision of quality healthcare but also eliminate the need for schools and human interactions.

Doi, Kunio. “Computer-Aided Diagnosis in Medical Imaging: Historical Review, Current Status and Future Potential.” Computerized Medical Imaging and Graphics 31.5 (2007): 198-211. Print.

Ladd, Thaddeus, Fedor Jelezko, Raymond Laflamme, Yasunobu Nakamura, Christopher Monroe and Jeremy O’Brien. “Quantum Computers.” Nature 464.1 (2010): 45-53. Print.

Lajoie, Susanne, and S. Derry. Computers as Cognitive Tools , New York: Routledge, 2009. Print.

Randell, Brian. The Origins of Digital Computers , New York: Routledge, 2013. Print.

Russell, Stuart and P. Norvig. Artificial Intelligence: A Modern Approach , London: Prentice Hall, 2003. Print.

Zabrodin, Aleksey and Vladimir Levin. “Supercomputers: Current State and Development.” Automation and Remote Control 68.5 (2009): 746-749. Print.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2020, July 25). Computer Evolution, Its Future and Societal Impact. https://ivypanda.com/essays/computer-evolution-its-future-and-societal-impact/

"Computer Evolution, Its Future and Societal Impact." IvyPanda , 25 July 2020, ivypanda.com/essays/computer-evolution-its-future-and-societal-impact/.

IvyPanda . (2020) 'Computer Evolution, Its Future and Societal Impact'. 25 July.

IvyPanda . 2020. "Computer Evolution, Its Future and Societal Impact." July 25, 2020. https://ivypanda.com/essays/computer-evolution-its-future-and-societal-impact/.

1. IvyPanda . "Computer Evolution, Its Future and Societal Impact." July 25, 2020. https://ivypanda.com/essays/computer-evolution-its-future-and-societal-impact/.

Bibliography

IvyPanda . "Computer Evolution, Its Future and Societal Impact." July 25, 2020. https://ivypanda.com/essays/computer-evolution-its-future-and-societal-impact/.

  • Supercomputer Design, Hardware and Software
  • Non-Silicon Transistors and Its Fabrication
  • Quantum Technologies' Impact on National Security
  • The Application of FinFET Technology in the Manufacture of Semiconductors
  • Intelligence Technologies: Helpful Innovations or Threats
  • Law Enforcement Technologies
  • The Transistor Company and Utilitarian Ethics
  • Microcomputer Components and Functions
  • The Main Factors of Risks in Business
  • Integrating Circuits Using Photolithography Method
  • Informatics as an Approach and a Theory
  • PowerPoint Computer Program: Principles and Processes
  • Subtitling Software' Significance for Subtitlers: WinCap
  • Computer Reservations System in Hotel
  • VSphere Computer Networking: Planning and Configuring

Logo

Essay on History of Computer

Students are often asked to write an essay on History of Computer in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on History of Computer

Early beginnings.

Computers didn’t always look like the laptops or smartphones we use today. The first computer was the abacus, invented in 2400 BC. It used beads to help people calculate.

First Mechanical Computer

In 1822, Charles Babbage, a British mathematician, designed a mechanical computer called the “Difference Engine.” It was supposed to perform mathematical calculations.

The Birth of Modern Computers

The first modern computer was created in the 1930s. It was huge and filled an entire room. These computers used vacuum tubes to process information.

Personal Computers

In the 1970s, companies like Apple and IBM started making personal computers. This made it possible for people to have computers at home.

Remember, computers have come a long way and continue to evolve!

Also check:

  • Paragraph on History of Computer

250 Words Essay on History of Computer

Introduction.

The history of computers is a fascinating journey, tracing back several centuries. It illustrates human ingenuity and evolution from primitive calculators to complex computing systems.

Early Computers

The concept of computing dates back to antiquity. The abacus, developed in 2400 BC, is often considered the earliest computer. In the 19th century, Charles Babbage conceptualized and designed the first mechanical computer, the Analytical Engine, which used punch cards for instructions.

Birth of Modern Computers

The 20th century heralded the era of modern computing. The first programmable computer, the Z3, was built by Konrad Zuse in 1941. However, it was the Electronic Numerical Integrator and Computer (ENIAC), developed in 1946, that truly revolutionized computing with its electronic technology.

Personal Computers and the Internet

The 1970s and 1980s saw the advent of personal computers (PCs). The Apple II, introduced in 1977, and IBM’s PC, launched in 1981, brought computers to the masses. The 1990s marked the birth of the internet, transforming computers into communication devices and information gateways.

Present and Future

Today, computers have become an integral part of our lives, from smartphones to supercomputers. They are now moving towards quantum computing, promising unprecedented computational power.

In summary, the history of computers is a testament to human innovation, evolving from simple counting devices to powerful tools that shape our lives. As we look forward to the future, the potential for further advancements in computing technology is limitless.

500 Words Essay on History of Computer

The dawn of computing.

The history of computers dates back to antiquity with devices like the abacus, used for calculations. However, the concept of a programmable computer was first realized in the 19th century by Charles Babbage, an English mathematician. His design, known as the Analytical Engine, is considered the first general-purpose computer, although it was never built.

The first half of the 20th century saw the development of electro-mechanical computers. The most notable was the Mark I, developed by Howard Aiken at Harvard University in 1944. It was the first machine to automatically execute long computations.

During the same period, the ENIAC (Electronic Numerical Integrator and Computer) was developed by John Mauchly and J. Presper Eckert at the University of Pennsylvania. Completed in 1945, it was the first general-purpose electronic computer. However, it was not programmable in the modern sense.

The Era of Transistors

The late 1940s marked the invention of the transistor, which revolutionized the computer industry. Transistors were faster, smaller, and more reliable than their vacuum tube counterparts. The first transistorized computer was built at the University of Manchester in 1953.

The 1950s and 1960s saw the development of mainframe computers, like IBM’s 700 series, which dominated the computing world for the next two decades. These machines were large and expensive, but they allowed multiple users to access the computer simultaneously through terminals.

Microprocessors and Personal Computers

The invention of the microprocessor in the 1970s marked the beginning of the personal computer era. The Intel 4004, released in 1971, was the first commercially available microprocessor. This development led to the creation of small, relatively inexpensive machines like the Apple II and the IBM PC, which made computing accessible to individuals and small businesses.

The Internet and Beyond

The 1980s and 1990s brought about the rise of the internet and the World Wide Web, expanding the use of computers into every aspect of modern life. The advent of graphical user interfaces, such as Microsoft’s Windows and Apple’s Mac OS, made computers even more user-friendly.

Today, computers have become ubiquitous in our society. They are embedded in everything from our phones to our cars, and they play a critical role in fields ranging from science to entertainment. The history of computers is a story of continuous innovation and progress, and it is clear that this trend will continue into the foreseeable future.

That’s it! I hope the essay helped you.

If you’re looking for more, here are essays on other interesting topics:

  • Essay on Generation of Computer
  • Essay on Computer Technology Good or Bad
  • Essay on Computer Network

Apart from these, you can look at all the essays by clicking here .

Happy studying!

One Comment

I’m so happy and it’s so helpful to me. May God bless you

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Search Menu
  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Papyrology
  • Greek and Roman Archaeology
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Emotions
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Acquisition
  • Language Evolution
  • Language Reference
  • Language Variation
  • Language Families
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Modernism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Religion
  • Music and Media
  • Music and Culture
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Science
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Lifestyle, Home, and Garden
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Clinical Neuroscience
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Medical Ethics
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Strategy
  • Business Ethics
  • Business History
  • Business and Government
  • Business and Technology
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic Systems
  • Economic History
  • Economic Methodology
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Theory
  • Politics and Law
  • Public Administration
  • Public Policy
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Developmental and Physical Disabilities Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The History of Computing: A Very Short Introduction

The History of Computing: A Very Short Introduction

Author webpage

  • Cite Icon Cite
  • Permissions Icon Permissions

This book describes the central events, machines, and people in the history of computing, and traces how innovation has brought us from pebbles used for counting, to the modern age of the computer. It has a strong historiographical theme that offers a new perspective on how to understand the historical narratives we have constructed, and examines the unspoken assumptions that underpin them. It describes inventions, pioneers, milestone systems, and the context of their use. It starts with counting, and traces change through calculating aids, mechanical calculation, and automatic electronic computation, both digital and analogue. It shows how four threads—calculation, automatic computing, information management, and communications—converged to create the ‘information age’. It examines three master narratives in established histories that are used as aids to marshal otherwise unmanageable levels detail. The treatment is rooted in the principal episodes that make up canonical histories of computing.

Signed in as

Institutional accounts.

  • Google Scholar Indexing
  • GoogleCrawler [DO NOT DELETE]

Personal account

  • Sign in with email/username & password
  • Get email alerts
  • Save searches
  • Purchase content
  • Activate your purchase/trial code

Institutional access

  • Sign in with a library card Sign in with username/password Recommend to your librarian
  • Institutional account management
  • Get help with access

Access to content on Oxford Academic is often provided through institutional subscriptions and purchases. If you are a member of an institution with an active account, you may be able to access content in one of the following ways:

IP based access

Typically, access is provided across an institutional network to a range of IP addresses. This authentication occurs automatically, and it is not possible to sign out of an IP authenticated account.

Sign in through your institution

Choose this option to get remote access when outside your institution. Shibboleth/Open Athens technology is used to provide single sign-on between your institution’s website and Oxford Academic.

  • Click Sign in through your institution.
  • Select your institution from the list provided, which will take you to your institution's website to sign in.
  • When on the institution site, please use the credentials provided by your institution. Do not use an Oxford Academic personal account.
  • Following successful sign in, you will be returned to Oxford Academic.

If your institution is not listed or you cannot sign in to your institution’s website, please contact your librarian or administrator.

Sign in with a library card

Enter your library card number to sign in. If you cannot sign in, please contact your librarian.

Society Members

Society member access to a journal is achieved in one of the following ways:

Sign in through society site

Many societies offer single sign-on between the society website and Oxford Academic. If you see ‘Sign in through society site’ in the sign in pane within a journal:

  • Click Sign in through society site.
  • When on the society site, please use the credentials provided by that society. Do not use an Oxford Academic personal account.

If you do not have a society account or have forgotten your username or password, please contact your society.

Sign in using a personal account

Some societies use Oxford Academic personal accounts to provide access to their members. See below.

A personal account can be used to get email alerts, save searches, purchase content, and activate subscriptions.

Some societies use Oxford Academic personal accounts to provide access to their members.

Viewing your signed in accounts

Click the account icon in the top right to:

  • View your signed in personal account and access account management features.
  • View the institutional accounts that are providing access.

Signed in but can't access content

Oxford Academic is home to a wide variety of products. The institutional subscription may not cover the content that you are trying to access. If you believe you should have access to that content, please contact your librarian.

For librarians and administrators, your personal account also provides access to institutional account management. Here you will find options to view and activate subscriptions, manage institutional settings and access options, access usage statistics, and more.

Our books are available by subscription or purchase to libraries and institutions.

External resource

  • In the OUP print catalogue
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Rights and permissions
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

IMAGES

  1. Computer Evolution Free Essay Example

    short essay on evolution of computer

  2. The History and Evolution of Computers Essay Example

    short essay on evolution of computer

  3. The Evolution Of Computers: A Brief History And Future Trends Of

    short essay on evolution of computer

  4. History of Computer Free Essay Example

    short essay on evolution of computer

  5. History Of Computer In English Essay

    short essay on evolution of computer

  6. PPT

    short essay on evolution of computer

VIDEO

  1. The Evolution of Computers A Journey from Abacus to AI

  2. Evolution of computer #shorts

  3. evolution of computer in 1990 to 2023 😍#shorts

  4. Essay on Advantage of Computer #writingclasses #shorts

  5. Evolution of Computer|| 1st generation

  6. Evolution of Computer #shorts #evolution #history #mac #windows #computer #laptop 

COMMENTS

  1. The Evolution Of Computer

    Generations of computer. The generation of classified into five generations: 1. FIRST GENERATION COMPUTER: Vacuum Tubes (1940-1956) The first generation of computers is characterized by the use of "Vacuum tubes" It was developed in 1904 by the British engineer "John Ambrose Fleming".

  2. Essay On Evolution Of Computers

    702 Words3 Pages. Through time, computers have evolutionized. They have affected the world around us in many different ways. Computers have been integrated into the government and have becomes part of our everyday life. Computers have gone as far as being part of our culture. The evolution of computers have altered our lives a great deal; it ...

  3. History of Computers

    "Gene Amdahl's first computer." by Erik Pitti is licensed under CC BY 2.0 "First hard drives" by gabrielsaldana is licensed under CC BY 2.0 . The evolution of computers has been happening at a fast rate, and when this happens people's contributions are left out. The main demographic in computers that are left out are women.

  4. Essay on Generation of Computer

    500 Words Essay on Generation of Computer Introduction. The evolution of computers has been a journey marked by rapid progression and revolutionary breakthroughs. From the rudimentary first generation to the advanced fifth generation, each phase of computer development has significantly impacted various facets of society, economy, and science.

  5. Essay on Evolution of Computers

    The evolution of computers has been an intriguing journey, intertwined with human ingenuity and innovation. The earliest computing device, the abacus, was invented in 2400 BC, a simple manual tool used for calculations. Fast forward to the 19th century, the concept of a programmable computer was introduced by Charles Babbage, who designed the ...

  6. The Evolution of the Computer: How Did We Get to Where We Are Today?

    Neither the Babbage Machine nor the ENIAC looks anything like the computers we use today. The first computer with a monitor was invented in 1973 and was called the Xerox Alto. The Xerox Alto was developed by PARC, an American company, as a research system. Its user-friendly features added to its groundbreaking stance in the electronics industry.

  7. Computer Technology: Evolution and Developments Essay

    The development of computer technology is characterized by the change in the technology used in building the devices. The evolution of computer technology is divided into several generations, from mechanical devices, followed by analog devices, to the recent digital computers that now dominate the world. This paper examines the evolution of ...

  8. Essay on History of computers/Evolution of Computers

    Beginning. Perhaps the most important date in the history of computers is 1936. In the same year, the first "computer" was developed. It was created by Konrad Zuse and dubbed the Z1 computer. This computer stands as the first because it was the first fully programmable system.

  9. History of computers

    Computers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast ...

  10. Computer

    computer, device for processing, storing, and displaying information.. Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery.The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing.

  11. History of computing

    Grace Murray Hopper: a pioneer of computing. She worked alongside Howard H. Aiken on IBM's Mark I. Hopper and also came up with the term " debugging ." Hedy Lamarr: invented a "frequency hopping" technology that the Navy used during World War II to control torpedoes via radio signals.

  12. Free Essay: The Evolution of the Computer

    The evolution of computers is considered one of the most rapid technological development in the history of human innovation. Within just 50 years, computers had evolved from bulky military hardware that computed ballistic trajectory to compact personal devices that aids in our day-to-day routine.…. 2151 Words.

  13. The History and Evolution of Computers

    For example, the fastest computer in the world at Thursday, September 5th, 2013 is the "Titian-2 (Milky-2)" and is made up of a bunch of Intel Xenon E-2692 processors and they all run at a frequency of 2. 2 GHz. This computer is in China and belongs to the National University of Defense Technology. My Intel 13 550 runs at 3.

  14. History of computers: A brief timeline

    The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

  15. A Short History of Computers

    1 The Prehistory of Computers. Humans have always had tools. Humans have built fires, made spears, and built houses from the beginning. At first, however, technology was limited to standing structures or tools that were extensions of yourself—like knives or bows and arrows. Very little early technology was powered and free-functioning.

  16. Digital computer

    digital computer, any of a class of devices capable of solving problems by processing information in discrete form.It operates on data, including magnitudes, letters, and symbols, that are expressed in binary code—i.e., using only the two digits 0 and 1. By counting, comparing, and manipulating these digits or their combinations according to a set of instructions held in its memory, a ...

  17. Computers: The History of Invention and Development Essay

    The invention of the computer in 1948 is often regarded as the beginning of the digital revolution. It is hard to disagree that computers have indeed penetrated into the lives of people have changed them once and for all. Computer technologies have affected every single sphere of human activities starting from entertainment and ending with work ...

  18. The Evolution Of Computers: A Brief History And Future Trends Of

    March 23, 2023 by Prasanna. Computer Paragraph: Computers have come a long way since their inception in the 19th century. From mechanical calculators to quantum computing, computers have evolved to become a ubiquitous part of modern society. In this article, we will discuss the evolution of computers and their future trends.

  19. Computer Evolution, Its Future and Societal Impact

    The Evolution of Computers. The modern-day computers have evolved through four generations. The first generation of computers occurred between 1940 and 1956. The computers manufactured during this period were big and used magnetic drums as memory (Randell 49).

  20. Essay on History of Computer

    250 Words Essay on History of Computer Introduction. The history of computers is a fascinating journey, tracing back several centuries. It illustrates human ingenuity and evolution from primitive calculators to complex computing systems. Early Computers. The concept of computing dates back to antiquity.

  21. The History of Computing: A Very Short Introduction

    Abstract. This book describes the central events, machines, and people in the history of computing, and traces how innovation has brought us from pebbles used for counting, to the modern age of the computer. It has a strong historiographical theme that offers a new perspective on how to understand the historical narratives we have constructed ...

  22. The Evolution of Computers

    Abstract. The evolution of computers has been one of the most transformative journeys in human history. This review paper provides a comprehensive examination of the development of computers from their early beginnings to the present day, shedding light on the major milestones, technological breakthroughs, and their profound implications on society.

  23. The History of Computers: An Essay

    Short on time? Get essay writing help. Essay Service Examples Technology Computer. The History of Computers: An Essay. Topics: Computer Evolution. Words: 3741: ... The evolution of the computer has been an ongoing struggle with technology. The first computer created was in the third century. The Abacus was developed for counting and since this ...