The 5 Generations of the Computer and its Features

Each of the five generations of computer is characterized by an important technological development that had an innovative change in the way the computers operate.

Computers play an important role in almost every aspect of human life, but computers as we know them today are very different from the initial models.

The 5 Generations of the Computer and its Features Computer / computer of the 1950s. United States.

But what is a computer? A computer can be defined as an electronic device that performs arithmetic and logic operations.

Another popular definition may say that a computer is a device or machine that can process certain material to turn it into information.

To understand the basic operation of a computer, it is necessary to define data, processing and information.

The data is a collection of basic elements that exist if no sequence; by themselves have no meaning.

Processing is the process by which information can be extracted from the data. And finally, information is the end element of any processing job.

The first electronic computer was invented in 1833; was the first device to have an analytical engine.

As time passed, this device was transformed into a reliable machine that was able to perform jobs faster. Thus was born the first generation of computers with the ENIAC machine.

First generation (1945-1956)

The vacuum tube is associated as the main technology of the first generation of computers ; are glass tubes containing electrodes.

These tubes were used for the circuits of the first computers. Additionally, these machines used magnetic drums in their memory.

The vacuum tube was invented in 1906 by an electrical engineer. During the first half of the 20th century, this was the main technology used to build radios, televisions, radars, X-ray machines and other electronic devices.

First-generation machines were generally controlled with control panels with wiring or by a series of addresses coded on paper tapes.

They were very expensive, they consumed a lot of electricity, they generated a lot of heat and they were huge (they often occupied entire rooms).

The first operational electronic computer was called ENIAC and used 18,000 vacuum tubes. It was built in the United States, at the University of Pennsylvania and was about 30.5 meters long.

It was used for temporary calculations; mainly used in calculations related to the war, as operations related to the construction of the atomic bomb.

On the other hand, the Colossus machine was also built during these years to help the English during the Second World War . It was used to decode secret messages from the enemy and used 1,500 vacuum tubes.

While these first-generation machines were programmable, their programs were not stored internally. This would change as stored-program computers developed.

First generation computers depended on machine language, the lowest programming language understood by computers to perform operations (1GL).

They could solve only one problem at a time and operators could take weeks to program a new problem.

Second generation (1956-1963)

The second generation of computers replaced the vacuum tubes with the transistors. Transistors allowed computers to be smaller, faster, cheaper, and more energy-efficient. Magnetic disks and tapes were often used to store data.

Although the transistors generated enough heat to cause some damage to computers, they were an improvement on previous technology.

Second generation computers used a cooling technology, had broader commercial use, and were only used for specific scientific and business purposes.

These second generation computers left behind the binary cryptic machine language to use an assembly language (2GL). This change allowed programmers to be able to specify instructions in words.

During this time, high-level programming languages ​​were also being developed. Second-generation computers were also the first machines to store instructions in their memory.

At the time, this element had evolved from magnetic drums to a technology with a magnetic core.

Third generation (1964-1971)

The hallmark of third generation of computers was integrated circuit technology. An integrated circuit is a simple device that contains many transistors.

The transistors became smaller and were placed in silicon chips, called semiconductors. Thanks to this change, computers were faster and more efficient than the second generation.

During this time, computers used third-generation (3GL) languages, or high-level languages. Examples of these languages ​​include Java and JavaScript.

The new machines of this period originated a new approach to computer design. It can be said that he introduced the concept of a single computer over a range of other devices; a program designed to be used on one family machine could be used on others.

Another change of this period was that now the interaction with the computers was done through keyboards, a mouse and monitors with an interface and an operating system.

Thanks to this, the device could run different applications at the same time with a central system that was in charge of memory.

The company IBM was the creator of the most important computer of this period: the IBM System / 360. Another model of this company was 263 times faster than ENIAC, demonstrating the breakthrough in the field of computers until then.

As these machines were smaller and cheaper than their predecessors, computers were accessible for the first time to the general audience.

During this time, computers served a general purpose. This was important since previously the machines were used for specific purposes in specialized fields.

Fourth generation (1971-present)

The fourth generation of computers is defined by microprocessors. This technology allows thousands of integrated circuits to be built on a single silicon chip.

This advance made it possible that what once occupied a whole room, could now fit in the palm of one hand.

In 1,971 the Intel 4004 chip was developed that located all the components of the computer, from the central processing unit and memory to the controls of entrance and exit, in a single chip. This marked the beginning of the generation of computers that extends to this day.

In 1981, IBM created a new computer that was capable of running 240,000 sums per second. In 1996, Intel went further and created a machine capable of running 400,000,000 sums per second. In 1984 Apple introduced the Macintosh with an operating system other than Windows.

Fourth-generation computers became more powerful, more compact, more reliable, and more accessible. As a result, the Personal Computer Revolution (PC) was born.

In this generation, real-time channels, distributed operating systems and time-sharing are used. During this period the internet was born.

Microprocessor technology is found in all modern computers. This is because the chips can be made in large quantities without costing a lot of money.

Process chips are used as central processors and memory chips are used for random access memory (RAM). Both chips make use of millions of transistors placed on their silicone surface.

These computers use fourth generation languages ​​(4GL). These languages ​​consist of statements similar to those made in human language.

Fifth generation (present-future)

Fifth-generation devices are based on artificial intelligence. Most of these machines are still in development, but there are some applications that make use of the artificial intelligence tool. An example of this is voice recognition.

The use of parallel processing and superconductors makes artificial intelligence a reality.

In the fifth generation , the technology resulted in the production of microprocessor chips that have 10 million electronic components.

This generation is based on parallel processing hardware and artificial intelligence software. Artificial intelligence is an emerging field in computer science, which interprets the methods necessary to make computers think like humans

It is estimated that quantum computing and nano technology will radically change the face of computers in the future.

The goal of fifth-generation computing is to develop devices that can respond to the input of natural language and are capable of learning and organizing themselves.

The idea is that fifth-generation computers of the future can understand spoken words that can mimic human reasoning. Ideally, these machines will be able to respond to their environment using different types of sensors.

Scientists are working to make this a reality; try to create a computer with a real IQ with the help of programs and advanced technology. This breakthrough in modern technologies will revolutionize the computers of the future.

References

  1. Generation languages ​​(2017). Recovered from computerhope.com
  2. The four generations of computers. Recovered from open.edu
  3. History of computer development and generation of computers. Retrieved from wikieducator.org
  4. Computer- fourth generation. Retrieved from tutorialspoint.com
  5. The five generations of computers (2010). Retrieved from webopedia.com
  6. Generations, computers (2002). Retrieved from encyclopedia.com
  7. Computer- fifth generation. Retrieved from tutorialsonpoint.com
  8. Five generations of computers (2013). Recovered from bye-notes.com


Loading ..

Recent Posts

Loading ..