What is Computer Understanding and History of Microprocessor Development
A computer is a tool used to process data according to a procedure that has been formulated. The word computer was originally used to describe a person whose job is to perform arithmetic calculations, with or without tools, but the meaning of this word was later transferred to the machine itself. Originally, information processing was almost exclusively concerned with arithmetic problems, but modern computers are used for many tasks that are not related to mathematics.
In such a definition there are tools such as slide rules, and types of mechanical calculators ranging from abacus and so on, to all contemporary electronic computers. A better term that fits a broad meaning such as “computer” is “one that processes information” or “information processing system.” Today, computers are increasingly sophisticated. However, before the computer was not as small, sophisticated, cool and light now. In the history of computers, there are 5 generations in the history of computers.
How many Computer Generation
The first generation:
With the onset of the Second World War, the countries involved in the war tried to develop computers to exploit the strategic potential of computers. This increased funding for computer development as well as accelerated advances in computer engineering. In 1941, Konrad Zuse, a German engineer built a computer, the Z3, to design airplanes and missiles.
The Allies also made other progress in the development of computer power. In 1943, the British completed a secret code-breaking computer called the Colossus to decode the secrets used by the Germans. The impact of making the Colossus did not greatly affect the development of the computer industry for two reasons. First, Colossus is not a general-purpose computer, it is only designed to crack secret codes. Second, the existence of this machine was kept secret until a decade after the war ended.
The efforts made by the Americans at that time resulted in another progress. Howard H. Aiken (1900-1973), a Harvard engineer who worked with IBM, succeeded in producing electronic calculators for the US Navy. The calculator is half the length of a football field and has a cable span of 500 miles.
The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I, is an electronic relay computer. It uses electromagnetic signals to drive mechanical components. The engine operates slowly (it takes 3-5 seconds for each calculation) and is inflexible (the order of calculations cannot be changed). The calculator can perform basic arithmetic calculations and more complex equations.
Another computer development today is the Electronic Numerical Integrator and Computer (ENIAC), which was created by a collaboration between the United States government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million solder points, the computer was a very large machine consuming 160kW of power.
This computer was designed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), the ENIAC is a general-purpose computer that works 1000 times faster than the Mark I.
In the mid-1940s, John von Neumann (1903-1957) joined the University of Pennsylvania team in an effort to develop computer design concepts that would still be used in computer engineering for the next 40 years. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both programs and data.
This technique allows the computer to stop at some point and then resume its work. The key to the von Neumann architecture is the central processing unit (CPU), which allows all computer functions to be coordinated through a single source. In 1951, UNIVAC I (Universal Automatic Computer I) made by Remington Rand, became the first commercial computer to utilize the Von Neumann architectural model.
Both the United States Census Agency and General Electric own UNIVAC. One of the impressive results achieved by UNIVAC was its success in predicting the victory of Dwilight D. Eisenhower in the 1952 presidential election.
The first generation computers were characterized by the fact that operating instructions were created specifically for a particular task. Each computer has a different binary code program called “machine language”. This makes the computer difficult to program and limits its speed. Another characteristic of the first generation of computers was the use of vacuum tubes (which made computers at that time very large) and magnetic cylinders for data storage.
2. Second generation
In 1948, the invention of the transistor greatly influenced the development of computers. The transistor replaced the vacuum tube in televisions, radios and computers. As a result, the size of electric machines is drastically reduced.
Transistors began to be used in computers starting in 1956. Another invention in the form of the development of magnetic-core memory helped the development of second-generation computers that were smaller, faster, more reliable, and more energy-efficient than their predecessors. The first machine that utilizes this new technology is a supercomputer. IBM made a supercomputer called Stretch, and Sprery-Rand made a computer called LARC.
These computers, developed for atomic energy laboratories, can handle large amounts of data, a capability that atomic scientists desperately need. These machines were very expensive and tended to be too complex for business computing needs, limiting their popularity. Only two LARCs have ever been installed and used: one at Lawrence Radiation Labs in Livermore, California, and the other at the US Navy Research and Development Center in Washington DC Second-generation computers replaced machine language with assembly language. Assembly language is a language that uses abbreviations to replace binary code.
In the early 1960s, successful second-generation computers began to emerge in business, in universities, and in government. These second-generation computers are computers that use completely transistors. They also had the components that computers are associated with today: printers, diskette storage, memory, operating system, and programs.
One of the important examples of computers at this time was the IBM 1401 which was widely accepted in industry. By 1965, almost all large businesses were using second-generation computers to process financial information.
The programs stored on the computer and the programming languages that are included give the computer flexibility. This flexibility increases performance at a reasonable price for business use. With this concept, computers can print consumer purchase invoices and then execute product designs or calculate payroll. Several programming languages began to emerge at that time.
Common Business-Oriented Language (COBOL) and Formula Translator (FORTRAN) programming languages are becoming more common. This programming language replaces complicated machine code with words, sentences, and mathematical formulas that are easier for humans to understand. This allows a person to program a computer. A variety of new careers emerged (programmer, systems analyst, and computer systems expert). The software industry also began to emerge and develop during this second generation of computers.
3. Third Generation:
Although transistors outperform vacuum tubes in many ways, they generate considerable heat, which can potentially damage the internal parts of a computer. Quartz rock eliminates this problem. Jack Kilby, an engineer at Texas Instruments, developed an integrated circuit (IC: integrated circuit) in 1958. An IC combines three electronic components in a small silicon disc made of quartz sand.
In later scientists managed to fit more components into a single chip called a semiconductor. As a result, computers are getting smaller and smaller because components can be packed into chips. Another third-generation computer advancement was the use of an operating system, which allowed machines to run many different programs simultaneously with a main program that monitored and coordinated the computer’s memory.
4. Fourth Generation
After IC, the development goal became clearer: to reduce the size of circuits and electrical components. Large Scale Integration (LSI) can fit hundreds of components on a single chip. In the 1980s, Very Large Scale Integration (VLSI) contained thousands of components on a single chip.
Ultra-Large Scale Integration (ULSI) increased that number to millions. The ability to install so many components on a chip that was half the size of a coin pushed down the price and size of computers. It also increases the workability, efficiency and reliability of the computer.
The Intel 4004 chip made in 1971 brought advances to IC by putting all the components of a computer (central processing unit, memory, and input/output control) on a very small chip. Previously, IC was made to perform a specific task. Now, a microprocessor can be manufactured and then programmed to meet all the desired requirements. Not long after, every household appliance such as microwaves, ovens, televisions, and cars with electronic fuel injection (EFI) was equipped with a microprocessor.
Such developments allow ordinary people to use ordinary computers. Computers are no longer the domination of big companies or government agencies. In the mid-1970s, computer builders offered their computer products to the general public. These computers, called minicomputers, were sold in software packages that were easy for the layman to use. The most popular software at that time were word processing and spreadsheet programs. In the early 1980s, video games such as the Atari 2600 drew consumer attention to more sophisticated, programmable home computers.
In 1981, IBM introduced the use of Personal Computers (PCs) for use in homes, offices, and schools. The number of PCs in use jumped from 2 million units in 1981 to 5.5 million units in 1982. Ten years later, 65 million PCs were in use. Computers continue their evolution towards smaller sizes, from computers that are on a desk (desktop computer) to computers that can be put in a bag (laptop), or even computers that can be held (palmtop).
The IBM PC competed with the Apple Macintosh in the computer market. The Apple Macintosh became famous for popularizing the graphics system on its computers, while its rivals were still using text-based computers. Macintosh also popularized the use of mouse devices.
At present, we know the journey of IBM compatible with CPU usage: IBM PC/486, Pentium, Pentium II, Pentium III, Pentium IV (Series of CPUs made by Intel). Also we know AMD k6, Athlon, etc. These are all included in the fourth generation of computers.
As the use of computers in the workplace proliferates, new ways to explore potential are continuously being developed. As a small computer grows stronger, these computers can be connected together in a network to share memory, software, information, and also to be able to communicate with each other. Computer networks allow a single computer to form an electronic collaboration to complete a process task. By using direct cabling (also known as Local Area Network or LAN), or telephone wires, these networks can grow to be very large.
5. Fifth Generation:
Defining a fifth-generation computer becomes quite difficult because this stage is still very young. An imaginative example of a fifth-generation computer is the fictional computer HAL9000 from Arthur C. Clarke’s novel 2001: Space Odyssey. HAL performs all the desired functions of a fifth-generation computer. With artificial intelligence (AI), HAL can reason enough to have conversations with humans, use visual input, and learn from their own experiences.
Although the realization of the HAL9000 may still be far from reality, many of its functions have already been realized. Some computers can receive verbal instructions and are able to imitate human reasoning. The ability to translate foreign languages is also possible. This facility looks simple. But the facility becomes much more complicated than expected when programmers realize that human understanding is very dependent on context and understanding rather than simply translating words directly.
Many advances in computer design and technology have made it possible to manufacture fifth-generation computers. Two major engineering advances are parallel processing capabilities, which will replace the non-Neumann model. The non-Neumann model will be replaced with a system capable of coordinating multiple CPUs to work simultaneously. Another advancement is superconducting technology which allows the flow of electricity without any resistance, which in turn can accelerate the speed of information.
Japan is a country that is famous for socializing jargon and fifth-generation computer projects. The ICOT (Institute for New Computer Technology) institute was also formed to make it happen. Many reports that this project has failed, but some other information that the success of this fifth-generation computer project will bring a new change in the paradigm of computing in the world.
Definition in the World of Computers
In the world of computers, we recognize four types of numbers, namely binary, octal, decimal and hexadecimal. Binary numbers or binary digits (bits) are numbers consisting of 1 and 0. Octal numbers consist of 0.1,2,3,4,5,6 and 7. While decimal numbers consist of 0.1,2,3, 4,5,6,7,8 and 9. And hexadecimal numbers consist of 0,1,2,3,4,5,6,7,8,9,A,B,C,D,E and F.