First Generation
With the onset of the Second World War, the countries involved in the war sought to develop computers to exploit their potential strategic importance a computer. This increased funding for computer development and accelerate technical progress. In 1941, Konrad Zuse, a German engineer to build a computer, the Z3, to design airplanes and missiles.
Party allies also made other progress in the development of computer power. In 1943, the British completed a secret code-breaking computer called Colossus to crack the secret code used by Germany. The Colossus did not significantly affect the development of the computer industry because of two reasons. First, the Colossus is not a versatile computer (general-purpose computer), it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war ended. In the mid-1940s, John von Neumann (1903-1957) joined the University of Pennsylvania team, initiating concepts in computer design are up to 40 years is still used in computer engineering. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both programs and data. This technique allows the computer to stop at some point and then resume her job back. The main key von Neumann architecture is the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. In 1951, UNIVAC I (Universal Automatic Computer I) made by Remington Rand, became the first commercial computer that uses the Von Neumann architecture model.
First generation computers were characterized by the fact that operating instructions were made specifically for a particular task. Each computer has a different binary coded program called a machine language (machine language). This causes the computer is difficult to be programmed and the speed limit. Another feature is the use of first-generation computer vacuum tube (which makes the computer at that time are very large) and magnetic cylinders for the storage of data.
Second Generation
In 1948, the invention of the transistor greatly influenced the development of computers. The transistor replaced the vacuum tube in televisions, radios, and computers. As a result, the size of electric machines is reduced drastically. The transistor used in computers began in 1956. Other findings in the form of magnetic core memory-second generation computers smaller, faster, more reliable, and more energy efficient than its predecessor. The first machine that utilizes this new technology is the supercomputer. IBM makes supercomputer named Stretch, and Sprery-Rand makes a computer named LARC. These computers, both developed for atomic energy laboratories, could handle large amounts of data, a capability much in demand by atomic scientists. The machine is very expensive and tend to be too complex for business computing needs, thereby limiting. There are only two LARC ever installed and used: one at the Lawrence Radiation Labs in Livermore, California, and the other at the U.S. Navy Research and Development Center in Washington DC The second generation of computers replacing the machine language to assembly language. Assembly language is a language that uses abbreviations to replace the binary code.In the early 1960s, computers began to appear successful second generation in the business, in universities and in government. The second generation of computers is a computer which used transistors. They also have components that can be associated with the computer at this time: printers, storage, disk, memory, operating system, and programs.
Third Generation
Although the transistors in many respects the vacuum tube, but transistors generate substantial heat, which could potentially damage the internal parts of the computer. Quartz stone (quartz rock) eliminates this problem. Jack Kilby, an engineer at Texas Instruments, developed the integrated circuit (IC: integrated circuit) in 1958. IC combined three electronic components onto a small silicon disc, made from quartz. Scientists later managed to fit more components into a single chip, called a semiconductor. As a result, computers became ever smaller as more components were squeezed onto the chip. Other third-generation development is the use of the operating system (operating system) which allows the engine to run many different programs at once with a central program that monitored and coordinated the computer's memory.
Fourth GenerationAfter IC, the only place to go was down the size of circuits and electrical components. Large Scale Integration (LSI) could fit hundreds of components onto one chip. In the 1980's, Very Large Scale Integration (VLSI) contains thousands of components on a single chip.Along with the proliferation of computer usage in the workplace, new ways to harness their potential developed. Along with the increased strength of a small computer, these computers can be connected together in a network to share a memory, software, information, and also to be able to communicate with each other. Computer networks allow a single computer to establish electronic collaboration to complete a task process. By using direct cabling (also called Local Area Network or LAN), or [telephone cable, the network can become very large.
Fifth Generation
Defining the fifth generation computer becomes quite difficult because the field is still very young. Example of fifth generation computer imaginative fictional HAL9000 computer from the novel by Arthur C. Clarke titled 2001: Space Odyssey. HAL displays all the desired functions of a fifth-generation computers. With artificial intelligence (artificial intelligence or AI), HAL may have enough reason to hold conversations with humans, using visual feedback, and learn from his own experiences.
From the two definitions above, it can be defined computer system is an electronic network that consists of software and hardware that perform a specific task (receive input, process the input, keep the commandments, and provides output in the form of information). Moreover, it can also be interpreted as elements related to running an activity using a computer.
Computers can help people in their daily work, the job such as: word processing, number processing, and image processing.The ultimate goal of a computer system to process the data to produce information that needs to be supported by elements that comprise hardware (hardware), software (software), and brainware. Computer hardware is the hardware itself, the software is a program that contains commands to perform certain processes, and human brainware are involved in the operation and manage computer systems.Or computer systems are elements related to running an activity using a computer. Elements of a computer system consisting of human (brainware), software (software), set of instructions (instruction set), and hardware (hardware). Computer hardware is the hardware itself, the software is a program that contains commands to perform certain processes, brainware are humans involved in the operate and manage computer systems, and instruction set is a command.Elements of a computer system consisting of human (brainware), software (software), set of instructions (instruction set), and hardware (hardware). Thus, these components are the elements involved in a computer system. Of course, the hardware does not mean anything if there is not one of the other two (software and brainware).The three elements of a computer system must be interconnected and form a single unit. Hardware without software is not going to mean anything, just a dead thing. Both hardware and software are also not work if there are no people to operate them.Thus, these components are the elements involved in a computer system. Of course, the hardware does not mean anything if there is not one of the other two (software and brainware). A simple example, who will start the system if there are no humans. Or will execute any command if no computer software.Computer system is closely associated with the hardware alone, but also to learn about the computer system programming a bit.At first all operations in a computer system is handled by only one user. So all the settings on the hardware and software made by the user. But along with the development of the Operating System on a computer system, this arrangement was submitted to the Operating System. All kinds of resource management is governed by the Operating System.Setting the hardware and software is closely linked with the protection of hardware and software itself. Thus, when the first protection against all kinds of hardware and software for the system to run stable is done directly by the user so now System Operations who is responsible for many things. Operating systems should be able to regulate the use of all kinds of hardware resources required by the system to avoid the things that are not desirable. Along with the rise of resource sharing that occurs in a system, then the operating system must be able to intelligently adjust which should take precedence. This is because, if the regulation is not going well, it can certainly be a hardware failure protection.With the presence of multiprogramming which allows the utilization of several programs in memory at the same time, the utilization can be increased with concurrent use of these resources, but on the other hand will cause a problem because there is only one program can run at the same time unit. There will be many processes that affected only the result of a disturbance in one program.For example, only if a hard drive into a resource that is needed by a wide variety of programs, then there could be damage caused by hard disk temperatures too hot due to a congestion situation at the same time the use of resources from so many programs to send requests for the use of disk them.
This is where the hardware protection act. Operating system must either provide maximum protection, so that if there is one program that does not work then it will not interfere with the performance of the operating system and the programs that are running other.In addition, the operating system on the computer, In general, the computer system consisting of a CPU and device controllers are connected via a bus which provides access to memory. Generally, each device controller is responsible for a hardware spesisfik. Each device and the CPU can operate concurrently to gain access to the memory. The existence of some hardware can cause synchronization problems. Therefore, to prevent a memory controller is added to synchronize memory access.
At a more advanced computer systems, more complex architecture.To improve performance, use multiple buses. Each bus a data path between several different devices. In this way the RAM, processor, GPU (AGP VGA) connected by high speed primary bus is better known as the FSB (Front Side Bus). While other devices connected by a slower bus speed lower bus connected to another more quickly get to the main bus. For communication between the bus is used a bridge.
Responsibilities bus synchronization indirectly also affect memory synchronization performed by a bus controller or known as a bus master. Bus master will control the flow of data until at one time, the bus only contains data from a single device. In practice bridge and bus masters are brought together in a chipset. If the computer is turned on, which is known as booting, the computer will run the bootstrap program is a simple program that is stored in the form of ROM chip CMOS (Complementary Metal Oxide Semiconductor). Modern CMOS chips usually type EEPROM (Electrically Erasable Programmable Read Only Memory), which is non-volatile memory (not lost if the power is turned off) that can be written and erased with an electronic pulse. Then Bootstrapping program, better known as the BIOS (Basic Input Output System). Bootstrap main program, which is usually located on the motherboard will examine the major hardware and hardware-initialization of the program in the hardware known as firmware. Bootstrap main program will then find and load the operating system kernel into memory and then proceed with the initialization of the operating system. From this program the operating system will wait for certain events. This event will determine what to do next operating system (event-driven). This event in modern computers is usually marked by the appearance of the software or hardware interrupt, so the OS is called Interrupt-driven. Interrupt from hardware normally delivered via a specific signal, while the software sends an interrupt by invoking a system call or also known as monitor call. System / Monitor this call will cause the trap is a special interrupt generated by the software due to problems or requests for operating system services. Trap is also often referred to as the exception. Each interrupt occurs, a set of codes known as ISR (Interrupt Service Routine) will determine the action to be taken. To determine the action to be done, can be done in two ways polls that make computer check one by one device that is investigating the source of the interrupt and ISR by using alamatalamat stored in the array is known as an interrupt vector in where the system will check the Interrupt Vector whenever an interrupt occurs. Interrupt architecture must be able to store the address of the instruction that in-interrupt. On the old computer, the address is stored in a particular place are fixed, whereas on the new computer, the address stored on the stack along with state information at that time.
0 komentar:
Posting Komentar