affiliate marketing

Tuesday, 13 December 2011

Computer Generations part2


Computer Hardware Question 6: What is a fourth generation computer?
Answer:   During the 1970s, The downsizing of mainframe and minicomputers continued. By the late 1970s, most businesses were using computers for at least part of their data-management needs.
However, the need for smaller and faster computers meant that even the integrated circuits of the third generation of computers had to be made more compact. Fourth generation computers are based on  large-scale integration (LSI) of circuits. New chip manufacturing methods meant that tens of thousands and later hundreds of thousands of circuits could be integrated into a single chip (known as  VLSI for  very large-scale integration).

Nevertheless, at that time, computing was still mostly seen as a time-sharing process. One mainframe or minicomputer could service many users, each with a terminal that was connected to the computer by wire. But during this period, a new concept of "personal" computing was being developed. And, surprisingly, this new type of computer was not being developed by the well-established computer companies. It was the electronics hobbyists and a few fledgling electronics companies that were beginning to create computing devices that used small, limited processors, known as  microprocessors. These microprocessors were being built into small computers known as  microcomputers that were designed to be used by only one user at a time. For that reason, most businesses did not at first recognize their value. To users who had grown up with expensive room-sized mainframes that served the needs of the entire organization, the idea of a small computer that could serve the needs of only one user at a time seemed more like a toy. Many believed that these new "personal" computers would continue to be only a hobby for "electronics nuts." But this view was soon to change as new microprocessor designs began to deliver considerable computing power in a very small package.
Although several scientists were working with microprocessor technology during this period, the best known team was working for the Intel Corporation. The team of Ted Hoff, Jr., Frederick Faggin, and Stan Mazor were in the process of expanding on the sophisticated electronics that were being used in the very small Japanese calculators. They reduced all the processing power needed for basic computing down to a set of four small circuits, or chips, one of which was to become known as the Intel 4004 microprocessor. Several special-purpose microprocessors followed and in 1974 Intel produced the 8080, their first general-purpose microprocessor.
During this period Steven Jobs and Steven Wozniak began putting together kit computers in Jobs' garage. These personal computers sold very well and their endeavor eventually became the Apple Computer Corporation, the most successful of the early microcomputer companies.
But it was the world's largest computer company that legitimized the personal computer (PC). In 1981, the International Business Machine (IBM) Corporation introduced their own microcomputer. Its widespread acceptance by the business community instigated a flood of copycat PCs. During the next few years just about every company in the world that had anything to do with electronics produced a microcomputer, most of them very similar to the IBM PC.
During the 1980s, with the spread of specialized software, personal computers found a role in almost all organizations. As many businesses purchased an IBM PC (or one of its work-alike "clones"), it gradually became something of a standard for PC design. This much-needed standardization of PC design meant that programs that ran on one brand of microcomputer would also run on other similar types of PCs that used the same microprocessor.
Computer programming methods continued to evolve during the fourth generation as new high-level programming languages continued to be developed that were both easier to use and more closely related to specific computer tasks.

Computer Hardware Question 7: What types of computers will we use in the  future?
Answer:    Many believe that we are entering a fifth generation of computing, a period of smaller faster computers that we can talk to, computers that incorporate new software methods known as artificial intelligence (AI). AI methods give computers the capability to make decisions based on the evidence of the past, rather than on a set of programmed procedures. If computers can be taught the rules of decision making used by human experts, expert systems based on AI methods can be developed to take over some human tasks.
Others believe that the emergence of the internet and enhanced communications systems (including wireless) will make the concept of computer generations irrelevant. The overriding trends in computer evolution - smaller, faster, more powerful - continue today. Today's little microcomputers are far faster and more capable than any of the earlier generation computers; today's PCs are even more powerful than most of the huge mainframe computers of the past. But today's mainframe and minicomputers are also more powerful and they now work in close concert with PCs rather than using the  dumb terminals that used to be attached to large computers.
Each new generation of computers is faster, includes more memory and storage, and their operating system are constantly being improved. Software development methods are being improved just fast enough to keep up with the new computing capabilities and, despite the new capabilities, new user-computer interface designs are making them easier to use.
Perhaps the most important of today's trends is the fact that computers and the internet are both becoming a part of our daily lives. As computers continue to be used in marketing, retailing, and banking, we will grow ever more accepting of their presence. As computers are incorporated into other machines, we may find ourselves operating a computer when we drive, buy a can of soda, or when we want a tank of gas or a bite to eat. And as the computer's presence grows in our society, it will become far easier to use. As this history of computing has demonstrated, it is the needs of humans that continually drives the development of new computers and new computing technologies. 

No comments:

Post a Comment