Science Fair Project Encyclopedia
IBM PC compatible
IBM PC compatible refers to a class of computers which make up the vast majority of smaller computers (microcomputers) on the market today. They are based (without IBM's participation) on the original IBM PC. They use the Intel x86 architecture (or an architecture made to emulate it) and are capable of using interchangeable commodity hardware. These computers also used to be referred to as PC clones, and nowadays, just PCs.
The origins of this platform came with the decision by IBM in 1980 to market a personal computer as quickly as possible in response to Apple Computer's rapid success in the burgeoning personal computer market (although it wasn't generally known as the 'personal computer' market). On 12 August 1981, the first IBM-PC went on sale. There were several operating systems available for it but the one people remember is DOS (it was also the cheapest). IBM licenced DOS from Microsoft, IBM's version was called PC-DOS and was sold as an 'add-on' to the IBM PC. IBM's agreement also allowed Microsoft to sell its version MS-DOS for non-IBM platforms. Also, in creating the platform, IBM used only one proprietary component: the BIOS.
Columbia copied the IBM PC and produced the first 'compatible' (ie compatible to the IBM PC standard) PC in 1982. Compaq Computer Corp. produced its first IBM PC compatible (which was also the first sewing machine-sized portable PC) a few months later in 1982 — the Compaq Portable. Compaq could not directly copy the BIOS as a result of the court decision in Apple v. Franklin, but it could reverse-engineer the IBM BIOS and then write its own BIOS using clean room design. Compaq became a very successful PC manufacturer, and was bought out by Hewlett-Packard in 2002.
Simultaneously, many manufacturers such as Xerox, Digital, and Sanyo introduced PCs that were, although x86- and MS-DOS-based, not completely hardware-compatible with the IBM PC. While such decisions seem foolish in retrospect, it is not always appreciated just how fast the rise of the IBM clone market was, and the degree to which it took the industry by surprise. Later, in 1987, IBM itself would launch the PS/2 line of personal computers which was only software compatible with the PC architecture; this was also hugely unsuccessful.
Microsoft's intention, and the mindset of the industry from 1981 to as late as the mid-1980s, was that application writers would write to the API's in MS-DOS, and in some cases to the firmware BIOS, and that these components would form what would now be called a hardware abstraction layer. Each computer would have its own OEM version of MS-DOS, customized to its hardware. Any piece of software written for MS-DOS would run on any MS-DOS computer, regardless of variations in hardware design.
During this time MS-DOS was sold only as an OEM product. There was no Microsoft-branded MS-DOS, MS-DOS could not be purchased directly from Microsoft, and the manual's cover had the corporate color and logo of the PC vendor. Bugs were to be reported to the OEM, not to Microsoft. However, in the case of the clones, it soon became clear that the OEM versions of MS-DOS were virtually identical, except perhaps for the provision of a few utility programs.
MS-DOS provided adequate support for character-oriented applications, such as those that could have been implemented on a minicomputer and a Digital VT100 terminal. Had the bulk of commercially important software fallen within these bounds, hardware compatibility might not have mattered. However, from the very beginning, many significant pieces of popular commercial software wrote directly to the hardware, for a variety of reasons:
- Communications software directly accessed the UART chip, because the MS-DOS API and the BIOS did not provide full support for the chip's capabilities.
- Graphics capability was not taken seriously. It was considered to be an exotic or novelty function. MS-DOS didn't have an API for graphics, and the BIOS only included the most rudimentary of graphics functions (such as changing screen modes and plotting single points); having to make a BIOS call for every point drawn or modified also increased overhead considerably, making the BIOS interface notoriously slow. Because of this, line-drawing, arc-drawing, and blitting had to be performed by the application, and this was usually done by bypassing the BIOS and accessing video memory directly. Games, of course, used graphics. They also performed any machine-dependent trick the programmers could think of in order to gain speed. Thus, games were machine-dependent—and games turned out to be important in driving PC purchases.
- Even for staid business applications, speed of execution was a significant competitive advantage. This was shown dramatically by Lotus 1-2-3's competitive knockout of rival spreadsheet Context MBA . The latter, now almost forgotten, preceded Lotus to market, included more functions, was written in Pascal, and was highly portable. It was also too slow to be really usable on a PC. Lotus was written in pure assembly language and performed some machine-dependent tricks. It was so much faster that Context MBA was dead as soon as Lotus arrived.
- Disk copy-protection schemes, popular at the time, made direct access to the disk drive hardware precisely in order to write nonstandard data patterns, patterns that were illegal from the point of view of the OS and therefore could not be produced by standard OS calls.
- The microcomputer programming culture at the time was hacker-like, and enjoyed discovering and exploiting undocumented properties of the system.
At first, other than Compaq's models, few "compatibles" really lived up to their claim. "95% compatibility" was seen as excellent. Gradually vendors discovered, not only how to emulate the IBM BIOS, but the places where they needed to use identical hardware chips to perform key functions within the system. Reviewers and users developed suites of programs to test compatibility, generally including Lotus 1-2-3 and Microsoft Flight Simulator, the two most popular "stress tests." Meanwhile, IBM damaged its own franchise by failing to appreciate the important of "IBM compatibility," when they introduced products such as the IBM Portable (essentially a Compaq Portable knockoff), and later the PCjr, which had significant incompatibilities with the mainline PCs. Eventually, the Phoenix BIOS and similar commercially-available products permitted computer makers to build essentially 100%-compatible clones without having to reverse-engineer the IBM PC BIOS themselves.
By the mid-to-late 1980s buyers began to regard PCs as commodity items, and became skeptical as to whether the security blanket of the IBM name warranted the price differential. Meanwhile the incompatible Xeroxes and Digitals and Wangs were left in the dust. Nobody cared that they ran MS-DOS; the issue was that they did not run off-the-shelf software written for IBM compatibles.
The declining influence of IBM
Since 1982, IBM PC compatibles have conquered both the home and business markets of commodity computers so that the only notable remaining competition comes from Apple Macintosh computers with a market share of only a few per cent. Meanwhile, IBM has long since lost its leadership role in the market for IBM PC compatibles (this may have had to do with the failure of other manufacturers to adopt the new features of the IBM PS/2); currently the leading players include Dell and Hewlett-Packard. Despite advances in computer technology, all current IBM PC compatibles remain very much compatible with the original IBM PC computers, although most of the components implement the compatibility in special backward compatibility modes used only during a system boot.
One of the strengths of the PC compatible platform is its modular design. This meant that if a component became obsolete, only an individual component had to be upgraded and not the whole computer as was the case with many of the microcomputers of the time. As long as applications used operating system calls and did not write to the hardware directly, the existing applications would work. However, MS-DOS (the dominant operating system of the time) did not have support for many calls for multimedia-hardware, and the BIOS was also inadequate. Varous attempts to standardise the interfaces were made, but in practice, many of these attempts were either flawed or ignored. Even so, there were many expansion options, and the PC compatible platform advanced much faster than other competing platforms of the time.
"IBM PC Compatible" becomes "Wintel"
In the 1990s, IBM's influence on PC architecture became increasingly irrelevant. Instead of focusing on staying compatible with the IBM-PC, vendors began to focus on compatibility with the evolution of Microsoft Windows. No vendor dares to be incompatible with the latest version of Windows, and Microsoft's annual WinHEC conferences provide a setting in which Microsoft can lobby for and in some cases dictate the pace and direction of the hardware side of the PC industry. The term "IBM PC Compatible" is on the wane. Ordinary consumers simply refer to the machines as "PCs," while programmers and industry writers are increasingly using the term "Wintel architecture" ("Wintel" being a contraction of "Windows" and "Intel") to refer to the combined hardware-software platform.
The breakthrough in entertainment software
The original IBM PC was not designed with games in mind. The monochrome graphics and very simple sound made it unsuitable for multimedia applications. That, and the fact that it was priced out of the entertainment market, made it seem unlikely that the PC platform would be used for games.
As the technology of the PC advanced, games started to appear for the PC. At first, these were inferior to the games for other platforms. Thanks to the modular design, the technology behind the PC advanced rapidly. What PC games lacked in multimedia capabilities, they made up for in raw speed. A few years later, VGA cards started to appear. These offered 256-colour graphics out of a palette of 262144. At around this time, sound-cards started to appear. They improved the beeping sounds of the PC speaker to give a more rich sound.
By the time the PC had superior hardware to the competing platforms of the time, it still was not taken seriously as a games machine. This could have been caused by the higher price, or the fact that video game consoles rather than personal computers were now starting to attract gamers, or it could have been that the hardware was very awkward to program for, and required the development of different drivers for all the multimedia hardware.
The PC platform did not manage to create a cult-following as the other platforms had done. At the time, there was a demo scene on the PC but it was small, did not appear until many years after the original IBM PC and demos were few and far between. The lack of a demo scene meant that there were few programmers who knew how to get the most out of the machine, so there were few PC programmers out there with the knowledge required to squeeze the full performance out of the machine.
One thing that PCs did have in their favour were raw processing power. This made them suitable for 3D games. The PC made a breakthrough as a games machine when Doom was released in 1993 thanks to its outstanding graphics and gameplay. Because networking hardware was widespread on PCs, Doom also offered multiplayer support across a network. Few games offered that at the time. Doom finally established the PC as a games-machine.
Design flaws and more compatibility issues
When the PC was originally designed, even though it was designed for expandability, even the designers of the original IBM PC could not take into account the hardware-developments of the '80s. By the late '80s, IBM the creators of the IBM PC hardly had much say, and a lot of other companies were trying to push their standards.
To make things worse, IBM, Intel and Microsoft introduced several design flaws which created hurdles for developing the PC compatible platform. One example of such a design flaw was the DOS 640k barrier (memory below 640k is known as conventional memory). This was partly to do with the way IBM mapped the memory of the PC, and the memory-managment of DOS (which was the most widely used operating system) had a way of dealing with it that made things worse. In order to expand PCs beyond one megabyte, EMS was devised to allow access to the memory above 1 megabyte. However, once Intel released the 80286 processor, an alternative memory managment scheme was introduced — XMS. EMS and XMS were originally incompatible, so anyone writing software that used more than one megabyte had to support both systems.
Graphics cards suffered from their own incompatibilities. Once graphics cards advanced to SVGA level, the standard for accessing them was no longer clear. At the time, PC programming involved using a memory model that had 64KB memory segments. The standard VGA graphics modes used screen memory that fitted into a single memory segmet. SVGA modes required more memory, so accessing the full screen memory was tricky. Each manufacturer developed their own ways of accessing the screen-memory and even numbering the new graphics modes. This meant that the manufacturers needed to develop device drivers in software that allowed the SVGA modes to be used by a program that accesses the graphics-card at the driver level. Unfortunately, there was no standard for device-drivers that all manufacturwers followed. An attempt at creating a standard called VESA was made, but not all manufacturers adhered to it. To make things worse, the manufacturers' drivers often had bugs. To work around them, the application developers had to write their own drivers for the cards with buggy drivers.
Programming the PC was a nightmare. It put many hobbyists off, and may have been responsible for the slow take-off of the PC as a multimedia platform. When developing for the PC, a large test-suite of various hardware combinations was needed to make sure the software was compatible with as many PC configurations as possible. Eventually, a new memory-model was devised — DPMI. It offered a flat memory model and made life for programmers easier.
Meanwhile, consumers were overwhelmed by the many different combinations of hardware on offer. To give the consumer some idea of what sort of PC would be needed to run a given piece of software, the Multimedia PC standard (or MPC) was set in 1990. It meant that a PC that met the minimum MPC standard could be considered an MPC. Software that could run on a minimalistic MPC-compliant PC would be guaranteed to run on any MPC. The MPC level 2 and MPC level 3 standards were later set, but the term "MPC compliant" never caught on. After MPC level 3 in 1996, no further MPC standards were set.
The rise of Windows
Microsoft announced Windows 1.0 in November 1983, but wasn't able to get it out the door until 1985. It wasn't succesful. The same thing happened in 1987 with the launch of Windows 2.0; followed by the launch of Windows/286 and Windows/386 in 1988.
It is probably the lack of success of these early versions of Windows that threw IBM and Microsoft together to produce their version of the future with OS/2 in 1987 (it was launched with IBM's PS/2). At the launch Bill Gates is quoted as saying "DOS is dead". OS/2 had been written from scratch by Microsoft and IBM (with IBM taking the lions share) and was vastly superior to the DOS based Microsoft Windows.
But OS/2 had a problem - amongst many, it turned out - it was written for the 80286 processor. The 80386 had been launched the year before and, according to Gordon Moore, Intel had told IBM that the 386 would be ready in time for OS/2 shipping, Moore says IBM didn't believe him and carried on writing OS/2 for the 286. And when the 386 was launched in September 1986 it left OS/2 disasterously underpowered.
Then, making things worse, IBM and Microsoft didn't deliver OS/2's various 'extra bits' (namely Presentation Manager - the Windows like front end for OS/2). Despite Microsoft and IBM saying "DOS was dead" users whole heartedly stuck with it.
Thus by 1990 the market - and the technology on the PC platform - was ready for something new. Microsoft was still working with IBM when it launched Windows 3.0 and - according to Gates - it sold twice as many copies as Microsoft had expected. Windows 3.0 sat 'on top' of DOS, thus users would load DOS on their machine and THEN load Windows, this allowed users to swap between DOS and Windows - rather than picking just one environment, making the gradual move to Windows possible.
Windows 3.0 resembled Apple Computer's System 7 (Microsoft went as far as hiring Apple employees in its Windows design team) and revolutionised the way users 'used' their PCs. In the past, users had typed in commands into the MS DOS interface (a Command Line Interface - CLI) where now they had a Graphical User Interface GUI which used a mouse to point to small pictures of tasks icons to 'make things happen'. Windows 3.0 was followed by Windows 3.1 in 1991 and eventually Microsoft, realising that users wanted to network their PCs, included standard network protocols into a newer 3.11 version.
With the two companies still working together in the early 1990's the success of Windows 3.0 - and the realative failure of OS/2 - caused some friction. According to Gates, IBM said to Microsoft that it should drop Windows and work wholly on OS/2. Microsoft declined and eventually the two split, Microsoft took its code for OS/2 3.0 - codenamed OS/2 NT (for New Technology) with it. OS/2 NT would mutate into Windows NT and therefore into todays Windows XP.
Windows NT was launched in 1993, it was a parallel development to Windows for DOS, aimed at the server market it was supposed to be a fully professional system that wouldn't rely on DOS. At this time take up was very small, the system was power hungry and had few applications.
Development of the traditional Windows platform continued, adding more features, standardised protocols and building on hardware support, this in 1995 Windows 95 was born. Before Windows 95, games and gaming were a totally MS DOS experience. Users had to tolerate rebooting into DOS, fiddling with memory (see the 640k barrier ) and reconfiguring their PC every time they wanted to load a game. Windows 95 provided a system called DirectX which allowed programmers access to a standard API to perform video and sound card calls from Windows, revolutionising the games arena. For the first time, a PC programmer could benefit from Windows 95s memory management capabilities and extended functionality, and have API access to the graphics and sound cards - of which there were many versions and drivers. 3D graphics were possible from within Windows, (for those with 3Dfx cards) and now Network Multiplayer 3D graphics games were in the realms of possibility to almost every programmer.
Windows 95 was replaced with Windows 98 in 1998 then with Windows 98SE (Second Edition) in 1999. It was Microsoft's intention to combine its Windows NT and Windows 9x (as the various versions of Windows 95 to ME were called) operating systems and the phasing out of the Windows 9x operating systems. At first Microsoft were to finish the 9x line with Windows 98SE but when it was apparent that its NT line needed more power than the average 9x PC could deliver. The phasing out was delayed and Microsoft launched an 'interim' version of Windows: Windows ME (Millenium Edition) in 2000.
In Febuaray 2000 the latest version of Windows NT was released called Windows 2000 and finally began to show signs that it could exist on the PC desktop. And in October 2001 Windows XP was launched, this was to replace all previous versions of Windows and, at time of writing Feb 2004, has had two service pack updates and isn't expected to be replaced by the next version of Windows (codename) Longhorn until 2006.
Challenges to Wintel domination
The success of Windows had driven nearly all other rival commercial operating systems into near-extinction, and had ensured that the PC was the dominant computing platform. This meant that if a manufacturer only made their software for the Wintel platform, they would be able to reach out to the vast majority of computer users. By the mid to late 1990s, introducing a rival operating system had become too risky a commercial venture. Experience had shown that even if an operating system was superior to Windows, it would be a failure.
However, a free operating system was being developed by enthusiasts - Linux. Because they were doing it for fun, they were not concerned with taking risks. Despite the fact that Microsoft programmers were programming for a living and the programmers working on Linux were programming in their spare time, Linux became a first class product. The sheer number of contributors to the Linux project allowed development effort comparable to that of the Microsoft programmers. After a couple of years, Linux had become a very powerful operating system and, because it was free, it spread widely.
By the late 1990s, Linux was being taken seriously. It was seen as an example of what could be achieved with the open source movement. While initially lacking in software and being incompatible with Windows, Linux did solve one of the main problems with Windows — stability issues. Despite this, Windows still remains the dominant operating system.
On the hardware front, Intel decided to licence their technology so that other manufacturers could make x86 compatible CPUs. In other cases, companies such as AMD and Cyrix produced alternative CPUs compatible with Intel's. Towards the end of the 1990s, AMD was taking a huge chunk of the CPU market for PCs and even ended up playing a significant role in directing the evolution of the 'x86 platform when its Athlon processors were released in 1999, two years before the comparable Intel Pentium 4 architecture was released.
DirectX, while solving many of the problems in programming the PCs, was only compatible with Windows. OpenGL, which was available for several platforms, was ported to Windows, and offered a means of rapidly developing cross-platform 3D applications.
The PC today
- main article is at personal computer
The original IBM PC is long forgotten and the term PC compatible is not used. The processor speed and memory are many orders of magnitude greater than they were on the original IBM PC, yet any well-behaved program for the original IBM PC that does not call the hardware directly can still run on a modern PC. Some say that the desire for backward compatibility might have hindered the development of the PC, but many believe the ability to run legacy software is what helped keep the PC alive.
The modular design makes it possible to choose every component of a PC from a variety of different manufacturers and to buy only what is needed for the tasks the computer is intended to carry out. Upgrades are easy. It is also possible to choose the operating system to run on the PC, and what software to run.
Software and compatability amongst different PCs and hardware compatibility is no longer a major issue. There are other platforms in existence today (mostly the Apple Macintosh), but they are a minority.
Thanks to intuitive user-interfaces and the information-gathering and communications capabilities of the Internet, the computer has finally escaped from the domain of computer-professionals and computer-hobyists, and has become mainstream.
The design of computer cases has become more elaborate and users can modify the cases themselves (this is known as case modding), but even so, the plain beige box case design that has been around since the 80s are still common.
A PC can come in one of the following configurations:
A computer that sits on the top of a desk. Portability is not part of the design, so the desktop computers tend to be too heavy and too large to carry. This has the advantage that the components do not need to be mimiaturised, and are therefore cheaper.
Not long after the first IBM-PC came out, Compaq produced the Compaq Portable — one of the first portable PC compatible computers. Weighing in at 28 pounds, it was more of a "luggable" than a "portable".
A Laptop (also known as a Notebook) is a PC that has been miniaturised so that it is easy to carry and can fit into a small space. It uses a flat-screen LCD which is folded onto the keyboard to create a slab-shaped object. Carrying a laptop around is easy, but this increased portability comes at a cost. To reduce size and mass, a special design is used with smaller components. These components are more expensive than regular components. The design is more integrated meaning that it is less expandable, although the RAM and the hard drive can be upgraded. Laptops are also battery powered, so as well as being smaller, the components need to have a low power-usage.
In 1996, Toshiba produced the Libretto range of sub-notebooks (mini-notebooks). The first model (the Libretto 20) had a volume of 821.1 cm and weighed just 840 g! They were fully PC compatible (unlike PDAs). There were several models produced in the Libretto range. Librettos are no longer produced.
- Computer hardware
- Computer software
- IBM PC
- Personal computer
- History of computing hardware (1960s-present)
- PC speaker
- The PC Guide! – Contains detailed historical and technical information. Many resources about PCs and some links.
- WiredGuide Resources Page – A collection of links to PC resources.
- http://server.physics.miami.edu/~chris/pc_resources.html – A collection of links to useful PC programs, utilities and Web sites.
Buying a PC
- Pricewatch – Compare prices of PC hardware from different vendors.
- Dave's Guide to buying a Home Computer
Building a PC
- Build your own PC
- http://www.buildyourowncomputer.net/ – Learn to build your own computer.
- My Super PC – How To Build A PC - A Computer Building Guide.
-  – Great Hardware Reviews
The contents of this article is licensed from www.wikipedia.org under the GNU Free Documentation License. Click here to see the transparent copy and copyright details