Pacific Connection(英語)

In Search of the Killer App: 64-bit Computing on the Desktop

In 1969, Intel introduced the world's first single-chip processor: the 4004. It was a 4-bit CPU. In 1972, it was succeeded by the Intel's 8-bit 8008. Two years later, National Semiconductor came up with the first 16-bit chip, the PACE. It would take another eight years, until 1981, before a 32-bit processor debuted: Hewlett-Packard's HP9000 engineering minicomputer. Intel shipped its 386 chip-featuring the architecture that would come to dominate computing-in 1985. Since then, 32-bit processors have helped revolutionize desktop computing. The increased speed and memory access have made the graphical user interface possible. Icons, graphics, digital photography, the Web as we know it-none of this would have been very practical in a 16-bit world.

That's where we are today on the desktop. Intel, which dominates both the workstation and server markets, remains primarily a maker of 32-bit CPUs. The same is true of AMD and Apple, with its use of IBM's PowerPC. With up to 4 gigabytes of memory access, 32-bit systems do just fine for most applications. Most operating systems are 32-bit. If the 32-bit CPU were a 32-lane road, most highway engineers would tell you that it was wide enough to accommodate most traffic on most days.

So do users really need a 64-bit highway? Is the cost worth it? Or, to put it another way, does 64-bit computing have a killer application that will drive sales for 64-bit CPUs? Or will 64-bit desktop computing occur primarily in niche markets-among processor-intensive applications? As it turns out, the two major CPU makers-Intel and AMD, have very different answers to that question. Intel, so far at least, is moving slowly and cautiously on the 64-bit desktop universe. AMD has pounced. Perhaps that's not surprising. With its large market share, Intel has much to lose if it moves too quickly, while AMD has everything to gain.

The two strategies are perhaps best understood by the processors these two companies make. The Intel Itanium, first introduced in May 2001, is a complete departure from the x86 instruction set. Co-developed by Hewlett-Packard, the Itanium and its successor, the Itanium 2 (introduced in July 2002) are primarily targeted at server applications. AMD introduced its 64-bit Opteron for the server market in April 2003. The Athlon 64-aimed at desktop machines-came out last September. Unlike the Itanium, these two architectures support the x86 instruction set without the need for intervening software, allowing 32-bit applications to run in native mode. Meanwhile, Intel is rumored to be testing a x86/64-bit hybrid architecture (which, at this writing, it won't confirm). Some observers predict the company will have no choice but to release a hybrid chip, perhaps later this year.

Intel would seem to have little to be embarrassed about so far. The company has a huge market share-around 80 percent, according to the company. In a presentation to security analysts last November, Intel President Paul Otellini declared this the "year of the Itanium" and said that the company will ship more than 100,000 of the CPUs in 2003. Those processors are found in servers and workstations manufactured by 40 different vendors. Some 1,000 applications now support the Itanium. Meanwhile, Intel's other products-the Pentium and Xeon-are also selling quite well compared to the competitors. So what's the problem?

"Until or unless Intel offers an alternative, they have to push Itanium," said Rick Whittington, an analyst for American Technology Research, in a New York Times interview. "But what Paul Otellini is saying is a fiction. The customers are not voting his ballot." One potential customer to go the other way is Sun, which has announced a family of Sun Fire servers running 64-bit Opterons. The adoption of the AMD technology represents a begrudging acknowledgement by Sun that its SPARC 64-bit RISC processor will not be enough to carry the company forward. Part of the reason for Sun's choice is undoubtedly cost: AMD's processors are cheaper than Intel's.

"For AMD, it's a huge win for us in the enterprise," says Hal Speed, who heads AMD's Strategic Initiatives Group. "We've had IBM, but Sun brings a unique approach to building out a solution set with software and hardware. Being included in that, with the quality of the customers, opens doors for us."

AMD: bridging to 64-bit

Unlike the 32-bit market where the x86 instruction set provided a common point of reference, 64-bit processors have all gone their own way. And that includes the Itanium on one hand and the Opteron and Athlon on the other. "The two architectures are nothing alike," says Speed. He says that the divergence started with Intel. "The Itanium is a departure from x86, and when Intel went to Itanium, they broke a lot of the backward compatibility with today's 32-bit applications. At the time, HP and Intel collaborated on a proprietary solution to compete against Alpha and SPARC. Our approach is vastly different. We've maintained backward compatibility. The 32-bit x86 runs natively-there is no emulation layer. Ours is a migration from x86. We've expanded our registers to 64-bit, added a handful of new instructions, while maintaining compatibility with the 32-bit world."

The two architectures have required two different versions of Microsoft's Windows XP 64-bit Edition. The AMD version came later and, at this writing, is still in beta. "This is still early," Speed says. "We've got Microsoft compilers and debuggers in beta form. The new version of Visual Studio will have full support for the AMD 64 instruction set. He says that the visual debugger, compiler, and the rest of the suite are targeted primarily for Longhorn development. Indeed, Speed thinks that Longhorn-Microsoft's planned XP replacement-is key to 64-bit computing on client machines. "The current beta will be productized in the second half of 2004. Both the server and client versions are built off the [Windows] Server 2003 code base-which is also, I think, what they do for Itanium. By contrast, Longhorn will be built from the ground up for support for AMD 64 across all the OS products: server, client, tablet."

Does that mean that applications will have separate versions for Itanium and AMD 64 versions? "I'll give you my answer," says Speed. "I think you are going to have only one version, and it's going to be ours. The industry has shown that niche, proprietary solutions do not succeed. You can buy an AMD 64-bit processor today and run a 32-bit OS and 32-bit apps. There's no penalty." The advantage, says Speed, is a mixed environment of 32-bit and 64-bit applications. You could have an Oracle 64-bit database running with 32-bit Apache software, all under the same environment with, presumably, no performance degradation-because the 32-bit apps are running in native mode.

As you might expect, Speed says that the client-side applications most likely to go 64-bit include such professional-level content creation software like special effects software such as Alias Systems's Maya and Discreet's effects and compositing products. Over time, he says, 64-bit will trickle down to the "professional/consumer" market. "The other killer app is, of course, gaming-like Unreal Tournament 2004 from Epic Games and Far Cry from Germany's CryTek. From what I can tell, the PC games are outpacing the current level of console game."

"Ultimately, 64-bit will be employed in home entertainment, to transcode rich video data. Today, a hard-disk personal video recorder cannot time-shift an HD signal, from HDTV or high-definition DVDs. A computer with an Athlon 64 could. Speed notes that Microsoft has produced a high-definition (1080-progressive scan) version of Terminator 2: Judgment Day Windows Media 9 Series. The resulting resolution is nearly three times that of a conventional DVD. But Microsoft does not insist on a 64-bit processor on the desktop. A 3.0GHz or greater processor for will do, along with an AGP4x based NVIDIA or ATI video adapter card with at least 32 MB of RAM.

Speeding video encoding

If you look around for 64-bit desktop applications, you quickly realize that we are in the foundation stage. There are 32-bit operating systems with 64-bit extensions. There are 64-bit development tools coming onto the market. But there are few applications. When AMD launched its Athlon 64 processor, its 32 "launch partners" were overwhelmingly PC integrators. Finding application developers who are even willing to talk about 64-bit on the desktop can be a tough business.

One application developer porting to AMD 64 is DivX, whose video codec does MPEG-4 compatible video compression. In September, the company announced it would support the Athlon 64. DivX creator and company co-founder Jerome Rota says that the principal performance advantage will be seen in the encoding speed. Video compression typically compares one frame with adjacent frames, looking for features that are common to both. This process, called "motion estimation," is a "very repetitive, very stupid job, but it takes a lot of time," says Rota, "about 70 to 90 percent of the total encoding time."

One of the main reasons DivX decided to create a 64-bit version of the encoder was to speed this process up. The first generation takes advantage of the larger integrated memory associated with 64-bit CPUs-what Rota refers to as "cache memory," allowing more frames to be put in memory for analysis. "We can move more data to get a substantial speed increase in motion estimation." Throughput numbers vary, but Rota figures it will be at least two times faster.

On the other hand, 64-bit is not as much an advantage on the decode side. Not that a 64-bit processor doesn't help, but Rota says that the job played by the video card can make a big difference. "We've seen a huge difference between the performance on the same CPU, depending on the card in use," he says. "An integrated, low-end video card will require a lot of CPU time to move the data through the bus to the video card." He says that for high-definition playback, especially, you need not only a good graphics card, but a strong CPU to push data through this pipe. But high-end 32-bit CPUs, in the range of the 3.2 gigahertz, will also do the job.

Rota says that recompiling 32-bit code for a 64-bit processor represents only the first stage of 64-bit development. To tap the larger possibilities of the processor, you must re-architect the code. "Right now, we are just taking the 32-bit code and tweak it just a little. But there are many more things we can do. For example, 64-bit supports larger memory sizes. This could be extremely interesting for compression at the professional level. If you can fit your entire source material into memory, you can see a huge gain in terms of motion estimation. The difference could be tremendous." He explains that motion estimation is currently done by examining the two adjacent frames-one before and one ahead of the frame in question. In the future, algorithms will look further ahead, and ultimately, the possibilities really open up if you could put the entire video source in memory. That possibility exists with 64-bit computing because of the massive addressable memory.

Rota thinks that the possibilities for 64-bit CPUs are largely unexplored. "When 32-bit first came out, people said-who needs that? It was extremely fast, and it supported 4 MB of memory. But who cared? But really quickly, people discovered some pretty important uses-like the graphical operating system. The same potential is happening again with 64-bit. I have 2 gig memory on my machine, and I can already see the advantage of having more. But that means not just adding memory, but re-thinking the application and the way you do things in code. With the current DivX architecture, it doesn't make that much sense to have more memory. If you sat down with an engineer and said-ok, what if we had unlimited memory-it opens up possibilities. The answers will require some time to think through, to take to another level. But to truly take advantage of the 64-bit architecture, you need to think anew."

Itanium: targeting RISC chips

If AMD is aiming for the desktop with its 64-bit CPUs, Intel's Itanium family remains firmly entrenched on servers. And not just any servers, but those at the high-end. In the Intel universe, Pentiums are for the desktops, Trio processors are for mobile computers, Xeon CPUs are for entry- and mid-range servers and Itaniums are aimed at servers costing roughly $7,000 to $10,000 or more. You can buy an Hewlett-Packard workstation with an Itanium 2 running HP-UX, Red Hat Linux Advanced Workstation 2.1 for Itanium, or Microsoft Windows XP 64- Bit Edition 2003. Starting price, around $3,880. But these are exceptions to the rule.

"Intel has more than 80 percent market share in the server market-but this is primarily in servers costing $10,000 and less," says Michael Graf, Intel's Itanium product line manager,. "Those servers that cost more have historically been the domain of proprietary RISC CPUs. And while the unit volume in this space is relatively small, they do account for about 50 percent of total server spending. This is really the purpose of Itanium-to go after this higher-end segment."

So what about PCs? Graf says the obvious: that for the time being, at least, 32-bit satisfies most end users. And for the time being, that claim is hard to argue with. Word processing, email, web searches-the kind of everyday stuff run on most PCs-doesn't really need more computational power. Most people and businesses won't pay for 64-bit computing unless there is a tangible payoff, a killer application. DivX's Rota can imagine an opportunity for a super-speed video codec, but how many people do video compression in their homes?

Asked how Intel's strategy compares with AMD's, Graf says that "AMD has a product that runs 32-bit software and occasionally runs 64-bit code. Itanium is optimized for 64-bit computing, and you can run 32-bit applications on the platform." (In other words, 32-bit emulation is good enough.) "So with Itanium, we're saying we're primarily going to run 64-bit, and occasionally run 32-bit. Our prime thrust is to optimize the code to run in native Itanium mode, but we recognize that end-users have got a lot of applications in their environment-and we're providing the capability.

"We believe there is a class of end-user that values high-end 64-bit computing. That's the user we're trying to satisfy with Itanium. We've got a broad suite of OSs, compilers, tools and applications optimized to run in 64-bit mode. Recognizing users may have some legacy applications they were not able to port over to Itanium in time-we're offering 32-bit support to bridge over to Itanium."

What about games and 3D graphics? Graf believes that 64-bit doesn't have that much payoff, at least yet. "Maybe it will some day, but we certainly don't see it today." And what about Longhorn? "Microsoft is supporting Itanium today with multiple 64-bit operating systems: production level versions of Windows XP and Windows Server 2003. Longhorn is another 64-bit operating system, but I don't see where it will have a big impact given what's already out in production."

Some observers have pointed to Intel's multi-threading technology for x86 processors (which Intel calls "Hyper-Threading," has effectively extended the life of 32-bit processors by giving them added performance. Graf says that Intel "will introduce multi-threading to the Itanium with our Montecito product in 2005-but we haven't decided what to name it."

Apple's fastest-the G5

There's one other player in the 64-bit desktop space: Apple. The top-of-the-line G5 is a 64-bit machine that gives Apple the kind of cache that Volkswagen gets by owning Porsche and Lamborghini. Even if you never buy one, it sounds good.

Apple has raised some eyebrows by claiming that the G5 is the fastest personal computer ever made. But there's no doubt it is fast. (For an Apple, the G5 is also big: the dual-process G5 weighs around 45 pounds.) Apple makes the case through a series of benchmark tests it commissioned, run by the firm Veritest-using SPEC CPU 2000 benchmarks. Based on the tests, which also used Dell Dimension 8300 and a two-processor Dell Precision 650 systems running Red Hat Linux 9.0 Professional, Apple claims the G5 is faster than both the fastest Pentium 4 and a dual-processor Xeon workstation when tested using industry-standard SPEC CPU 2000 benchmarks-measuring both single processor speed and overall system throughput. The G5 won three out of four, most dramatically in floating point system throughput.

In any case, the speed difference is not overwhelming and reviewers who have tested the machine generally say it runs as fast as the fastest Intel-based PCs-but not necessarily faster in every instance. When PC Magazine compared performance between the G5 and a Dell Precision 650 Workstation running dual 3.06-GHz Xeon processors, the results were mixed. Apple beat out the Dell running Adobe Acrobat and Sorenson Squeeze, a video compression tool. The Dell system was the victor running Adobe Photoshop 7 and New Tek Lightwave 3D, a 3-D modeling application.

It is unlikely that any 64-bit processor will wind up dominating the benchmarks. Just as with 32-bit processors, the winner will depend on the type of test, the machine in use, the software involved. Both AMD and Intel point out that their 64-bit chips run faster for other reasons besides bandwidth. Integrated "on-dye" memory, sometimes referred to as "cache memory" can reduce the bottleneck encountered when addressing external memory over a bus. Intel says it will have 9 MB of integrated memory on the Itanium II, 24 MB with the Montecito in 2005. The number of on-board registers and the way memory is handled also contribute to computational power. So the real test of 64-bit on the desktop will be up to application developers. To the extent they can identify and seize an opportunity available on 64-bit, at least some customers will follow. That has already happened on the server side. But on the desktop, the killer application has yet to be created.

おすすめ記事

記事・ニュース一覧