Site hosted by Angelfire.com: Build your free website today!

^/\^ PeaK /\^/\



An Overview of 3D Graphics/Hardware Features ... why all the fuss

1.0 3D Bottlenecks and Benchmarks

The task of creating an illusion of a 3-D world on a computer monitor is a very compute intensive task that inevetably involves either the CPU bogging down and/or the Video Graphics Chipset bogging down. Bogging down needs to be quantified and benchmarks are an invention to help gauge the amount of or lack of performance of advance graphics cards designed to remove this bogging down process. For computers, the benhcmarking task was typically a well defined task like calculating a number but how do you benchmark the task of creating an illusion ? Graphics cards use different algorithms to arrive at a screen image....the quality and time/speed taken must both be consider together, equally.

I was browsing for 3D benchmarks and came across Michael's 3D Benchmark Links . He stated that 3DBench was not a very good benchmark becuase of its dependence on the rating of the CPU used. To the extent that we would all like to be able to judge the merits of one subsystem...say CPU...independent of another....say Graphics card...I agree. One of the things that Steve Jobs hyped on when he was building his NeXT computer was that the performance is always limited (and thereby) sensitive to the slowest subsystem. You could say that sensitivity implies the existence of a bottleneck. The two most influential bottlenecks in our quest for creating an illusion are the CPU/motherboard and the Graphics card.

1.1 Benchmarking an Illusion

3D hardware gives you better performance in terms of the traditional computerspeak of better response and less time lag while addressing a issue which most of you can judge on your own but have not had to in the area of computers: visual realism and accuracy. Any hardware such as graphics cards or board architectures(such as AGP...combined software/operating system/board solution) that aid in this task of increase realism are on the right track.. PC World wrote one of the few good reviews recently on 3D but inadvertently described the advantages of 3D for computer gaming by saying:

AGP or "Accelerated Graphics Port" , synonymous with increased visual quality, is a hardware solution to freeing up the need for expensive(circa 1996) locat memory (found on the graphics card) while providing medium bandwidth access to main memory for the "3D wallpaper patterns", or texturres, used to cover all objects in the 3D illusion. More wall paper/larger memory allows for richer, more varied and more detailed patterns to represent your 3D illusion. Just replace every instance of AGP with words 3D in quotes above to explain what 3D hardware should be delivering. We do not need frame rates much beyond 15 fps for smooth play...there are more important things to address for 3D realism once the framerate is above a certain level.

2.0 3D Background and Terminology

With this as a backdrop, I like to give you some links which cover and allow you to explore the above ground on your own.

Most of you familiar have heard of or perhaps seen DOOM... the father of a genre of games called first person role playing games. Blocky low resolution images on a 486 CPU took a backseat to the concept of a being in virtual world interacting with one another. To make this work, a virtual desscription of the world is coded into the program to describe objects. These objects are then interpreted by the 3d program based upon their surroundings (lighting and shadows) and visually interpreted onto the flat computer screen (2D) based upon their perspective/relative position away from an imaginary viewer which happens to be you. When you record a scene with a camcorder, the objects are real and the image captured by your camera (2D) behaves according to the physics of
  1. object position
  2. relative object position
  3. transparency
  4. surface properties
  5. environment
Doom addressed 1] and 2] and downplayed the roles of 3], 4] and 5]. With the Pentium processor, new video chipset addressed the last three areas:
 
Property 3D Hardware
postiion -perspective correction
-bilinear filtering
relative postiion z-buffer
transparency lens effects
surface properties shading
environment fogging
alpha blending

3.0 The need for 3D Standards

Microsoft Windows performance used to be hampered by the CPU because it had a lot of information and work to do. With the advent of the "accelerated" chip, the CPU no longer became the bottlenck because its task was simplified to only issuing and implementing higher level and less data intensive calls. Prior to Windows, each and every DOS application had to be patched to take advantage of the hardware features. Software development was a little less hectic then and companies such as ATI hired people to do these sorts of things called native ports. With games, this is not so easy and manufacturers do these things co-operatively with game developers. The times they are a changin...

As the market matured and applications started writing to standard application interfaces (ie. Windows), Graphics manufacturers only had to map the Windows interface onto their hardware...the software stayed the same (regardless of the hardware) and a level playing field was established. Video graphics cards distinguised themselves through a combination of "marrying driver expertise to hardware features". Winstone and Winbench tests this marriage. One piece of hardware may actually be superior to another but be hampered by poor drivers...this begs the question "Should a benchmark be "truly" driver independent ?" ...It all depends on your audience. If what you want are practical, low cost, "green earth", solutions then it is to everbody's advantage to have a level playing field that starts with hardware and low level drivers and demands the existence of defacto high level application 3D interfaces. Else all this 3D activity is targeted towards niche techno-geeks like myself who post benchmark results for native ports of applications such as VQuake for other techno-geeks to have some semblance of community. AcK!

3.1 Lack of standards and Native Ports

Popular games such as MechWarrior and Quake are ported to specific video cards until the playing field is made even by standard interfaces such as Direct3D and OpenGL. At present, the defacto DOS interface for games is DOS as gamers are a different breed from the business types that embraced Microsoft Windows. Gamers are sophisticated and opt for immersion over neat icons.

It is unfair to compare a special native port of a game to one piece of hardware to gauge "intrinsic" hardware performance....however it does allow the industry to compete in more ways for the attention of gamers and those who succeed at it should be applauded for upping the software ante. Believe me, if the market wants the fragmentation created by "special ports", other hardware vendors will come on side (as they have with MechWarrior) and the beginnings of new emerging non-defacto standards begin to emerge. That is the standard will be to have standard ports. Confused ?

The defacto standard practice in the 3D industry amongst the big guns is to port MechWarrior over to specific 3D hardware. Maybe we should use MechWarrior (circa 1996) as a benchmark :) I know of some well respected Hardware reviewers/webmasters arguing for native ports of Quake/GLQuake (circa 1997) and in 1998 someone big is urging us to use Turok as a Benchmark....whatever happened to MS MonterTruchMadness. This is a reflection of the continued growth of games performance being tied to the identity of a PC that has led to 3D to garner the "heart and mind" of the graphics card industry. So if the person coding the port had a "bad" day while doing the "Creative" port for the nth time....guess who does and who doesn't benefit... Just don't expect to see any useful benchmarks until API are figured out.

3.2 Direct3D...an ill birthed Standard Interface

3D standard interfaces are designed minimize the amount of software required to support a the 3D features of a given piece of 3D hardware. As in any design, bad decisions in the design are soon flushed out as these designs are flushed out. Direct3D was a badly documented complicated design whose flaws in portability were masked. The concept of hardware shortcomings being adaptively addressed by software gave control to the game developer but also the also the responsibility of providing this capability. In Microsoft's haste to have hardware vendors on their side, the lack of a baseline 3D hardware platform resulted in the task of developing an application, such as a game, was a moving target. Software could always, given enough time, be tested on enough platforms to ensure that a game ran on most hardware.

3.2.1 Compatability Bits

Things are not entirely "rosy" in Direct3D Land in terms of application developers being able to write their application once due to Microsoft's ill conceived concept of compatabiity bits as pointed out in Game Developer March 97 as quoted below: 

Alex St. John conceived DirectX/Direct3D and was recently fired from Microsoft for software politicking incorrectly on behalf of Direct3D. The gist of it is that John got it right and Alex conceded...his firing is the first of many manouveurs for Microsoft to keep a unified voice internally and I guess externally. The following is an excerpt from the folks at Ziff-Davis that discusses some of the application and benchmarking issues/weaknesses/traps inherent in the Direct3D implementation: It is not ethical if hardware vendors benchmark their products under different conditions from their competition to inflate the performance specs and exploit the of the unsuspecting 3D customer. Misrepresenting the hardware capability effectively caused 3D benchmarks like Winbench to give products with non-implemented 3D capability the same score as another product with the correctly implemented if they both had the same framerate. The effect is two-fold as the misreporting of bits also resulted in a faster framerate due to the simpler processing.These sort of games happened in the 2-D days and are happening again. Cavaet Emptor.

3.3 OpenGL...a better 3D interface

John Carmack has stirred the pot and should make things interesting for 3D by providing his support of a "real visualization " vs "gaming texturing" API in the form of OpenGL. The very thing that made the likes of Terminator2 possible.

OpenGLhas been the workhorse standard in the Workstation world for years. It is robust and and yes...it runs on Linux (more recent link here). John Carmack's (of DOOM fame) offering of a OpenGL version of Quake and his view of a 3D programming has caused a clarification about an API's ease of use versus speed within the industry with regards to APIs. Card vendors I think will need to support both Direct3D and OpenGL. Please see the PeAK OpenGL pages located here

Currently, most chip manufacturers, including ATI, keep the 3D aspects of their chips under close wraps for proprietary reasons. It is not until open and defacto standards emerge for application interfaces exists, such as Direct3D or OpenGL, that a level playing field will exists for chip and and card vendors to distinguish their products to the guy on the street via marrying drivers for these common interfaces to hardware.

4.0 Fourth Generation 3D

Multimedda/Convergence of DVD, HDTV, TV-out, TV-in, and Broadcast TV support are important issues to consider for a video graphics card but I will stick to 3D for now. The quest for more realism/immersion in 3D graphics and visuals without incurring a significant response lag has demanded the following features:

  1. Higher resolutions:24 or 32 bit Z=buffer, larger local memories(16-32 MB), double/triple buffering
  2. Better/More filtering: trilinear filtering
  3. Better throughput: pixel caches, 1.6GByte/second memory bandwidth, 128 wide internal buses, 128 wide external buses.
  4. Workstation class OpenGL Graphics: Mature OpenGL support for workstation class applications.

4.1 Trilinear Filtering on the Rage 128
A A good synopsis of all these 3D terms can be fournd 3Dimensional 128 site. There have been many threads going around about "trinlinear filtering" and the Rage 128 and the throughput penalty for enabling it and "kluges" by other companies to get enhanced filtering using a technique LOD Dithering to approximate this filtering technique...the problem is that this technique is passed off as "trlinear filtering". Here are the two best responses that I have seen. The are by the "G" brothers Geoffrey and Dan who can be found in many of the 3D forums cutting to the chase on 3D issues.

Date: Thu Dec 17'98 - 1:02pm
     Author: Geoffrey G. (gbgitch@southwind.net)
     Subject: Nope.

        in response to Re: Urban Legend?, posted by Dave Steele on Thu  
        Dec 17'98 - 7:21am

     Nope.
Date: Thu Dec 17'98 - 3:21pm
     Author: Dan G. (dangitch@southwind.net)
     Subject: Re: Nope.

     , in response to Nope., posted by Geoffrey G. on Thu Dec 17'98 - 1:02pm

     Re: Nope.
I think the thing I will take away from this is that LOD dithering is a way of getting a alternate and possibly better visual with some known tradeoffs. The problem is that it is passed off as something else. Cavaet Emptor.
 

5.0 DX8 mini-FAQ


5.1 DirectX History Lesson

There used to be and still is a tight binding force in the 3D community known as Winbench
Through a freak sequence of events, it came to be that hardware reviews were reduced to time demo execution runs of the latest hot game and the lustre of WinBench tarnished.

My prediction is that WinBench will make a return as the "benchmark" to gauge 3D hardware features fairly in terms of performance and quality.  They have  earned the respect of Vendors due to an experienced core group of graphics savvy programmers who are the "defacto self appointed guardian/consumer watchdogs"  of 3D maturation. i
One of the few entities capable of creating metrics to help gauge the validity of 3D and separate out the hype in the  hardware/driver/application pipeline.  Another unique aspect of them is the capability to debug/uncover/find "tricks" that are employed by drivers to workaround (sometimes illegally) deficiencies in the hardware. Unlike most test sites that stand on the coat tails of  frame rate counters (found in nearly all games today), ZDNet is capable of distilling and writing sophisticated code to uncover dissected focus views on hardware.  These   synthetic benchmarks dissect the goods on both  hardware features and the ability of the driver to expose it.  WinBench has concentrated on bug fixes for 2001.

  • "Bench Capades"...cheats,
        1. You mean my name itself makes a difference in my report card
        2. Approximating quality and performance...
        3. Thesis: games are a better indicator of readworld performance than synthetic benchmarks
        4.  Blame Canada!!!
            1. ATI 2 consecutive Vsync flipping driver (Rage Pro Turbo) serves as Tom's fodder for the demise of Winbench and Tom's hardware claim to fame. It was Monster Truck Madness, Turok, and now Mercedes Benz Trucking...WinBench has its place in this world but when you are far ahead of the pack, you often get overlooked.
        That is it for the history lesson. Better benchmarks can be arrived at by bypassing either bypassing "some" of the rendering or a good chunk of it (stuttering, anyone?). Today, many have commented on lower benchmarks with the Radeon but better game play with smoother framerates. I hope some of the cheats in synthetic benchmarks and real world games give you an understanding of why this can be so.

     

    ...counter reset Dec 4/97

    Email: peakrchau@yahoo.ca | HOME


    Email: peakrchau[at]yahoo.ca

    ...as of May 15 2004