Það er allllveg að gerast, einhverntíman í kvöld eða á morgun fellur niður NDA'ið á Halflife2 benchmörkunum þannig að þá munum við sjá allar síðurnar með sín eigin benchmörk. Þangað til aftur á móti hefur ótal gerst.

http://www.anandtech.com/showdoc. html?i=1862

“You'll see my own numbers tomorrow night at midnight, but we've been given the go ahead to reveal a bit of information about Half-Life 2. I'll keep it brief and to the point and will explain it in greater detail tomorrow night:

- Valve is pissed at all of the benchmarking ”optimizations“ they've seen in the hardware community;
- Half-Life 2 has a special NV3x codepath that was necessary to make NVIDIA's architecture perform reasonably under the game;
- Valve recommends running geforce fx 5200 and 5600 cards in dx8 mode in order to get playable frame rates.
- even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7. The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;
- ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor.”

http://www.gamersdepot.com/har dware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchma rks/001.htm

“In his speech, Gabe outlined several key points of both personal and professional frustration:

Valve take serious issue with ”optimizations“ from NVIDIA as of late

In no certain words, Valve is highly disappointed with current NV3x hardware as a high-performance DX9 accelerator

Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings.

According to Gabe, the rumors and speculation of ATI paying them off is nothing but bull - he said Valve's top priority has everything to do with them wanting HL2 players to have the best experience. After doing some early-on benchmarking between NVIDIA and ATI, the choice was clear from Valve's standpoint of who to partner with.

Microsoft's DirectX team was on-hand to give full blessing to Valve's upcoming HL2 benchmark - in fact it's being referred to as the most complete DX9 benchmark to date. ”

http://www.gamersdepot.com/interviews /gabe/001.htm

“GD: Why DirectX9 over OpenGL 2.0?

Gabe: OpenGL 2.0 isn't as mature as DX9.

GD: What's your relationship with NVIDIA been like in light of all the recent ATI press over HL2?

Gabe: Valve and NVIDIA both know that we have a lot of shared customers, and we've invested a lot more time optimizing that rendering path to ensure the best experience for the most customers.”

http://techreport.com/etc/2 003q3/valve/index.x?pg=1

“NVIDIA's NV3x-derived chips are way off the pace set by the ATI DirectX 9-class cards. The low-end GeForce FX 5200 Ultra and mid-range GeForce FX 5600 Ultra are wholly unplayable. The high-end GeForce FX 5900 Ultra ekes out just over 30 fps, well behind ATI's mid-range Radeon 9600 Pro (and yes, that score is from a 9600 Pro, not a stock 9600—the slide was mis-labeled). The Radeon 9800 Pro is twice the speed of the GeForce FX 5900 Ultra.

However, NVIDIA has claimed the NV3x architecture would benefit greatly from properly optimized code, so Newell detailed Valve's sojourn down that path. The company developed a special codepath for the NV3x chips, distinct from its general DirectX codepath, which included everything from partial-precision hints (telling the chip to use 16-bit floating-point precision rather than the default 32-bit in calculating pixel shader programs) to hand-optimized pixel shader code.

The ”mixed mode“ NV3x codepath yielded mixed results, with a fairly decent performance gain on the FX 5900 Ultra, but not near enough of a boost on the FX 5200 Ultra and FX 5600 Ultra.

Oddly enough, even using the DX8 codepath, the previous-generation GeForce Ti 4600 outperformed the brand-new GeForce FX 5600 Ultra. ”

http://firingsquad.gamers.com/hardwar e/hl2_performance_preview_part1/

“While showing a very significant increase in frame rates for the FX5900 card, the 5600 and 5200 Ultra gets a slight performance boost. Remember, on the slide in page 1, the FX5900 at full DX9 only clocked in at only about 31 FPS. While it is all well that NVIDIA's performance has increased, these performance gains will be moot as new DX9 functionality will allow for less parital-precision functions.

So how much has Valve done to please its NVIDIA-based video-card-owning customers? It has spent 5 times the amount of time optimizing the NV3X path as they have the DX9 path. Valve themselves were alarmed at the performance difference and went further to say that ATI did not need such specific optimizations performed.

The easy thing for Valve to do ( and to save lots of time ) is to treat NV3X as DX8 hardware, meaning it's up to us gamers to turn on DX9 for the game ourselves. Also, by doing so, if we had DX8 running with GeForce FX 5200/5600, we will get playable framerates. Another downside of having the mixed mode equivalent ( 2 different optimizations ) for any single title will mean that future developers using this engine will have to tailor their code to both paths, which means more budget required, which some studios do not have.

This is NVIDIA's Official statement: ”The Optimizations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday [Sept. 8, 2003]. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2“.”

Og að lokum, grein um samanburð nVidia og ATI í Directx9 leikjum
http://www.gamersdepot.com/hardware/video_card s/ati_vs_nvidia/dx9_desktop/001.htm

Svo virðist sem að þetta er allt ekki allveg fast þar sem nýjir driverar eiga eftir að koma út frá nVidia, kemur allt betur í ljós á næstu dögum.