Just For Information
Better architecture:
read this article at Xbitlabs. It clearly shows what are the weaknesses and strengths of RV770 as compared to G200. If you are short on time, here are a few highlights:
1- **Fill Rate: 48XX cards can’t match GTX2XX ** New Nvidia solutions have a serious advantage over ATI Radeon HD 4800 family: they have twice as many RBE/ROP units. However, this parameter doesn’t really affect the gaming performance any more. It may have certain influence on the performance only if anti-aliasing is enabled in resolutions over 1920x1200. In fact, you shouldn’t really base any conclusions on the results of this test, because they have no connection with reality and only indicate that G200 is today’s fastest monolithic GPU when it comes to fillrate. The gaming performance is affected by much more factors, namely, GPU’s computational capacity.
2- Pixel Shaders and Physics: 48XX cards open a can of whoop a on GTX2XX cards**
X-bit Mark test illustrates remarkably well both: mathematical GPU performance and their architectural success. It includes shaders made of mathematical instructions as well as shaders with a lot of texture samples.
This is a complete fiasco for GeForce GTX 200. Nvidia GeForce GTX 280 lost to ATI Radeon HD 4870 in all tests. Moreover, in some of the tests such as Plaid Fabric, 27-Pass Fur and Dynamic Branching – it was a really bad beating. Nvidia GeForce GTX 260 also lost to ATI Radeon HD 4850, though not as badly and even got ahead of the competitor in NPR test (hatch, 8 textures).
So, we managed to reveal the major weakness of the new Nvidia G200: its computational unit working at 1GHz+ frequency doesn’t always bring in the victory. Remember that computational processors are the ones determining the performance in contemporary games that use a lot of shader effects with numerous complex calculations. However, it is indeed very hard to compete with new ATI solutions featuring 800 ALU.
3- Geometry performance: G200 can’t match RV 770 Overall, Nvidia GeForce GTX 200 family cannot boast much in geometry benchmarks. In most cases G200 demonstrated considerably lower potential than ATI RV770 GPU, even despite gigantic chip size and unprecedented amount of transistors used. New Nvidia GPU is evidently not so well-balanced: with the die size approaching 600sq.mm G200 features extremely large texturing and raster blocks at the expense of computational capacity.
Now the problem with synthetic benchmarks is that they really don’t matter much to many people as often people will see how the cards are performing in games. But still these tests can predict how long a card will last and in a scenario where most games are becoming more shader-intensive and are using more computational power than pure fill rate, HD48XX cards are better bet if you don’t want to change your card after six months.
We don’t want decoding…Where is this coming from. News yes, facts yes. Fanboism no…
I never said we don’t want decoding. I, speaking for myself, want it badly. VC-1 is still broken on Nvidia cards. And that’s a shame because it is such a basic thing. It’s just the way idle power usage is broken on 48XX cards. In that same article, Xbitlabs wrote for G2XX that it has incomplete support of VC-1 decoding. Pure facts. No fanboism
COH is one of the better benchmarks out there. This is from personal experience over 30+ game benchmarks and 100+ cards. Why don’t you shut it, instead of voicing your so called expert opinion about game titles.
For you it could be. But this is one game where Nvidia cards just perform too well. It’s like Call Of Juarez, a game where 38XX series cards pull ahead of Nvidia cards even when we know that overall the situation is completely different.
NVIDIA getting more developers for their TWIMTBP than ATi for their GITG is just better marketing and brand value and NV should be praised for that
Better PR and marketing really does work.
You are absolutely right and I used to believe it too. But after DX 10.1 fiasco I changed my mind. A company has some responsibility towards its consumers. And holding back technology deliberately even when it can bring tangible benefits to your consumers is not the best way to serve your consumers. AMD introduced DX 10.1 with 38XX cards, way before G200 cards came into market. Nvidia had lots of time to work on supprt for this API. Why Nvidia did not implement the support for new API? As AT wrote, the current Nvidia cards, all derived from G80 architecture probably have crippled support for DX10.1.
Instead of fixing that support and bring its consumers better technology, the company has since then relied on arm twisting game developers and blatant PR lies.
I have nothing against Nvidia, but G2XX has been a complete fiasco for them. And that is not a fanboism but the truth.