Nvidia pissed of the Radeon 2

vbimport

#1

I just posted the article Nvidia pissed of the Radeon 2.

Sources says that Nvidia is allready shaking in it’s pans, so they will be releasing an Geforce 3mx, Ultra, Ultra Advanced and an Geforce 4.

Ultra specs:

Gpu clock of 300Mhz (Ultra and…

Read the full article here:  [http://www.cdfreaks.com/news/2320-Nvidia-pissed-of-the-Radeon-2.html](http://www.cdfreaks.com/news/2320-Nvidia-pissed-of-the-Radeon-2.html)

Feel free to add your comments below. 

Please note that the reactions from the complete site will be synched below.

#2

Nvidia is working on a solution for their Radeon 2 problem ??? is this correct? shouldn’t it be ati? :+


#3

Wow! The Geforce 3 is an expensive card, but imagine the pricesetting of the Gerforce 4!!! Question is: Should you by the Geforce 3 or wait for number 4!?


#4

anyhow, this is good news i was thinking already to buy geforce3, now i’ll wait, prices will/must drop now


#5

Don’t worry Nvidia will win out in the end. ATI is just having a go at obtaining the monopoly of the Graphic Card arena. The best thing would be for Nvidia to do a hostile takeover of ATI, and close the plant down, and before doing so obtaining all blue prints of their hardware and copyrighting all material so no other company such as ATI(under a new name) can produce such hardware. Mind you why doesn’t Bill Gate buy both companies out he will have the monopoly of the Graphics market then. There will never be another contender again, that man has so much money he could wipe out the entire production plants of other graphic cards designers. :4 sigh! another war rages on, constant battle for domination will never end. :8


#6

ATI is the last company that has a real shot at challenging Nvidia, and hopefully competition breeds innovation.


#7

ATI and Nvidia do not compete directly.ATI’s goal as stated in the Toronto Star news paper this morning is increased market share. ATI wants to have a greater market share than Nvidia in the pc market as well as the laptop market. Nvidia is starting to make graphic chips for laptops to take away from ATI’s substanial market share.Nvidia designs their graphic cards for maximum performance,while ATI aims for high perfromance with maximum image quality. Radeon cards have better image quality than GeForce cards.ATI has ramped up their release cycle from 12-15 months to 6-8 months to compete with Nvidia’s release cycle.Graphics chip market will follow the same path as the CPU market.AMD vs Intel,the chips do different things well just as the Radeon and GeForce graphics cards.Their will not be a clear winner on the company side.


#8

This competition is good for everyone, it’s what helps keep the prices in check. With 3dfx out of the game it’s really just ATi and Nvidia at the gate with ATi heavily involved with the OEMs (ATi’s having a majority share of that market with NV chipping away at it) and NV the opposite with the retail. Whether anyone likes it or not ATi is going to be in the game for a while, if only ATi could improve their driver quality with the Radeon then maybe they could take the performance crown. :slight_smile:


#9

someday all of us gamers have to buy a new graphic-card … an no matter how long u wait its always the wrong moment to buy :+ well … i think gf4 priceing will be around 1500 to 2000 DM …


#10

so renegadestorm u obviously are an employee of ATI to make such remarks, or being paid lots of dosh to slag the gforce chip. A little word of advice until you can produce a 3d chip that can rival both Nvidia and ATI, I would stay quiet. You neither have the financial backing or a fin clue how to design such a golden egg. Take a cold shower :4


#11

BitBoys Oy of Finland have the financial backing and a big mouth when it comes to their technology. They’ve been promising the Biggest and the Best memory bandwidth technology and ‘3D’ experience since late 1997 (which is nothing new, chip makers have been integrating memory since we’ve had on die caches) but are yet to even show any signs of actual silicon. Too bad Sony or ArtX (acquired by ATi) don’t release graphics chips for PC as the Sony GS for PS2 only runs at 150Mhz but yet has a super wide 2560bit bus plus the ArtX/ATi ‘Flipper’ chip for Gamecube with 162Mhz clock/3MB on chip DRAM which both have enough poly performance to churn out the effects with the exception of maybe perpixel shading. Of course both are pushing and being pushed by some very proprietor technology but still it’s still on par or over par with some of the latest ‘advances’ in graphics accelerators.


#12

It’s disturbing when you see so many people with their mouths firmly sucking Nvida cocks, Simple fact is their cards may be the best but they are also the most expensive, and just say they did buy out ATI or Matrox etc, the prices would soar. I buy Matrox for this exact reason, I’m not supporting another monopoly where the consumer loses.


#13

Crudely put, Golga. :r WTG, applying economic pressure to the market to effect change. However, using the best card should cost the most. If you want the best. Using the second best is usually the best value. If you want value. Using BrandX can make a statement. If you are making a statement. One costs more money, the next costs in less performance, the last costs in both. Go figure, eh?. Just how much price and performance is a statement worth? Totally subjective value, who knows?


#14

The bad thing about ATI is it’s drivers. If any of you have looked at any of the new games that come out, it takes them a while to come out with ATI patches. The Geforce series is the fastest video cards out there right now. Furthermore, as expensive as they may seem, in 3-6 months, they become very, very cheap (almost half). I’ve also heard that NVidia is moving away from the GeForce series, and working on a another video card (I forget the code name, not GeForce). Coming out in approximately 2003. 1Ghz GPU, 256 or 512 ddr ram,and many other benefits. I’m not sure if ATI will be able to compete against NVidia’s speedy graphics cards. Most people look for speed first, image quality second (although NVidia is not that far behind in image quality).