AMD further delays flagship HD 7990 Dual GPU graphic card

vbimport

#1

AMD further delays flagship HD 7990 Dual GPU graphic card.

[newsimage]http://static.rankone.nl/images_posts/[/newsimage]AMD seems to have problems competiting with Nvidia and has further delayed its upcoming flagship GPU


Read the full article here: [http://www.myce.com/news/amd-further-delays-flagship-hd-7990-dual-gpu-graphic-card-62636/](http://www.myce.com/news/amd-further-delays-flagship-hd-7990-dual-gpu-graphic-card-62636/)


Please note that the reactions from the complete site will be synched below.

#2

I have bought about 5 ATI (now AMD) cards over the last 15 years. I tired an NVIDIA card a decade ago and had nothing but problems, so back it went. I see in the forums people complaining about the lousy AMD drivers, but have never had any problems. AMD will get my money until one of us dies. Currently looking at upgrading from HD 4850 to HD 7850.


#3

Foxconn of Honhai should buy AMD.

Honhai has cash, enough cash to spend on building new plants for LCD panels and chips.

AMD has brands and technologies.


#4

It’s a pity that this has been delayed as it’s normally associated with price drops for the lesser cards and some bargains to be had.

[B]Wombler[/B]


#5

I’m still runnig an HD 4670. Still running strong.


#6

Is there much to be excited about in PC gaming these days?
Multi GPU chips will give way to multi-core CPU chips… there’s no real reason to have two physical CPUS…

Of course it’s a natural, if lame evolution of the DUAL graphics card compromise…
However, I belive if innovation was truly run by people who give a crap about moore’s law and all that would have transitioned DIRECTLY to multi-core GPU graphic solutions and not bothered with more physical hardware and build GPU’s that are maybe almost twice the physical size, but have multi-core built-in! If you remember, some graphics card GPUs were 2.5" and eventually shrunk down… but there’s no reason why we can’t go back to physically bigger chips for now and then shrink the process in bi-yearly revisions.

Lately, this was AMD’s market to lose… I also had an Nvidia (now Intel) card back in the 90s…what I didn’t like was their “BETA” drivers were also much more of a hassle than ATI(amd’s) because end-users had a vested interest in seeing that things went right… consumer’s were on their side… nobody wants to do development work for Intel… that bloated greedy company with their overpriced products and market manipulation (prices, and found guilty by the EU for pushing AMD’s products out of the market with kickbacks).

That said, AMD dropped the ball in 2010 in CPU’s so my system is Intel i5-750 2.66ghz quad core… I’m hoping AMD will have a good solution when I’m ready to upgrade, but the way things look they might not even be around 2-5 years from now, lol…


#7

[QUOTE=tmc8080;2645746]Is there much to be excited about in PC gaming these days?[/QUOTE]

Well, I have a subscription to PC Gamer magazine and I’d have to say, “yes”. There are some great looking games coming out this year and in 2013. One I’m really excited about is Last Light. It’s a post-apocalyptic action survival horror first-person shooter. Screenshots look beautiful.

[QUOTE=tmc8080;2645746]Multi GPU chips will give way to multi-core CPU chips… there’s no real reason to have two physical CPUS.[/QUOTE]

Don’t you know that many hands make light work?

[QUOTE=tmc8080;2645746]I’m hoping AMD will have a good solution when I’m ready to upgrade, but the way things look they might not even be around 2-5 years from now, lol…[/QUOTE]

It’ll be a sad day for all of us if that happens. Innovation will go down and prices will go up on both the CPU and GPU fronts. Intel and NVIDIA will bunny-hump our wallets like there’s no tomorrow.


#8

[QUOTE=tmc8080;2645746]

nobody wants to do development work for Intel… that bloated greedy company with their overpriced products and market manipulation (prices, and found guilty by the EU for pushing AMD’s products out of the market with kickbacks).

[/QUOTE]

That kind of posts killed AMD. AMD executives depended too much on fanatics and they were also becoming fanatic. CDFreaks was also one of the millions of websites full of them from about 2000 to about 2008.


#9

I remember “scene” and hacker groups writing their own drivers and graphics utilties for ATI graphics cards… and once upon a blue moon for NVIDIA as well just so certain games would work correctly… talk about fanatical…

Those street fighter games just wouldn NOT be denied!


#10

[QUOTE=tmc8080;2645746]
Multi GPU chips will give way to multi-core CPU chips… there’s no real reason to have two physical CPUS…[/QUOTE]
Doubtful. GPUs are orders of magnitude faster at massive parallelised stream number crunching compared to generic cpus. There’d be a huge performance or picture quality hit moving to multi core cpu rendering, at least in the near term.


#11

well, a bit of an error… I meant multi-core GPU’s obviously…
this is possible… I mean, if they’re gonna have two gpu’s running parallel, why wouldn’t a multi-core GPU running parallel not be MORE efficient… down the line…?

much like HAMR hard drive technology was 4 years ago, I’m pretty much saying it’s next, next gen stuff… not for the very next product series launch… (3 - 5 years min). when they can obviously make them better than the two best GPU’s they currently have on single chip process. memory circuits and multi-core parallel chip design are about rendering images that are 2k, 3k, 4k pixels and beyond @ 60 frames+/sec… and, for mind-blowing purposes, multi-monitor support of those ultra-resoultuion screens!


#12

[QUOTE=tmc8080;2646136]well, a bit of an error… I meant multi-core GPU’s obviously…
this is possible… I mean, if they’re gonna have two gpu’s running parallel, why wouldn’t a multi-core GPU running parallel not be MORE efficient… down the line…?[/QUOTE]
There’s a been a bunch of zealots harping on about video processing on a generic multi-core CPU for years, and then there’s the cross-over with the GPGPU’s (AMD E350 series / AMD APU series / Intel i3/i5/i7 series 2-3), where a CPU has a low power GPU embedded into the design, which is, of course, relatively low output, and typically is bandwidth limited (due to shared memory bus), and also detail/texture limited (due to shared memory size).

On efficiency … dual core is inherently less efficient, due to collisions/delays on shared busways/memory accesses & etc.

But yes, two (multiple) cores are typically better than a single core, on the proviso that the single core isn’t clocked at double the clock rate of the dual cores.

On the power side, dual cores on a single die is typically more energy efficient than two separate single cores also :slight_smile: