Sony may increase PS3 power with external processor

Sony may increase PS3 power with external processor.

[newsimage]http://static.rankone.nl/images_posts/2010/12/8XJfYI.jpg[/newsimage]A patent recently filed by Sony Computer Entertainment Inc. Tokyo, JP, shows that the company is considering releasing an external processing component to give the system a boost of power to help keep up with the demands of increasingly technically sophisticated operations.


Read the full article here: [http://www.myce.com/news/sony-may-increase-ps3-power-with-external-processor-37661/](http://www.myce.com/news/sony-may-increase-ps3-power-with-external-processor-37661/)


Please note that the reactions from the complete site will be synched below.

Reminds me of the old days

Only ports on the PS3 Slim are qty 2 USB2 ports. Have to imagine they will put out another revision of the console that will allow expansion.

yeah, it’s like the Sega 32X!

[QUOTE=Blu-rayFreak;2563096]yeah, it’s like the Sega 32X![/QUOTE] You have gotta see this. :bigsmile:

[QUOTE=Mr. Belvedere;2563101]You have gotta see this. :bigsmile:[/QUOTE]
OMG, friggin hilarious!

[QUOTE=Mr. Belvedere;2563101]You have gotta see this. :bigsmile:[/QUOTE]

Thanks for sharing, I was LMAO!

Seriously? WTF?

I guess they could implement this via the SATA controller for the HDD & have the HDD share it … or something similar …

Implemnenting multiprocessor through a USB2 port would be … laughable :iagree:

http://en.wikipedia.org/wiki/File:Mega_Drive_II_(PAL)_%2B_Mega-CD_II_(PAL)_%2B_32X_(PAL).jpg\

How about …

  • PS3 programming experts dropping into games houses to explain how to program the damn cell processor properly, rather than programming it as separate CPU & GPU, and relying on the cruddy video card … 5 years after the GPU is supposed to be retired!!! Seriously … the GPU was included to ease initial transition to proper CELL programming … but games development houses are still programming games using the GPU as the primary graphics processor!

  • Sony developing/implementing a better programming kit to automatically convert code developed for vanilla CPU/GPU to programming suitable for the “CELL processor + offload physics to the cruddy video card”.

If you read the summary on the 11th page, http://www.freepatentsonline.com/20100312969.pdf you’ll see it talks about a reconfigurable interface. I’m betting they’re talking about the gigabit ethernet port. The gigabit part is good for local networks only as residential internet speeds rarely, truely go above 12Mbps. So I would guess that would be reconfigured to some proprietary interface that could handle both highspeed data + network and the ethernet connection would probably be moved off to the add-on device. But, in the summary it also talks about the potential for the interface to be over multiple ports.

Just my observation and thoughts on it,

Eric

[QUOTE=icefloe01;2563593]If you read the summary on the 11th page, http://www.freepatentsonline.com/20100312969.pdf you’ll see it talks about a reconfigurable interface. I’m betting they’re talking about the gigabit ethernet port. The gigabit part is good for local networks only as residential internet speeds rarely, truely go above 12Mbps. So I would guess that would be reconfigured to some proprietary interface that could handle both highspeed data + network and the ethernet connection would probably be moved off to the add-on device. But, in the summary it also talks about the potential for the interface to be over multiple ports.

Just my observation and thoughts on it,

Eric[/QUOTE]
Transmitting data at gigabit speed to another Processor?
Lol!
Ethernet is really inefficient, and the delays (OMFG) and basing processing offloading on a gigabit ethernet port that, if lucky, might get upto 400Mb/s on a good day depending on the network chip and interface selected, is not really conducive to a real-time co-processor.

Gigabit ethernet isn’t really gigabit, and gigabit is not as fast as everyone thinks … it’s just that common internet speeds of 4-30Mbps is pitifully slow :slight_smile:

The sata port is configured to get 1.5Gbps consistently and will get a reliable 100MB/s (800Mb/s).

The way I see it, they can have both USB ports 2x480Mb/s (480Mb/s reliably), the sata port 1500Mb/s (1000Mbps reliably) & ethernet port 1000Gb/s (400Mb/s reliably) givin a total of 1880Mb/s and still end up without enough processing power for new games, because the CELL cpu is busy co-ordinating communications with the new coprocessor through the various ports …

It will be interesting to see, regardless.

[QUOTE=debro;2563667]Transmitting data at gigabit speed to another Processor?
Lol!
Ethernet is really inefficient, and the delays (OMFG) and basing processing offloading on a gigabit ethernet port that, if lucky, might get upto 400Mb/s on a good day depending on the network chip and interface selected, is not really conducive to a real-time co-processor.

Gigabit ethernet isn’t really gigabit, and gigabit is not as fast as everyone thinks … it’s just that common internet speeds of 4-30Mbps is pitifully slow :slight_smile:

The sata port is configured to get 1.5Gbps consistently and will get a reliable 100MB/s (800Mb/s).

The way I see it, they can have both USB ports 2x480Mb/s (480Mb/s reliably), the sata port 1500Mb/s (1000Mbps reliably) & ethernet port 1000Gb/s (400Mb/s reliably) givin a total of 1880Mb/s and still end up without enough processing power for new games, because the CELL cpu is busy co-ordinating communications with the new coprocessor through the various ports …

It will be interesting to see, regardless.[/QUOTE]

The key word though is reconfigurable. Meaning that the hardware they have in place may be capable of much higher speeds and different communication protocols. Especially if they come up with their own. For which, of course, Sony is absolutely notorious. Neither AT-TRAC nor Betamax (regardless of their superiority) ever got mainstream. People even still complain about memory sticks. But that’s neither here nor there. The point is, whatever port they use, they’re gonna make it do what [I]they[/I] want regardless of for what it was inherently designed.