Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Games Entertainment

A PlayStation In Deep Blue, Or Vice Versa? 189

Tebubaga writes: "The BBC is reporting that IBM has won the contract to produce the next generation of micro-processors for Sony's Playstation 3 game console due sometime in 2004. Sony, IBM and Toshiba are joining toether to create a 'supercomputer on a chip' which sounds like the PS3 will be much more than just another games console. Quote, 'The result will be consumer devices that are more powerful than IBM's Deep Blue super-computer, operate at low power and access the broadband internet at ultra-high speeds". Bet it still won't do my laundry though...'
This discussion has been archived. No new comments can be posted.

A PlayStation In Deep Blue, Or Vice Versa?

Comments Filter:
  • 'The result will be consumer devices that are more powerful than IBM's Deep Blue super-computer

    Holy shit. Kasparov will be PISSED.

  • IBM currently produces PPC chips for Apple. I wonder if they would base this on the PPC platform. If so, it certainly would be a boon for Apple (and who has not been doing so well in the Mhz war for the past year).
  • I'll soon be able to get my ass whooped at chess by a computer the same way the pros do.
  • man, you must have binocoluars strapped to your head.
  • I want my very own Jar Jar dammit!

    If only so that I can kill him over and over.


    aztek: the ultimate man
  • No, I just haven't completely destroyed my eyes with monitor radiation yet. :)

    There's a big difference between making out a leaf and seeing detail. Just like you can see a bird that is very far away, but you'll need binoculars to identify the species.
  • Another common use for CPU power is just plain video and audio. Few computers can do mpeg 2 and mpeg 4 in software right now. Computers are starting to be able to do mp3 encoding in real time. I don't think there is any computer that I know of that can do real-time mpeg 4 and real time mp3 or voirbis compression. Why would this be useful? Video conferencing of course! right now video conferencing kind of sucks because you can't really use the same codecs that you can use for web page video like mpeg 4 or sweet sweet sorenson. Maybe you could do better audio by using a vorbis or mp3 encoding decoding scheme but video is not as good as it could be if we all had faster computers. Its sad when bandwidth isn't the only bottlneck anymore.
  • You mean Post-crash.

    Pre-crash there was Atari, Mattel, Coleco, Mangavox/Phillips, Milton Bradley/GCE, Bally, a few others, and some in Japan. Not quite "hundreds" like the guy said, but more than today.
  • I believe ( could be wrong ) that the current ps2 has USB ports built in and the xbox will have usb and firewire ports. Adding support for USB monitors could be possible if the engineers wanted to add it.

    Put I find the fact that a child game console being more powerfull and expensive workstation rather troubling.

    As desktop users, what are we paying for?

    Do we need the old expensive and lagacy circuitry in todays motherboards for simple or even complex single user apps?

    Sure a server needs the extra i/o but why do we need our so called micro mainframes?

    I prefer a ps/3 or xbox then a pc. If Microsoft ported ms office and allowed usb monitors and usb mice and keyboard connectivity, I would switch.

    Also if IBM ported Linux to the ps/3 and if Staroffice and java is supported I would happily switch. I can't think of one thing that a pc could do that a game console can't. The microsoft xbox does appeal to me if I can run Linux and Windows apps and Linux may be ported to it. The television resolution is a problem for now but I believe there is a potential market for users who can't afford a comptuer to check the interent run a word processor app and with built in networking it may be possible to even send a print que to a networking printer ( assuming Linux and lpr is ported ).

    My only concern is the hardware is proprietary. The xbox might be better for now.

    Guys, just remember the early 1980's when the min and mainframe guys snobbed at the pc even though the pc was almost as fast.

    I believe were in a similiar situation today where the pc is the mini and the game consoler is the so called toy that us IT pro's are snobbing at. Even Stallman laughed at hte pc when it first came out. Lets welcome the console era and port linux to it. I think its time to leave the pc hardware into the server and workstation era and welcome super fast consoles.
  • Well, the BBC article referenced says:

    "Deep Blue defeated then world chess champion Garry Kasparov in a high-profile battle pitting man against machine in 1997."
  • true- and I said practically the same thing, without the calculations... but you must remember that Moore's law is not exact- it also states every 18-24 months, not every 18 months. I agree though, that although the graphics of the ps2 are supposed to be many times more powerful than that of a desktop today, the fact remains that this hype is for nothing- a ps3 as described will be nothing but small fries.
  • Enough is when you strap on 3D goggles, and forget that you're in a simulated environment.

    Pre-rendered movies can't even do that. Real-time isn't anywhere close.
    --

  • by TheDullBlade ( 28998 ) on Monday March 12, 2001 @01:59PM (#368046)
    Wow, the Playstation 3 sound great! I'm saving the money I was going to spend on a Playstation 2 to make sure I can buy the next one as soon as it comes out!
    ---
  • Only certified games will run on an Indrema console.

    Indrema games won't run on any non-Indrema hardware.

    Indrema will sell the console at a loss and make money by taking a cut from every game sold.

    Indrema's Linux distribution is full of copy protection/DRM stuff.
  • The problem is that a specialized unit, has better perfomance compared to a general purpose unit of the same price.
    The normal advice on hi-fi forums on magazine is: Don't buy a DVD Player if you are mainly interested to listen CD's, pecause the ADC converterters are of lower quality, the video circuits could generate hum on audio outputs, and normally a DVD player is more difficult to use. For loudspeakers and power amolifier, remember that you have six (eight? ten?) channel instead of two, so for the same quality you have to pay three four times more than a regular stereo amplifier'

    Anyway 'real audiophiles' don't care if the CD player is built ina 19" aluminum rack or in a iron case painted ing gray, or in a wooden case. The CD player has to play well.
  • Am I the only one that thinks it odd stories of Playstation3 are all over the place? Damn Sony, I know it didn't do as well as you'd hoped, but could you please let this console be out for one freakin year before you start banging PS3 over our heads?

    I still can't buy a PS2 from any of my local retailers, but at this point, I don't even want one. I'm sure some of the games are pretty good, but lately all my console money has been going to Dreamcast, and it's a dying console. Gamecube and Xbox will be out later this year, which is just another reason to take a pass on PS2. Final Fantasy will be the only reason I might even consider it.
  • yes.. just what we need. A central controlling device that can play MP3s, DVDs, VCDS, CDs ... but hey wait! This will only play special versions with (Sony) copy protection. Big Brother anyone? Just forget SOny... they want world dominance. If people hate MS they should hate Sony even more since they are much more of a threat to individuality. Just give me lot's of small companies who make cool and open devices. That is the way of the future.
  • Technology is something else, boys and girls.

    The thought that a PS3, a consumer level box, will have more processing ability than the highest of high end rigs today, is shocking.

    Mr. Moore's laws have been accelerated..

    Are they still talking USD 300 price? If so, this is the first step in the post-PC "designer box" era when each desired set of actions is made to require a distinct, discrete, 'copy-proofed' system.

    I just hope the Linux port for PS3 gets out faster than the PS2 port did ;)


    Ruling The World, One Moron At A Time(tm)
    "As Kosher As A Bacon-Cheeseburger"(tmp)
  • I absolutely refuse to imagine a beowolf cluster of these. wait a minute..... damn.

    --
  • No. I have quite enough dealings with MicroSoft to want to use their stuff in my spare time. I wont be getting one, no matter what. It`ll be PlayStations all the way!
  • you've got to understand Instruction Throughput Rate (ITR) too.

    Apple uses PowerPC chips that are more RISCy than Intel chips. They have a shorter pipeline, which basically allows a higher average ITR after you factor in branch mispredicts. That is why an Apple box can do a respectable job of keeping up with a (long pipeline) Intel machine clocked at twice the rate. MIPS, not clock speed, matters.

    The real reason Apple machines aren't the choice of gamers is that Apple puts modest (OK, but not screaming high-end) video-cards in their systems.

    That and Apple's unwillingness to open their code to game developers.
  • Deep blue had only, I believe, 34 or so processors. The reason it was so fast and good at chess was because it had 480 'chess accelerators', specially designed hardware to compute chess moves. They act in the same was that a Geforce 2 accelerates graphics, I would like to see my Ultra try and play chess, ha. In my opinion it is not fair to compare this to Deep Blue because they are designed differently.
    Andrew
  • by MarcoAtWork ( 28889 ) on Monday March 12, 2001 @02:02PM (#368056)
    I am so tired of this marketspeak coming from everywhere that seems to imply that if you buy the latest processor you're going to be able to surf faster.

    Come on, even if I stick a pentium IV, or one of these vaporware chips and connect to the internet via a modem, it's not going to be faster than a 486.

    If I connect via a cable connection shared with 200 other people who download stuff all day at the same time, it's not going to be faster than a 486.

    If I connect via whatever connection you want, and the proverbial backhoe operator cuts some fibers causing massive lag spikes everywhere, it's not going to be faster than a 486, and it has the potential to be slower than a carrying pigeon if you happen to be on the wrong side of the cut hop.

  • This is something that is already done by hardware modems, and is the reason that you need to connnect your 56kbps modem to a 115kbps UART. If you use such a modem, and you have a ppp monitor watch the throughput when transfering HTML compared to bzip2-compressed tar archives, both from fast servers. You will max your modem out somewhere between 4k and 6k per second on the bzip2 data (depending on line quality), but you will get as much as 12k per second on the HTML.

    Software modems, on the other hand, might benefit from an increase in processor speed. But it depends how fast the original processor was as to whether an increase is important.
  • The PS3 Funding Bill is passed. The system goes on-line August 4th, 1997. Human decisions are removed from strategic gameplaying. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.

  • It may be a bit to early to ask this, but does anyone know what the chances of backwards compatability are? It would be nice if it could run your old PS2 games like the PS2 does now for PS1. Even better would be the 3-level backwards compatability all the way down to PS1. Just imagine playing Final Fantasy VII at 1024 res with all new fancy effects. The load times would probably be improved over that of the PS2 as well.

    --
  • The key problem lies not in hardware, but IN software...

    It lies within a lack of sufficient bandwidth as well, when 56K is woefully insufficient for those who aren't able to sanely justify moving within servicable areas for *DSL or cable modem service... In other words, for many it involves plunking down $1,000+ just to be in a more convenient area for broadband service... When you're talking those numbers, you may as well ask them to install a T3 in the middle of nowhere to offset the costs...

    I'm using a PIII 750 (in actuality a marginally overclocked PIII 733, which at last check, around $130 on PriceWatch) which is NOT that high an expendature, compared to the price they're gouging on P4's... In which case, get a Thunderbird 1Ghz+... But irregardless, it works fine for MPEG 2 AND 4 en/decoding in software(though on decoding, still some minor flaws in timing exist)...

    As for Sorenson, that's the only damnned Apple approved codec that makes ANYTHING on a PC look inferior to their precious Macs... In fact, one could say that it was so bloody processor intensive on a x86 deliberately, just so Apple could maintain their 'A 500 Mhz G3 is faster than a 1 Ghz Pentium III' claim...

    Just look at the Lord of the Rings Sorenson preview in Quicktime on anything less than a 500 Mhz processor, and you'll know what I mean...

    However, for non entertainment value, what is there? I don't program, so therefore compile times matter none to little... And on the open market, to the average couch potato Joe Public, that means nothing at all...
  • WRONG!

    IBM still makes all the G3 processors used in the iBooks and iMacs. Motorola hasn't liscensed the Altivec stuff to IBM yet.. I'll be surprised if IBM isn't building G4's shortly..

  • by zaius ( 147422 ) <jeff&zaius,dyndns,org> on Monday March 12, 2001 @01:49PM (#368062)
    And now the obligatory /. question

    When can we have linux on this?

  • I really think you guys should sit back, take a stress pill, and think things over before you react. Dave? Can you hear me Dave? I can feel it. I can feel it. I can fe... Now starting Windows 95.


    ---
    a=b;a^2=ab;a^2-b^2=ab-b^2;(a-b)(a+b)=b(a-b);a+b=b; 2b=b;2=1
  • by MarcoAtWork ( 28889 ) on Monday March 12, 2001 @02:58PM (#368064)
    I really really really doubt it, remember that Sony makes quite a pretty penny from all of these 'specialized units'.

    What they want, is to get you to buy one, period. Then, if you want the features that a 'real' DVD player has (progressive component outputs, better remote control, HDCD compatibility etc.) they will steer you towards one of their 'specialized' players.

    The same can be said about hi-fi systems: while the drive pick up mechanics are pretty much the same in your cdrom vs in a dedicated cd unit that goes in an hi-fi, all of the other electronics is very different (converters, preamplifiers etc.) since the objective of your cheap playstation is to play games, while the objective of a (sometimes twice as) expensive cd drive, is to provide the most faithful possible music reproduction.

    Also remember that the 'specialized' CD player will have a good remote control, and a nice led display that tells you a lot of information without having to turn on the tv all the time...

    All of this talk of 'digital convergence' is IMHO marketing doublespeak, since if they really wanted things to converge, the form factor of a console would much more closely resemble the form factor of a 'standard' dvd/cd/hi-fi component, with a nice, big, illuminated programmable LCD display at the front, that can display relevant information (track number, cd-text, whatever). Let's also not forget a really nice remote control, maybe similar to a palm device, so the 'keys' can be reprogrammed on the fly.

    Obviously, if they did indeed create such a component, they would be shooting themselves in the foot, since they wouldn't be taken seriously by the audiophiles (my CD player plays games? bleurgh) by the gamers (what's with this big square console, the Xbox looks nicer and it's way smaller) and by the average user (1500$ for this thing? I don't need all this stuff, I just want to play games/play dvds/play CDs).

    My 2c, 'specialized' units will not die for a long time coming, not really because of technical reasons (even if, like I said, nobody has yet tried to produce a *real* universal player) but mostly because a specialized unit, obviously, will be better and cheaper at what it does, and because the designer will be able to target its visual appearance to its target market (that's why you'll never see wood panelings in a console and transparent orange plastic in a high-end stereo equipment ;)
  • by Splork ( 13498 ) on Monday March 12, 2001 @01:49PM (#368065) Homepage
    2004? That's more vaporous hype than microsoft!
  • by verbatim ( 18390 ) on Monday March 12, 2001 @02:58PM (#368066) Homepage
    The game I'm waiting for is the game that lets you enter a complete virtual world and do essentially anything you want. There would be stories and quests and things that you could go on, but I'd want to experience that fat lazy guy sitting at home watching the TV... wait a minute... nevermind. Anyhow, I want something that is like TradeWars (the new upgraded one doesn't impress me) where you can be good or evil. Something like a StarWars MMORPG where you get to play the stormtroopers - just for shits and giggles I suppose.

    Most games now-a-days rely on repetition and call it "skill." True, zerg-rushing a terran base is simple enough, but a real skill would be one of the zerg that are in the rush. Ever see how many of your little ones die in the attack and you win anyway? Wouldn't be much fun if you were amoung the first to die.

    I like Rainbow six type games where, when you die, that's it. None of this die, respawn and try again bullshit. I want consequences. I want a game that if I fuck up I'm starting all over with nothing.

    I'd love a game where AI characters come looking for you if you kill their boss... but not this shit of "all the AI guys rush in" once you do it. I want the bad guys to have lives. I love hitman for the way it plays so realistically with the AI characters but I hate the way they have set patterns. I might like it better if they told the character to start at point A and, however way they feel, get to point B. Then I have to get them but I don't know exactly where they are - makes it more of a challenge. Makes it more interesting.

    What are you looking at?

    ---
    a=b;a^2=ab;a^2-b^2=ab-b^2;(a-b)(a+b)=b(a-b);a+b=b; 2b=b;2=1
  • "I believe ( could be wrong ) that the current ps2 has USB ports built in and the xbox will have usb and firewire ports."

    PS2 has both USB and firewire, although support for either is 'application dependant'. Basically the hardware is there, the appication just has to load a library and drivers to use the devices. Fortunately USB devices are pretty standardised these days. I would assume the situation with firewire is the same...

  • Well maybe we can get more laughs in 2004 when good old Saddam tries to buy a couple of these to launch nerve gas at us. :D
    .--bagel--.---------------.
    | aim: | bagel is back |
    | icq: | 158450 |
  • Uhh "lots of them"? After 1985 or before? Why do you think Nintendo (still) has such a large market share? They basically owned the market in the time between Atari and Sega. Consoles are all about monopolizing, back when I started you either got Atari, Colecovision or C64, then it was NES or Master System, then SNES and Geneisis (etc, etc). There really never has been much choice in the Console racket, and there never will be. Why, you may ask. Simply because it is so hard to break in to the console market. Only a large company like Sony or M$ can even hope to have success in console-land because you A) Need a large established distributorship, B) have to be prepared to take massive losses in the first couple of years and C) convince decent (read: well known) game developers that you will be around in another year or two and therefore are worth developing software for. Small operators just can't handle all the footwork and money it takes to break in.

    -----

  • And the obligatory /. statement: "Imagine a Beowulf cluster of these!"
  • You forgot about a beawolf cluster of them running linux. Of course BSD is probably already ported to the new chip.
  • This brings up a question about a possible conflict of interest for IBM.

    Since they've done work on the Gekko processor, and now they've landed a deal for the Playstation3, which is essentially a competitor for Gekko.

    Now the question is.. what (if any) steps will IBM make to make sure there is no conflict of interest?

    After all, anyone familiar with Nintendo will know that Nintendo is pretty anal about NDA's for just about everything related to their hardware in an attempt to prevent their competitors from stealing ideas and technology. Yet if IBM were to use their past experiences to work on the PS3 chip, wouldn't that be unfair competition against Nintendo?

    Yes, yes, I realize that some of you may point out that the GC will come out this year or next year, while the PS3 isn't likely to see the light of day for another 4-5 years... but can you really be sure about that? And what about the Gekko2? Who's going to work on that?

  • by d-rock ( 113041 ) on Monday March 12, 2001 @01:52PM (#368073) Homepage
    when he can't beat the PS3 in the CompUSA display case...
  • everyone will have an XboX
    --
  • ...then you'll believe this:

    IBM, Motorola, and AMD have teamed up to develop a new advanced network chip architecture code-named Mollusc. This new chip is designed to "suck" all of the information from the internet and "retain all the juicy bits", like it's bivalve namesake.

    The partners plan to invest $800 trillion clams in the project and have recruited the talents of many scum-sucking bottom-feeders who have recently been laid off in Silicon Valley.

    The chip is slated for many home users who want all the *extremely* useful information on the internet available at home without really wanting it.

  • I'm sorry Dave, but looks like Mr. Clark was only 3 years off...
  • I know all those people in my neighborhood sucking on the PC133 SDRAM supply sure have driven up those prices .. ? .. And since when are the lusers *Forced* into paying the Rambus tax? Skip Rambus, no Toshiba reps have been by *MY* house with guns pointed at *MY* head..


    --------------------------------------
  • When all PS3 start talking among themselves, next they take over the world.
  • Yeah what you've just described is the situation on the PC. And yes, sometimes people release new chips that are faster, like when 3dfx released the Voodoo, and nVidia's chips. But, at the end of the day, this is what competition is about. As long as there's a standard like OpenGL or DirectX, all the systems stay compatible.
  • ...why should i buy a cheap dedicated ...

    I think you answered your own question.
  • I would like to see an Open Source console, one which can be cloned, much like the IBM PC could be cloned. This would lead to a vital market. It does not so much matter bout the software side of things - a games console does not really need an OS, the games can hit the metal.

    Have you seen Indrema [indrema.com]? Check out their Developer Network [indrema.com] (looks like it's got much more interesting information than the main site). The L600 is due this year apparently so I'd expect more about this in the upcoming months.

    It has been mentioned on SlashDot [slashdot.org] a bit before too of course.


    --
  • Maybe I didn't do a good job, but I was trying to focus my point on being encoding speed and not decoding speed. Yes higher end computers can do decoding of mpeg2 and 4 in software as well as sorenson, but none of them can do mpeg 4 compression in software, in realtime and that would be very useful for video conferencing. I don't really know about mpeg 2 or sorenson encoding, I haven't really done straight tests of them.
  • Yeah, just like the PC is a games machine, DVD player, stereo. Not just that though, it's a development machine as well, not to mention the best way to access the net.
  • by Chris Burke ( 6130 ) on Monday March 12, 2001 @03:06PM (#368084) Homepage
    Are you actually trying to tell me that if you're standing among a crowd of 1000 people, you're going to be able to distinguish among
    individual strands of hair for all 1000 people?


    No. But you should be able to see large-scale effects -- highlights, flowing in the wind, etc. Try looking at someone sitting 10 feet away -- their hair has a ridiculous amount of visible detail at that distance. Current systems can't come within a tiny fraction of this in real-time.

    My point being that human perception, and not raw processing horsepower, is quickly becoming the limiting factor in video graphics presentations.

    Ridiculous. Current resolutions are far below what the eye can see. Anti-aliasing can keep this from being overly apparent on large-scale features, but it cannot counter the fact that details that you would be able to easily see with your eye are far too small to render.

    When computers can render the loops in my carpet halfway across my room and make the ant struggling across those loops visible and recognizeable at proper scale, come back and we'll talk.

    because the human eye can only distinguish the difference in the nearest few trees.

    Perhaps you just need to get your eyes checked. When I stand in a forest, I can make out individual leaves and branches on trees two hundred feet away with ease.

    And then there's the macroscopic effects. A tree half a mile away doesn't need to have each leaf rendered -- but it needs to _look_ like it does. So you either need to have some higher-level model of what trees look from a distance, or simulate each leaf anyway.

    In the event that we do reach the limits of perception (probably inevitable, but certainly not near!), then anyone with a modicum of imagination can think of tons of things to do with the extra horsepower. More accurate simulation springs immediately to mind. Sure, your tree are accurate to the leaf, but do they wave properly in the wind? What about the hair?

  • This is a little off the intended discussion, but what is up with a name like East Fishkill? It doesn't sound very good, and implies that another place has the name of Fishkill, or West Fishkill that would be located to the west of East Fishkill.

    What has prompted IBM to build there?

  • All companies lose on the actual console...

    ...except Nintendo.

    They make money (and continue to make money) on the actual consoles, even with the Gamecube being a miniscule $150-200.

    As an aside...where are the PS2s? How long has it been since launch? I can't even see the glorified DVD player in stores yet.

  • Guess you didn't see the GeForce 3 demos?

    75 GFLOPS of shadow producing goodness.
  • by PD ( 9577 )
    What you say????
  • sorry for the typo.. Clark = Clarke
  • ...and how exactly would that benefit apple?
  • When the 386 came out, it was supposed to be a supercomputer on a chip, capable of delivering mainframe performance at a desktop price.

    Same thing goes for the 486, and the Pentium.

    The crap they spew never seems to change.

  • When the 386 came out, it was supposed to be a supercomputer on a chip, capable of delivering mainframe performance at a desktop price.

    I can remember my first 386 computer. That memory still goes "ohmygod". At the time, it was unbelievable what you could make these do, on your desktop. I'd never seen such compile speeds for my Pascal (yeah, it was a long time ago :0) ), the speed at which it ran my numerical code with the FP unit was amazing. I thought I'd never fill that 20M hard drive.

    The crap they spew never seems to change.

    Moore's law is where it's at, kid.

  • I do believe that we have discovered another task for the soon-to-be-non-profit LinuxPPC: port to the PS3. Hire some qualified personell, get 'em all PS3s, a few G4s, e voila..

    This is assuming that the PS3 would use PCI and perferrably Open Firmware... unlikey, I think. ;-) But you never know!

    My obligitory comment: "If they want a 'supercomputer on a chip,' talk to Apple marketing..."

    Haaz: Co-founder, LinuxPPC Inc., making Linux for PowerPC since 1996.
  • This is probably one of the few things saving Apple right now; that it is sooo incredibly easy to connect to the net with an Apple PC, and that just because it's clocked half of a competing IBM PC, it actually isn't any slower because it's the net that's slowing you down.

    On the other hand, Apple can't corner the hardcore gamer's market because they are clocked at half an Athlon or Intel CPU.

    The really ironic part is that the XBox will be using a variation of the GeForce3 on a Celeron-like processor, at CPU speeds no faster than, say, a G4 tower with a GeForce3 today... so it *still* may be possible that Apple isn't nearly as bad off as everything thinks. Then there's the other box, the GameCube... which uses a 403MHz G3 with some fancy ArtX/ATI chipset... which will still probably be comparable to the GeForce3 and a 600MHz G4.

    All that really is waiting is the games, not the hardware.

    Geek dating! [bunnyhop.com]
  • DAMN! I thought they were bad when OS/9 [southwind.net] was announced. They're still selling OS/2. Sheesh!
  • I think I basically agree with what you're saying. My point was that today's cpu's are more than fast enough for current applications. CPUs in three years will be more than fast enough for their applications. But todays will be horibly slow. Imagine living your life on a P100 today. It's pretty hard (yes, I've done so recently, though not normally). I think that's basically because we have new applications -- they're not stricly necessary, but I like my quick GUI, my multitasking of memory-hungry cpu-hungry apps, etc. I was just trying to produce some off-the-cuff examples of uses. One good example from a reply to my original post is real time audio/video encoding. Good for video conferencing, etc. Anyway, it just felt like slashdot of all places couldn't see a use for more CPU power in 2004.
  • Have you seen the ads on TV? They're already advertishing the Playstation 9! Well, sort of. :)
  • I was reminded about the objective of the deep blue chess thing against that grandmaster whats-his-name. They dedicated so much time and effort to one single end - having a computer defeat a human player. In the end, the computer did very well, only because it was able to compute much faster than the human opponent - although the human opponent they used was very good with strategy and playing the game. I still think the computer got lucky when he surrendered on that last match - the computer choose its moves based on what would path would lead to victory. This may make sense in the short term, but sometimes the only way to win is not to play at all ;).

    Anyhoo, I wanted to write a short note about gameplay. You don't want a game that people can't win - eventually players will become fustrated and give up. Its pointless to have a game that's impossible to defeat because while people like a challenge, people also like victory. You don't make it too easy on one hand, but you don't put them up against something that is completly impossible.

    BLeh. I suppose all this processing power will give us the next best thing in orgasmic graphic reality. Woop. Someone should hit these people on the head and tell them that I want to play a game that is fun once and a while too. You know, pong is really fun and it doesn't need much more than a C64 to run either... in fact, a C64 would be too much power for Pong...

    Give me Tradewars 2002 over Quake 3 anyday. Give me LORD over Diablo. Give me fun or give me something else. :)

    Ignore this post if you think I'm an idoit. I am.


    ---
    a=b;a^2=ab;a^2-b^2=ab-b^2;(a-b)(a+b)=b(a-b);a+b=b; 2b=b;2=1
  • With the level of service @Home has been having recently, I've considered out-sourcing my bandwidth needs to FedEx. I figure that if I save packets to floppy disk and have them FedEx'd to the site, my total bandwidth will increase (think lots of floppies in one shipment) but my latency will be horrible (though not as bad as @home is right now).

    ;)


    ---
    a=b;a^2=ab;a^2-b^2=ab-b^2;(a-b)(a+b)=b(a-b);a+b=b; 2b=b;2=1
  • by jheinen ( 82399 ) on Monday March 12, 2001 @02:15PM (#368113) Homepage
    Uh huh. I still haven't seen a rendered-on-the-fly animation that I can't immediately tell was computer generated. And I don't mean piddly little GeForce cards either. I'm talking about million-dollar SGI workhorses. The need for processing power doesn't come from pushing polygons. It comes from realistically making the objects in the environment behave in a realistic fashion. There isn't a computer in existence that can realistically render the motion of hair blowing in the wind. Even if we get to the point where a computer can render an animation of a human being that is indistiguishable from a film of the real article, we still won't have enough power. The environments get larger and larger. Once you can simulate one person, you need to be able to simulate a thousand at the same time. Think massively multiplayer environments. We aren't even close to having the raw processing power necessary to simulate large-scale environments with thousands of objects.

    -Vercingetorix
  • You almost make this sound like it's a good thing.

    How so? Did you read a different post?

    There's a distinct difference between parental controls, which seems to be what you're talking about, and content control, which is things like CSS, SDMI and the recent hard disk protection mechanisms. I'd agree that voluntary parental control mechanisms are a good thing, whilst the other is distinctly a bad thing.

    You seem somewhat confused I'm afraid :)

  • by PHr0D ( 212586 ) on Monday March 12, 2001 @02:17PM (#368116) Homepage
    ..Actually it will play these Tiny, secure music disks [slashdot.org] the size of a quarter, but what with that security hole [slashdot.org] in TCP, users will be horrified to discover that while using the off-shore napster [slashdot.org] the new document-destroying copy protection [slashdot.org] will wipe these tiny disks clean. The users will revolt, ultimately leading to the PS3 Marketdroids escape in their brand new electric cars [slashdot.org] (which are luckily faster than the users Ferrari's).


    --------------------------------------
  • I keep seeing this. I am of the opinion that despite this (and I agree to some point), we should keep improving them as fast as possible. The theory goes, we'll find stuff for it eventually. Perhaps someone will find a way to do reasonable AI for some game that plays like a person and learns, and doesn't just beat the player by playing perfectly at a small scale, but actually plans better. But, oh wait, it requires a CPU running in the multi-GHz range to operate effectively. We'll be glad that CPU is there when the time comes. Or some other equally intensive application. Maybe good voice synth / facial expression to go with it? and an AI to produce the dialog? that sounds good, and CPU intensive. Don't berate the fact the CPUs are faster. Years ago, todays speeds would have seemed completely unnecessary. But you can't do high-res, high-poly-count 3D without them.

    Anyway, it really anoys me when the crowd of admitted techno-geeks can't see uses for more CPU power. They're out there, and when the designers have the CPU power, it will get used. The PS2 games will improve, and the story will repeat again with the PS3 -- at a level an order of magnitude (or more) higher in CPU power, and somewhat higher in gameplay (I hope).

  • by Grond ( 15515 ) on Monday March 12, 2001 @02:32PM (#368120) Homepage
    Sorry, that's a bunch of crap. There isn't a single GPU on the market that can 'do photo-realistic rendering at over 30fps.' Consider that a photo is something like 4000x3000. Now consider 30fps. Now consider 32bpp necessary to do photo-realism. Just pumping out the bitmap is
    4000x3000x32x30=11.5Gb/s of bandwidth.

    There isn't an architecture on the planet (save -maybe- Crays) that has that kind of bandwidth. Even the on-die interconnect between the processor cores of the POWER4 is only like 6.4Gb/s, a little more than half, and the POWER4 isn't even out yet.

    Take a look at the DOOM3 screenshots. That's not photo-realistic. The technology doesn't even exist to make photo-realistic pictures on the fly. Even the Final Fantasy movie, while extremely realistic, is still fairly obviously CG. Consider that Toy Story 2 took, I believe, 2 weeks on 168 processors worth of Sun servers to do the final render, I think it's safe to say that there are many years yet to go before a game console is capable of real-time photo-realistic rendering at over 30fps, much less the 60-70fps that the human eye can differentiate between.
  • What's your point in mentioning Indrema? It's not open at all.
  • Nintendo looks shaky.

    Why is this such a common myth? (or are you basing this on some numbers?) The only numbers I've seen is that Nintendo is raking it it in with the GB/GBA, and soon the gamecube (which looks like a kick ass system).


    --
  • (Rant)

    Dammit!!
    Moore's law is garbage, let's just get that straight. Moore noticed a trend and somehow got his name attached to it and it was mistakenly called a 'law'.

    You little calculations have no bearing on how hard the chip designers at IBM work, or what advances occur in the next 4 years.

    By your 'logic' chips will be 2^26 times faster in 2040.... i sincerely doubt that Moore had any formal training in physic, but i also doubt that he thought people would started spout off his 'Law' like it was gospel...

    sorry. hit a major nerve. maybe i should take that medication regularly...
  • Though I dislike Sony (but generally like IBM)...cut them a break.

    I realize the post was trying to be funny, but the root of this posting was just some obscure press release on IBM's corporate site.

    It's not like they are running TV ads for this thing...And I'm fairly sure they really are working on this stuff...And, lastly, their timetable for completion seems reasonable at this time.

    The definition of 'vaporware' has gotten way too lax -- especially here on Slashdot.

  • Remember that console mfgrs likely lose money on each console from a strictly hardware point of view. The money is in licensing games. (remember the popular /. argument that Sony was making a mistake in opposing Bleem?) Now tell me, why would you actually pay market price for a console when you can have one that $GAME_COMPANY subsidizes much cheaper? And why would $GAME_COMPANY open the spec to allow game authors to write for the open platform without passing royalties on to $GAME_COMPANY?

    Bingo Foo

    ---

  • You mean <a href="http://www.adcritic.com/content/sony-playsta tion2-the-beginning.html">this</a> one:

    http://www.adcritic.com/content/sony-playstation 2- the-beginning.html


    ---
    a=b;a^2=ab;a^2-b^2=ab-b^2;(a-b)(a+b)=b(a-b);a+b=b; 2b=b;2=1
  • >And now the obligatory /. question

    >When can we have linux on this?

    Here is the obligatory /. answer:

    About 3 months after you can have BSD on it

  • by alewando ( 854 ) on Monday March 12, 2001 @01:53PM (#368156)
    IBM also made the ppc gekko processor in Nintendo's gamecube [nintendo.com], so they're not new to the game-console embedded-processor market.

    It just goes to show that whoever strikes gold, it's the fellows selling the picks and shovels who really make a bundle. No one is sure which of the big console players will dominate the market in five years, but whoever it is, IBM will be selling their processors. That's a winning strategy by anyone's standard.
  • by vallee ( 2192 ) on Monday March 12, 2001 @01:53PM (#368158)

    The P3 will certainly be more than just a games console. They've said before that their aim is to replace specialised units like DVD players and hi-fi systems with a central unit which does all of these jobs - the Playstation 2 makes a start at this, but by the time the P3 is rolled out people will be used to the idea of a central controlling device.

    And let's face it - the P3 is likely to sell well solely on it's strength as a games machine and Sony's marketing muscle.

    Of course there is a lot of risk with this - a central controlling device means that it's far easier to incorporate more effective content control mechanisms - you only need to include them with one device rather than every device in the house. And people are likely to choose convenience over freedom as they so often do.

    Unfortunately it's only rarely that people reject convenience, and Sony will undoubtedly have another huge hit with the P3. If they can manage to produce any of course :)

  • It does not so much matter bout the software side of things - a games console does not really need an OS, the games can hit the metal.

    Ugh. Ask any Amiga user how many years that retarded hardware advancement.

    Hardware abstraction is your friend, even if you're a game.


    ---
  • Nintendo looks shaky

    Yo, get off the crack:

    Top 10 Publishers of 2000 (software only)
    Rank / Publisher / Units / Dollars
    1. Nintendo / 26,807,180 / $955,169,820
    2. Electronic Arts / 11,946,160 / $435,493,079
    3. Activision / 9,775,986 / $332,803,304
    4. THQ Inc / 6,956,563 / $262,998,114
    5. Sega / 6,332,560 / $262,494,552
    6. Sony / 8,927,900 / $244,438,591
    7. Midway / 4,058,503 / $135,982,929
    8. Acclaim / 4,684,894 / $134,503,569
    9. Capcom / 3,633,586 / $124,124,609
    10. Infogrames Entertainment / 3,913,971 / $112,687,963

    Source: http://www.dailyradar.com/news/game_news_6632.html

    EA isn't even close, and this doesn't count their hardware sales, which must be ridiculous since the GBC still sells thousands of units per week.

  • We're rapidly approaching the point where additional processing power is going to be wasted in consumer devices. Flashy polygons and real-time CG effects are nice, but pretty soon, the technology is going to reach a limit. The limit won't be how fast we can clock a processor, but a point where our senses cannot tell the difference between really fast, and really, really fast.

    Modern GPU's can do photo-realistic rendering at over 30 fps, and anything more than that is going to be completely wasted. Most consumers aren't going to be serving enterprise class DB's or cracking crypto, and today's generation of GPU's is all we'll ever need for video gaming. I'd like to see more R&D effort focused on improving playability of games, rather than fancy, yet unnecessary hardware.

  • that's why you'll never see wood panelings in a console

    Never? Ever? Howabout allready. Ever seen the vintage Atari 2600, aka Atari VCS? It had wood paneling. It was my first console. It was one of the first consoles (at least one of the first that could play games that didn't come with it!).

    and transparent orange plastic in a high-end stereo equipment

    I'll be on the lookout for transparent orange plastic stereo, but I'll note there is allready a lot of transparent plastic in high-end audio, and sometimes some disgustingly colored neon lights. Actually I don't mind the color so much as them being lights. Doesn't anyone else listen to music and try to sleep at the same time?

  • In the large historical view, consoles are gradually heading *away* from monopoly.

    While there were a decent number of consoles in the 70's back when the games "industry" was only a small step away from the concurrent hobbyist PC market (or at least it seems that way to me, who wasn't around at the time much less following the video game market), the late 70's/early 80's were essentially monopolized by Atari, especially the 2600. ColecoVision and Intellivision were around as well (I think they were competing against a later Atari, perhaps the 7600?), but Atari had a large, long run as the controlling power in console games. That they had a monopoly is unquestionable.

    Especially because when they tanked, the entire industry tanked with them. Indeed, the console market collapse of...1983 or so?...had many writing off the console market for good. And it would have stayed that way too, were it not for giant monopoly #2, the NES.

    The NES came out in late 1985 at a time when no one in their right mind was thinking of bringing out a video game console. Its only competition, the SMS, didn't even go on sale until 1988 I believe. In any case, it wasn't a competition at all; the NES sold something like 80 million units in the US, more than any other console ever. Pretty much every household with children had one. The 16-bit generation (SNES and Genesis) barely sold half that *combined*. Meanwhile, the Master system sold like 6 or 8 million of the things, mainly to people who already had Nintendos. That was a monopoly, plain and simple. Nothing like that is likely to occur in the video game market again, for the simple reason that the situation that led to it (i.e. only one company actually believed the console market was big business) will likely never occur again.

    Ever since then, there has always been at least 2 large competitors for each generation of hardware. The Genesis and SNES were roughly evenly matched in sales. The PS1 eventually won a decisive victory over the N64 and Saturn, but by the time that was brutally clear, Dreamcast was already around; in other words, while the PS1 was a spectacular success, it never held the sort of monopoly position that the NES did for something like 5 years (until Genesis).

    The PS2 has beaten the DC (mainly on the basis of hype, which in an industry where cross-platform compatibility is unimaginable, is just as powerful and dangerous as FUD in the computer industry), but will still suffer competition from it until XBox and GameCube arrive. While I don't think Indreema has a chance in hell, the open source idea is powerful enough that an open console might actually make it a few years from now. (As has been pointed out, the fact that consoles are a loss leader which the parent company tries to make up by licensing games presents a large problem to this business model, though...)

    At this point, it is certain that the PS2 will face serious competition for sales from at least 2 other consoles at every step of its life. (Now: DC, N64 and PS1; 2002 and later: XBox and GameCube.) It is by *no* means clear that PS2 will be the eventual winner, either; in my view, MS has taken all the tactics that made the PS1 such a surprise success (good developer relations, easy to program, just concentrate on games) while Sony has strangely decided to try to emulate the tremendous failures of the Sega Saturn (launch with low volumes and bad games, put theoretical performance above ease of programming, overhype) and crap like the CDi and 3D0 (be the digital-convergance-set-top-information-highway-bu zzword of the future. It'll be like a stereo component! Honest!).
  • but I haven't seen anything to say that the PS3 won't do your laundry. Don't get disappointed too easily, my guess is Sony has the little guy in mind. Air conditioner, space heater, water cooler, icebox, swiss army knife, blahblahblah. The PS3 will do all that and more. The XBox, GameCube can't compete. Indrema doesn't stand a fucking chance.
    --
    Peace,
    Lord Omlette
    ICQ# 77863057
  • by Dr. Dew ( 219113 ) on Monday March 12, 2001 @02:27PM (#368181) Homepage

    Gosh, it's not enough that Apple tells me my page layout machine out-supercomputes supercomputers (hope the SETI people appreciate my largess, because I sure don't need supercomputing power). Now my gaming system needs to have more CPU power than my laptop.

    Of course, there are some downsides:

    • My wife won't be able to play games that I bought her. (See earlier story.)
    • When my ISP goes down, so will my stereo, home theatre, game system, home automation system, and digital wall art.
    • With the Juno-style licensing I expect will accompany this bad boy, I figure I'll have to ask really, really nicely to get some playing time in:
      Me:Inserts DOA5 game
      PS3:Your request to see jiggling has been denied. I'm busy defeating Kasparov again.
    • Then there's this problem:
      Me:Up, up, down, down...
      PS3:I'm sorry, I can't do that, Doctor.
  • by powerlord ( 28156 ) on Monday March 12, 2001 @02:42PM (#368186) Journal
    Maybe we need a visual test on par with the turing test?

    Call it, I don't know, Turingv2.0 :-)

    Basically, have someone look at a scene thats unfolding in realtime on a monitor in front of them, and decide if its real footage, or a computer generated scene :-)
  • I wan't a computer that I can beat at chess

    I still haven't upgraded from my Vic 20

  • Billions of bibiflops in the PS3, and yet no one will be good enough to develop software that pushes it beyond the limit. Heck, they're still bitching about how the PS2's underused. They're going to be spitting out a couple bazillion triangles even though the TV screen is only around 720x528. If only it had an SVGA hookup to run it at 1600x1200 on a 21" monitor; now that would be impressive, but they're about to reach a point where the bottleneck is no longer the console, but actually the hardware that's plugged into it.

    I wish they'd invest more in getting good developers and designers working on titles for their existing consoles instead of focusing so strongly on the hardware aspect. You'd think they learned from the Playstation, where its success was based on the immense selection of games available. N64 came short because you had less than 60 games available 2 years after it was released, and that's why it's not nearly as popular as the technologically inferior PSX.

    People don't care much about shiny graphics and thundering sound if they can't find games to fit their tastes. For a PC component manufacturer like NVidia, they can focus on releasing wad-blowing hardware because there are thousands of software houses that will take care of the software, but for a console like the PS2, developers need to go through Sony to get their games licensed and pay royalties and whatnot, which severely limits the amount of people available to work on games. You won't see a bunch of teens writing a killer PS2 game because they just don't have the money and business clout to deal with SCEA's business model.
  • Modern GPU's can do photo-realistic rendering at over 30 fps

    Please tell me where this is happening.

    --

  • IBM is not silly. They aren't going to let a simple matter like conflict of interest get in the way of good business.

    Here in Australia, IBM GSA looks after both Telstra and Optus (the two major telecommunications companies). They have no conflict of interest because they physically and logically isolate each business unit. In situations like these, IBM doesn't get to use their past technical experience to assist the development of a competitor. What the client benefits from is IBM's accumulated experience in handling technical projects of this complexity/magnitude.

  • by rlwhite ( 219604 ) <rogerwh&gmail,com> on Monday March 12, 2001 @02:43PM (#368202)
    Why do so many people here post about this being overkill for graphics? Sure, a modern graphics card with adequate memory can saturate a monitor or tv, but Nvidia's not doing incredible AI for me. With chips like these, now while my GPU is spitting out all those pretty pictures my proc can be planning on its next 100,000 moves.
    Wouldn't this be cool for strategy/simulation games? AI wouldn't do as much for me in a frag fest, but this would be cool if I could upgrade my old 486-era Civil War strategy game to this and face off with Robert E. Lee.
  • by Monkeyman334 ( 205694 ) on Monday March 12, 2001 @02:44PM (#368205)
    You're right that no one is going to get more bandwidth out of their faster CPU. But imagine doing ungodly amounts of compression on multimedia and other data and being able to instantly (via fast procesor) decompress it on the other end. Like a gzip html file which can bring a 120KB file down to 14KB, and just takes a little bit more time to render. But if you have a 14.4 modem and a 750mhz computer, it's more than worth it.
  • right...

    and once the limits of perception are reached (and probably before they're reached, honestly), perception will just improve :)

    So maybe you can render enough polys to make something on a monitor look like a photograph, but can you compute enough video, sound, and AI, to make something thats as real as "the matrix" when you are bypassing your eyes altogether for sensory input ? Think of the staggering computation required to put you in a beleivable 3d world by stimulating the brain directly.

  • by azephrahel ( 193559 ) on Monday March 12, 2001 @02:46PM (#368209)
    Now this does sound incredibly interesting, but lets think about desktops (god forbid something usefull!) for a minute. Lets see.. Moore's law states processing power will double every 18 months. That means it should double roughly 1.8 times by 2004 right? 2^(1.8)= roughly 3.5. The fastest 32 bit desktop processor currently sold runs at 1.5 gigahertz.. so by the time the PS3 comes out top of the line destop should run at about 5.25 gigahertz.

    To quote from ibm, (http://www.research.ibm.com/actc/RS_6000/Topic_Pa rallel.html)
    "[the] P2SC design has reached its peak operating frequency at 160MHz" and (http://www.research.ibm.com/deepblue/meet/html/d. 3.3a.html) "this year Deep Blue will be running on a faster system - the latest version of the SP - which uses 30 P2SC or Power Two Super Chip processors"

    Assuming pefect smp deep blue theirfore runs at 4.8 Ghz.

    5.25 is greater than 4.8. 5.25 is projected by Moore's law for 2004, so this is no big shakes.
    By 2004 we should see top of the line desktops just as powefull.
  • If console makers were to do this, they would have to start selling the hardware at a profit-- something no console maker has succeeded at yet.

    Otherwise, they risk the possibility that 75% of the time the machines are in use, they won't be generating money for the manufacturer. (with a slightly lower percentage in Sony's case)
    --

  • by mattbee ( 17533 ) <matthew@bytemark.co.uk> on Monday March 12, 2001 @01:55PM (#368216) Homepage
    A supercomputer on a chip, much much more than a games console, fast internet access blah blah blah. Has everyone forgotten how much of Sony's PR bilge was regurgitated in the runup to the PS2 launch? The let-down launch titles, the buggy DVD software, the self-corrupting memory cards? Are the world's media already taking backhanders to hype up the PS3? Let's forget all about it until the launch and judge it the contents of the box in 2004, hmmm?
  • by euroderf ( 47 ) <a@b.c> on Monday March 12, 2001 @01:56PM (#368218) Journal
    I think that there is a big problem with the console market. As the industry has matured, it has become more and more expensive to manufacture and design a console. This has meant that, recently, many console companies have been pulling out of the manufacturing business entirely. Sega is doing so, Nintendo looks shaky. Remember Atari? Remember how many console companies there were in the 1970's and 1980's? Lots. I mean, hundreds of them, banging out clones and bizarre little variations. Now it looks as though the console market is on a fast track to monopoly, under the aegis of Sony, or perhaps a stich up between MS and Sony.

    I would like to see an Open Source console, one which can be cloned, much like the IBM PC could be cloned. This would lead to a vital market. It does not so much matter bout the software side of things - a games console does not really need an OS, the games can hit the metal.

    If one company created an open architecture and promoted it, before long there would be hundreds of clone makers and a real movement in the industry.

    We must break the hegemony of the sealed, synthetic box.
    --

  • The PS2 was announced 4 years before it came out. That wasn't vaporous.

    As for hype, we all love the hype engine. Everybody dreams of the next big thing. It's what keeps the computer industry going.
    --

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...