Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
PC Games (Games) The Internet Games

OnLive Gaming Service Gets Lukewarm Approval 198

Vigile writes "When the OnLive cloud-based gaming service was first announced back in March of 2009, it was met with equal parts excitement and controversy. While the idea of playing games on just about any kind of hardware thanks to remote rendering and streaming video was interesting, the larger issue remained of how OnLive planned to solve the latency problem. With the closed beta currently underway, PC Perspective put the OnLive gaming service to the test by comparing the user experiences of the OnLive-based games to the experiences with the same locally installed titles. The end result appears to be that while slower input-dependent games like Burnout: Paradise worked pretty well, games that require a fast twitch-based input scheme like UT3 did not."
This discussion has been archived. No new comments can be posted.

OnLive Gaming Service Gets Lukewarm Approval

Comments Filter:
  • by Anonymous Coward on Friday January 22, 2010 @06:20AM (#30857758)

    The guy logged in using credentials 'borrowed' from an authorised beta tester, from more than twice the recommended distance from the server, acknowledged multiple high latency (due to distance) notifications, and the best he could do is damn the service with faint praise.

    • Because guess what? In the real world, people live all over. Onlive isn't going to be able to say "Just move closer to one of our data centers," at least not if they want to pitch themselves as the "cheaper than buying a graphics card" option. Sounds to me like they've been controlling who gets in to the beta to try and create an overly rosy impression. This guy was a more realistic test, a person who doesn't happen to be near their few locations.

      That's just the reality of this. If it is to work well I can'

      • by BobMcD ( 601576 ) on Friday January 22, 2010 @11:09AM (#30859548)

        Because guess what? In the real world, people live all over. Onlive isn't going to be able to say "Just move closer to one of our data centers," at least not if they want to pitch themselves as the "cheaper than buying a graphics card" option. Sounds to me like they've been controlling who gets in to the beta to try and create an overly rosy impression. This guy was a more realistic test, a person who doesn't happen to be near their few locations.

        That's just the reality of this. If it is to work well I can't only work well for a few people in a few locations.

        Imagine a movie-listings website for the greater New York City area. Now imagine some from Wyoming complaining that the theater in Cheyenne isn't listed on that site. The response that person would get is the same that your objection gets:

        If you don't live with our covered-area, feel free to use another service. We have plenty of customers within our area and we have decided not to cover yours.

        Not every business on the planet expects to serve every customer one the planet, and yet somehow they can still turn profits.

        Makes it not so attractive as they hyped it to be, especially against powerful $100 graphics cards (the low-mid range of graphics is great these days) and $200 game consoles.

        I think one of us has missed something. Either you're right, and OnLive expects this to kill all other gaming everywhere, or I'm right in that this is a supplemental service to gaming that adds a remote component for those customers that want it and can access it.

      • I think no matter what we can all agree that OnLive is not for your serious - regular gamers because they already have the hardware, even if it is outdated, and are used to playing on their local system.

        Since the system has flaws it will likely not persuade them to try the service.

        However, those who do not game due to the entry cost of a console or PC might find the idea neat and it could catch on.

        Turn some casual/non gamers into regular gamers only knowing what the gaming experience is like through OnLive.

    • The guy logged in using credentials 'borrowed' from an authorised beta tester,....

      I know most people take Beta/OpenBeta NDAs as just 'who cares' documents, but there is a reason they exist, and assuming they were signed and stuff, they are (IANAL) binding.

        I'd be surprised if "PC Perspective" didn't get a C&D already..

    • Let's not forget his belief that "750,000 bps" = 1 Mbps, not 7.5Mbs (or closer to 1MB/sec, not 1Mb...) . And to your point, there's a reason closed betas are closed. In addition to latency due to distance, this is also still a beta - you can be assured that they are tweaking for performance and other issues on a regular basis. Any numbers you get from such a beta are pretty meaningless.
  • I think the results are as expected or even slightly better than expected (at least from my viewpoint). It shows that something like OnLive will be workable in the future with slightly faster interenet access.

    My problems with OnLive are not related to the technical side. Even though i am mostly a casual gamer (at least since i gave up WoW) and i could profit from Pay-per-Hour, i am not sure i would like this. It would require a lot of trust from my side, OnLive has still to earn.

    CU, Martin

    • This is an issue with latency, not bandwidth; even if you had a dedicated 100Mb line but had 100 ping it would be unplayable.

      I doubt future improvements will be able to speed up the hardware that process signals by a huge margin, and presumably they're not going to change the speed of light any time soon; the only real thing they can do is put the data centres closer to the end-user to improve performance.

      • by slim ( 1652 )

        the only real thing they can do is put the data centres closer to the end-user to improve performance

        They're doing that. But Perlman claims that they're also:
          - Developing smarter routing algorithms
          - Tuning at the IP packet level, to increase speed on domestic routers etc. (I guess this is largely about getting the MTU right, dynamically)

        • by Svartalf ( 2997 )

          Heh... That's a used car salesman's pitch of things.

          "Smarter" routing algorithms have to be applied to each and every router in the mix that might see the traffic for that to work. Do you see the ISP's ripping every Cisco and Juniper out to accomodate them?

          "Tuning" at the "IP packet level"? Perlman said this?

          There's a magic size that will increase bandwidth to peak. Smaller stuff means you don't get as much through because of latencies, etc. Larger stuff,you end up getting more and more bandwidth wit

      • Re: (Score:3, Informative)

        by mseeger ( 40923 )

        Latency (for a not overbooked line) depends on bandwitdh and packet size. Same packetsize and ten times the bandwidth reduces the latency nearly by a factor of ten (on a single line).

        Overall latency depends on the sum of all latencies for each lines on the way plus a bonus for each router. The bonus for the routers is not the issue. The number of hops can be influenced by a service provider like OnLive through Peering Agreements. Something OnLive cannot influence, is the last mile to the customer. Usually 3

        • Latency (for a not overbooked line) depends on bandwitdh and packet size. Same packetsize and ten times the bandwidth reduces the latency nearly by a factor of ten (on a single line).

          I'm confused as to how this statement holds up, currently I have a 16Mb or so line at home with pings of about say 100ms to a server. If I was to get an upgrade to a 50Mb line, this means my ping would go down to 32ms?

          Surely latency is how long the signal takes to get to a server and back, how would allowing more signals to go back and forward help increase the speed that they get to the server and back?

          Car analogy:
          If say the speed-limit on a motorway was 70mph, and there was no congestion on the road; why

          • Re:As expected (Score:4, Informative)

            by mseeger ( 40923 ) on Friday January 22, 2010 @08:54AM (#30858434)
            Hi,

            If say the speed-limit on a motorway was 70mph, and there was no congestion on the road; why would adding in extra lanes to the motorway increase how fast I get to my destination?

            You get the car analogy wrong. A packet of 100 bytes is not similar to a single car. It consists of 800 cars (bits). So if you increase the number of lanes more cars can travel. Each car travels still the same speed (of light) but by allowing more cars at the same time, the delivery (packet) distributed over 800 cars gets delivered faster.

            The time a packet takes to get transmitted is roughly: packetsize/bandbidth.

            Say you have a 10mbps line and a 1000bytes packet. This will take 8000 bit / 10.000.0000 bit / s = 0,00008 s or 0,8ms (one way). So the latency through the line will be roughly 1,6ms. If you got to 100mbps ethenet or even gigabit ethernet, the time will go down by factor 10 each step.

            But there are some side effects: Sometimes packets are packed into larger packets to fill the line better. This will increase the latency. When the speed of the line is high, the time the OS needs to send/receive the packets gets more influence on the latency. Also the latency may occur in your providers network because he overbvooks the service (selling access for more cars than the lanes allow and therfor creating congestion).

            To see wether your line is the chokepoint use Traceroute [wikipedia.org] to see where the latency happens. If the latency already occurs close to you, a faster line may improve the latency. Also look for features from your provider as "fastpath".

            CU, Martin

            P.S. This is a very short overview of the topic. A complete coverage would come as a book. BTW the books have already been written: Richard W. Stevens: TCP/IP Illustrated [kohala.com].

            • Electrical signals don't travel at the speed of light. Ethernet is generally ~.5C. Not entirely sure what the speed for fiber is, but it is also a bit less than C. Just for giggles, I should point out that the sunlight that you see is also not reaching you traveling at C since C is defined as the speed of light in a vacuum. I'm aware that for all intents and purposes, it's easy enough to consider them all as traveling as C since the difference in time is so minute over terrestrial distances. It just get

              • Re: (Score:3, Informative)

                by mseeger ( 40923 )

                Quibbler :-) You wanted it....

                For our example 0.5C is sufficently close to C to call it "speed of light" :-). As you point out, the "speed of light" is not the same as C. I can find materials where the speed of light is below 0.5C. So saying that the electric signal travels at the speed of light is correct since i didn't mention any matrial i would be measuring the speed of light in....

                Point, game and match :-)

                CU, Martin

                P.S. I have references to materials reducing the speed of light to 17m/s (38mph for you

                • Re: (Score:3, Funny)

                  by BobMcD ( 601576 )

                  P.S. I have references to materials reducing the speed of light to 17m/s (38mph for you imperial bastards) without significant absorption.

                  Why am I suddenly wearing a black helmet and breathing through a respirator?

                  • Re: (Score:3, Funny)

                    by mseeger ( 40923 )

                    Why am I suddenly wearing a black helmet and breathing through a respirator?

                    If you're calculating in inches, pounds, gallons or miles: that's worse. I can forgive any honest villain, but not non-metric....

            • So what you're saying is: The internet is not a dump truck?

            • A packet of 100 bytes is not similar to a single car. It consists of 800 cars (bits). So if you increase the number of lanes more cars can travel. Each car travels still the same speed (of light) but by allowing more cars at the same time, the delivery (packet) distributed over 800 cars gets delivered faster.

              Your reasoning doesn't make sense. First off, if latency were mainly dependent on the signal speed in the cables, they would be only a few milliseconds and would certainly be negligible for the last mile.
              Second, internet communication is serial, not parallel so the car analogy makes no sense.

              • by mseeger ( 40923 )

                > First off, if latency were mainly dependent on the signal speed in the cables, they would be only a few milliseconds and would certainly be negligible for the last mile.

                Please note:

                1. I never said that the signaling speed would affect latency.
                2. I said the bandwidth affects the latency of the line involved.
                3. My analogy was: If you assume the internet connection is a highway where cars travel at the same speed (hghway: 70mph, internet: signaling speed) than the bandwidth represents the number of lanes

          • What you have to remember about ping is it is more or less testing minimum time. The payload of an ICMP packet is very low. With video data like this, you have more payload. So you not only have to count the transit time from the datacenter, but also how long said amount of data will take to transfer at your line speed.

            For example, say each video frame is roughly 50kbytes. If you had a line that was only 50kbps, well then it would take a full second for you to receive a frame, even if the latency on that li

      • by Svartalf ( 2997 )

        It's an issue of both.

        With the ping being bad it sucks for the end user.
        With the sheer amount of bandwidth needed, there's no way the feed-ends could keep up with more than a couple hundred to a couple thousand.

        An OC-48's only able to really handle about 1200 or so realistically.

        If you overbook the bandwidth or server resources, you will degrade things accordingly.

    • It would require a lot of trust from my side, OnLive has still to earn

      Trust? In what way, beyond what you would give *any* online merchant to whom you provide your credit card info?

      • Re:As expected (Score:4, Insightful)

        by mseeger ( 40923 ) on Friday January 22, 2010 @11:53AM (#30860110)

        Trust? In what way, beyond what you would give *any* online merchant to whom you provide your credit card info?

        It's like with honest politician: I trust them to stay bought....

        Something i value with my games is to take an old savegame and try something new. If i don't "own" the game but just purchased a service, the game or the savegame may disappear.

        If e.g. Amazon takes my money and won't deliver my copy of Mass Effect 2, i have a good chance to get my money back. But if i purchase OnLive to play Mass Effect 2 and they remove the game from their list, my "invested" time and some of my money is gone. If this happens 1-2 years after the purchase, there is nothing i can do that will have any effect.

        Someone taking my credit card credentials and using them fraudulently is a known process i know how to handle.

        CU, Martin

  • The menu video seems to be available, but the in game videos now give:

    "This video is no longer available due to a copyright claim by OnLive, Inc..."

  • by Anonymous Coward

    What is it with all this 'cloud' stuff?

    I've got half a terabyte of storage, a pretty good graphics card with shader support and a nippy CPU.

    When there are raytracing cards with inbuilt physics, I'll enjoy a slightly more realistic gaming experience on my local machine, thanks.

    Until then, I'll have to go with pretty realistic and the only significant cause of latency being my old neurons.

    GOML.

  • by Rogerborg ( 306625 ) on Friday January 22, 2010 @06:54AM (#30857890) Homepage

    Read: "excitement (from clueless arts majors masquerading as tech journalists) and hilarity (from anyone with even a remote shred of knowledge of the technologies involved)".

    Look, this tech may - may - be workable for SimWarConquer, but for anything that's reaction based? No. Not going to happen. There is no technobabble solution to latency, and anyone who tells you otherwise wants your credit card number.

    • And yet this review - from a sceptic - says it pretty much works. While it's in beta. From a location that would have been excluded from the beta if he'd gone through proper channels.

      • Re: (Score:3, Insightful)

        by Rogerborg ( 306625 )

        Mmm. Two out of three games were reported as playable, with noticeable latency compared to a local version, one being "right on the edge" of playability. The best experience came from a game that's largely about learning the track, which makes it not a reaction game per se.

        Notice that this was while playing over a wired-to-the-wall connected (who still uses those?) and with a low 85ms ping to the server.

        I'm also assuming that there was a certain degree of tolerance for a novel experience. Once people

        • 85ms is not a high ping at all. If Onlive considers that to be outside of their acceptable range, they've got a nasty surprise coming when they try to open it to the public. Lot of people going to have a ping of that or higher.

          I've got a reasonable good connection here, business class 12/1.5 mbps cable. Represents a mid to high end home connection. Minimum latency on the connection is in the realm of 25ms. That is about what it takes to get out past the CMTS and so on. For good sites, I'm usually in the 40-

  • by EdZ ( 755139 ) on Friday January 22, 2010 @06:59AM (#30857918)
    I came here expecting to see a belated "First!" post followed by a joke about lag.
  • I must admit, I've not actually played it, but if it's anything like the other Burnout games, millisecond reaction times are kind of important. It may be that he has having a hard time picking up on la instinctively because of the analogue controls but I doubt the reaction time increase would stand up in serious play.

  • While I've been mildly interested in OnLive, my biggest excitement over this was a confirmation that a streamed remote desktop session with real good responsiveness (say LAN) could be had soon. I even started poking around for similar systems that actually streamed the desktop in 2mbps or similar video stream with interactivity, but alas, it seems like no one is working on this issue.

    So, I'm open to suggestions, is there any current existing remote desktop server/client system that actually streams the des

    • by British ( 51765 )

      I did this back in 1996. I was wardialing, and found somebody's open PcAnywhere connection. So I connected to it and attempted to play Solitaire over 33.6 dialup. Needless to say, it was a pain dragging those cards around.

  • by El_Muerte_TDS ( 592157 ) on Friday January 22, 2010 @07:22AM (#30858034) Homepage

    Gamers would also no longer have to worry about patches and software updates to their gaming titles - one of those annoyances that PC gamers often cite on their way to moving to a console.

    I recently bought a PS3 with some games. When I started it I was welcomed with "You need to install the latest PS3 firmware now!". So I had to wait for it to install and reboot. Then I inserted a game and wanted to play, but I was welcomed with "Updates have been found for this game and need to be installed". Which is pretty much identical to the PC, but there you often have the option to install the patch.

    • They really should just change the message to "These enhancements to your gaming experience are Exciting and Mandatory!".
    • Also PC patching is becoming much easier in most cases. It is getting rather popular for games to have auto updaters, or a button you can click that will check and download what you need. Then of course if you buy a game from a digital service like Impulse or Steam, those will check automatically and keep your stuff up to date.

      Really the patching thing is becoming largely a non-issue. I don't see no patches as a major selling point for this.

    • To me that's more a good thing than not. Even when a new patch has a serious issue (we all know it happens) it's a LOT harder to support a game when you have 16 levels of patch out there, and you never know whether your bug reports are coming from a completely unpatched instance or one with all the latest updates.
    • by slim ( 1652 )

      You told us there would be a 480ms round trip. That would make it unusable.

      This guy found it was pretty much OK, despite being further than the servers than recommended.

      • by Tei ( 520358 )

        Thats your read?
        Mine is 480ms is "pretty much OK" for a console driving game, but is horrible for a PC FPS.

        The "Playing UT3 with OnLive" shows how it feels: moving like a drunk.

        Anyway, probably lots of games are still fun with 480ms. So yes, as another game system, seems OnLive will work (for a subset of the gammers)

    • Actually we're looking at 800KB/s (7-8Mb). The author is apparently confused in his writeup.
  • ... because I couldn't even stream the videos without jitter. :)

  • by jbb999 ( 758019 ) on Friday January 22, 2010 @07:41AM (#30858160)
    The major problem isn't overall latency, it's little spikes of latency on an otherwise good line. A moment of 100ms lag on an otherwise good line doesn't matter for online games because of client prediction and at worst it's a tiny moment where the controls don't seem responsive. It's not a problem for normal video because they can buffer 250ms or 500ms or 1000ms of video without any problem. But on this they can't do any significant buffering or the latency will be too much to play.And even 100ms of sudden latency will cause the picture to lag or freeze or jump. It might only happen occasionally but I suspect people won't put up with it. And they can't do anything about it either, even if your ISP is only 10% loaded on its lines and routers, there will be times when all that 10% send packets at the same moment and they get queued in a router somewhere, just for a tiny time but tiny little amounts of jitter like this are normal and expected and to be honest I think will be the downfall of this project because there is no real way to deal with them. But I guess we'll see :)
    • Mod parent up. I hadn't even thought of this, but he's right. The video will have occasional jerkiness not found in either remote noninteractive video or local gaming video and it's going to be very jarring.

      • by slim ( 1652 )

        I too would expect this jerkiness, and it's surprising therefore that the review doesn't mention it.

        Perlman goes on about "psycho-perceptual" techniques, and I wonder whether that's enough to deal with the situation.

        For example, a really smart decoder could detect that the scene in general is panning, or zooming (e.g. a racing game, going in a straight line), or rotating, or some combination of these. Indeed the encoder could include hints in the stream. The game engine could even feed extra information to

  • 1 MB/sec... (Score:4, Interesting)

    by V50 ( 248015 ) on Friday January 22, 2010 @07:54AM (#30858218) Journal

    There are still large areas of North America stuck with either stone-age Dial-Up (in 20-freakin'-10) or slow expensive satellite. Like mine (I cry myself to sleep over my 1200ms latency) This is absolutely a no-go there. Obviously.

    Now, in better places, I'm sort of out of the loop. Whenever I've spent time in cities, either visiting my brother in Ottawa or living in London (Ontario, not the good one) for a few months at a time, it's been my experience that even connections that are supposed to get up to 1MB/sec would be lucky to get that in practice, especially at peak times. Furthermore, the sheer amount of lagspikes, connection hiccups, or general time when the interrnet craps out for no apparent reason makes it seem like you'd be dealing with one frustration after another. The number of times I see people get DC'd on World of Warcraft seems to back up my theory that staying connected, and maintaining a constant connection at 5KB/s or so (for WoW) is difficult enough, doing the same for a (whopping?) 1/MB/s while keeping latency under 100ms would be hellish.

    So is my experience with the Internet indicative of the general population, or have I just had the misfortune of having terrible service? Can people really keep 1MB/s sustained, without lag hiccups, DCs, lost packets, etc, while keeping under 100ms?

    • Re: (Score:3, Insightful)

      by Sycraft-fu ( 314770 )

      Well yes, some people can. However in general you need to live in an area with good access anyhow. Also, you are going to need to have a good deal better than what you want minimum. If you want to sustain 1mbps without a problem, you probalby need a 10mbps line. The more headroom you've got, the easier it is to maintain what you need.

      That is not to say it is problem free, even good connections will drop sometimes or have problems. Also, of course, better connections tend to cost more money. Not a huge deal,

  • This is an obvious pump and dump scheme, unless they have somehow unlocked technology previously unseen and unknown by mankind, and have done so for the purpose of playing video games.

    • My assumption has always been that their business model would be to sell the kit to ISPS who would install the servers at local exchanges and charge for gaming as a premium service. That would seem to solve most of the problems. The physics only really becomes a problem when you assume central servers away from subscribers

  • How will they get user maps and mods on this?

    if all gameing where to go this way that will kill that.

    also how about free games / small game developers that may be shut out of this?

  • He says that this may be in cable boxes some day but with with the VOD control lag that you see now days that may not work out that well and cable co's don't have a lot of bandwidth for this some don't even have room for all people who want to use VOD at some times.

  • caped / metered ISP's will kill this 1meg /s min is to high for many caped plans. Even comcrap 250gb will limit this.

  • by Purity Of Essence ( 1007601 ) on Friday January 22, 2010 @10:43PM (#30866412)

    Doubters should really watch the Columbia University presentation. It's entertaining and very technical and will probably address your every concern. Too many genuine experts here don't know what they are talking about because they are ignorant of the way OnLive actually works. It's more clever than you probably think.

    YouTube Mirror
    http://www.youtube.com/watch?v=2FtJzct8UK0 [youtube.com]

    Original
    http://tv.seas.columbia.edu/videos/545/60/79 [columbia.edu]

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...