Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Microsoft Stats XBox (Games) Games

Xbox One Reputation System Penalizes Gamers Who Behave Badly 183

New submitter DroidJason1 writes: "Microsoft has added new 'player reputation scores' to each Xbox Live member's Gamercard. The scores are represented by icons consisting of the colors green, yellow, and red. The more hours you play fairly online without being reported as abusive by other players, the better your reputation will be. Good players are given a green color, while those that 'need work' are yellow and those that need to be avoided are red. Microsoft says, 'If players do not heed warnings and continue to have a negative impact on other players and the Xbox Live community, they will begin to experience penalties. For example, people with an “Avoid Me” rating will have reduced matchmaking pairings and may be unable to use certain privileges such as Twitch broadcasting.' They add that the system will adjust for false reports."
This discussion has been archived. No new comments can be posted.

Xbox One Reputation System Penalizes Gamers Who Behave Badly

Comments Filter:
  • Bullying (Score:5, Insightful)

    by ruir ( 2709173 ) on Thursday March 27, 2014 @05:22AM (#46591099)
    A new form of teen bullying, giving bad scores to the classmate you do not like...
    • OMG FAG LOL (Score:4, Interesting)

      by Anonymous Coward on Thursday March 27, 2014 @05:28AM (#46591117)

      And not to mention anyone who beats you in-game is CLEARLY cheating.

      Have you seen any alternatives to moderation/meta-moderation schemes that exclude this? It seems like the only real alternatives to actual dilligent curation (which works but is labor intensive) is either living with bullying and chilling effects ala reddit or accepting that the SNR is higher from trolls ala 4chan.

      How do you overcome this for an automated service? Is this like asking "How do you cure cancer?"

      • Re:OMG FAG LOL (Score:5, Insightful)

        by JaredOfEuropa ( 526365 ) on Thursday March 27, 2014 @05:42AM (#46591163) Journal
        I'm not too worried about trolls, but I've seen plenty of abuse and accusations of cheating hurled at "skillers", in games like BF4. All to easy to hit the "report" button in frustration after the same guy headshoots you for the 6th time in a round. And the crowdsourcing effect will not work here to filter out abuse; I expect strong players to consistently attract such reports against them in online games.

        One way to counter this to some degree is to spot-check reports, and apply heavy penalties to players making false accusations. It still is a lot of work, and I doubt whether an operator could make the distinction between a rage-report and an inaccurate report made in good faith.
        • Re:OMG FAG LOL (Score:5, Informative)

          by Sneftel ( 15416 ) on Thursday March 27, 2014 @05:49AM (#46591181)

          They're not basing the reputation system on reports of cheating, though. As you pointed out, it's difficult, and hopelessly subjective, to tell the difference between a really good player and a cheater, so expert oversight is necessary to interpret those flags. (The good news is, automated analytics are getting remarkably good at telling the difference. It's an arms race, of course, but not as lopsided as it once was.) Rather, this system is for tagging griefers.

          • Re:OMG FAG LOL (Score:5, Informative)

            by Frobnicator ( 565869 ) on Thursday March 27, 2014 @07:47AM (#46591505) Journal

            The system is not about cheating. The system is primarily about profanity and abuse.

            They have been tinkering with it since it came out.

            Also they haven't released what specific metrics they are using, but they have already mentioned factors: account playing statistics, complaints per hour played, positive feedback messages, friend requests, negative feedback messages, "Avoid This Player" marks, gamercard mutes, gamercard blocked communications, and filed complaints and reports. Couple all of them together and you will likely see some patterns quickly. They also mention that it will have human involvement and you will not be dinged for being skilled, nor will you be dinged for people targeting you. The last two seem to imply some human involvement.

            My guess is that they start with simple statistical analysis to identify players trending downward with a steady stream of "block communications", "avoid this player", and "mute" flags. All of these are specifically mentioned on their site [xbox.com]. After algorithmic identification, I'm guessing one of their army of community managers (real live human beings who are employed to listen to the vitriol and enforce the rules) would probably get a notice to monitor the chat when the player starts play. If they hear a profanity stream click the check box marked "profanity". If they hear taunting, harassment, or other abuse, pick the check box that corresponds. With a real live human involved they can nicely handle people who were wrongly accused.

          • While I haven't seen the reporting function work well in any environment, DotA 2 does a decent job with the reverse for positive reputation. They implemented commendations you give out at the end of a game, and you get a limited number for a set time period. The options are Teamwork, Forgiving, Leadership, and Friendly (I think). All of these are more focused on cooperation than skill. It serves as some incentive and gives some a positive reputation.

            Of course, this doesn't counteract greifing/harassment

            • Final Fantasy 14 did something similar, you could give out positive reputation to the good players in the randomly generated parties for dungeons( you got 1 point to give, and 3 other party members to choose between who gets it). After certain amounts of positive reputation you got certain in game things ( title / eventually a mount thingy ), there was no negative reputation options. Less than a week after this option was launched party play improved significantly, but even then there were still occasiona

        • If you had read the article they specifically said that reporting people of higher skill because you are a "sore loser" will be handled, OTOH we will have to see if the system will be able to handle all the ways "grief-reporting" can be done.

          I do think this kind of system is a step in the right direction because so many people playing multiplayer games are total douches and they need to be dealt with somehow.

          • because so many people playing multiplayer games are total douches and they need to be dealt with somehow.

            I prefer to just do not play online. I've tried a few times and concluded that it is not worth.
            • I prefer to just do not play online.

              So how should someone go about finding other players in the same city who are willing to play offline multiplayer with him? The offline multiplayer scene in the 1990s relied on after-school play dates, but the kids who did that have since grown up.

              • Explaining better: I do not play any games that requires multiplayer. For me it's a hassle, there's no fun in trying to play with punks who view the game as a obsessive and savage competition rather than simple entertainment.
                • Ever try playing single-player games in co-op mode with some real-life friends? It can enhance the experience dramatically.

        • Re: (Score:2, Troll)

          by pla ( 258480 )
          All to easy to hit the "report" button in frustration after the same guy headshoots you for the 6th time in a round.

          Clearly you missed the intent of this, then.

          Shooting people, even in-game, naturally counts as antisocial behavior. In order to keep a positive rating, you need to all sit around the battlefield and sing Kumba-Ya. You can expect a Mario-clone as MS's next big hit.

          Of course, I think Microsoft underestimates their user base - As the real outcome of this rating system, people will compet
          • Shooting people, even in-game, naturally counts as antisocial behavior. In order to keep a positive rating, you need to all sit around the battlefield and sing Kumba-Ya. You can expect a Mario-clone as MS's next big hit.

            I wouldn't think so, as Super Mario Bros. is insanely violent. Goombas were living peacefully until the Toads invaded. Goombas hired Koopa Troop to freeze the invaders in blocks and detain the princess of the Toads who has the antidote. Terrorist Mario squashes the innocent Goombas.

            For nonviolent gaming, you need to look at something more similar to The Sims or Animal Crossing or Harvest Moon. Microsoft tried that under the name Viva Pinata.

        • If the guy is headshotting you 6x in a row, on a normal everyday server (console), then yeah he should be. He's abusing the system/newbs whether cheating or not. The way BF4 handles netcode and lag among players is horrendous and if someone's abusing that then they should be kicked. Sorry, it just makes for a better game for everyone and Dice should do a better job of balancing teams and weapons. (Pistol beats a full auto machine gun, wha?)

          I play BF4 some (unfortunately). If I start getting tagged by t
          • "If I start getting tagged by the same guy over and over, and he's taking out multiples at a time when they are shooting him w/ a gun/tank/whatever, you can sure bet I'll be the one "abusing the system" by reporting them."

            If assholes had assholes you'd be the assholes assholes asshole.

        • I'm not too worried about trolls, but I've seen plenty of abuse and accusations of cheating hurled at "skillers", in games like BF4.

          Exactly. I get disappointed if I do not get at least one cheating accusation per few hours online play, it means I am having a bad day.

          The problem is just the way I play FPS games where I generally charge round the map, taking slightly obscure routes and firing in very short bursts without reloading until I need to (I roughly count the number of shots I fire in my head). This only works because I generally have pretty quick reactions and am good at recognising where enemies are most likely to come from base

        • In my experience, on a server with 60 players there's usually only 1 or 2 players that will point you out. I doubt that would change your rating.

        • by mjwx ( 966435 )

          I'm not too worried about trolls, but I've seen plenty of abuse and accusations of cheating hurled at "skillers", in games like BF4. All to easy to hit the "report" button in frustration after the same guy headshoots you for the 6th time in a round. And the crowdsourcing effect will not work here to filter out abuse; I expect strong players to consistently attract such reports against them in online games.

          Any ratings system that depends on the "wisdom of the crowd" of gamers is doomed to fail. As an average gamer, I get abuse and accusations of cheating thrown at me by below average gamers (seriously, I'm usually in the middle of the team when the scores are tallied), it depends too much on gamers being reasonable and rational actors... anyone who's played any online FPS knows this is not the case.

          Trolls will ruin a reputation system based on reports because it takes a lot of effort to reverse a fraudulen

        • Strong players are going to be playing against other strong players, which are less likely to call them cheats.

          Crowd sources most definitely does filter it out, this is a GREAT example of how crowd sourcing can stop bullying in its tracks. The 'crowd' won't rate a normal non-abusive person badly. They'll only get a few bad reports from a few select people.

          Those same people who gave bad reports to you for no reason probably did it to others as well.

          When you have no complaints from most people you play, and

      • I wonder if a machine-learning approach could be used. Train the system to align with curators' assessments of abusive behaviour by gamers.

        Or just add a bunch of heuristics. Speech-to-text to pick out homophobic insults would go a long way. Sure, gamers could 'get wise', but if the end-result is morons politely insulting each other, that still sounds like a win.

      • by arth1 ( 260657 )

        or accepting that the SNR is higher from trolls ala 4chan.

        If the signal to noise ratio is high, it means there's far more signal than noise.
        Either say "lower", or use "noise ratio".

      • by TheLink ( 130905 )

        Have you seen any alternatives to moderation/meta-moderation schemes that exclude this?

        I haven't seen any implemented but a possible way is to keep the opposing groups apart.

        If "A" doesn't like "B" just make it less likely that "A" and "B" end up in the same match (or see each others posts).

        If "A" starts getting too picky "A" might end up in fewer matches. If "B" really is an asshole, then "B" might end up in fewer matches too.

        I've proposed this before to an MMO and also a related "Points of View" method for reviewing products: http://slashdot.org/comments.p... [slashdot.org]

        After a while you might end up w

    • by gl4ss ( 559668 )

      and uh wtf twitch broadcasting a privilige?

      sure, twitch could ban broadcasters.. but fuck.. what's next, can't view youtube with an account that 10 of your classmates decided to give bad ratings to??

    • Re:Bullying (Score:5, Interesting)

      by Swistak ( 899225 ) on Thursday March 27, 2014 @06:08AM (#46591235) Homepage
      I see this point brought up every time I discuss the reputation system. There's quite a bit of game theory behind it but it can be done. And actually there are systems that implement it (LoL for example, Stack Overflow, Quora - in non-gaming world).

      When creating these systems you don't simply ban someone after one or few reports. The way most of them work are: Calculate a trust in player reporting T. New players have this set very low, later the more acurate reports were the higher the trust, addintionally usually the more reports user sends the less they "weight" (this basically makes assholes who report for "feeding" everyone with negative k/d ratio meaningless and is a reason i was never banned ;))
      Once the number of reports * trust outweight player karma (which he usually collects by small amount for each game where he's not reported, and for each accurate report he makes), then he gets banned.
      That's a bit simplified and in reality you build a neural network with feedback (that's how most of these systems are implemented), initially you hire people to "teach" a network, eliminate initial threat, and build "trust" on group of players. After you have big enough group of trusted players, they themselves are used to further train the network and detect new usefull players and ban bad ones. A lot depends on the initial training phase, but I've personally seen one Community Manager turn her community into self-moderating machine, after a year she didn't even had to do much banning herself, each message that didn't conform to standards was almost immidietly met with polite response that explained why it's inapropriate and request not to continue the topic! By users tehmselves!
      So yes, these systems do work (At least good ones), and no reports do not become your personal moderation/harrasment tool, smart people already thought of that
      • Re:Bullying (Score:5, Interesting)

        by Swistak ( 899225 ) on Thursday March 27, 2014 @07:21AM (#46591419) Homepage
        I'd like to extend above answer a little. The systems in games like Smite and Lol actually got so good that amount of false negatives are so low that they are almost non-existent and can be handled throughly on case-by-case basis. I play Smite a lot in my free time, and I see how the system works from outside, I cannot count how many times I was thretened to be reported, and even if half of these threats were followed through I probably earened over 100 "Intentional feeding" reports by now, and I'm still playing without even one temporary ban. At the same time I've seen number of players disapear from leaderboard after I've reported them for harrasment (there was actuall harrasment, mother calling, death threats even), it didn't happen after my report, but few days later after few more matches all of haters sooner or later got permaban.

        So the reputation systems came a long long way from where they used to be, false positives are no longer big problem, the biggest issue is now reaction time (time between player starting spewin vitriol to the moment he's prevented from playing), ideally it should not be few days (as it's now in most cases), someone having bad day shouldn't mean a bad day to all person he's teamed up with

        One of the solutions might be "incremental" baning, by disabling some of the futures - which some games already do (and Microsoft is doing in this case). One of better examples is voice chat muting, I cannot recall which game id doing it. They way it works is the more people mute asshole, the more likelly he is to start muted in first place, his teammates might decide to unmute him, but there's no longer risk of "Beter not fuck up morons i need this win" welcoming you to the match.

        I'm looking forward to further advancements in these systems, as playing team games on internet is still quite annoying these days, especially since you often get matched with people who don't speak english and/or you cannot just smack for beeing an idiot like you'd if you played football together.
        • especially since you often get matched with people who don't speak english

          That's not a "griefing" issue as much as "matchmaking UI needs a language preference" issue.

      • by Thruen ( 753567 )
        As a former Live subscriber, I don't believe Microsoft has taken such care with their system. The reputation system already existed before, it wasn't as obvious and didn't penalize you as much, but it existed. I was a big fan of the Halo series, I played them online until maybe a year ago. I wasn't always the best player but I wasn't the worst either. I played to enjoy it, I wasn't running around trolling and talking smack (save a couple excited moments) and I didn't leave games if I could avoid it. When I
        • Uh, are you sure it wasn't 80% of the feedback you got? Was feedback obligatory?

          • by Thruen ( 753567 )
            No, feedback is not obligatory, it's not even immediately apparent you can do it without looking for it. It'd be nice if you knew anything about the way Live works before commenting like that, Google can help. As for actually receiving 80% negative feedback, while I can't positively say so without polling everyone I'd ever played with, it doesn't seem reasonable to think that roughly twelve of the fifteen other players in every single game I'd ever played took the time to single out an average player who do
      • compared to forums (Score:5, Interesting)

        by kevlar_rat ( 995996 ) on Thursday March 27, 2014 @08:41AM (#46591801) Homepage Journal
        This is fascinating. I run a website [squte.com] that applies a user reputation system to Usenet - a medium notorius for flame-wars (it's where the words 'troll' and 'flame' come from, after all) - so I'm aware of some of the theory, but it seems games have gone further than forums.
        The algorithm I use is much simpler, the 'trust' metric is identical to the user Karma, presuming that users who act sensibly will also moderate sensibly. It works very well and filters out >95% of flames and trolls.
        To those who ask how to stop reporting being abused, it's actually simple:
        * weight reports by the number of reports. If a user only reports one other person per thousand the reports carry more weight than if they report every other user.
        * as you said, have a 'trust' factor that weights the reports. In the case of my site, this is just their Karma score - if they get reported a lot as an arse, they are more likely to be an arse in the way they themselves report.
        * Make reporting really easy. The more data you have from legit users, the more your algorithm can work on.
        • by Swistak ( 899225 )
          I talked with her a lot about this and she mentioned that while coleration between good users and good moderators is quite high, there's large number of users who she calls "cryptohaters" and i call hypocrits, in public the'll advocate peace and understanding, but given anonymous medium liek down/up votes, or power (liek mod rights), will hate, downwote, silence their oponents with post removals etc. That's why I think separate "trust" metric makes sense.
      • LoL actually has a tribunal that rewards players for taking their time to handle cases (1 minute per case at most). The judges are also validated against other already validated judges so it's easy to keep the judges competent.

        What Xbox is doing is much simpler and in volume will work.
        ++---+++--- --- this player gets a yellow color
        -----------++- --- this player gets a red color
        etc.

        Simple and effective if you ask me. If everybody starts reporting people who won when they lost then I say turn the internet o

    • Its not bullying, its just the meta-game.
    • Re:Bullying (Score:5, Interesting)

      by pehrs ( 690959 ) on Thursday March 27, 2014 @07:04AM (#46591361)
      Having been involved of the design of a similar system a few years back, I found this remarkably easy to handle.

      What you do is that you cluster people based on their opinions, and add a fading of old opinions. People who share good opinion about each other are in the same cluster. People who dislike each other are in different clusters. So, what happens in the end is that the "nice" people end up in a few big "nice people" clusters, and you get lots of small clusters of jerks. In the system we designed we actually provided individualised feedback to the users, as in "From the perspective of your cluster, this person has good/neutral/bad standing". In practice it didn't take long before people with good behaviour were efficiently separated from the rest.

      Giving bad score to lots of people needlessly quickly gets you kicked out of the "good people" cluster. Congratulations, you now get to play with the rest of bullies.

      Of course, this is just basic computer science and statistics...
    • by Chas ( 5144 )

      Exactly.

      These type of rep systems exist already.

      Pretty much NONE of them work as intended, and devolve into griefing tools.

    • MeowMeowBeenz [wikipedia.org] now available on Xbox One.
    • If only there was a website that had a system whereby the users would self moderate and allow the more Interesting, Insightful or Funny comments to be more visible.
    • RIOT has successfully implemented a penalty system in League of Legends. As far as I'm concerned it is the most respectful gaming environment I've ever seen. I'm very happy this is being implemented. If a few people who don't like you report you it won't matter as many won't rate you or will rate you as a good teammate.

    • This is so true.

      My brother is VERY good as counterstrike type games. Just a natural I guess. Head shot mofos all day, finding choke points and head shotting entire teams as they run out etc. And no, not camping as your score tends to suck if you camp.

      He stopped playing CS. Why? Because he kept getting kicked and then banned for "cheating" which was a code word for being too good. (I know for a fact he was not because I watched him a few times)

      This will be no different.

      Basically MS, in their infinite wisdom,
    • Use Slashdot's moderation and meta moderation style system. One irate idiot cannot negatively impact the score of a post or a users reputation. Multiple people need to report the same thing for a score to hold, then the meta moderators determine if the score assigned to a post was justified or not.

      I assume Microsoft won't allow a single person's review of another user to hold much weight until multiple users are reporting the same thing, Likewise, I assume that users with a good reputation down voting a bad

    • That is trivial to filter out and is already something implemented.

      A few bad marks don't really do that much damage to your reputation especially from a small group of people.

      If EVERYONE you play with gives you a bad rating, then you end up with a shitty rep.

      Dealing with karma whores is trivial, and thats all this is.

  • They just added karma to xbox accounts?

    • Xbox One Colourful Karma.

    • There's been a "karma" system for Xbox Live accounts pretty much since the launch of the 360. You look at somebody's gamer card and they have a star rating out of 5 clearly viewable. The change here is that, for the first time, they're making it have actual consequences.

      A lot of the posts in this thread so far are about the potential for abuse. I've played on on Xbox Live on and off since the days of the original Xbox and have seen the old "consequence free" system in operation for a while. By and large, my

    • "They just added karma to xbox accounts?"

      Yes. If thieves and rogues steal too much from Paladins, they get a record.

  • by captainpanic ( 1173915 ) on Thursday March 27, 2014 @05:38AM (#46591151)

    There is absolutely no way that soar losers will totally abuse this.

    Also, there is no way that people will get upset buying an expensive gaming system, and subsequently being unable to play with the 'green' accounts because of some highly subjective moderation system.

    • by captainpanic ( 1173915 ) on Thursday March 27, 2014 @05:41AM (#46591159)

      Also: I get the feeling that European English speaking people swear a lot more than in the USA, and I wonder if this will be reflected in the moderation.

      • Also: I get the feeling that European English speaking people swear a lot more than in the USA, and I wonder if this will be reflected in the moderation.

        I too %*&!#$! wonder if this will be *(@&#$&%@ reflected in the @$&!%(#!%$&! moderation.

      • Re: (Score:3, Funny)

        by Anonymous Coward

        I'm guessing it stems from the absence of bleeps whenever someone on TV refers to a body part or a vaguely defined swear word, Europeans just aren't that afraid of bodily functions or the full range of the language. How's that for grand sweeping generalization?

      • Really? The Brits use "bullocks, bugger, bloody" vs the US using "bullshit, fuck, fucking." And what, exactly is a swear word? Which do you rate worse, cocksucker or knob-gobbler?
        • Which do you rate worse, cocksucker or knob-gobbler?

          Obviously, I live a sheltered life, since I have never heard "knob-gobbler" before...

          That said, knob-gobbler is too funny to be a swear word, and I think everyone should use it instead of cocksucker.

          • Obviously, I live a sheltered life, since I have never heard "knob-gobbler" before...

            Yes, yes you do ... I think I've known of that one for at least 30 years.

            That said, knob-gobbler is too funny to be a swear word, and I think everyone should use it instead of cocksucker.

            And Tits [segall.net], wow. Tits doesn't even belong on the list, you know. It's such a friendly sounding word.

      • by AmiMoJo ( 196126 ) * on Thursday March 27, 2014 @08:46AM (#46591847) Homepage Journal

        I was playing Phantasy Star years ago and chatting away, eating from crisps (potato chips in American). Someone asked me what all the noise was and I said I was masticating. The guy went ape-shit, ranting on about how children play the game and so forth. I had to call him a wanker and mute him after that.

      • by mjwx ( 966435 )

        Also: I get the feeling that European English speaking people swear a lot more than in the USA, and I wonder if this will be reflected in the moderation.

        In my experience, they swear less.

        But they are a lot more creative with their insults than Americans.

    • I would expect it wouldn't rate every complaint equally.
      If you account for factors such as the number of complaints from a user vs their score. And say you find a trend of complaints vs being beaten and complaints when you played well. You can correlate the bad users and weigh their complaints accordingly.

    • by korbulon ( 2792438 ) on Thursday March 27, 2014 @07:23AM (#46591425)
      Soar losers won't give a flying fuck.
    • They work. There's plenty of proof of them working in games such as League of Legends. I've seen the progression of their system. I used to have to deal with a dick every second game, now I'm lucky if I get one every 100 games. I'd say the system works perfectly.

      And subjective, I disagree. If people don't want to play with you there's nothing subjective about that. Go play against the AI if you're too much of a dick to be reasonable when gaming.

  • You mean I can't join a game of Counter Strike: Source, team flash someone for 27 rounds until an admin shows up, then quit, change my name, rejoin and repeat? What's the point of playing then?
  • Good players will get ganged up on by the fricking kiddies and smeared. It's why I refuse to play any public multi player stuff anymore.
    Last time I did any of that was back when Modern Warfare was released. 2 friends of mine and I were utterly owning maps by using real tactics. all three of us were hit with complaints by the kiddies that want to be tubers or camper.

    I only do private games with friends anymore, tired of the utter scum that is the public gaming crowd on Xbox.

  • It's about time... (Score:5, Insightful)

    by egarland ( 120202 ) on Thursday March 27, 2014 @08:04AM (#46591593)

    XBox has long been known as the most potent example of the Greater Internet Fuckwad Theory [penny-arcade.com]. Adding a bit of accountability for being a horrible person is overdue.

  • When I read the headline, I was kind of hoping the XBone reputation system was going to give little electroshocks to kids when they act out in front of company, pick on their sister or don't lift the toilet seat.

  • by EXTomar ( 78739 ) on Thursday March 27, 2014 @09:39AM (#46592177)

    It is all well and good to give users com controls to their com features but trying to enforce a reputation system like this is just another tool for bad guys to behave like bad guys. If a group of 4 bullies wants to make someone's day miserable, they form up and join a game and focus on one player using all tools available where a reputation system like this is just the thing they need: One player getting 4 warnings is more serious than 4 different players getting warnings from one player.

    What they and successful systems do instead is establish a "trust relation". If you are matched in a team with some complete stranger, then neither of you have "trust" and neither should do "trusted" actions with each other. If you form a party, you automatically trust them more than a stranger and access more "trusted" features. If another player is in your "friends" list and formed a party with you then you have a high level of "trust" with that player and should be allowed a lot of "trusted" features with them.

    There does need to be moderation tools and they should be as automatic as possible but "reputation" systems seem to be built upon a flawed premise that complete strangers can judge each other fairly when it turns out there is little reason to trust what either have them have to say about the other.

    • First, if this is going to be at all successful, the implementers are going to watch, and see what happens with the system in real-world use. Unless they happened on the perfect system right at the gate, there will be room for improvement. Second, they may have more than one category, not visible to the user. Maybe "complainers" (people who always report on people who kill them) and "bullies" (campers and griefers) will both get a red rating over time. (Yes, the guy who responds to everyone who's better

  • Should have saved this for April 1st to go with "Dice holdings apologizes for beta and promises to deploy IPv6"

    "The system also adjusts for false reports from people that might intentionally report someone of greater skill or for other griefing purposes."

    Well then nothing to worry about. I suppose this system also implements RFC3514 on every game packet to ensure fair play.

  • Whoever designed this system has never been on the internet apparently or played a game. I'm amazing at MW3. According to everyone I play with, I'm clearly cheating. In realtiy I am not. This system is bullshit.
  • People will more likely report problem people than the gamers who quietly participate and cause no problems. And, simple disputes can cause negative reports. However, I would like to see some crowdsourced attempts in other games to flag players for typical trolling; racism, griefing, TKing, and suspected hacking. Admins should *definitely* have the ability to flag people as known problem children and I'd love to see these flags pop up on player names when I log in.

    In games like Battlefield griefers have

  • Do you remember the shit you said as a kid? Kids are now interacting with adults. Their physical bodies which face threat in the real world if such things are said have been removed and replaced by the appearance of grizzled soldiers. Thus removing the instinctive tolerance we have towards the biting, pulling, poking and verbal abuse of our young, and enabling their already vitriolic comments to become more so. This, combined with the illusion of maturity in swearing like "adults" this, adds multiplier

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...