A long, long time ago in an MMO far, far, away, long before the NGE and other changes, there was a sandbox MMO that boasted an entirely player-based economy. When SWG launched and during the first couple of years of its life, there was practically no such thing as loot – and what there was, was usually either entirely useless or purely decorative (bantha dolls!). Weapons, armor, harvesters, houses, and everything else: if you needed it, someone had to make it. Crafting was big business, involving careful resource location and good experimentation skills, and the better you were at it, the better and therefore the more desirable your product would be. Items decayed over time, in some cases falling apart or becoming useless after a lot of use or neglect.
This degradation of items and the eventual need to replace them is generally called “item decay”, and many players absolutely detest it. However, in the right game with the right kind of economy, I think it’s not only a necessary element of that economy, it’s an essential one. The problem is that the typical implementation of item decay is more as a money sink than as an actual component of the economy, which makes it at best a tedious routine (get back to home base, repair) and at worst a downright annoyance. For the record, I don’t like that kind of item decay either, especially since in most cases the effect of an item’s condition is nil until said condition reaches 0%, at which point the item suddenly becomes useless until repaired or replaced. Some games implement incremental effects, but that’s really just another layer of pain in the ass. It doesn’t add to anyone’s enjoyment of the game, including my own.
Most MMOs I’ve experienced can be somewhat loosely classed as loot-based or player-based, where items are generated primarily either through looting or through crafting. Most games use a combination of both, but the vast majority of MMOs these days use a loot-based system. (Tokens and other loot-purchase systems also fall under this category.) Star Wars: Galaxies and a few other games, however, use or have used a primarily player-based system, and in those cases item decay – if properly designed and implemented – makes a lot of sense. If everything is crafted by players, you have to design a system for repeat business, especially in the later stages of a game’s life where there aren’t that many new characters being created.
Personally I’m a huge fan of player-based economies, especially in sandbox-style games where that sort of design makes a lot of sense. The problem is that item decay itself still remains a fairly sizeable annoyance; carrying repair kits and spares and whatnot could turn into an inventory nightmare in pre-NGE SWG. That doesn’t, however, invalidate the usefulness of item decay itself: it’s the implementation that may be at fault. One could, for instance, have items with a finite lifespan – the player wouldn’t have to bother with repairing anything, but they would know that their weapon was set to last, say, 10 days worth of playtime. While the initial reaction to that might be resoundingly negative, the initial reaction of players to anything new tends to be resoundingly negative anyway, and we usually not only get used to the change but eventually also come to appreciate it.
What item decay shouldn’t be is a giant annoyance, or something that penalizes the absent or terminally absent-minded player. Having to log in to pay maintenance on harvesters – if you didn’t want to see thousands of credits worth of investment go up in flames – was a pain in the backside; but when that system was changed to just shutting down the harvester (coupled with, if memory serves, a much slower decay rate), keeping one’s harvesters alive stopped being a second job and became something that was costly if forgotten, but not excruciatingly painful. That’s how item decay should work.
In loot-based games, item decay is really just something designed to funnel a little money (multiplied by thousands of players) out of an economy where money is constantly being magically created. In sandbox and player-based games, it serves a more useful economic vitality function and can be implemented without driving everyone crazy in the process. Part of the problem is that it’s difficult to mix loot- and player-based systems – in most MMOs, one is usually very subordinate to the other and the loot-based systems usually end up winning out.
I suspect that may be a function of popularity. Many players loathe crafting and, by extension, seem to loathe crafters (though from what I’ve seen, gouging is far more common among loot-sellers than it is among most serious crafters). Interestingly, however, games with a strong crafting component and community (EQ2, for instance) tend to suffer less from this trend than games where crafting is a sideline or an afterthought. Strong crafting communities will tend to balance themselves out pretty quickly when it comes to pricing and availability, and there’s almost always a way to find what you need even if you can’t make it yourself.
Tangentially, one might even argue that the need to interact with other people to obtain goods and services tends to strengthen the stickability and community of a given game, although the trend these days seems to be to allow people to do almost everything by themselves. As an often solo player I value being able to do a number of things alone, but I don’t think a little social interaction is too much to ask, especially if the game systems make it easy. In UO and SWG, for instance, players could have vendors – if I can sell my wares in my own shop even when I’m not online, I can build my brand and still keep my customers happy, even the ones who don’t ever want to actually talk to me. (I had a few very loyal customers in SWG who were just like that.)
But you can’t build that kind of long-lasting economy and community if you don’t remove things from the game over time, or you just end up feeding mudflation and the eventual loss of any useful custom. Consumables are one way to ensure repeat business (potions, ammo, etc.), but in and of themselves they can’t be the only way. Loot-based games do the same sort of thing by implementing levels and tiers for items – your stuff won’t fall apart, but it will ultimately become useless as you outlevel it or as you grow past it in terms of the content you’re tackling. It may not be item decay per se, but it serves very much the same purpose, as well as keeping players on that ever-upward item treadmill.
As I said, I prefer sandbox games and I prefer a more up-front version of item replacement. I don’t necessarily want to be constantly looking for the next and better piece of gear, and I’m not entirely certain that mechanic is as popular as designers seem to think it is. There are other ways to ensure player retention. One of them is to create dynamic, community-driven worlds, and if that means my gear or my house suffers damage over time and I eventually have to consider replacing it, then so be it.
I’m also quite sure many of you won’t agree. Feel free to sound off below – I think this is more a case of opinion and context than right or wrong, and I’ve always been interested in reading other players’ reasons for preferring one system or the other. I’m safe, my flame-retardant suit is finally back from the cleaners!