I enjoy discussing the business aspects of the game industry. I want to share my knowledge with those that might not understand why post launch monetization is an important part of funding video game development. To be clear: There are valid debates to be had regarding monetization strategies, but this article is to be taken as educational and not purely a defense of such practices.
Expenses and You!
In accounting for what a game costs to develop and support, developers have two primary ways to account for cost: Investment & Normal Expense
Investment expense is, generally speaking, the cost the company incurs to build the game until the release date. It typically includes all the “code”, servers, voice acting, tools, and assets used to create the product. The reason it’s called investment is because the company is claiming the product is adding value to the company (becomes an asset) thus they want to amortize the investment over a certain number of years. Here’s an example: Company XYZ creates a game called “Call of Battle Heroes”. It costs them approximately 100 million dollars to create “Call of Battle Heroes” over the four years it was in development. Since the company can claim that “Call of Battle Heroes” adds value to company, there is some special accounting to consider. First, they add 100 million of “value” to the company. Second, they account for the investment expense value over a time period (e.g 10 years). Each year, company XYZ takes 10 million dollars in expenses to account for the development of “Call of Battle Heroes”. While it is a bit more complex than this, the takeaway here is that the impact of investment expense is “softened” by the fact that it adds “value” to the company and is accounted for over a time period. All companies (even your favorite small indie developer) can claim investment and balance the value payoff over a long period of time. (Under GAAP, amortization is a non-cash expense on the income statement.)
If this is confusing, it’s because it’s not intuitive, especially if you aren’t familiar with how value and expense is accounted for in GAAP. And in case it wasn’t clear, yes, the company still incurred expenses (paid salaries, bought licenses, bought servers) to build the game. That expense is balanced out by the value that game added to the company’s theoretical market value.
This is the cost the company incurs to support/update/patch the game after it goes live. In my example, it includes any additional code, patches, assets, voice acting, promotions, servers, marketing, etc., to support “Call of Battle Heroes” after its launch date. Normal expense must be immediately claimed as expense on the income statement (normally falling under operating expenses). There are special situations where the company can claim investment again, but for the sake of simplicity today, we’ll consider everything after the game goes “live” to be considered normal expense. The takeaway here is that the company cannot amortize normal expense by claiming is as “value” over ten years. (E.g. If a developer spends five million dollars to update/patch a game during a financial quarter, they have to report that five million dollars immediately.)
Why Does Expense Type Matter?
In the olden days (before approximately 2010)…
…a game was developed primarily as an investment, then released to the market. Before the days of games being supported as services, post launch support was not a major expense item. (Yes, there were exceptions but I’m keeping this simple so it doesn’t drag on for days.) Developers would release patches and updates, but for the most part, post launch support expense was not nearly as expensive as the original investment. And since the value of the game was being accounted for over 5-10 years, it was easier to “balance to books” and show a healthy profit.
In the modern era of game development…
…the expansion of multiplayer, MMOs, co-op, and live service games, have caused post launch expenses to increase many times over. The modern gaming consumer expects their favorite game will be supported properly after launch (this includes single player games as well). Developers & publishers deal with the original development investment cost (e.g. 100 million amortized over 10 years) and a ballooning budget to support the game’s post launch content/patches/fixes as normal expense. Post launch support varies, depending on the game, but the general rule of thumb is about 5-20% of the original budget (e.g. 5-20% of the original 100 million) goes to support the game each year. Many of these games are scheduled to be running for ten years (or more) so there has to be a plan for a positive ROI/ROE (return on investment/expense), year over year.
Remember, after the game goes live nearly all money spent to support the game is considered normal expense, it has to be accounted for immediately as an operating expense. In some cases, especially in the AAA industry, the normal expense of operating the game can actually be higher than the amortized per year investment expense. The general rule of thumb for calculating expenses is to use between $5,000 and $20,000 per month, per employee.
The math is quite simple: In order to calculate the estimated expense to support a game for a year, take the [RATE] x [NUMBER OF EMPLOYEES] x 12 months. For example: $8,000 x 75 employees x 12 months = 7.2 Million per year. In locations such as Silicon Valley, the rate is always at the top end of the scale (15-20k). You can do the math yourself, but you likely already realized we are talking about costs that can run upwards of 20-30 million dollars per year to support a AAA game.
Note: Post launch expenses, in some cases, can be just as daunting as the original expense. This is especially true in cases where the “launch” version was only a portion of the game, or needs rebuilt. If there are 300 people working on a game, and those same people keep working on it after launch, there isn’t much difference between launch and support. Some of these massive game “rebuilds” (e.g. Destiny 2, FFXIV) may get split between investment and general expense.
This is the part where a lot of people (particularly YouTubers) say something like: “Company XYZ made 20 billion dollars last year, they can afford it.” There is a reason that company XYZ made billions of dollars last year: They expense each of their games independently. To make sure that products are being tracked for a profit (or loss), each game has its own budget. The project team that manages and develops a game must work within their budget. Just because the company (whether it be a publisher or independent developer) made 80 bazillion dollars does not mean that profit is then redistributed evenly within the organization. Each game has to “support” itself (usually), and show that it is turning a profit, else what it the point? No game, product, or service can run “in the red” (aka losing money) for longer than a couple years (or so). This is true in both large and small developers/publishers. No game gets a free ticket to run a negative ROI/ROE unless there is some special reason the company is keeping it afloat. One last point here: Some revenue has to be reinvested back into the company to support new projects.
The one exception that companies will make when considering a loss is to to keep servers & authentications systems live, far longer than the game being able to pay for itself with sales. The general thought process here is that taking those servers offline would harm the company financially by tarnishing their brand. But, as we all know, those services are eventually disabled.
I often hear that the company “should just sell additional copies of the game instead of using microtransactions.” Yes, copies of the game will continue to sell over time, but there is an extremely predictable fall-off rate that developers/publishers can model when trying to determine future sales. After the first year of sales, games generally don’t sell enough to pay for themselves if they require sustained expenses (note: GTAV is the exception to everything and should not be used to model game sales). Additionally, as more and more games are bundled into services like Xbox Game Pass and Origin Access, simply measuring game units sold becomes irrelevant. The takeaway here is that developers can no longer count on simply selling “copies” of the game to pay for all the expenses that come with it.
What do microtransactions have to do with any of this?
If “Call of Battle Heroes” has to support itself for 3-10 years, there must be a way to pay for those yearly expenses that does not rely on selling additional licenses / copies of the game.
Developers/publishers have multiple options when it comes to supporting post launch expenses (this is a general list, not meant to be 100% exhaustive):
- Charging for post launch DLC – This was the go-to strategy for many years in the video game industry. In order to support a game post launch, run the servers, etc., developers would directly charge the customer a fee to access the new content. While this option is simple and straightforward, it also has a lot of problems. This method often splits the player base, dividing groups of customers/players into individual buckets that are often not compatible with each other. Additionally, charging for DLC often has the adverse effect of taking people out of the game because they no longer want to pay for new content. Developers are moving away from the this strategy because the downsides have proven unpopular in the market. Interestingly, The Witcher 3 (praise Geraldo) used this model quite successfully. I strongly believe there is still room in the market for this type of post launch monetization, in the right game.
- Purchasable cosmetics & optional content – This strategy is basically a hybrid approach which attempts to monetize non-essential chunks of the game. Think of something like Starcraft 2, where both cosmetics and content are purchasable, but are in no way required to experience the core game. Destiny 2 Forsaken also falls into this category where the core of the game is playable, but a person can also purchase cosmetics and a “season pass” for optional content.
- First, let’s talk about the optional cosmetics: With loot boxes falling out of favor with some developers, this appears to be the “next best thing” in terms of reoccurring monetization. The “problem” with purchasable cosmetics is that they create a spike in incoming cash-flow that falls off once people buy the cosmetic item(s) they want. Cosmetic stores also have to be constantly updated with new skins/costumes/color-palettes/etc. to re-incentive consumers to spend more money. Look at something like Blizzard’s Heroes of the Storm: In the last year the cosmetic store has been constantly updated, in order to generate reoccurring revenue. [Update: As I write this, Blizzard has reduced support for HoTS.]
- Second, let’s talk optional content: This is a good strategy if the game is built around it. If a developer can create optional content within their game without making part of the player base feel alienated (left out, abandoned), then this monetization strategy can work. The issue here is that it’s often very risky to go down this path. Optional content sometimes requires just as much development time/effort as the non-optional content and can be very costly in and of itself. If the content cannot pay for itself, or the player base feels like the optional content isn’t worth it, this will backfire (monetarily and politically).
- Loot boxes that drop cosmetics only – Loot boxes, every gamer’s favorite term. Let’s make sure we are on the same page with what I mean when I say “loot boxes that drop cosmetics only.” I am referring to any system where the consumer pays real money (or buys a premium currency) to purchase a digital box (pack, crate, bag, whatever) that drops an assortment of cosmetic devices for use within the game. For example, in Overwatch, you can pay real money to buy loot boxes that drop non game-altering content such as skins, voices, and stickers. The content in this category CANNOT affect the mechanical actions of the game or make the player “more powerful.”
- Why would a developer use cosmetic loot boxes, especially when gaming communities are turning against them? For a moment, let’s put the “gambling” aspect aside and just talk about the revenue model. Loot boxes create reoccurring & more predictable profit generation than simply selling straight purchasable cosmetics. Loot boxes help smooth out the revenue model in a way that creates predictable revenue over time as opposed to peaks & valleys in the other monetization models. Since the consumer is not guaranteed the cosmetic they want, they will generally spend more consistently to obtain the cosmetics in the game. Developers can also use “free” loot boxes as a reward loop in game (e.g. play 5 hours, get a free loot box).
- Loot boxes filled with game-altering items – To be clear, this is the type of loot box where the contents directly alter the game, whether that be through more items, cards, characters, or weapons. In this system, players spend real money (or buy a premium currency) to purchase a digital box that contains various items that make the player more powerful (in some way). These power-altering items often come in rarities, with the best items in the game being extremely rare (< 1% of rolls). This type of loot box was popularized by mobile games but can be found in non-mobile games as well. Examples of this system include any & all mobile “gacha” games, FIFA (ultimate team), and Hearthstone (card packs). You may recall that Star Wars Battlefront 2 was originally going to use a loot box, game-altering system, but it was ultimately removed due to consumer backlash.
- If I sit here and debate the legalities and morality of this type of system, this article would never end. Instead, I’m simply going to cover why a developer would choose to use this system.
- This type of monetization is extremely popular in mobile development because mobile games are often free to install. The developer makes no money up front, but instead uses the loot box system to entice players to buy “power” and roll for better items/upgrades/characters. Similar to cosmetic loot boxes, this system is lucrative and creates more predictable revenue over time. Mobile developers are not the only people utilizing such a system. Card games, such as Hearthstone, also use a version of this system to supply cards to the player (similar to buying a pack of magic cards in “real life”).
- This system is also very controversial because it takes advantage of the willingness of people to buy “power” with money.
- Some other monetization method – Pay to skip, Pay for premium, Pay for special treatment, etc.
- I am not going to spend a lot of time on this section, but generally speaking it is when a developer strikes some type of hybrid system when the player can pay real money (or use a premium currency) to skip over a waiting period or avoid the “grind.”
If a developer wants a predictable influx of cash to support the project, cosmetic items & cosmetic loot boxes are some of the most effective and least controversial solutions. They don’t affect the mechanics of the game and due to some great data analytics in the industry, it’s relatively easy to predict how much revenue they will generate over a time period (depending on the size of the player base). There are lot of formulas in the industry now that essentially read: “If your player base is X, you can expect to make Y from microtransactions.” It’s not a perfect science but it works. (“Ask A Dev” had a great response to a question regarding “whales” and who’s spending all the money.)
It’s critical to understand that games today, especially live multiplayer & service based games such as The Division, absolutely must show how they will maintain a profit year over year. Selling additional copies of the game is not enough to cover the development expense of supporting the product year over year.
I’m not saying that cosmetics and cosmetic-type loot boxes are the only answer. There is a valid debate to be had over what solution provides the most value and for whom (consumer or company). Most often the answer varies from game to game. I just want you (the reader) to understand why those options have become the go-to solution to support post launch expenses. I’m sure we can all have a healthy debate over the other options. Personally, if a game I enjoy is constantly updated for 3-10 years, and it’s supported by a cosmetics store, I’ll take that option. I’ve had industry employees confirm (directly to me) that live service games live and die by their microtransactions. No microtransactions, no game.
Common misconceptions (and other popular talking points)…
- “Game companies should just be happy with the profits they make.” — This is just not how business works. There is no legal (or moral) limit to how much money a person or company can make. What might seem wrong (or greedy) to you, might be how Blizzard supports their next venture. That’s how a business operates – some of profits have to be reinvested into the company. There is absolutely no good way to determine what is right or wrong in terms of revenue. Is CEO pay too high? I don’t know. Maybe? The CEO’s salary is a drop in the bucket compared to what it costs to run a major AAA developer. Your favorite indie developer who makes the special game you love? They still did it so they can make a profit (and hopefully enough to invest the money back into other games). It is surprisingly easy for a company to burn through billions of dollars. A company that employees hundreds, if not thousands of people, can suddenly find themselves underwater (financially) with only a couple big budget mistakes/failures. You might be surprised to know that many big developers and publishers post two quarters of loss every financial year (to be fair, this is changing with services).
- “This is all a result of exploding budgets and companies wasting money. They wouldn’t need microtransactions if they budgeted better.” — Ehhh. Yes and no. It costs anywhere from $5,000 to $20,000 per person, per month, to run a project (potentially even higher in high cost areas). Even if a developer removes some bloat, uses a ready-made engine, or plans better, there is still no getting around the fact that professional game development is vastly more expensive now that it was in the past. Top tier developers & artists expect a salary over six figures (in the high cost areas) while entry level positions expect to see reasonable starting salaries between $50k-90k. I don’t care how many YouTube channels try to convince people that it’s not. The only practical options to reduce expenses is to make smaller games (something the industry is talking about now), employ less people (layoffs), or reduce salaries (no).
- “Microtransactions are bad enough, but even cosmetic loot boxes need regulated because they are a form of gambling.” — This is the current popular argument against loot boxes and it’s probably the one that will illicit the most debate. I’ve seen gaming communities shift their arguments to this talking point since it’s the one that is gaining the most traction with the press and various governmental bodies. This is also the hardest one to have an open & honest debate about since people will point to different facts and figures (and feelings) to support their argument. I need to dedicate an entire article to the morality of loot boxes and I just don’t have the room to do that here. Instead, I will simply go over some facts and personal opinion in regards to this method.
- Gambling, legally, is the wrong term to apply to loot boxes (in the United States at least) because something of perceived value is always returned to the buyer. (Before you type up a 5000 word essay about why this argument is flawed, I already understand the arguments about loot boxes being “psychologically akin to gambling” since they contain items of various rarities. “Psychologically akin to gambling” is a loaded phrase and for today, I’d rather get to the rest of the article.)
- Despite many YouTubers and media sites saying so, the UK Gambling Commission did not equate loot boxes to gambling.
- “Think of the children” and gambling regulation arguments don’t hold a lot of water with me, personally, because 1) children should not have the ability to spend money on their parents PC / console 2) free market societies, generally speaking, don’t regulate commerce as long as the impact is personal, reasonable, and avoidable. (We still allow people to buy alcohol and its harmful effects far outweigh the risk of people buying Overwatch loot boxes.)
- Cosmetic loot boxes are not just a tool for massive publishers/developers. They can provide a predictable, steady stream of revenue to even the smallest of developers. I don’t want to see that option taken away from them. When the US government passed the Dodd-Frank financial rules, small community banks & credit unions were impacted more than their large counterparts. While Dodd-Frank was an important (and necessary) piece of legislation, large institutions floated on just fine, while small banks struggled to deal with the regulations.
- I can see the potential for loot boxes that provide transactional value, outside of the game, to be regulated in some manner. There is an argument to made those items have real world value.
I’m absolutely not here to tell you what to think about microtransactions, DLC, and loot boxes. You have to make up your own mind and then apply your hard earned money where you believe it is best spent. If there is a developer doing something you like, go support them. If you don’t want to purchase loot boxes? Don’t buy them. If you want that fancy new costume for your hero and you believe it’s worth the money, go ahead and buy it. What I want people to take away from this is an understanding of how the industry really works. Building & supporting games is vastly more expensive than it used to be, and truthfully, much of that is because expectations have changed. (Remember when voice acting was rare?)
It’s easy to label the large publisher/developers as simply trying to “nickel & dime” the consumer but most of the time, that simply isn’t the case. The project teams & leaders behind the game must demonstrate how they are generating a profit (at least breaking even) and microtransactions are more often than not, the way to make that happen.
Do some publishers abuse microtransactions? Absolutely, yes. Do some publishers/developers need to feel the sting of the market when they get overly aggressive? Again, yes! Battlefront 2 is a near perfect example where the market reacted negatively to overly aggressive, consumer un-friendly tactics, and EA rightfully paid for it. The solution to overly aggressive monetization tactics already exists – don’t buy products/services you don’t want to support.
In conclusion, I hope this helps everyone understand of what drives microtransactions and post launch monetization (outside of pure profit driven motive). You don’t have to agree with all of it, and I’m sure your take is different than mine, but we can at least have a respectful conversation about it.
Do you have an opinion on microtransactions, particularly cosmetics, or loot boxes? Let us know in the comments or on Twitter!