Last week it was reported how Geeknet Inc. was in the process of being bought out by retailer Hot Topic for $16 a share or $37 million in cash.
However we have just discovered that deal was squashed because Thinkgeek got a better deal from Gamestop.
GameStop offered $20 per share and Hot Topic wanted away. GameStop’s $20 per share deal also includes $37 million in cash and comes out to a total valuation of $140 million.
Geeknet must pay Hot Topic a three percent “break-up fee,” which GameStop has agreed to reimburse.
What this will mean is that ThinkGeek customers can pick up ThinkGeek merchandise in GameStop stores.
The press release also mentions the potential of offering GameStop PowerUp Rewards members “exclusive, unique and cutting edge merchandise related to their favorite entertainment.”
The deal should be concluded by the end of GameStop’s second financial quarter of 2015, which will happen in August.
For a while now, people had been wondering what the next Wii would be called, with smart money being on the Number 2. However it seems that the new console dubbed the Nintendo NX has a few surprises under the bonnet.
According to Nikkei Nintendo is planning an Android console so that game developers would be able to port their games over with relative ease.
This could also indicate that games developed for the Nintendo NX could extend to other Android-powered devices like smartphones and tablets, play nice with the console.
Games developers have been ignoring the Wii U in droves so this might actually help Nintendo get back into the race.
Android-powered consoles have appeared before but they died horribly in the market place.
There’s something genuinely surreal about sitting down to write an article about region locking in 2015. It feels archaic and almost nostalgic; I might as well be writing something about blowing into cartridge ports to get games to work, or bemoaning the long load times for cassettes. Yet here we are. Years into the era of digital distribution, long after we reached the point where it became technically harder to prevent customers from accessing games from anywhere in the world than it is to permit the same, region locking is back in the news. Thanks, Nintendo.
The focus of this week’s headlines is the Humble Bundle promotion which Nintendo is running for a number of indie titles on 3DS and Wii U. It’s a great deal for some excellent games and is raising money for a solid cause; plus it’s wonderful to see console platform holders engaging with the Humble Bundle approach, which has been so successful at bringing indie games (and other creative works) to wider audiences on the PC. It ought to be a win, win, win for Nintendo, gamers and indie developers alike.
Unfortunately, though, the bundle only works in the Americas; North America and some bits of Central and South America. Customers elsewhere are entirely locked out, a matter which has been a source of deep frustration not only to those customers, but also seemingly to Nintendo’s own staff working on the project. The result is that what ought to have been a straightforward PR win for the company has turned bittersweet; there has been more widespread news coverage of the region locking debacle in the past few days than there has been for the bundle itself.
Although this is a terrible shame for the developers involved – and I sincerely hope that Nintendo can pull its thumb out of its backside and launch an international version of the bundle in short order – no sympathy is due to Nintendo in this situation. It’s a problem entirely of the company’s own making; the firm made a deliberate and conscious decision to embrace region locking even as the internationalisation of digital distribution made that look increasingly ridiculous, and until that stubbornly backwards piece of decision making is reversed, it’s going to continue causing PR problems for the firm, not to mention genuine problems for its most devoted customers.
Remember, after all, that the rest of the gaming world has ditched region locking en masse – Sony gave it up with the PS3, even making it painless to use digital content from different regions by creating multiple accounts on the same console, while Microsoft made region locking optional on Xbox 360 (making a bit of a mess where some publishers enforced it and others didn’t) before ditching it entirely on the Xbox One. At the same time Nintendo, ever the merry contrarians, went the opposite direction, not only maintaining region locking on the Wii and Wii U, but even extending it to the 3DS – in contrast to the company’s prior handheld consoles, which had been region free.
The idiocy of a region locked handheld is staggering; these are systems which are quite simply at their best when you’re traveling, yet lo and behold, Nintendo don’t want you to buy any games if you go on holiday or on a business trip. The excuses trotted out were mealy-mouthed corporate dishonesty from start to finish; it was all about protecting customers, honest, and respecting local customs and laws. Utter tosh. Had those things been a genuine issue, they would have been an issue in the previous decades when Nintendo managed to sell handheld consoles without region locking; they would also have been an issue for Sony and Microsoft when they removed region locking from their systems.
In truth, there’s only one reason for region locking in this day and age – price control – and Nintendo’s calculation must have been that they had more to lose from the possibility, real or imagined, of people buying cheaper 3DS games from countries overseas, than they had to lose from annoying a chunk of their customer base, be they keen gamers who wanted to try out titles unlikely to be released in their regions, expats who want to play games brought from their home countries or parents who find that a game bought in the airport on the way home from holiday results not in a pacified, happy child on the flight but in an angry, upset child with a game that won’t work.
In Nintendo’s defence, Satoru Iwata has recently been musing publicly about dropping region locking from the Nintendo NX, whenever that turns up. That the company is clearly planning to move down that path does rather confirm that it’s been fibbing about its motivations for region locking all along, of course, which might be why Iwata is being cautious in his statements; it’s a shame if such face-saving is the reason for Nintendo failing to keep up with industry moves in this regard, because the company is going to keep being periodically beaten with this stick until the problem is fixed.
Admittedly, there would be problems with removing region locking from its existing consoles – not least that Nintendo’s agreements with publishers probably guarantee the region locking system, so even if it could be patched out of the 3DS and Wii U with a software update, that can’t happen legally due to the contracts it would breach. What Nintendo could and should do, however, is to offer gamers a gesture of good faith on the matter by dropping region locking from all its first-party software from now on – and perhaps emulating Xbox 360 era Microsoft by making it optional for third-party publishers as well. I can envisage no legal barrier to that approach; it would earn the company enormous kudos for responding to its audience and dealing with the problem, and would cost them precisely nothing. There aren’t that many easy PR wins floating around the industry right now; Nintendo should leap on this chance to show itself to be on the customers’ side.
Wheels turn slowly in Kyoto, though, and it’s probably too much to expect the company to react in a startup-like way to the region locking issue. In some ways it’s Nintendo’s strength that it reacts slowly and thoughtfully rather than jumping on every bandwagon, but in recent years, it’s also been a weakness far too many times – and the thoroughly wonderful software that the company has been turning out in the past few years, perhaps the finest line-up it’s produced in decades, has been regularly undermined by bad decisions in marketing and positioning of its platforms, many of which can be traced to a failure to understand where the market is and where it’s moving.
Region locking isn’t the biggest problem. Fixing it would be cheap and easy but would hardly be a panacea for Nintendo’s issues – but it’s a problem that’s symptomatic, emblematic even, of the broader problems Nintendo has with putting its customers first and applying the same care and attention to its corporate aspects which it always applies to its software development. Fix a problem like this in a proactive, rapid way, and we might all start to believe that the company has what it takes to get back on top.
At Sony’s 2015 Investor Relations Day today, Sony Computer Entertainment president and global CEO Andrew House detailed the company’s strategy for the coming year, including how it will address some shortcomings.
House began his presentation on a positive note, talking up PlayStation 4 as “the fastest selling hardware platform in our history,” showing better-than expected growth and pushing PlayStation Plus subscriptions to twice what they were in fiscal year 2013. He said the company has a competitive advantage for the moment, and laid out three ways it hopes to maintain that. In addition to next year’s launch of the Project Morpheus virtual reality headset and continued cost reduction efforts, House said the company needs quality software.
“We are working very hard to continue very strong support from third-party pubs and devs,” House said. “Our first-party lineup is a little sparse this year, so I think this places even greater emphasis on getting good third-party support.”
That doesn’t necessarily mean exclusive third-party support. To date, House said Sony has been primarily trying to get multiplatform developers to simply take advantage of features the PS4 has over the competition, like SharePlay, or maybe include extra content in the PS4 version or give players early access to add-on content. Third-party exclusives are still an option, just not a frequently used one.
“I will admit that these are, in the current publishing landscape, few and far between, but we were able to announce a full exclusive around a franchise like Street Fighter so that Street Fighter 5 is a complete exclusive for PlayStation 4,” House said, adding, “Although given publishing dynamics and development costs, those are increasingly difficult to secure.”
House also talked about the decline in Sony’s other platforms. As much as the PS4′s growth has exceeded expectations, so too has the PlayStation 3′s decline. House said the system’s price simply isn’t as competitive in the market as the PlayStation 2 and PSone were after their successors launched, and added that the shift toward more connected console experiences has also made less capable offerings less attractive.
House also cast a dim view of the company’s handheld business. While he noted that the Vita platform remains “strong and vibrant” in Asia and Japan, his outlook for the current fiscal year included declines in the US and Europe. Additionally, he referred to the PlayStation Vita and its microconsole counterpart the PlayStation TV as “legacy platforms” when discussing a write-off of hardware components for the two.
“I would characterize 2015 as the beginning of a harvest period for the PlayStation 4 platform,” House said. “The beginning of a harvest period. That being said, we are also undertaking to invest in the future, and 2015 will also be a year of investment.”
That investment will be focused on a few areas. There’s the Morpheus, of course, as well as continued spend on original PlayStation entertainment content like the TV show Powers (which was recently greenlit for a second season). On top of that, House said Sony would be investing in the expansion of its PlayStation Vue television streaming platform and a continued re-architecture of its PlayStation Network with an eye toward increasing stability and reducing maintenance downtime.
Hackers from Brazil have managed to discover a new exploit for the PS4 which enables them to bypass the DRM on any software and games.
A couple of weeks ago, a number of electronic stores in Brazil had been advertising the means to copy and run a series of ripped retail games on the console.
At the time little was known about the hack back then, but information gradually began to trickle out from customers and make its way around the web. Please see below for commentary from Lancope.
Gavin Reid, VP of threat intelligence, Lancope said that Sony was playing an arms race against groups that benefit from the abilities to copy and share games.
The hack originates from a Russian website and has been pushed into the public by Brasilian retailers. The hack isn’t necessarily a jailbreak for the PS4, nor is it really a homebrew technique.
What they did was use a retail PS4, with several games installed on it, with it’s entire game database and operating system (including NAN/BIOS). This was then dumped onto a hacked PS4 via Raspberry Pi.
The entire process costs about $100 to $150 to install 10 games and $15 per additional game.
“Open source groups like Homebrew with more altruistic motivations of extending the functionality of the console alongside groups selling modified consoles specifically to play copied games and of course the resell of the games themselves at fraction of the actuals costs. This has happened historically with all of the major consoles. It would be highly unlikely not to continue with the PS4,” he said.
The deal that helped Crytek recover from its recent financial difficulties was Amazon, according to a report from Kotaku.
The online retail giant signed a licensing deal for CryEngine, Crytek’s proprietary game engine. Sources within the company put the deal’s value at between $50 million and $70 million, and suggested that Amazon may be using it as the bedrock for a proprietary engine of its own.
However Amazon uses the technology, though, the importance of the deal for Crytek cannot be overstated. Last year, during the summer, it became apparent that all was not well at the German developer. Employees hadn’t been fully paid in months, leading to an alleged staff walkout in its UK office, where a sequel to Homefront was in development. Koch Media acquired the Homefront IP and its team shortly after.
When the company’s management eventually addressed the rumors, it had already secured the financing necessary to take the company forward. No details of the deal were offered, but it’s very likely that Crytek got the money it needed from Amazon.
We have contacted Crytek to confirm the details, but it certainly fits with the perception that Amazon could emerge as a major creator of game content. It has snapped up some elite talent to do just that, it acquired Twitch for a huge sum of money, and it has been very open about where it plans to fit into the overall market.
Nintendo has formed a comprehensive new alliance with DeNA that will make every one of the company’s famous IPs available for mobile development.
The bedrock of the deal is a dual stock purchase, with each company buying ¥22 billion ($181 million) of the other’s treasury shares. That’s equivalent to 10 per cent of DeNA’s stock, and 1.24 per cent of Nintendo. The payments will complete on April 2, 2015.
What this will ultimately mean for the consumer is Nintendo IP on mobile, “extending Nintendo’s reach into the vast market of smart device users worldwide.” There will be no ports of existing Nintendo games, according to information released today, but, “all Nintendo IP will be eligible for development and exploration by the alliance.” That includes the “iconic characters” that the company has guarded for so long.
No details on the business model that these games and apps will be released under were offered, though Nintendo may well be reluctant to adopt free-to-play at first. The information provided to the press emphasised the “premium” experiences Nintendo currently offers on platforms like Wii U and 3DS. Admittedly, that could be interpreted in either direction.
However, Nintendo and DeNA are planning an online membership service that will span Nintendo consoles, PC and smart devices. That will launch in the autumn this year.
This marks a significant change in strategy for Nintendo, which has been the subject of reports about plans to take its famous IPs to mobile for at least a year. Indeed, the company has denied the suggestion on several occasions, even as it indicated that it did have plans to make mobile a part of its core strategy in other ways.
Analysts have been offering their reflections on the deal, with the response from most being largely positive.
“Nintendo’s decision to partner with DeNA is a recognition of the importance of the games app audience to the future of its business,” said IHS head of gaming Piers Harding-Rolls. “Not only is there significant revenue to be made directly from smartphone and tablet consumers for Nintendo, app ecosystems are also very important in reaching new customers to make them aware of the Nintendo brand and to drive a new and broader audience to its dedicated console business. Last year IHS data shows that games apps were worth $26 billion in consumer spending globally, with handheld console games worth only 13 per cent of that total at $3.3 billion.
“The Nintendo-DeNA alliance is a good fit and offers up a number of important synergies for two companies that are no longer leaders in their respective segments.
“DeNA remains one of the leading mobile games company’s in Japan and, we believe, shares cultural similarities with Nintendo, especially across its most popular big-brand content. The alliance gives Nintendo access to a large audience in its home market, which remains very important to its overall financial performance. Japanese consumers spend significantly more per capita on mobile games than in any other country and it remains the biggest market for both smartphone and handheld gaming. While the partnership gives Nintendo immediate potential to grow its domestic revenues through this audience, gaining access to DeNA’s mobile expertise is important too to realise this potential.
“This alliance makes commercial sense on many levels – the main challenge will be knitting together the cultures of both companies and aligning the speed of development and iteration that is needed in the mobile space with Nintendo’s more patient and systematic approach to games content production. How the new games are monetised may also provide a challenge considering the general differences in models used in retail for Nintendo and through in-app purchases for DeNA.”
In a livestreamed press conference regarding the DeNA deal, Nintendo’s Satoru Iwata reassured those in attendance that the company was still committed to “dedicated video game systems” as its core business. To do that, he confirmed that the company was working on a new console, codenamed “NX”.
“As proof that Nintendo maintains strong enthusiasm for the dedicated game system business let me confirm that Nintendo is currently developing a dedicated game platform with a brand new concept under the development codename NX,” he said.
“It is too early to elaborate on the details of this project but we hope to share more information with you next year.”
There’s not a lot to argue with the consensus view that Valve had the biggest and most exciting announcement of GDC this year, in the form of the Vive VR headset it’s producing with hardware partner HTC. It may not be the ultimate “winner” of the battle between VR technologies, but it’s done more than most to push the whole field forwards – and it clearly sparked the imaginations of both developers and media in San Francisco earlier this month. Few of those who attended GDC seem particularly keen to talk about anything other than Vive.
From Valve’s perspective, that might be just as well – the incredibly strong buzz around Vive meant that it eclipsed Valve’s other hardware-related announcement at GDC, the unveiling of new details of the Steam Machines initiative. Ordinarily, it might be an annoying (albeit very high-quality) problem to have one of your announcements completely dampen enthusiasm for the other; in this instance, it’s probably welcome, because what trickled out of GDC regarding Steam Machines is making this look like a very stunted, unloved and disappointing project indeed.
To recap briefly; Steam Machines is Valve’s attempt to create a range of attractive, small-form-factor PC hardware from top manufacturers carrying Valve’s seal of approval (hence being called “Steam Machines” and quite distinctly not “PCs”), running Valve’s own gaming-friendly flavour of the Linux OS, set up to connect to your living room TV and controlled with Valve’s custom joypad device. From a consumer standpoint, they’re Steam consoles; a way to access the enormous library of Steam content (at least the Linux-friendly parts of it) through a device that’s easy to buy, set up and control, and designed from the ground up for the living room.
That’s a really great idea, but one which requires careful execution. Most of all, if it’s going to work, it needs a fairly careful degree of control; Valve isn’t building the machines itself, but since it’s putting its seal of approval on them (allowing them to use the Steam trademark and promoting them through the Steam service), it ought to have the power to enforce various standards related to specification and performance, ensuring that buyers of Steam Machines get a clear, simple, transparent way to understand the calibre of machine they’re purchasing and the gaming performance they can expect as a result.
Since the announcement of the Steam Machines initiative, various ways of implementing this have been imagined; perhaps a numeric score assigned to each Machine allowing buyers to easily understand the price to performance ratio on offer? Perhaps a few distinct “levels” of Steam Machine, with some wiggle room for manufacturers to distinguish themselves, but essentially giving buyers a “Good – Better – Best” set of options that can be followed easily? Any such rating system could be tied in to the Steam store itself, so you could easily cross-reference and find out which system is most appropriate for the kind of games you actually want to play.
In the final analysis, it would appear that Valve’s decision on the myriad possibilities available to it in this regard is the worst possible cop-out, from a consumer standpoint; the company’s decided to do absolutely none of them. The Steam Machines page launched on the Steam website during GDC lists 15 manufacturers building the boxes; many of those manufacturers are offering three models or more at different price and performance levels. There is absolutely no way to compare or even understand performance across the different Steam Machines on offer, short of cross-referencing the graphics cards, processors, memory types and capacities and drive types and capacities used in each one – and if you’ve got the up-to-date technical knowledge to accurately balance those specifications across a few dozen different machines and figure out which one is the best, then you’re quite blatantly going to be the sort of person who saves money by buying the components separately and wouldn’t buy a Steam Machine in a lifetime.
“Valve seems to have copped out entirely from the idea of using its new systems to make the process of buying a gaming PC easier or more welcoming for consumers”
In short, unless there’s a pretty big rabbit that’s going to be pulled out of a hat between now and the launch of the first Steam Machines in the autumn, Valve seems to have copped out entirely from the idea of using its new systems to make the process of buying a gaming PC easier or more welcoming for consumers – and in the process, appears to have removed pretty much the entire raison d’etre of Steam Machines. The opportunity for the PC market to be grown significantly by becoming more “console-like” isn’t to do with shoving PC components into smaller boxes; that’s been happening for years, occasionally with pretty impressive results. Nor is it necessarily about reducing the price, which has also been happening for some years (and which was never going to happen with Steam Machines anyway, as Valve is of no mind to step in and become a loss-leading platform holder).
Rather, it’s about lowering the bar to entry, which remains dizzyingly high for PC gaming – not financially, but in knowledge terms. A combination of relatively high-end technical knowledge and of deliberate and cynical marketing-led obfuscation of technical terminology and product numbering has meant that the actual process of figuring out what you need to buy in order to play the games you want at a degree of quality that’s acceptable is no mean feat for an outsider wanting to engage (or re-engage) with PC games; it’s in this area, the simplicity and confidence of buying a system that you know will play all the games marketed for it, that consoles have an enormous advantage over the daunting task of becoming a PC gamer.
Lacking any guarantee of performance or simple way of understanding what sort of system you’re buying, the Steam Machines as they stand don’t do anything to make that process easier. Personally, I ought to be slap bang in the middle of the market for a Steam Machine; I’m a lapsed PC gamer with a decent disposable income who is really keen to engage with some of the games coming out in the coming year (especially some of the Kickstarted titles which hark back to RPGs I used to absolutely adore), but I’m totally out of touch with what the various specifications and numbers mean. A Steam Machine that I could buy with the confidence that it would play the games I want at decent quality would be a really easy purchase to justify; yet after an hour flicking over and back between the Steam Machines page launched during GDC and various tech websites (most of which assume a baseline of knowledge which, in my case, is a good seven or eight years out of date), I am no closer to understanding which machine I would need or what kind of price point is likely to be right for me. Balls to it; browser window full of tabs looking at tech spec mumbo-jumbo closed, PS4 booted up. Sale lost.
This would be merely a disappointment – a missed opportunity to lower the fence and let a lot more people enjoy PC gaming – were it not for the extra frisson of difficulty posed by none other than Valve’s more successful GDC announcement, the Vive VR headset. You see, one of the things that’s coming across really clearly from all the VR technology arriving on the market is that frame-rate – silky-smooth frame-rate, at least 60FPS and preferably more if the tech can manage it – is utterly vital to the VR experience, making the difference between a nauseating, headache-inducing mess and a Holodeck wet dream. Suddenly, the question of PC specifications has become even more important than before, because PCs incapable of delivering content of sufficient quality simply won’t work for VR. One of the appealing things about a Steam Machine ought to be the guarantee that I’ll be able to plug in a Vive headset and enjoy Valve’s VR, if not this year then at some point down the line; yet lacking any kind of certification that says “yes, this machine is going to be A-OK for VR experiences for now”, the risk of an expensive screw-up in the choice of machine to buy seems greater than ever before.
I may be giving Steam Machines a hard time unfairly; it may be that Valve is actually going to slap the manufacturers into line and impose a clear, transparent way of measuring and certifying performance on the devices, giving consumers confidence in their purchases and lowering the bar to entry to PC gaming. I hope so; this is something that only Valve is in a position to accomplish and that is more important than ever with VR on the horizon and approaching fast. The lack of any such system in the details announced thus far is bitterly disappointing, though. Without it, Steam Machines are nothing more than a handful of small form-factor PCs running a slightly off-kilter OS; of no interest to hobbyists, inaccessible to anyone else, and completely lacking a compelling reason to exist.
Valve has developed its own Intel Vulkan GPU graphics driver for Linux that they intend to open-source.
The Vulkan API is still being argued about and will not be finalized until later this year, but Valve has been developing their own Intel GPU reference driver for Vulkan to help early adopter’s boot-strap their code.
During their presentation at GDC2015 Valve announced that its Intel Linux driver will be open-sourced, but they haven’t provided a time-frame for doing so.
Valve also confirmed that the Source 2 Engine supports the alpha Vulkan API today and that Vulkan will be supported across the board on Steam Machines.
Intel graphics hardware might not be the sexiest, but there is a lot of it out there. It is also easy to target for an open-source driver given Intel’s extensive hardware specifications / programming documentation.
A Vulkan Intel Linux graphics driver used by game developers sounds very promising for having good support for this new API and most likely high-performance was a priority in the development of this driver by Valve and likely their partners at LunarG.
Nintendo is heading back to black, with the company’s financial announcements this week revealing that it’s expecting to post a fairly reasonable profit for the full year. For a company that’s largely been mired in red ink since the end of the glory days of the Wii, that looks like pretty fantastic news; but since I was one of the people who repeatedly pointed out in the past when Nintendo’s quarterly losses were driven by currency fluctuations, not sales failures, it’s only fair that I now point out that quite the reverse is true. The Yen has fallen dramatically against the Dollar and the Euro in recent months, making Nintendo’s overseas assets and sales much more valuable in its end-of-year results – and this time, that’s covering over the fact that the company has missed its hardware sales targets for both the 3DS and the Wii U.
In short, all those “Nintendo back in profit” headlines aren’t really worth anything more than the “Nintendo makes shock loss” headlines were back when the Yen was soaring to all-time highs a few years ago. The company is still facing the same tough times this week that it was last week; the Wii U is still struggling to break 10 million units and the 3DS is seeing a major year-on-year decline in its sales, having faltered significantly after hitting the 50 million installed base mark.
In hardware terms, then, Nintendo deserves all the furrowed brows and concerned looks it’s getting right now. Part of the problem is comparisons with past successes, of course; the Wii shipped over a million units and the DS, an absolute monster of a console, managed over 150 million. In reality, while the Wii U is having a seriously hard time in spite of its almost universally acclaimed 2014 software line-up, the 3DS isn’t doing badly at all; but it can’t escape comparison with its record-breaking older sibling, naturally enough.
Plenty of commentators reckon they know the answer to Nintendo’s woes, and they’ve all got the same answer; the company needs to ditch hardware and start selling its games on other platforms. Pokemon on iOS! Smash Bros on PlayStation! Mario Kart on Xbox! Freed from the limited installed base of Nintendo’s own hardware – and presumably, in the case of handheld titles, freed to experiment with new business models like F2P – the company’s games would reach their full potential, the expensive hardware division could be shut down and everyone at Nintendo could spend the rest of their lives blowing their noses on ¥10,000 notes.
I’m being flippant, yes, but there’s honestly not a lot more depth than that to the remedies so often proposed for Nintendo. I can’t help but find myself deeply unconvinced. For a start, let’s think about “Nintendo’s woes”, and what exactly is meant by the doom and gloom narrative that has surrounded the company in recent years. That the Wii U isn’t selling well is absolutely true; it’s doing better than the Dreamcast did, to pick an ominous example, but unless there’s a major change of pace the console is unlikely ever to exceed the installed base of the GameCube. Indeed, if you treat the Wii as a “black swan” in Nintendo’s home console history, a flare of success that the company never quite figured out how to bottle and repeat, then the Wii U starts to look like a continuation of a slow and steady decline that started with the Nintendo 64 (a little over thirty million consoles sold in total) and continued with the GameCube (a little over twenty million). That the 3DS is struggling to match the pace and momentum of the DS is also absolutely true; it’s captured a big, healthy swathe of the core Nintendo market but hasn’t broken out to the mass market in the way that the DS did with games like Brain Training.
Yet here’s a thing; in spite of the doom and gloom around downward-revised forecasts for hardware, Nintendo was still able to pull out a list of this year’s million-plus selling software that would put any other publisher in the industry to shame. The latest Pokemon games on 3DS have done nearly 10 million units; Super Smash Bros has done 6.2 million on 3DS and 3.4 million on the Wii U. Mario Kart 8 has done almost five million units, on a console that’s yet to sell 10 million. Also selling over a million units in the last nine months of 2014 on 3DS we find Tomodachi Life, Mario Kart 7 (which has topped 11 million units, life to date), Pokemon X and Y (nearly 14 million units to date), New Super Mario Bros 2 (over 9 million), Animal Crossing: New Leaf (nearly 9 million) and Kirby: Triple Deluxe. The Wii U, in addition to Mario Kart 8 and Super Smash Bros, had million-plus sellers in Super Mario 3D World and Nintendo Land.
That’s 12 software titles from a single publisher managing to sell over a million units in the first three quarters of a financial year – a pretty bloody fantastic result that only gets better if you add in the context that Nintendo is also 2014′s highest-rated publisher in terms of critical acclaim. Plus, Nintendo also gets a nice cut of any third-party software sold on its consoles; granted, that probably doesn’t sum up to much on the Wii U, where third-party games generally seem to have bombed, but on the 3DS it means that the company is enjoying a nice chunk of change from the enormous success of Yokai Watch, various versions of which occupied several slots in the Japanese software top ten for 2014, among other successful 3DS third-party games.
Aha, say the advocates of a third-party publisher approach for Nintendo, that’s exactly our point! The company’s software is amazing! It would do so much better if it weren’t restrained by only being released on consoles that aren’t all that popular! Imagine how Nintendo’s home console games would perform on the vastly faster-selling PS4 (and imagine how great they’d look, intones the occasional graphics-obsessive); imagine how something like Tomodachi Life or Super Smash Bros would do if it was opened up to the countless millions of people with iOS or Android phones!
Let’s take those arguments one at a time, because they’re actually very different. Firstly, home consoles – a sector in which there’s no doubt that Nintendo is struggling. The PS4 has got around twice the installed base of the Wii U after only half the time on the market; it’s clear where the momentum and enthusiasm lies. Still, Super Smash Bros and Mario Kart 8 managed to sell several million copies apiece on Wii U; in the case of Mario Kart 8, around half of Wii U owners bought a copy. Bearing in mind that Nintendo makes way more profit per unit from selling software on its own systems than it would from selling it on third-party consoles (where it would, remember, be paying a licensing fee to Sony or Microsoft), here’s the core question; could it sell more copies of Mario Kart 8 on other people’s consoles than it managed on its own?
If you think the answer to that is “yes”, here’s what you’re essentially claiming; that there’s a large pent-up demand among PlayStation owners for Mario Kart games. Is there really? Can you prove that, through means other than dredging up a handful of Reddit posts from anonymous people saying “I’d play Nintendo games if they were 1080p/60fps on my PS4″? To me, that seems like quite a big claim. It’s an especially big claim when you consider the hyper-competitive environment in which Nintendo would be operating on the PS4 (or Xbox One, or both).
Right now, a big Nintendo game launching on a Nintendo console is a major event for owners of that console. I think Nintendo launches would still be a big event on any console, but there’s no doubt that the company would lose focus as a third-party publisher – sure, the new Smash Bros is out, but competing for attention, pocket money and free time against plenty of other software. It’s not that I don’t think Nintendo games could hold their own in a competitive market, I merely don’t wish to underestimate the focus that Nintendo acquires by having a devoted console all of their own underneath the TVs of millions of consumers – even if its not quite the number of millions they’d like.
How about the other side of the argument, then – the mobile games aspect? Nintendo’s position in handheld consoles may not be what it used to be, but the 3DS has roundly trounced the PlayStation Vita in sales terms. Sure, iPhones and high-end Android devices have much bigger installed bases (Apple shifted around 75 million iPhones in the last quarter, while the lifetime sales of the 3DS are only just over 50 million), but that comparison isn’t necessarily a very useful one. All 50 million 3DS owners bought an expensive device solely to play games, and the lifetime spend on game software of each 3DS owner runs into hundreds of dollars. The “average revenue per user” calculation for Pokemon on the 3DS is easy; everyone paid substantial money for the game up front.
By comparison, lots and lots of iOS and Android users never play games at all, and many of those who play games never pay for them. That’s fine; that’s the very basis of the F2P model, and games using that model effectively can still make plenty of money while continuing to entertain a large number (perhaps even a majority) of players who pay nothing. Still, the claim that moving to smartphones is a “no-brainer” for Nintendo is a pretty huge one, taken in this context. The market for premium, expensive software on smartphones is very limited and deeply undermined by F2P; the move to F2P for Nintendo titles would be creatively difficult for many games, and even for ones that are a relatively natural fit (such as Pokemon), it would be an enormous commercial risk. There’s a chance Nintendo could get it right and end up with a Puzzle & Dragons sized hit on its hands (which is what it would take to exceed the half a billion dollars or so the company makes from each iteration of Pokemon on 3DS); there’s also an enormous risk that the company could get it wrong, attracting criticism and controversy around poor decisions or misjudged sales techniques, and badly damage the precious Pokemon brand itself.
In short, while I’m constantly aware that the market seems to be changing faster than Nintendo is prepared to keep up with, I’m not convinced that any of the company’s critics actually have a better plan right now than Satoru Iwata’s “stay the course” approach. If you believe that PlayStation fans will flock to buy Nintendo software on their console, you may think differently; if you think that the risk and reward profile of the global iOS market is a better bet than the 50-odd million people who have locked themselves in to Nintendo’s 3DS platform and shown a willingness to pay high software prices there, then similarly, you’ll probably think differently. Certainly, there’s some merit to the idea that Nintendo ought to be willing to disrupt its own business in order to avoid being disrupted by others – yet there’s a difference between self-disruption and just hurling yourself headlong into disaster in the name of “not standing still”.
There’s a great deal that needs to be fixed at Nintendo; its marketing and branding remains a bit of a disaster, its relationships with third-party studios and publishers are deeply questionable and its entire approach to online services is incoherent at best. Yet this most fundamental question, “should Nintendo stay in the hardware business”, remains a hell of a lot tougher than the company’s critics seem to believe. For now, beleaguered though he may seem, Iwata still seems to be articulating the most convincing vision for the future of the industry’s most iconic company.
Over the last few years, the industry has seen budget polarization on an enormous scale. The cost of AAA development has ballooned, and continues to do so, pricing out all but the biggest warchests, while the indie and mobile explosions are rapidly approaching the point of inevitable over-saturation and consequential contraction. Stories about the plight of mid-tier studios are ten-a-penny, with the gravestones of some notable players lining the way.
For a company like Ninja Theory, in many ways the archetypal mid-tier developer, survival has been a paramount concern. Pumping out great games (Ninja Theory has a collective Metacritic average of 75) isn’t always enough. Revitalizing a popular IP like DMC isn’t always enough. Working on lucrative and successful external IP like Disney Infinity isn’t always enough. When the fence between indie and blockbuster gets thinner and thinner, it becomes ever harder to balance upon.
Last year, Ninja Theory took one more shot at the upper echelons. For months the studio had worked on a big budget concept which would sit comfortably alongside the top-level, cross-platform releases of the age: a massive, multiplayer sci-fi title that would take thousands of combined, collaborative hours to exhaust. Procedurally generated missions and an extensive DLC structure would ensure longevity and engagement. Concept art and pre-vis trailers in place, the team went looking for funding. Razor was on its way.
Except the game never quite made it. Funding failed to materialize, and no publisher would take the project on. It didn’t help that the search for a publishing deal arrived almost simultaneously with the public announcement of Destiny. Facing an impossible task, the team abandoned the project and moved on with other ideas. Razor joined a surprisingly large pile of games that never make it past the concept stage.
Sadly, it’s not a new story. In fact, at the time, it wasn’t even a news story. But this time Ninja Theory’s reaction was different. This was a learning experience, and learning experiences should be shared. Team lead and co-founder Tameem Antoniades turned the disappointment not just into a lesson, but a new company ethos: involve your audience at an early stage, retain control, fund yourself, aim high, and don’t compromise. The concept of the Independent AAA Proposition, enshrined in a GDC presentation give by Antoniades, was born.
Now the team has a new flagship prospect, cemented in this fresh foundation. In keeping with the theme of open development and transparency, Hellblade is being created with the doors to its development held wide open, with community and industry alike invited to bear witness to the minutiae of the process. Hellblade will be a cross-platform game with all of the ambition for which Ninja Theory is known, and yet it is coming from an entirely independent standpoint. Self-published and self-governed, Hellblade is the blueprint for Ninja Theory’s future.
“We found ourselves as being one of those studios that’s in the ‘squeezed middle’,” project lead Dominic Matthews says. “We’re about 100 people, so we kind of fall into that space where we could try to really diversify and work on loads of smaller projects, but indie studios really have an advantage over us, because they can do things with far lower overheads. We have been faced with this choice of, do we go really, really big with our games and become the studio that is 300 people or even higher than that, and try to tick all of these boxes that the blockbuster AAA games need now.
“We don’t really want to do that. We tried to do that. When we pitched Razor, which we pitched to big studios, that ultimately didn’t go anywhere. That was going to be a huge game; a huge game with a service that would go on for years and would be a huge, multiplayer experience. Although I’m sure it would have been really cool to make that, it kind of showed to us that we’re not right to try to make those kinds of games. Games like Enslaved – trying to get a game like that signed now would be impossible. The way that it was signed, there would be too much pressure for it to be…to have the whole feature set that justifies a $60 price-tag.
“That $60 price-tag means games have to add multiplayer, and 40 hours of gameplay minimum, and a set of characters that appeal to as many people as they possibly can. There’s nothing wrong with games that do that. There’s some fantastic games that do, AAA games. Though we do think that there’s another space that sits in-between. I think a lot of indie games are super, super creative, but they can be heavily stylised. They work within the context of the resources that people have.
“We want to create a game that’s like Enslaved, or like DMC, or like Heavenly Sword. That kind of third-person, really high quality action game, but make it work in an independent model.”
Cutting out the middle-man is a key part of the strategy. But if dealing with the multinational machinery of ‘big pubs’ is what drove Ninja Theory to make such widespread changes, there must surly have been some particularly heinous deals that pushed it over the edge?
“I think it’s just a reality of the way that those publisher/developer deals work,” Matthews says. “In order for a publisher to take a gamble on your game and on your idea, you have to give up a lot. That includes the IP rights. It’s just the realities of how things work in that space. For us, I think any developer would say the same thing, being able to retain your IP is a really important thing. So far, we haven’t been out to do that.
“With Hellblade, it’s really nice that we can be comfortable in the fact that we’re not trying to appeal to everyone. We’re not trying to hit unrealistic forecasts. Ultimately, I think a lot of games have unrealistic forecasts. Everyone knows that they’re unrealistic, but they have to have these unrealistic forecasts to justify the investment that’s going into development.
“Ultimately, a lot of games, on paper, fail because they don’t hit those forecasts. Then the studios and the people that made those games, they don’t get the chance to make any more. It’s an incredibly tough market. Yes, we’ve enjoyed working with our publishers, but that’s not to say that the agreements that developed are all ideal, because they’re not. The catalyst to us now being able to do this is really difficult distribution. We can break away from that retail $60 model, where every single game has to be priced that way, regardless of what it is.
Driven into funding only games that will comfortably shift five or six million units, Matthews believes that publishers have no choice but to stick to the safe bets, a path that eventually winnows down diversity to the point of stagnation, where only a few successful genres ever end up getting made: FPS, sports, RPG, maybe racing. Those genres become less and less distinct, while simultaneously shoe-horning in mechanics that prove popular elsewhere and shunning true innovation.
While perhaps briefly sustainable, Matthews sees that as a creative cul-de-sac. Customers, he feels, are too smart to put up with it.
“Consumers are going to get a bit wary of games that have hundreds of millions of dollars spent on them”
“I think consumers are going to get a bit wary. Get a bit wary of games that have hundreds of millions of dollars spent on them. I think gamers are going to start saying, ‘For what?’
“The pressures are for games to appeal to more and more people. It used to be if you sold a million units, then that was OK. Then it was three million units. Now it’s five million units. Five million units is crazy. We’ve never sold five million units.”
It’s not just consumers who are getting wise, though. Matthews acknowledges that the publishers also see the dead-end approaching.
“I think something has to be said for the platform holders now. Along with digital distribution, the fact that the platform holders are really opening their doors and encouraging self-publishing and helping independent developers to take on some of those publishing responsibilities, has changed things for us. I think it will change things for a lot of other developers. “Hellblade was announced at the GamesCom Playstation 4 press conference. My perception of that press conference was that the real big hitters in that were all independent titles. It’s great that the platform holders have recognised that. There’s a real appetite from their players for innovative, creative games.
“It’s a great opportunity for us to try to do things differently. Like on Hellblade, we’re questioning everything that we do. Not just on development, but also how we do things from a business perspective as well. Normally you would say, ‘Well, you involve these types of agencies, get these people involved in this, and a website will take this long to create.’ The next thing that we’re doing is, we’re saying, ‘Well, is that true? Can we try and do these things a different way,’ because you can.
“There’s definitely pressure for us to fill all those gaps left by a publisher, but it’s a great challenge for us to step up to. Ultimately, we have to transition into a publisher. That’s going to happen at some point, if we want to publish our own games.”
While the Sony PlayStation 4 has been selling very well, it seems that Christmas was not really its season.
Sony said that the PlayStation 4 has sold more than 18.5 million units since the new generation of consoles launched. While that is good and makes the PS4 the fastest selling PlayStation to date, there was no peaking at Christmas.
You would think that the PS4 would sell well at Christmas as parents were forced to do grevious bodily harm to their credit cards to shut their spoilt spawn up during the school holidays. But apparently not.
Apparently, the weapon of choice against precious snowflakes being bored was an Xbox One which saw a Christmas spike in sales.
Sony said that its new numbers are pretty much on target, it sold the expected 2 million sales per month rate.
Redmond will be happy with that result even if it still has a long way to go before it matches the PlayStation 4 on sales.
Recently, my smartphone started acting up. I think the battery is on the way out; it does bizarre things, like shutting itself off entirely when I try to take a picture on 60per cent battery, or suddenly dropping from fully charged to giving me “10per cent remaining, plug me in or else” warnings for no reason at all. I can get it fixed free of charge, but it’s an incredibly frustrating, bothersome thing, especially given how much money I’ve paid for this phone. Most of us have probably had an experience like this with a piece of hardware; a shoddy washing machine that mangled your favorite shirt, a shiny new LCD screen with an intensely irritating dead pixel, an Xbox 360 whose Red Ring of Death demanded a lengthy trip back to the service center. There are few of us who can’t identify with the utter frustration of having a consumer product that you’ve paid good money for simply fail to do its job properly. Sure, it’s a #FirstWorldProblem for the most part (unless it’s something like a faulty airbag in your Honda, obviously), but it’s intensely annoying and certainly makes you less likely to buy anything from that manufacturer again.
Given that we could all probably agree that a piece of hardware being faulty is utterly unacceptable, I’m not sure why software seems to get a free pass sometimes. Sure, there are lots of consumers who complain bitterly about buggy games, but by and large games with awful quality control problems tend to get slapped with labels like “flawed but great”, or have their enormous faults explained in a review only to see the final score reflect none of those problems. It’s not just the media that does this (and for what it’s worth, I don’t think this is corruption so much as an ill-considered aspect of media culture itself); for every broken game, there are a host of consumers out there ready to defend it to the hilt, for whatever reason.
I raise this problem because, while buggy games have always been with us – often hilariously, especially back in the early days of the PlayStation – the past year or so has seen a spate of high-profile, problematic games being launched, suggesting that even some of the industry’s AAA titles are no longer free from truly enormous technical issues. The technical problems that have become increasingly prevalent in recent years are causing genuine damage to the industry; from the botched online launches of games like Driveclub and Battlefield through to the horrendous graphical problems that plague some players of Assassin’s Creed Unity, they are giving consumers terrible experiences of what should be high points for the medium, creating a loud and outspoken group of disgruntled players who act to discourage others, and helping to drive a huge wedge between media (who, understandably, want to talk about the experience and context of a game rather than its technical details) and consumers (who consider a failure to address glaring bugs to be a sign of collusion between media and publishers, and a failure on the part of the media to serve their audience).
We can all guess why this is happening. I don’t wish in any way to underplay how complex and difficult it is to develop bug-free software; I write software tools to assist in my research work, and given how often those simple tools, developed by two or three people at most, have me tearing my hair out at 3am as I search for the single misplaced character that’s causing the whole project to behave oddly, I am absolutely the last person in the world who is going to dismiss the difficulty involved in debugging something as enormous and complex as a modern videogame. Debugging games has inevitably become harder as team sizes and technical complexity has grown; that’s to be expected.
However, just because something is harder doesn’t mean it shouldn’t be happening, and that’s the second part of this problem. Games are developed to incredibly tight schedules, sometimes even tighter today (given the culture of annual updates to core franchises) than they were in the past. Enormous marketing budgets are preallocated and planned out to support a specific release date. The game can’t miss that date; if there are show-stopping bugs, the game will just have to ship with those in place, and with a bit of luck they’ll be able to fix them in time to issue a day-one digital patch (and if your console isn’t online, tough luck).
Yet this situation is artificial in itself. It’s entirely possible to structure your company’s various divisions around the notion that a game will launch when it’s actually ready, and ensure that you only turn out high-quality software; Nintendo, in particular, manages this admirably. Certainly, some people criticise the company for delaying software and it does open up gaps in the release schedule, but compared to the enormous opprobrium which would be heaped upon the company if it turned out a Mario Kart game where players kept falling through the track, or a Legend of Zelda where Link’s face kept disappearing, leaving only eyes and teeth floating ghoulishly in negative space (sleep well, kids!), an occasional delay is a corporate cultural decision that makes absolute sense – not only for Nintendo, but for game companies in general.
It doesn’t even have to go as far as delaying games on a regular basis. There is a strong sense that some of the worst offenders in terms of buggy games simply aren’t taking QA seriously, which is something that absolutely needs to be fixed – and if not, deserves significant punishment from consumers and critics alike. Quality control has a bit of an image problem; there’s a standard stereotype of a load of pizza-fuelled youngsters in their late teens testing games for a few years as they try to break into a “real” games industry job. The image doesn’t come from thin air; for some companies, this is absolutely a reality. It is, however, utterly false to think that every company sees its QA in those terms. For companies that take QA seriously, it’s a division that’s respected and well-treated, with its own career progression tracks, all founded on the basic understanding that a truly good QA engineer is worth his or her weight in gold.
Not prioritising your QA department – not ensuring that it’s a division that’s filling up with talented, devoted people who see QA as potentially being a real career and not just a stepping stone – is exactly the same thing as not prioritising your consumers. Not building time for proper QA into your schedules, or failing to enact processes which ensure that QA is being properly listened to and involved, is nothing short of a middle finger raised to your entire consumer base – and you only get to do that so many times before your consumers start giving the gesture right back to you and your precious franchises.
Media does absolutely have a role to play in this – one to which it has, by and large, not lived up. Games with serious QA problems do not deserve critical acclaim. I understand fully that reviewers want to engage with more interesting topics than technical issues, but I think it’s worth thinking about how film reviewers would treat a movie with unfinished special effects or audio mixed such that voices can’t be heard; or perhaps how music reviewers would treat an album with a nasty recording hiss in the background, or with certain tracks accidentally dropping out or skipping. Regardless of the good intentions of the creative people involved in these projects, the resulting product would be slammed, and rightly so. It’s perhaps the very knowledge of the drubbing that they would receive that means that such awful movies and albums almost never see the light of day (and when they do, they become legendary in their awfulness; consider the unfinished CGI at the end of “The Scorpion King”, which remains a watchword for terrible special effects many years later).
Game companies, by contrast, seem to feel unpleasantly comfortable with releasing games that don’t work and aren’t properly tested. Certain technical aspects probably contribute to this; journalists may be wary of slamming a game for bugs that may be fixed in a day-one patch, for instance. Yet it seems that there’s little choice but to make the criteria stricter in this regard. If media and consumers alike do not take to punishing companies severely for failing to pay proper respect to QA procedures for their games, this problem will only worsen as firms realize that they they can get away with launching unfinished software.
We all want a world where technical issues are nothing but a footnote in the discussion of games; that will be the ultimate triumph of game technology, when it truly becomes transparent. We do not, however, live in that time yet, and the regular launches of games that don’t live up to even the most basic standards of quality is something nobody should be asked to tolerate. The move by some websites to stop reviewing online games until the servers are live and populated with real players is a good start; but the overall tolerance for bugs and willingness to forgive publishers for such transgressions (“we know the last game was a buggy mess, but we’re still going to publish half a dozen puff pieces that will push our readers to pre-order the sequel!”) needs to be fixed. If we want to talk about the things that are important about games (and we do!), it’s essential that we fix the culture that ignores QA and technical issues first.
For independent developers, the last decade has been an endless procession of migratory possibilities. The physical world was defined by compromise, dependence and strategically closed doors, but the rise of digital afforded freedom and flexibility in every direction. New platforms, new business models, new methods of distribution and communication; so many fresh options appeared in such a brief window of time that knowing where and when to place your bet was almost as important as having the best product. For a few years, right around 2008, there was promise almost everywhere you looked.
That has changed. No matter how pregnant with potential they once seemed, virtually every marketplace has proved unable to support the spiralling number of new releases. If the digital world is one with infinite shelf-space for games, it has offered no easy solutions on how to make them visible. Facebook, Android, iOS, Xbox Live Arcade, the PlayStation Network; all have proved to be less democratic than they first appeared, their inevitable flaws exposed as the weight of choice became heavier and heavier. As Spil Games’ Eric Goossens explained to me at the very start of 2014: “It just doesn’t pay the bills any more.”
Of course, Goossens was talking specifically about indie development of casual games. And at that point, with 2013 only just receding from view, I would probably have named one exception to the trend, one place where the balance between volume and visibility gave indies the chance to do unique and personal work and still make a decent living. That place would have been Steam, and if I was correct in my assessment for even one second, it wasn’t too long before the harsher reality became clear.
After less than five months of 2014 had passed, Valve’s platform had already added more new games than in the whole of the previous year. Initiatives like Greenlight and Early Access were designed to make Steam a more open and accessible platform, but they were so effective that some of what made it such a positive force for indies was lost in the process. Steam’s culture of deep-discounting has become more pervasive and intense in the face of this chronic overcrowding, stirring up impassioned debate over what some believe will be profound long-term effects for the perceived value of PC games. Every discussion needs balance, but in this case the back-and-forth seemed purely academic: for a lot of developers steep discounts are simply a matter of survival, and precious few could even entertain the notion of focusing on the greater good instead.
And the indie pinch was felt beyond Steam’s deliberately weakened walls. Kickstarter may be a relatively new phenomenon – even for the hyper-evolving landscape of the games industry – but it faced similar problems in 2014, blighted by the twin spectres of too much content and not enough money to go around. Anecdotally, the notion that something had changed was lurking in the back ground at the very start of the year, with several notable figures struggling to find enough backers within the crowd. The latter months of 2014 threw up a few more examples, but they also brought something close to hard evidence that ‘peak Kickstarter’ may already be behind us – fewer successful projects, lower funding targets, and less money flowing through the system in general. None of which was helped by a handful of disappointing failures, each one a blow for the public’s already flagging interest in crowdfunding. Yet another promising road for indies had become more treacherous and uncertain.
So are indies heading towards a “mass extinction event”? Overcrowding is certainly a key aspect of the overall picture, but the act of making and releasing a game is only getting easier, and the allure of development as a career choice seems to grow with each passing month. It stands to reason that there will continue to be a huge number of games jostling for position on every single platform – more than even a growing market can sustain – but there’s only so much to be gained from griping about the few remaining gatekeepers. If the days when simply being on Steam or Kickstarter made a commercial difference are gone, and if existing discovery tools still lack the nuance to deal with all of that choice, then it just shifts the focus back to where it really belongs: talent, originality, and a product worth an investment of time and money.
At GDC Europe this summer, I was involved in a private meeting with a group of Dutch independent game developers, all sharing knowledge and perspective on how to find success. We finished that hour agreeing on much the same thing. There are few guarantees in this or any other business, but the conditions have also never been more appropriate for personality and individuality to be the smartest commercial strategy. The world has a preponderance of puzzle-platformers, but there’s only one Monument Valley. We’re drowning in games about combat, but This War of Mine took a small step to the left and was greeted with every kind of success. Hell, Lucas Pope made an entire game about working as a border control officer and walked away with not just a hit, but a mantelpiece teeming with the highest honours.
No matter how crowded the market has become, strong ideas executed with care are still able to rise above the clamour, no huge marketing spend required. As long as that’s still possible, indies have all of the control they need.
Sources are sighting a rating seen on the Australian classifications that seem to point to an upcoming Remastered Edition of Borderlands is coming for Xbox One and PlayStation 4. So far this has remained unconfirmed by publisher 2K and franchise developer Gearbox.
The new remastered version is expected to be simply called “Borderlands Remastered Edition”, but with no confirmation from 2K and Gearbox it is difficult to say what all it might contain or if it is simply a converted and compiled version of the first three games for the Xbox One and PlayStation 4.
Bottom line if it is in fact a complied remastered release of the first three games, the reality is that this could actually be a good thing for those that own the new consoles.