Subscribe to:

Subscribe to :: TheGuruReview.net ::

Was Crytek Saved By Amazon?

April 9, 2015 by Michael  
Filed under Gaming

The deal that helped Crytek recover from its recent financial difficulties was Amazon, according to a report from Kotaku.

The online retail giant signed a licensing deal for CryEngine, Crytek’s proprietary game engine. Sources within the company put the deal’s value at between $50 million and $70 million, and suggested that Amazon may be using it as the bedrock for a proprietary engine of its own.

However Amazon uses the technology, though, the importance of the deal for Crytek cannot be overstated. Last year, during the summer, it became apparent that all was not well at the German developer. Employees hadn’t been fully paid in months, leading to an alleged staff walkout in its UK office, where a sequel to Homefront was in development. Koch Media acquired the Homefront IP and its team shortly after.

When the company’s management eventually addressed the rumors, it had already secured the financing necessary to take the company forward. No details of the deal were offered, but it’s very likely that Crytek got the money it needed from Amazon.

We have contacted Crytek to confirm the details, but it certainly fits with the perception that Amazon could emerge as a major creator of game content. It has snapped up some elite talent to do just that, it acquired Twitch for a huge sum of money, and it has been very open about where it plans to fit into the overall market.

Courtesy-GI.biz

 

Is Nintendo Going Mobile?

March 18, 2015 by Michael  
Filed under Mobile

Nintendo has formed a comprehensive new alliance with DeNA that will make every one of the company’s famous IPs available for mobile development.

The bedrock of the deal is a dual stock purchase, with each company buying ¥22 billion ($181 million) of the other’s treasury shares. That’s equivalent to 10 per cent of DeNA’s stock, and 1.24 per cent of Nintendo. The payments will complete on April 2, 2015.

What this will ultimately mean for the consumer is Nintendo IP on mobile, “extending Nintendo’s reach into the vast market of smart device users worldwide.” There will be no ports of existing Nintendo games, according to information released today, but, “all Nintendo IP will be eligible for development and exploration by the alliance.” That includes the “iconic characters” that the company has guarded for so long.

No details on the business model that these games and apps will be released under were offered, though Nintendo may well be reluctant to adopt free-to-play at first. The information provided to the press emphasised the “premium” experiences Nintendo currently offers on platforms like Wii U and 3DS. Admittedly, that could be interpreted in either direction.

However, Nintendo and DeNA are planning an online membership service that will span Nintendo consoles, PC and smart devices. That will launch in the autumn this year.

This marks a significant change in strategy for Nintendo, which has been the subject of reports about plans to take its famous IPs to mobile for at least a year. Indeed, the company has denied the suggestion on several occasions, even as it indicated that it did have plans to make mobile a part of its core strategy in other ways.

Analysts have been offering their reflections on the deal, with the response from most being largely positive.

“Nintendo’s decision to partner with DeNA is a recognition of the importance of the games app audience to the future of its business,” said IHS head of gaming Piers Harding-Rolls. “Not only is there significant revenue to be made directly from smartphone and tablet consumers for Nintendo, app ecosystems are also very important in reaching new customers to make them aware of the Nintendo brand and to drive a new and broader audience to its dedicated console business. Last year IHS data shows that games apps were worth $26 billion in consumer spending globally, with handheld console games worth only 13 per cent of that total at $3.3 billion.

“The Nintendo-DeNA alliance is a good fit and offers up a number of important synergies for two companies that are no longer leaders in their respective segments.

“DeNA remains one of the leading mobile games company’s in Japan and, we believe, shares cultural similarities with Nintendo, especially across its most popular big-brand content. The alliance gives Nintendo access to a large audience in its home market, which remains very important to its overall financial performance. Japanese consumers spend significantly more per capita on mobile games than in any other country and it remains the biggest market for both smartphone and handheld gaming. While the partnership gives Nintendo immediate potential to grow its domestic revenues through this audience, gaining access to DeNA’s mobile expertise is important too to realise this potential.

“This alliance makes commercial sense on many levels – the main challenge will be knitting together the cultures of both companies and aligning the speed of development and iteration that is needed in the mobile space with Nintendo’s more patient and systematic approach to games content production. How the new games are monetised may also provide a challenge considering the general differences in models used in retail for Nintendo and through in-app purchases for DeNA.”

In a livestreamed press conference regarding the DeNA deal, Nintendo’s Satoru Iwata reassured those in attendance that the company was still committed to “dedicated video game systems” as its core business. To do that, he confirmed that the company was working on a new console, codenamed “NX”.

“As proof that Nintendo maintains strong enthusiasm for the dedicated game system business let me confirm that Nintendo is currently developing a dedicated game platform with a brand new concept under the development codename NX,” he said.

“It is too early to elaborate on the details of this project but we hope to share more information with you next year.”

Courtesy-GI.biz

 

 

Is Valve’s Steam Machine A Flop?

March 17, 2015 by Michael  
Filed under Gaming

There’s not a lot to argue with the consensus view that Valve had the biggest and most exciting announcement of GDC this year, in the form of the Vive VR headset it’s producing with hardware partner HTC. It may not be the ultimate “winner” of the battle between VR technologies, but it’s done more than most to push the whole field forwards – and it clearly sparked the imaginations of both developers and media in San Francisco earlier this month. Few of those who attended GDC seem particularly keen to talk about anything other than Vive.

From Valve’s perspective, that might be just as well – the incredibly strong buzz around Vive meant that it eclipsed Valve’s other hardware-related announcement at GDC, the unveiling of new details of the Steam Machines initiative. Ordinarily, it might be an annoying (albeit very high-quality) problem to have one of your announcements completely dampen enthusiasm for the other; in this instance, it’s probably welcome, because what trickled out of GDC regarding Steam Machines is making this look like a very stunted, unloved and disappointing project indeed.

To recap briefly; Steam Machines is Valve’s attempt to create a range of attractive, small-form-factor PC hardware from top manufacturers carrying Valve’s seal of approval (hence being called “Steam Machines” and quite distinctly not “PCs”), running Valve’s own gaming-friendly flavour of the Linux OS, set up to connect to your living room TV and controlled with Valve’s custom joypad device. From a consumer standpoint, they’re Steam consoles; a way to access the enormous library of Steam content (at least the Linux-friendly parts of it) through a device that’s easy to buy, set up and control, and designed from the ground up for the living room.

That’s a really great idea, but one which requires careful execution. Most of all, if it’s going to work, it needs a fairly careful degree of control; Valve isn’t building the machines itself, but since it’s putting its seal of approval on them (allowing them to use the Steam trademark and promoting them through the Steam service), it ought to have the power to enforce various standards related to specification and performance, ensuring that buyers of Steam Machines get a clear, simple, transparent way to understand the calibre of machine they’re purchasing and the gaming performance they can expect as a result.

Since the announcement of the Steam Machines initiative, various ways of implementing this have been imagined; perhaps a numeric score assigned to each Machine allowing buyers to easily understand the price to performance ratio on offer? Perhaps a few distinct “levels” of Steam Machine, with some wiggle room for manufacturers to distinguish themselves, but essentially giving buyers a “Good – Better – Best” set of options that can be followed easily? Any such rating system could be tied in to the Steam store itself, so you could easily cross-reference and find out which system is most appropriate for the kind of games you actually want to play.

In the final analysis, it would appear that Valve’s decision on the myriad possibilities available to it in this regard is the worst possible cop-out, from a consumer standpoint; the company’s decided to do absolutely none of them. The Steam Machines page launched on the Steam website during GDC lists 15 manufacturers building the boxes; many of those manufacturers are offering three models or more at different price and performance levels. There is absolutely no way to compare or even understand performance across the different Steam Machines on offer, short of cross-referencing the graphics cards, processors, memory types and capacities and drive types and capacities used in each one – and if you’ve got the up-to-date technical knowledge to accurately balance those specifications across a few dozen different machines and figure out which one is the best, then you’re quite blatantly going to be the sort of person who saves money by buying the components separately and wouldn’t buy a Steam Machine in a lifetime.

“Valve seems to have copped out entirely from the idea of using its new systems to make the process of buying a gaming PC easier or more welcoming for consumers”

In short, unless there’s a pretty big rabbit that’s going to be pulled out of a hat between now and the launch of the first Steam Machines in the autumn, Valve seems to have copped out entirely from the idea of using its new systems to make the process of buying a gaming PC easier or more welcoming for consumers – and in the process, appears to have removed pretty much the entire raison d’etre of Steam Machines. The opportunity for the PC market to be grown significantly by becoming more “console-like” isn’t to do with shoving PC components into smaller boxes; that’s been happening for years, occasionally with pretty impressive results. Nor is it necessarily about reducing the price, which has also been happening for some years (and which was never going to happen with Steam Machines anyway, as Valve is of no mind to step in and become a loss-leading platform holder).

Rather, it’s about lowering the bar to entry, which remains dizzyingly high for PC gaming – not financially, but in knowledge terms. A combination of relatively high-end technical knowledge and of deliberate and cynical marketing-led obfuscation of technical terminology and product numbering has meant that the actual process of figuring out what you need to buy in order to play the games you want at a degree of quality that’s acceptable is no mean feat for an outsider wanting to engage (or re-engage) with PC games; it’s in this area, the simplicity and confidence of buying a system that you know will play all the games marketed for it, that consoles have an enormous advantage over the daunting task of becoming a PC gamer.

Lacking any guarantee of performance or simple way of understanding what sort of system you’re buying, the Steam Machines as they stand don’t do anything to make that process easier. Personally, I ought to be slap bang in the middle of the market for a Steam Machine; I’m a lapsed PC gamer with a decent disposable income who is really keen to engage with some of the games coming out in the coming year (especially some of the Kickstarted titles which hark back to RPGs I used to absolutely adore), but I’m totally out of touch with what the various specifications and numbers mean. A Steam Machine that I could buy with the confidence that it would play the games I want at decent quality would be a really easy purchase to justify; yet after an hour flicking over and back between the Steam Machines page launched during GDC and various tech websites (most of which assume a baseline of knowledge which, in my case, is a good seven or eight years out of date), I am no closer to understanding which machine I would need or what kind of price point is likely to be right for me. Balls to it; browser window full of tabs looking at tech spec mumbo-jumbo closed, PS4 booted up. Sale lost.

This would be merely a disappointment – a missed opportunity to lower the fence and let a lot more people enjoy PC gaming – were it not for the extra frisson of difficulty posed by none other than Valve’s more successful GDC announcement, the Vive VR headset. You see, one of the things that’s coming across really clearly from all the VR technology arriving on the market is that frame-rate – silky-smooth frame-rate, at least 60FPS and preferably more if the tech can manage it – is utterly vital to the VR experience, making the difference between a nauseating, headache-inducing mess and a Holodeck wet dream. Suddenly, the question of PC specifications has become even more important than before, because PCs incapable of delivering content of sufficient quality simply won’t work for VR. One of the appealing things about a Steam Machine ought to be the guarantee that I’ll be able to plug in a Vive headset and enjoy Valve’s VR, if not this year then at some point down the line; yet lacking any kind of certification that says “yes, this machine is going to be A-OK for VR experiences for now”, the risk of an expensive screw-up in the choice of machine to buy seems greater than ever before.

I may be giving Steam Machines a hard time unfairly; it may be that Valve is actually going to slap the manufacturers into line and impose a clear, transparent way of measuring and certifying performance on the devices, giving consumers confidence in their purchases and lowering the bar to entry to PC gaming. I hope so; this is something that only Valve is in a position to accomplish and that is more important than ever with VR on the horizon and approaching fast. The lack of any such system in the details announced thus far is bitterly disappointing, though. Without it, Steam Machines are nothing more than a handful of small form-factor PCs running a slightly off-kilter OS; of no interest to hobbyists, inaccessible to anyone else, and completely lacking a compelling reason to exist.

Courtesy-Gi.biz

Valve Develops Vulcan GPU For Linux

March 10, 2015 by Michael  
Filed under Computing

Valve has developed its own Intel Vulkan GPU graphics driver for Linux that they intend to open-source.

The Vulkan API is still being argued about and will not be finalized until later this year, but Valve has been developing their own Intel GPU reference driver for Vulkan to help early adopter’s boot-strap their code.

During their presentation at GDC2015 Valve announced that its Intel Linux driver will be open-sourced, but they haven’t provided a time-frame for doing so.

Valve also confirmed that the Source 2 Engine supports the alpha Vulkan API today and that Vulkan will be supported across the board on Steam Machines.

Intel graphics hardware might not be the sexiest, but there is a lot of it out there. It is also easy to target for an open-source driver given Intel’s extensive hardware specifications / programming documentation.

A Vulkan Intel Linux graphics driver used by game developers sounds very promising for having good support for this new API and most likely high-performance was a priority in the development of this driver by Valve and likely their partners at LunarG.

Courtesy-Fud

 

Is Nintendo On The Right Course?

February 3, 2015 by Michael  
Filed under Gaming

Nintendo is heading back to black, with the company’s financial announcements this week revealing that it’s expecting to post a fairly reasonable profit for the full year. For a company that’s largely been mired in red ink since the end of the glory days of the Wii, that looks like pretty fantastic news; but since I was one of the people who repeatedly pointed out in the past when Nintendo’s quarterly losses were driven by currency fluctuations, not sales failures, it’s only fair that I now point out that quite the reverse is true. The Yen has fallen dramatically against the Dollar and the Euro in recent months, making Nintendo’s overseas assets and sales much more valuable in its end-of-year results – and this time, that’s covering over the fact that the company has missed its hardware sales targets for both the 3DS and the Wii U.

In short, all those “Nintendo back in profit” headlines aren’t really worth anything more than the “Nintendo makes shock loss” headlines were back when the Yen was soaring to all-time highs a few years ago. The company is still facing the same tough times this week that it was last week; the Wii U is still struggling to break 10 million units and the 3DS is seeing a major year-on-year decline in its sales, having faltered significantly after hitting the 50 million installed base mark.

In hardware terms, then, Nintendo deserves all the furrowed brows and concerned looks it’s getting right now. Part of the problem is comparisons with past successes, of course; the Wii shipped over a million units and the DS, an absolute monster of a console, managed over 150 million. In reality, while the Wii U is having a seriously hard time in spite of its almost universally acclaimed 2014 software line-up, the 3DS isn’t doing badly at all; but it can’t escape comparison with its record-breaking older sibling, naturally enough.

Plenty of commentators reckon they know the answer to Nintendo’s woes, and they’ve all got the same answer; the company needs to ditch hardware and start selling its games on other platforms. Pokemon on iOS! Smash Bros on PlayStation! Mario Kart on Xbox! Freed from the limited installed base of Nintendo’s own hardware – and presumably, in the case of handheld titles, freed to experiment with new business models like F2P – the company’s games would reach their full potential, the expensive hardware division could be shut down and everyone at Nintendo could spend the rest of their lives blowing their noses on ¥10,000 notes.

I’m being flippant, yes, but there’s honestly not a lot more depth than that to the remedies so often proposed for Nintendo. I can’t help but find myself deeply unconvinced. For a start, let’s think about “Nintendo’s woes”, and what exactly is meant by the doom and gloom narrative that has surrounded the company in recent years. That the Wii U isn’t selling well is absolutely true; it’s doing better than the Dreamcast did, to pick an ominous example, but unless there’s a major change of pace the console is unlikely ever to exceed the installed base of the GameCube. Indeed, if you treat the Wii as a “black swan” in Nintendo’s home console history, a flare of success that the company never quite figured out how to bottle and repeat, then the Wii U starts to look like a continuation of a slow and steady decline that started with the Nintendo 64 (a little over thirty million consoles sold in total) and continued with the GameCube (a little over twenty million). That the 3DS is struggling to match the pace and momentum of the DS is also absolutely true; it’s captured a big, healthy swathe of the core Nintendo market but hasn’t broken out to the mass market in the way that the DS did with games like Brain Training.

Yet here’s a thing; in spite of the doom and gloom around downward-revised forecasts for hardware, Nintendo was still able to pull out a list of this year’s million-plus selling software that would put any other publisher in the industry to shame. The latest Pokemon games on 3DS have done nearly 10 million units; Super Smash Bros has done 6.2 million on 3DS and 3.4 million on the Wii U. Mario Kart 8 has done almost five million units, on a console that’s yet to sell 10 million. Also selling over a million units in the last nine months of 2014 on 3DS we find Tomodachi Life, Mario Kart 7 (which has topped 11 million units, life to date), Pokemon X and Y (nearly 14 million units to date), New Super Mario Bros 2 (over 9 million), Animal Crossing: New Leaf (nearly 9 million) and Kirby: Triple Deluxe. The Wii U, in addition to Mario Kart 8 and Super Smash Bros, had million-plus sellers in Super Mario 3D World and Nintendo Land.

That’s 12 software titles from a single publisher managing to sell over a million units in the first three quarters of a financial year – a pretty bloody fantastic result that only gets better if you add in the context that Nintendo is also 2014′s highest-rated publisher in terms of critical acclaim. Plus, Nintendo also gets a nice cut of any third-party software sold on its consoles; granted, that probably doesn’t sum up to much on the Wii U, where third-party games generally seem to have bombed, but on the 3DS it means that the company is enjoying a nice chunk of change from the enormous success of Yokai Watch, various versions of which occupied several slots in the Japanese software top ten for 2014, among other successful 3DS third-party games.

Aha, say the advocates of a third-party publisher approach for Nintendo, that’s exactly our point! The company’s software is amazing! It would do so much better if it weren’t restrained by only being released on consoles that aren’t all that popular! Imagine how Nintendo’s home console games would perform on the vastly faster-selling PS4 (and imagine how great they’d look, intones the occasional graphics-obsessive); imagine how something like Tomodachi Life or Super Smash Bros would do if it was opened up to the countless millions of people with iOS or Android phones!

Let’s take those arguments one at a time, because they’re actually very different. Firstly, home consoles – a sector in which there’s no doubt that Nintendo is struggling. The PS4 has got around twice the installed base of the Wii U after only half the time on the market; it’s clear where the momentum and enthusiasm lies. Still, Super Smash Bros and Mario Kart 8 managed to sell several million copies apiece on Wii U; in the case of Mario Kart 8, around half of Wii U owners bought a copy. Bearing in mind that Nintendo makes way more profit per unit from selling software on its own systems than it would from selling it on third-party consoles (where it would, remember, be paying a licensing fee to Sony or Microsoft), here’s the core question; could it sell more copies of Mario Kart 8 on other people’s consoles than it managed on its own?

If you think the answer to that is “yes”, here’s what you’re essentially claiming; that there’s a large pent-up demand among PlayStation owners for Mario Kart games. Is there really? Can you prove that, through means other than dredging up a handful of Reddit posts from anonymous people saying “I’d play Nintendo games if they were 1080p/60fps on my PS4″? To me, that seems like quite a big claim. It’s an especially big claim when you consider the hyper-competitive environment in which Nintendo would be operating on the PS4 (or Xbox One, or both).

Right now, a big Nintendo game launching on a Nintendo console is a major event for owners of that console. I think Nintendo launches would still be a big event on any console, but there’s no doubt that the company would lose focus as a third-party publisher – sure, the new Smash Bros is out, but competing for attention, pocket money and free time against plenty of other software. It’s not that I don’t think Nintendo games could hold their own in a competitive market, I merely don’t wish to underestimate the focus that Nintendo acquires by having a devoted console all of their own underneath the TVs of millions of consumers – even if its not quite the number of millions they’d like.

How about the other side of the argument, then – the mobile games aspect? Nintendo’s position in handheld consoles may not be what it used to be, but the 3DS has roundly trounced the PlayStation Vita in sales terms. Sure, iPhones and high-end Android devices have much bigger installed bases (Apple shifted around 75 million iPhones in the last quarter, while the lifetime sales of the 3DS are only just over 50 million), but that comparison isn’t necessarily a very useful one. All 50 million 3DS owners bought an expensive device solely to play games, and the lifetime spend on game software of each 3DS owner runs into hundreds of dollars. The “average revenue per user” calculation for Pokemon on the 3DS is easy; everyone paid substantial money for the game up front.

By comparison, lots and lots of iOS and Android users never play games at all, and many of those who play games never pay for them. That’s fine; that’s the very basis of the F2P model, and games using that model effectively can still make plenty of money while continuing to entertain a large number (perhaps even a majority) of players who pay nothing. Still, the claim that moving to smartphones is a “no-brainer” for Nintendo is a pretty huge one, taken in this context. The market for premium, expensive software on smartphones is very limited and deeply undermined by F2P; the move to F2P for Nintendo titles would be creatively difficult for many games, and even for ones that are a relatively natural fit (such as Pokemon), it would be an enormous commercial risk. There’s a chance Nintendo could get it right and end up with a Puzzle & Dragons sized hit on its hands (which is what it would take to exceed the half a billion dollars or so the company makes from each iteration of Pokemon on 3DS); there’s also an enormous risk that the company could get it wrong, attracting criticism and controversy around poor decisions or misjudged sales techniques, and badly damage the precious Pokemon brand itself.

In short, while I’m constantly aware that the market seems to be changing faster than Nintendo is prepared to keep up with, I’m not convinced that any of the company’s critics actually have a better plan right now than Satoru Iwata’s “stay the course” approach. If you believe that PlayStation fans will flock to buy Nintendo software on their console, you may think differently; if you think that the risk and reward profile of the global iOS market is a better bet than the 50-odd million people who have locked themselves in to Nintendo’s 3DS platform and shown a willingness to pay high software prices there, then similarly, you’ll probably think differently. Certainly, there’s some merit to the idea that Nintendo ought to be willing to disrupt its own business in order to avoid being disrupted by others – yet there’s a difference between self-disruption and just hurling yourself headlong into disaster in the name of “not standing still”.

There’s a great deal that needs to be fixed at Nintendo; its marketing and branding remains a bit of a disaster, its relationships with third-party studios and publishers are deeply questionable and its entire approach to online services is incoherent at best. Yet this most fundamental question, “should Nintendo stay in the hardware business”, remains a hell of a lot tougher than the company’s critics seem to believe. For now, beleaguered though he may seem, Iwata still seems to be articulating the most convincing vision for the future of the industry’s most iconic company.

Courtesy-GI.biz

 

Do Game Developers Have Unrealistic Expectations?

January 22, 2015 by Michael  
Filed under Gaming

Over the last few years, the industry has seen budget polarization on an enormous scale. The cost of AAA development has ballooned, and continues to do so, pricing out all but the biggest warchests, while the indie and mobile explosions are rapidly approaching the point of inevitable over-saturation and consequential contraction. Stories about the plight of mid-tier studios are ten-a-penny, with the gravestones of some notable players lining the way.

For a company like Ninja Theory, in many ways the archetypal mid-tier developer, survival has been a paramount concern. Pumping out great games (Ninja Theory has a collective Metacritic average of 75) isn’t always enough. Revitalizing a popular IP like DMC isn’t always enough. Working on lucrative and successful external IP like Disney Infinity isn’t always enough. When the fence between indie and blockbuster gets thinner and thinner, it becomes ever harder to balance upon.

Last year, Ninja Theory took one more shot at the upper echelons. For months the studio had worked on a big budget concept which would sit comfortably alongside the top-level, cross-platform releases of the age: a massive, multiplayer sci-fi title that would take thousands of combined, collaborative hours to exhaust. Procedurally generated missions and an extensive DLC structure would ensure longevity and engagement. Concept art and pre-vis trailers in place, the team went looking for funding. Razor was on its way.

Except the game never quite made it. Funding failed to materialize, and no publisher would take the project on. It didn’t help that the search for a publishing deal arrived almost simultaneously with the public announcement of Destiny. Facing an impossible task, the team abandoned the project and moved on with other ideas. Razor joined a surprisingly large pile of games that never make it past the concept stage.

Sadly, it’s not a new story. In fact, at the time, it wasn’t even a news story. But this time Ninja Theory’s reaction was different. This was a learning experience, and learning experiences should be shared. Team lead and co-founder Tameem Antoniades turned the disappointment not just into a lesson, but a new company ethos: involve your audience at an early stage, retain control, fund yourself, aim high, and don’t compromise. The concept of the Independent AAA Proposition, enshrined in a GDC presentation give by Antoniades, was born.

Now the team has a new flagship prospect, cemented in this fresh foundation. In keeping with the theme of open development and transparency, Hellblade is being created with the doors to its development held wide open, with community and industry alike invited to bear witness to the minutiae of the process. Hellblade will be a cross-platform game with all of the ambition for which Ninja Theory is known, and yet it is coming from an entirely independent standpoint. Self-published and self-governed, Hellblade is the blueprint for Ninja Theory’s future.

“We found ourselves as being one of those studios that’s in the ‘squeezed middle’,” project lead Dominic Matthews says. “We’re about 100 people, so we kind of fall into that space where we could try to really diversify and work on loads of smaller projects, but indie studios really have an advantage over us, because they can do things with far lower overheads. We have been faced with this choice of, do we go really, really big with our games and become the studio that is 300 people or even higher than that, and try to tick all of these boxes that the blockbuster AAA games need now.

“We don’t really want to do that. We tried to do that. When we pitched Razor, which we pitched to big studios, that ultimately didn’t go anywhere. That was going to be a huge game; a huge game with a service that would go on for years and would be a huge, multiplayer experience. Although I’m sure it would have been really cool to make that, it kind of showed to us that we’re not right to try to make those kinds of games. Games like Enslaved – trying to get a game like that signed now would be impossible. The way that it was signed, there would be too much pressure for it to be…to have the whole feature set that justifies a $60 price-tag.

“That $60 price-tag means games have to add multiplayer, and 40 hours of gameplay minimum, and a set of characters that appeal to as many people as they possibly can. There’s nothing wrong with games that do that. There’s some fantastic games that do, AAA games. Though we do think that there’s another space that sits in-between. I think a lot of indie games are super, super creative, but they can be heavily stylised. They work within the context of the resources that people have.

“We want to create a game that’s like Enslaved, or like DMC, or like Heavenly Sword. That kind of third-person, really high quality action game, but make it work in an independent model.”

Cutting out the middle-man is a key part of the strategy. But if dealing with the multinational machinery of ‘big pubs’ is what drove Ninja Theory to make such widespread changes, there must surly have been some particularly heinous deals that pushed it over the edge?

“I think it’s just a reality of the way that those publisher/developer deals work,” Matthews says. “In order for a publisher to take a gamble on your game and on your idea, you have to give up a lot. That includes the IP rights. It’s just the realities of how things work in that space. For us, I think any developer would say the same thing, being able to retain your IP is a really important thing. So far, we haven’t been out to do that.

“With Hellblade, it’s really nice that we can be comfortable in the fact that we’re not trying to appeal to everyone. We’re not trying to hit unrealistic forecasts. Ultimately, I think a lot of games have unrealistic forecasts. Everyone knows that they’re unrealistic, but they have to have these unrealistic forecasts to justify the investment that’s going into development.

“Ultimately, a lot of games, on paper, fail because they don’t hit those forecasts. Then the studios and the people that made those games, they don’t get the chance to make any more. It’s an incredibly tough market. Yes, we’ve enjoyed working with our publishers, but that’s not to say that the agreements that developed are all ideal, because they’re not. The catalyst to us now being able to do this is really difficult distribution. We can break away from that retail $60 model, where every single game has to be priced that way, regardless of what it is.

Driven into funding only games that will comfortably shift five or six million units, Matthews believes that publishers have no choice but to stick to the safe bets, a path that eventually winnows down diversity to the point of stagnation, where only a few successful genres ever end up getting made: FPS, sports, RPG, maybe racing. Those genres become less and less distinct, while simultaneously shoe-horning in mechanics that prove popular elsewhere and shunning true innovation.

While perhaps briefly sustainable, Matthews sees that as a creative cul-de-sac. Customers, he feels, are too smart to put up with it.

“Consumers are going to get a bit wary of games that have hundreds of millions of dollars spent on them”

“I think consumers are going to get a bit wary. Get a bit wary of games that have hundreds of millions of dollars spent on them. I think gamers are going to start saying, ‘For what?’

“The pressures are for games to appeal to more and more people. It used to be if you sold a million units, then that was OK. Then it was three million units. Now it’s five million units. Five million units is crazy. We’ve never sold five million units.”

It’s not just consumers who are getting wise, though. Matthews acknowledges that the publishers also see the dead-end approaching.

“I think something has to be said for the platform holders now. Along with digital distribution, the fact that the platform holders are really opening their doors and encouraging self-publishing and helping independent developers to take on some of those publishing responsibilities, has changed things for us. I think it will change things for a lot of other developers. “Hellblade was announced at the GamesCom Playstation 4 press conference. My perception of that press conference was that the real big hitters in that were all independent titles. It’s great that the platform holders have recognised that. There’s a real appetite from their players for innovative, creative games.

“It’s a great opportunity for us to try to do things differently. Like on Hellblade, we’re questioning everything that we do. Not just on development, but also how we do things from a business perspective as well. Normally you would say, ‘Well, you involve these types of agencies, get these people involved in this, and a website will take this long to create.’ The next thing that we’re doing is, we’re saying, ‘Well, is that true? Can we try and do these things a different way,’ because you can.

“There’s definitely pressure for us to fill all those gaps left by a publisher, but it’s a great challenge for us to step up to. Ultimately, we have to transition into a publisher. That’s going to happen at some point, if we want to publish our own games.”

Courtesy-GI.biz

Was The PS4 Sales Flat Over The Holiday?

January 7, 2015 by Michael  
Filed under Gaming

While the Sony PlayStation 4 has been selling very well, it seems that Christmas was not really its season.

Sony said that the PlayStation 4 has sold more than 18.5 million units since the new generation of consoles launched. While that is good and makes the PS4 the fastest selling PlayStation to date, there was no peaking at Christmas.

You would think that the PS4 would sell well at Christmas as parents were forced to do grevious bodily harm to their credit cards to shut their spoilt spawn up during the school holidays. But apparently not.

Apparently, the weapon of choice against precious snowflakes being bored was an Xbox One which saw a Christmas spike in sales.

Sony said that its new numbers are pretty much on target, it sold the expected 2 million sales per month rate.

Redmond will be happy with that result even if it still has a long way to go before it matches the PlayStation 4 on sales.

Courtesy-Fud

Are Buggy Games Getting A Pass?

December 23, 2014 by Michael  
Filed under Gaming

Recently, my smartphone started acting up. I think the battery is on the way out; it does bizarre things, like shutting itself off entirely when I try to take a picture on 60per cent battery, or suddenly dropping from fully charged to giving me “10per cent remaining, plug me in or else” warnings for no reason at all. I can get it fixed free of charge, but it’s an incredibly frustrating, bothersome thing, especially given how much money I’ve paid for this phone. Most of us have probably had an experience like this with a piece of hardware; a shoddy washing machine that mangled your favorite shirt, a shiny new LCD screen with an intensely irritating dead pixel, an Xbox 360 whose Red Ring of Death demanded a lengthy trip back to the service center. There are few of us who can’t identify with the utter frustration of having a consumer product that you’ve paid good money for simply fail to do its job properly. Sure, it’s a #FirstWorldProblem for the most part (unless it’s something like a faulty airbag in your Honda, obviously), but it’s intensely annoying and certainly makes you less likely to buy anything from that manufacturer again.

Given that we could all probably agree that a piece of hardware being faulty is utterly unacceptable, I’m not sure why software seems to get a free pass sometimes. Sure, there are lots of consumers who complain bitterly about buggy games, but by and large games with awful quality control problems tend to get slapped with labels like “flawed but great”, or have their enormous faults explained in a review only to see the final score reflect none of those problems. It’s not just the media that does this (and for what it’s worth, I don’t think this is corruption so much as an ill-considered aspect of media culture itself); for every broken game, there are a host of consumers out there ready to defend it to the hilt, for whatever reason.

I raise this problem because, while buggy games have always been with us – often hilariously, especially back in the early days of the PlayStation – the past year or so has seen a spate of high-profile, problematic games being launched, suggesting that even some of the industry’s AAA titles are no longer free from truly enormous technical issues. The technical problems that have become increasingly prevalent in recent years are causing genuine damage to the industry; from the botched online launches of games like Driveclub and Battlefield through to the horrendous graphical problems that plague some players of Assassin’s Creed Unity, they are giving consumers terrible experiences of what should be high points for the medium, creating a loud and outspoken group of disgruntled players who act to discourage others, and helping to drive a huge wedge between media (who, understandably, want to talk about the experience and context of a game rather than its technical details) and consumers (who consider a failure to address glaring bugs to be a sign of collusion between media and publishers, and a failure on the part of the media to serve their audience).

We can all guess why this is happening. I don’t wish in any way to underplay how complex and difficult it is to develop bug-free software; I write software tools to assist in my research work, and given how often those simple tools, developed by two or three people at most, have me tearing my hair out at 3am as I search for the single misplaced character that’s causing the whole project to behave oddly, I am absolutely the last person in the world who is going to dismiss the difficulty involved in debugging something as enormous and complex as a modern videogame. Debugging games has inevitably become harder as team sizes and technical complexity has grown; that’s to be expected.

However, just because something is harder doesn’t mean it shouldn’t be happening, and that’s the second part of this problem. Games are developed to incredibly tight schedules, sometimes even tighter today (given the culture of annual updates to core franchises) than they were in the past. Enormous marketing budgets are preallocated and planned out to support a specific release date. The game can’t miss that date; if there are show-stopping bugs, the game will just have to ship with those in place, and with a bit of luck they’ll be able to fix them in time to issue a day-one digital patch (and if your console isn’t online, tough luck).

Yet this situation is artificial in itself. It’s entirely possible to structure your company’s various divisions around the notion that a game will launch when it’s actually ready, and ensure that you only turn out high-quality software; Nintendo, in particular, manages this admirably. Certainly, some people criticise the company for delaying software and it does open up gaps in the release schedule, but compared to the enormous opprobrium which would be heaped upon the company if it turned out a Mario Kart game where players kept falling through the track, or a Legend of Zelda where Link’s face kept disappearing, leaving only eyes and teeth floating ghoulishly in negative space (sleep well, kids!), an occasional delay is a corporate cultural decision that makes absolute sense – not only for Nintendo, but for game companies in general.

It doesn’t even have to go as far as delaying games on a regular basis. There is a strong sense that some of the worst offenders in terms of buggy games simply aren’t taking QA seriously, which is something that absolutely needs to be fixed – and if not, deserves significant punishment from consumers and critics alike. Quality control has a bit of an image problem; there’s a standard stereotype of a load of pizza-fuelled youngsters in their late teens testing games for a few years as they try to break into a “real” games industry job. The image doesn’t come from thin air; for some companies, this is absolutely a reality. It is, however, utterly false to think that every company sees its QA in those terms. For companies that take QA seriously, it’s a division that’s respected and well-treated, with its own career progression tracks, all founded on the basic understanding that a truly good QA engineer is worth his or her weight in gold.

Not prioritising your QA department – not ensuring that it’s a division that’s filling up with talented, devoted people who see QA as potentially being a real career and not just a stepping stone – is exactly the same thing as not prioritising your consumers. Not building time for proper QA into your schedules, or failing to enact processes which ensure that QA is being properly listened to and involved, is nothing short of a middle finger raised to your entire consumer base – and you only get to do that so many times before your consumers start giving the gesture right back to you and your precious franchises.

Media does absolutely have a role to play in this – one to which it has, by and large, not lived up. Games with serious QA problems do not deserve critical acclaim. I understand fully that reviewers want to engage with more interesting topics than technical issues, but I think it’s worth thinking about how film reviewers would treat a movie with unfinished special effects or audio mixed such that voices can’t be heard; or perhaps how music reviewers would treat an album with a nasty recording hiss in the background, or with certain tracks accidentally dropping out or skipping. Regardless of the good intentions of the creative people involved in these projects, the resulting product would be slammed, and rightly so. It’s perhaps the very knowledge of the drubbing that they would receive that means that such awful movies and albums almost never see the light of day (and when they do, they become legendary in their awfulness; consider the unfinished CGI at the end of “The Scorpion King”, which remains a watchword for terrible special effects many years later).

Game companies, by contrast, seem to feel unpleasantly comfortable with releasing games that don’t work and aren’t properly tested. Certain technical aspects probably contribute to this; journalists may be wary of slamming a game for bugs that may be fixed in a day-one patch, for instance. Yet it seems that there’s little choice but to make the criteria stricter in this regard. If media and consumers alike do not take to punishing companies severely for failing to pay proper respect to QA procedures for their games, this problem will only worsen as firms realize that they they can get away with launching unfinished software.

We all want a world where technical issues are nothing but a footnote in the discussion of games; that will be the ultimate triumph of game technology, when it truly becomes transparent. We do not, however, live in that time yet, and the regular launches of games that don’t live up to even the most basic standards of quality is something nobody should be asked to tolerate. The move by some websites to stop reviewing online games until the servers are live and populated with real players is a good start; but the overall tolerance for bugs and willingness to forgive publishers for such transgressions (“we know the last game was a buggy mess, but we’re still going to publish half a dozen puff pieces that will push our readers to pre-order the sequel!”) needs to be fixed. If we want to talk about the things that are important about games (and we do!), it’s essential that we fix the culture that ignores QA and technical issues first.

Courtesy-GI.biz

 

Are Indie Developers Dying Out?

December 22, 2014 by Michael  
Filed under Gaming

For independent developers, the last decade has been an endless procession of migratory possibilities. The physical world was defined by compromise, dependence and strategically closed doors, but the rise of digital afforded freedom and flexibility in every direction. New platforms, new business models, new methods of distribution and communication; so many fresh options appeared in such a brief window of time that knowing where and when to place your bet was almost as important as having the best product. For a few years, right around 2008, there was promise almost everywhere you looked.

That has changed. No matter how pregnant with potential they once seemed, virtually every marketplace has proved unable to support the spiralling number of new releases. If the digital world is one with infinite shelf-space for games, it has offered no easy solutions on how to make them visible. Facebook, Android, iOS, Xbox Live Arcade, the PlayStation Network; all have proved to be less democratic than they first appeared, their inevitable flaws exposed as the weight of choice became heavier and heavier. As Spil Games’ Eric Goossens explained to me at the very start of 2014: “It just doesn’t pay the bills any more.”

Of course, Goossens was talking specifically about indie development of casual games. And at that point, with 2013 only just receding from view, I would probably have named one exception to the trend, one place where the balance between volume and visibility gave indies the chance to do unique and personal work and still make a decent living. That place would have been Steam, and if I was correct in my assessment for even one second, it wasn’t too long before the harsher reality became clear.

After less than five months of 2014 had passed, Valve’s platform had already added more new games than in the whole of the previous year. Initiatives like Greenlight and Early Access were designed to make Steam a more open and accessible platform, but they were so effective that some of what made it such a positive force for indies was lost in the process. Steam’s culture of deep-discounting has become more pervasive and intense in the face of this chronic overcrowding, stirring up impassioned debate over what some believe will be profound long-term effects for the perceived value of PC games. Every discussion needs balance, but in this case the back-and-forth seemed purely academic: for a lot of developers steep discounts are simply a matter of survival, and precious few could even entertain the notion of focusing on the greater good instead.

And the indie pinch was felt beyond Steam’s deliberately weakened walls. Kickstarter may be a relatively new phenomenon – even for the hyper-evolving landscape of the games industry – but it faced similar problems in 2014, blighted by the twin spectres of too much content and not enough money to go around. Anecdotally, the notion that something had changed was lurking in the back ground at the very start of the year, with several notable figures struggling to find enough backers within the crowd. The latter months of 2014 threw up a few more examples, but they also brought something close to hard evidence that ‘peak Kickstarter’ may already be behind us – fewer successful projects, lower funding targets, and less money flowing through the system in general. None of which was helped by a handful of disappointing failures, each one a blow for the public’s already flagging interest in crowdfunding. Yet another promising road for indies had become more treacherous and uncertain.

So are indies heading towards a “mass extinction event”? Overcrowding is certainly a key aspect of the overall picture, but the act of making and releasing a game is only getting easier, and the allure of development as a career choice seems to grow with each passing month. It stands to reason that there will continue to be a huge number of games jostling for position on every single platform – more than even a growing market can sustain – but there’s only so much to be gained from griping about the few remaining gatekeepers. If the days when simply being on Steam or Kickstarter made a commercial difference are gone, and if existing discovery tools still lack the nuance to deal with all of that choice, then it just shifts the focus back to where it really belongs: talent, originality, and a product worth an investment of time and money.

At GDC Europe this summer, I was involved in a private meeting with a group of Dutch independent game developers, all sharing knowledge and perspective on how to find success. We finished that hour agreeing on much the same thing. There are few guarantees in this or any other business, but the conditions have also never been more appropriate for personality and individuality to be the smartest commercial strategy. The world has a preponderance of puzzle-platformers, but there’s only one Monument Valley. We’re drowning in games about combat, but This War of Mine took a small step to the left and was greeted with every kind of success. Hell, Lucas Pope made an entire game about working as a border control officer and walked away with not just a hit, but a mantelpiece teeming with the highest honours.

No matter how crowded the market has become, strong ideas executed with care are still able to rise above the clamour, no huge marketing spend required. As long as that’s still possible, indies have all of the control they need.

Courtesy-GI.biz

Is Borderlands Headed To The Xbox One And PS4?

December 17, 2014 by Michael  
Filed under Gaming

Sources are sighting a rating seen on the Australian classifications that seem to point to an upcoming Remastered Edition of Borderlands is coming for Xbox One and PlayStation 4. So far this has remained unconfirmed by publisher 2K and franchise developer Gearbox.

The new remastered version is expected to be simply called “Borderlands Remastered Edition”, but with no confirmation from 2K and Gearbox it is difficult to say what all it might contain or if it is simply a converted and compiled version of the first three games for the Xbox One and PlayStation 4.

Bottom line if it is in fact a complied remastered release of the first three games, the reality is that this could actually be a good thing for those that own the new consoles.

Courtesy-Fud

Sony Hires FireEye To Assist With Data Breach

December 2, 2014 by Michael  
Filed under Computing

Sony Pictures Entertainment has hired FireEye’s Mandiant forensics unit to clean up a cyber attack that knocked out the studio’s computer network nearly a week ago, and resulted in three movies ending up online.

The FBI is also investigating the incident. Sony went down last Monday after displaying a red skull and the phrase “Hacked By #GOP,” which reportedly stands for Guardians of Peace. Emails to Sony have been bouncing back with messages asking senders to call employees because the system was “experiencing a disruption.”

Mandiant is an incident response firm that helps victims of breaches identify the extent of attacks, clean up networks and restore systems. The firm has handled some of the largest breaches uncovered to date, including the 2013 holiday attack on Target. Sony is investigating to determine whether hackers working on behalf of North Korea have launched the attack in retribution for the studio’s backing of the film “The Interview” which is to be released on Dec. 25 in the United States and Canada.

The movie is a comedy about a CIA attempt to assassinate North Korean leader Kim Jong Un, who is such a funny guy. The Pyongyang government denounced the film as “undisguised sponsoring of terrorism, as well as an act of war” in a letter to UN. Secretary-General Ban Ki-moon.

Courtesy-Fud

 

Was Sony’s Playstation Network Hacked Again?

November 25, 2014 by Michael  
Filed under Gaming

Sony has denied the claims of DerpTrolling, a hacker group which claimed it had raided the databases of the PSN, along with a number of other online services.

The group had published a list of emails and passwords for PSN, Windows Live Mail and 2K Games accounts online, and claimed to be prepared to release more, but Sony says that they’ve come from other sources than hacking.

“We have investigated the claims that our network was breached and have found no evidence that there was any intrusion into our network,” the company wrote in a declaration to Joystiq. “Unfortunately, Internet fraud including phishing and password matching are realities that consumers and online networks face on a regular basis. We take these reports very seriously and will continue to monitor our network closely.”

 

 

Courtesy-GI.biz

Is World Of Warcraft On The Rise?

November 24, 2014 by Michael  
Filed under Gaming

Blizzard is happy and why shouldn’t they be as World of Warcraft subscriptions are up. The reason for the increase can be traced to the release of the latest expansion pack which was recently released. The latest WOW expansion pack is called Warlords of Draeno and its release has driven subscriptions to 10 million.

Selling over 3.3 million copies of the Warlords of Draenor on the first day alone, growth has been seen in all major territories since release. The numbers do include those players that are using the 1 month free subscription that comes with the expansion pack. WoW subscriptions had climbed to 7.4 million last quarter after being down.

Of course the release of Warlords of Draenor has not been without its problems. Still Blizzard says that they are working around the clock to address them. Owners have been offered free play time as compensation.

Courtesy-Fud

Will Sunset Overdrive Make It To The PC?

November 21, 2014 by Michael  
Filed under Gaming

Microsoft has seen a number of Xbox One exclusive titles already be ported to the PC. Both Dead Rising 3 and Ryse have already made it to the PC, but we are now again hearing that Sunset Overdrive again is heading to the PC and Forza Horizon 2 maybe following as well.

This is not the first time we have heard rumors of Sunset Overdrive coming to the PC. An ad that suggested as much was down played at the time by Insomiac as a mistake. Now Sunset Overdrive and Forza Horizon 2 showed up on Amazon France as coming for the PC.

While Phil Spencer has suggested that Microsoft will have more to say about the PC in 2015 and that it would be a good thing for PC gamers. The reality is that Microsoft has not pushed PC game development in a longtime as it chose to focus on titles for the Xbox and Xbox 360. With the Xbox One being closer in design to the PC, porting a title to the PC is easier and Microsoft of course wants to be a player in this space.

We will have to wait and see what actually happens, but should Sunset Overdrive and Forza Horizon 2 make their way to the PC, it will be a good thing for PC gamers. Then again it could just be nothing more than a mistake.

Courtesy-Fud

Should AMD And nVidia Get The Blame For Assassin’s Creed’s PC Issues?

November 19, 2014 by Michael  
Filed under Gaming

Ubisoft is claiming that the reason that its latest Assassin’s Creed game was so bad was because of AMD and Nvidia configurations. Last week the Ubisoft was panned for releasing a game which was clearly not ready and Ubisoft originally blamed AMD for its faulty game. Now Ubisoft has amended an original forum post to include and acknowledge problems on Nvidia hardware as well.

Originally the post read “We are aware that the graphics performance of Assassin’s Creed Unity on PC may be adversely affected by certain AMD CPU and GPU configurations. This should not affect the vast majority of PC players, but rest assured that AMD and Ubisoft are continuing to work together closely to resolve the issue, and will provide more information as soon as it is available.”

However there is no equivalent Nvidia-centric post on the main forum, and no mention of the fact that if you own any Nvidia card which is not a GTX 970 or 980. What is amazing is that with the problems so widespread, Ubisoft did not see them in its own testing before sending it out to the shops. Unless they only played the game on an Nvidia GTX 970 and did not bother to test it on a console, it is inconceivable that they could not have seen it.

Courtesy-Fud