Subscribe to:

Subscribe to :: TheGuruReview.net ::

The Witcher Franchise Goes 25 Million Units

April 12, 2017 by  
Filed under Gaming

The Witcher 3: Wild Hunt continues to pay off for CD Projekt. The Polish publisher today reported its financial results for calendar year 2016, and the hit 2015 role-playing game loomed large over another successful campaign for the company.

CD Projekt said its revenues “continued to be dominated by ongoing strong sales” of The Witcher 3 and its two expansions. While the base game and its first expansion debuted in 2015, the second and final expansion pack, Blood and Wine, arrived last May and helped drive revenues of 583.9 million PLN ($148.37 million). That was down almost 27 percent year-over-year, but still well beyond the company’s sales figures prior to 2015. Net profits were likewise down almost 27%, with the company posting a bottom line gain of 250.5 million PLN ($63.65 million).

The company also announced a new milestone for the Witcher franchise, saying the three games have now cumulatively topped 25 millions copies sold, a number that doesn’t include The Witcher 3 expansions packs. That suggests 2016 saw roughly 5 million copies sold over the 20 million reported in CD Projekt’s 2015 year-end financials.

Even if this year saw overall sales take a dip for CD Projekt, its GOG.com online retail storefront still managed to post its best year ever. The company reported GOG.com revenues of 133.5 million PLN ($33.92 million), up 15% year-over-year.

CD Projekt is currently testing its Gwent free-to-play card game in closed beta, and intends to open it up to the public this spring. It is also working on its next AAA game, Cyberpunk 2077, thought it has no release date as yet.

Courtesy-GI.biz

Can Violence In A Game Promote Safety?

March 30, 2017 by  
Filed under Gaming

When the original Doom was released in 1993, its unprecedentedly realistic graphic violence fueled a moral panic among parents and educators. Over time, the game’s sprite-based gore has lost a bit of its impact, and that previous sentence likely sounds absurd.

Given what games have depicted in the nearly quarter century since Doom, that level of violence no longer shocking so much as it is quaint, perhaps even endearing. So when it came time for id Software to reboot the series with last year’s critically acclaimed remake of Doom, one of the things the studio had to consider was exactly how violent it should be, and to what end.

Speaking with GamesIndustry.biz at the Game Developers Conference last month, the Doom reboot’s executive producer and game director Marty Stratton and creative director Hugo Martin acknowledged that the context of the first Doom’s violence had changed greatly over the years. And while the original’s violence may have been seen as horrific and shocking, they wanted the reboot to skew closer to cartoonishly entertaining or, as they put it, less Saw and more Evil Dead 2.

“We were going for smiles, not shrieks,” Martin said, adding, “What we found with violence is that more actually makes it safer, I guess, or just more acceptable. It pushes it more into the fun zone. Because if it’s a slow trickle of blood out of a slit wrist, that’s Saw. That’s a little bit unsettling, and sort of a different type of horror. If it’s a comical fountain of Hawaiian Punch-looking blood out of someone’s head that you just shot off, that’s comic book. That’s cartoonish, and that’s what we wanted.”

“They’re demons,” Stratton said. “We don’t kill a single human in all of Doom. No cursing, no nudity. No killing of humans. We’re actually a pretty tame game when you think about it. I’ve played a lot of games where you just slaughter massive amounts of human beings. I think if we had to make some of the decisions we make about violence and the animations we do and if we were doing them to humans, we would have completely different attitudes when we go into those discussions. It’s fun to sit down in a meeting and think about all the ways it would be cool to rip apart a pinky demon or an imp. But if we had the same discussions about, ‘How am I going to rip this person in half?’ or rip his arm off and beat him over the head with it, it takes on a different connotation that I don’t know would be as fun.”

That balancing act between horror and comedy paid off for the reboot, but it was by no means the only line last year’s Doom had to straddle. There was also the question of what a modern Doom game would look like. The first two Doom games were fast-paced shooters, while the third was a much slower horror-tinged game where players had to choose between holding a gun or a flashlight at the ready. Neither really fit into the recent mold of AAA shooters, and the developers knew different people would have very different expectations for a Doom game in 2016.

As Stratton explained, “At that point, we went to, ‘What do we want? What do we think a Doom game should be moving forward?’As much as we always consider how the audience is going to react to the game–what they’re thinking, and what we think they want–back in the very beginning, it was, ‘What do we think Doom should be, and what elements of the game do we want to build the future of Doom on?’ And that’s really where we came back to Doom 1, Doom II, the action, the tone, the attitude, the personality, the character, the irreverence of it… those were all key words that we threw up on the board in those early days. And then mechanically, it was about the speed. It was about unbelievable guns, crazy demons, really being very honest about the fact that it was Doom. It was unapologetic early on, and we built from there.”

It helped that they had a recent example of how not to bring Doom into the current generation. Prior to the Doom reboot, id Software had been working on Doom 4, which Stratton said was a good game, but just didn’t feel like Doom. For one, it cast players as a member of a resistance army rather than a one-marine wrecking crew. It was also slower from a gameplay perspective, utilizing a cover-based system shared by numerous modern shooters designed to make the player feel vulnerable.

“None of us thought that the word ‘vulnerable’ belonged in a proper Doom game,” Martin said. “You should be the scariest thing in the level.”

Doom 4 wasn’t a complete write-off, however. The reboot’s glory kill system of over-the-top executions actually grew out of a Doom 4 feature, although Stratton said they made it “faster and snappier.”

Of course, not everything worked as well. At one point the team tried giving players a voice in their ears to help guide them through the game, a pretty standard first-person shooter device along the lines of Halo’s Cortana. Stratton said while the device works well for other franchises, it just didn’t feel right for Doom, so it was quickly scrapped.

“We didn’t force anything,” Stratton said. “If something didn’t feel like Doom, we got rid of it and tried something that would feel like Doom.”

That approach paid off well for the game’s single-player mode, but Stratton and Martin suggested they weren’t quite as thrilled with multiplayer. Both are proud of the multiplayer (which continues to be worked on) and confident they delivered a high quality experience with it, but they each had their misgivings about it. Stratton said if he could change one thing, it would have been to re-do the multiplayer progression system and give more enticing or better placed “hooks” to keep players coming back for game after game. Martin wished the team had messaged what the multiplayer would be a little more clearly, saying too many expected a pure arena shooter along the lines of Quake 3 Arena, when that was never the development team’s intent.

Those issues aside, it’s clear the pair feel the new wrinkles and changes they made to the classic Doom formula paid off more often than not.

“Lots worked,” Stratton said. “That’s probably the biggest point of pride for us. The game really connected with people. We always said we wanted to make something that was familiar to long-time fans, felt like Doom from a gameplay perspective and from a style and tone and attitude perspective. And I think we really accomplished that at a high level. And I think we made some new fans, which is always what you’re trying to do when you have a game that’s only had a few releases over the course of 25 years… You’re looking to bring new people into the genre, or into the brand, and I think we did that.”

Courtesy-GI.biz

Will AMD’s Polaris Based RX 500 Launch April 18th?

March 27, 2017 by  
Filed under Computing

According to reports, the upcoming AMD Radeon RX 500 series, which should be based on Polaris GPUs, could be slightly delayed, with the new launch date set for April 18th.

While earlier information suggested that the Polaris 10-based Radeon RX 570/580 should be coming on April 4th, with Polaris 11-based RX 550/560 refresh coming a week later, on April 11th, a new report from China site Mydrivers.com, spotted by eTeknix.com, suggests that the launch date has been pushed back to April 18th.

As we’ve written before, the new Radeon RX 500 series will be based on an existing AMD Polaris GPU architecture but should have somewhat higher clocks and improved performance-per-watt while the flagship Vega GPU based Radeon RX Vega, should be coming at a later date, most likely at Computex 2017 show, starting on May 30th.

Unfortunately, the precise details regarding the upcoming Radeon RX 500 series are still unknown but hopefully these performance and clock improvements will allow AMD to compete with Nvidia’s mainstream lineup.

Courtesy-Fud

AMD’s Vega Benchmarks Continue To Leak

March 23, 2017 by  
Filed under Computing

An alleged SiSoft benchmark result leak that has been spotted recently has revealed a bit more information on what appears to be at least one version of the upcoming AMD Vega GPU.

According to the data provided by the benchmark result, which was originally spotted by Videocardz.com, the GPU features 64 CUs for a total of 4096 Stream Processors as well as comes with 8GB of VRAM on a 2048-bit memory interface (two HMB2 stacks, each with 4GB and 1024-bit memory interface).

Despite the obviously wrong 344MHz GPU clock, the results are quite impressive, outperforming the Geforce GTX 1080 in the same benchmark. Of course, these are just compute results and probably done on an alpha driver version so it is still too early to talk about the actual real world performance but at least it gives a general idea regarding the GPU.

Earlier rumors suggested that there will be at least two versions of the Vega GPU and it is still not clear if this is the slower or the faster one.

As confirmed by AMD earlier, its Radeon RX Vega graphics cards should be coming in the first half of this year, with the most likely launch at Computex 2017 show which opens its doors on May 30th.

Courtesy-Fud

Windows On ARM Ready To Debut

March 20, 2017 by  
Filed under Computing

After months of rumors, Windows is finally fully functional on ARM based chips and the Wintel alliance is in tatters.

Microsoft officials told Bloomberg that the company is committed to use ARM chips in machines running its cloud services.

Microsoft will use the ARM chips in a cloud server design that its officials will detail at the US Open Compute Project Summit today. Microsoft has been working with both Qualcomm and Cavium on the version of Windows Server for ARM.

Microsoft joined the Open Compute Project (OCP) in 2014, and is a founding member of and contributor to the organisation’s Switch Abstraction Interface (SAI) project.

The OCP publishes open hardware designs intended to be used to build cheaper datacentres. The OCP has released specs for motherboards, chipsets, cabling, and common sockets, connectors, and open networking and switches.

Vole’s cloud server specification is a a 12U shared server chassis capable of housing 24 1U servers. Microsoft is also releasing its Chassis Manager under the open-source Apache license.

Project Olympus is the codename for Vole’s next-generation cloud hardware design that it contributed last autumn Fall to the OCP.

Vole is also expected to use ARM processors in its Olympus systems which will be headed to its data systems by Christmas.

The winner appears to be Qualcomm which says it is working on a variety of cloud workloads to run on the Microsoft Azure cloud platform powered by Qualcomm Centriq 2400 server solutions.

Qualcomm said it had been working with Vole for several years on ARM-based server enablement and has onsite engineering at Microsoft to collaboratively optimise a version of Windows Server, for Microsoft’s internal use in its data centres, on Qualcomm Centriq 2400-based systems.

There’s no word from Microsoft when it will begin offering Windows Server on ARM to external customers or partners, but that is only a matter of time. With less power the need for Intel’s use in the server room becomes less important and if ARM designs become more established because of Microsoft’s blessing, it is unlikely that anyone will want Intelthere.

Courtesy-Fud

Is AMD Rebranding The Radeon Series Next Month?

March 10, 2017 by  
Filed under Computing

According to the newest report, AMD’s upcoming RX 500 series will be completely made of RX 400 series rebrands, based on Polaris 10 and Polaris 11, as well as possible new Polaris 12 GPU, while Radeon RX Vega won’t launch before late May.

According to a report written by Martin Fischer from Heise.de, which is pretty much what we have been hearing and able to confirm with our sources, AMD is planing to launch its Radeon RX 500 series in early April. While this would usually sound interesting, the entire lineup will be compromised from RX 400 series rebrands, but with slightly higher GPU clocks.

According to the report, Polaris 10-based RX 580 and RX 570, should be launching on April 4th, while Radeon RX 560 and RX 550, based on Polaris 11, should launch on April 11. According to a similar report from Videocardz.com, the Radeon RX 550 could be based on a new Polaris 12 chip, while the rebranded Radeon RX 560 could get the fully-enabled Polaris 11 GPU with 1024 Stream Processors, similar to RX 460 seen in China.

The big question is the newly named Radeon RX Vega, which, according to AMD, should be available in the first half of this year, so the most likely launch is at the Computex 2017 show which starts on May 30th. It appears that the Nvidia GTX 1080 Ti will have to wait to get its competitor as simple rebrands just won’t be enough.

Courtesy-Fud

Are Low Profile Radeon RX 460 Forthcoming?

February 23, 2017 by  
Filed under Computing

MSI has unveiled yet another HTPC-friendly graphics card, the low-profile Radeon RX 460 that will comes in both 2GB and 4GB versions.

Featuring a dual-slot, low-profile, dual-fan cooler and a low-profile PCB to match, both the MSI RX 460 2GT LP and 4GT LP graphics card will be working at reference 1090MHz GPU base and 1200MHz GPU Boost clocks with GDDR5 memory working at 1750MHz on a 128-bit memory interface.

It also comes with single DVI and one HDMI display outputs.

In case you missed it, the Radeon RX 460 is based on AMD’s Polaris 11 GPU with 896 Stream Processors, 48 TMUs and 16 ROPs and should pack enough punch for a decent casual gaming experience.

Unfortunately, the price or the availability date have not been revealed but we are sure these two will appear in retail/e-tail soon at around US $100/€100.

Courtesy-Fud

Why Are The NPD Games Sales Kept Private?

February 22, 2017 by  
Filed under Gaming

When I first began my career in the games industry I wrote a story about an impending digital download chart.

It was February 2008 and Dorian Bloch – who was leader of UK physical games data business Chart-Track at the time – vowed to have a download Top 50 by Christmas.

It wasn’t for want of trying. Digital retailers, including Steam, refused to share the figures and insisted it was down to the individual publishers and developers to do the sharing (in contrast to the retail space, where the stores are the ones that do the sharing). This led to an initiative in the UK where trade body UKIE began using its relationships with publishers to pull together a chart. However, after some initial success, the project ultimately fell away once the sheer scale of the work involved became apparent.

Last year in the US, NPD managed to get a similar project going and is thus far the only public chart that combines physical and digital data from accurate sources. However, although many big publishers are contributing to the figures, there remains some notable absentees and a lack of smaller developers and publishers.

In Europe, ISFE is just ramping up its own project and has even began trialling charts in some territories (behind closed doors), however, it currently lacks the physical retail data in most major markets. This overall lack of information has seen a rise in the number of firms trying to plug the hole in our digital data knowledge. Steam Spy uses a Web API to gather data from Steam user profiles to track download numbers – a job it does fairly accurately (albeit not all of the time).

SuperData takes point-of-sale and transaction information from payment service providers, plus some publishers and developers, which means it can track actual spend. It’s strong on console, but again, it’s not 100% accurate. The mobile space has a strong player in App Annie collecting data, although developers in the space find the cost of accessing this information high.

It feels unusual to be having this conversation in 2017. In a market that is now predominantly digital, the fact we have no accurate way of measuring our industry seems absurd. Film has almost daily updates of box office takings, the music market even tracks streams and radio plays… we don’t even know how many people downloaded Overwatch, or where Stardew Valley would have charted. So what is taking so long?

“It took a tremendous amount of time and effort from both the publisher and NPD sides to make digital sales data begin to flow,” says Mat Piscatella, NPD’s US games industry analyst. NPD’s monthly digital chart is the furthest the industry has come to accurate market data in the download space.

“It certainly wasn’t like flipping a switch. Entirely new processes were necessary on both sides – publishers and within NPD. New ways of thinking about sales data had to be derived. And at the publishers, efforts had to be made to identify the investments that would be required in order to participate. And of course, most crucially, getting those investments approved. We all had to learn together, publishers, NPD, EEDAR and others, in ways that met the wants and needs of everyone participating.

“Over time, most of the largest third-party publishers joined the digital panel. It has been a remarkable series of events that have gotten us to where we are today. It hasn’t always been smooth; and keep in mind, at the time the digital initiative began, digital sales were often a very small piece of the business, and one that was often not being actively managed. Back then, publishers may have been letting someone in a first-party operation, or brand marketing role post the box art to the game on the Sony, Microsoft and Steam storefronts, and that would be that. Pricing wouldn’t be actively managed, sales might be looked at every month or quarter, but this information certainly was not being looked at like packaged sales were. The digital business was a smaller, incremental piece of the pie then. Now, of course, that’s certainly changed, and continues to change.”

“For one, the majors are publicly traded firms, which means that any shared data presents a financial liability. Across the board the big publishers have historically sought to protect the sanctity of their internal operations because of the long development cycles and high capital risks involved in AAA game publishing. But, to be honest, it’s only been a few years that especially legacy publishers have started to aggregate and apply digital data, which means that their internal reporting still tends to be relatively underdeveloped. Many of them are only now building the necessary teams and infrastructure around business intelligence.”

Indeed, both SuperData and NPD believe that progress – as slow as it may be – has been happening. And although some publishers are still holding out or refusing to get involved, that resolve is weakening over time.   “For us, it’s about proving the value of participation to those publishers that are choosing not to participate at this time,” Piscatella says. “And that can be a challenge for a few reasons. First, some publishers may believe that the data available today is not directly actionable or meaningful to its business. The publisher may offer products that have dominant share in a particular niche, for example, which competitive data as it stands today would not help them improve.

“Second, some publishers may believe that they have some ‘secret sauce’ that sharing digital sales data would expose, and they don’t want to lose that perceived competitive advantage. Third, resources are almost always stretched thin, requiring prioritisation of business initiatives. For the most part, publishers have not expanded their sales planning departments to keep pace with all of the overwhelming amount of new information and data sources that are now available. There simply may not be the people power to effectively participate, forcing some publishers to pass on participating, at least for now.

“So I would certainly not classify this situation as companies ‘holding out’ as you say. It’s that some companies have not yet been convinced that sharing such information is beneficial enough to overcome the business challenges involved. Conceptually, the sharing of such information seems very easy. In reality, participating in an initiative like this takes time, money, energy and trust. I’m encouraged and very happy so much progress has been made with participating publishers, and a tremendous amount of energy is being applied to prove that value to those publishers that are currently not participating.”

NPD’s achievements is significant because it has managed to convince a good number of bigger publishers, and those with particularly successful IP, to share figures. And this has long been seen as a stumbling block, because for those companies performing particularly well, the urge to share data is reduced. I’ve heard countless comments from sales directors who have said that ‘sharing download numbers would just encourage more competitors to try what we’re doing.’ It’s why van Dreunen has noted that “as soon as game companies start to do well, they cease the sharing of their data.”

Indeed, it is often fledgling companies, and indie studios, that need this data more than most. It’s part of the reason behind the rise of Steam Spy, which prides itself on helping smaller outfits.

“I’ve heard many stories about indie teams getting financed because they managed to present market research based on Steam Spy data,” boasts Sergey Galyonkin, the man behind Steam Spy. “Just this week I talked to a team that got funded by Medienboard Berlin-Brandenburg based on this. Before Steam Spy it was harder to do a proper market research for people like them.

“Big players know these numbers already and would gain nothing from sharing them with everyone else. Small developers have no access to paid research to publish anything.

“Overall I’d say Steam Spy helped to move the discussion into a more data-based realm and that’s a good thing in my opinion.”

The games industry may be behaving in an unusually backwards capacity when it comes to sharing its digital data, but there are signs of a growing willingness to be more open. A combination of trade body and media pressure has convinced some larger publishers to give it a go. Furthermore, publishers are starting to feel obligated to share figures anyway, especially when the likes of SuperData and Steam Spy are putting out information whether they want them to or not.

Indeed, although the chart Dorian promised me 9 years ago is still AWOL, there are at least some figures out there today that gives us a sense of how things are performing.

“When we first started SuperData six years ago there was exactly zero digital data available,” van Dreunen notes. “Today we track the monthly spending of 78 million digital gamers across platforms, in spite of heavy competition and the reluctance from publishers to share. Creating transparency around digital data is merely a matter of market maturity and executive leadership, and many of our customers and partners have started to realize that.”

He continues: The current inertia comes from middle management that fears new revenue models and industry changes. So we are trying to overcome a mindset rather than a data problem. It is a slow process of winning the confidence and trust of key players, one-at-a-time. We’ve managed to broker partnerships with key industry associations, partner with firms like GfK in Europe and Kadokawa Dwange in Japan, to offer a complete market picture, and win the trust with big publishers. As we all move into the next era of interactive entertainment, the need for market information will only increase, and those that have shown themselves willing to collaborate and take a chance are simply better prepared for the future.”

NPD’s Piscatella concludes: “The one thing I’m most proud of, and impressed by, is the willingness of the participating publishers in our panel to work through issues as they’ve come up. We have a dedicated, positive group of companies working together to get this information flowing. Moving forward, it’s all about helping those publishers that aren’t participating understand how they can benefit through the sharing of digital consumer sales information, and in making that decision to say “yes” as easy as possible.

“Digital selling channels are growing quickly. Digital sales are becoming a bigger piece of the pie across the traditional gaming market. I fully expect participation from the publishing community to continue to grow.”

Courtesy-GI.biz

Will Politics Bring Down The Gaming Industry?

February 20, 2017 by  
Filed under Gaming

If you’re someone who makes a living from videogames – as most readers of this site are – then political developments around the world at the moment should deeply concern you. I’m sure, of course, that a great many of you are concerned about things ranging from President Trump’s Muslim travel ban to the UK Parliament’s vote for “Hard Brexit” or the looming elections in Holland and France simply on the basis of being politically aware and engaged. However, there’s a much more practical and direct way in which these developments and the direction of travel which they imply will impact upon us. Regardless of personal ideology or beliefs, there’s no denying that the environment that seems to be forming is one that’s bad for the medium, bad for the industry, and will ultimately be bad for the incomes and job security of everyone who works in this sector.

Video games thrive in broadly the same conditions as any other artistic or creative medium, and those conditions are well known and largely undisputed. Creative mediums benefit from diversity; a wide range of voices, views and backgrounds being represented within a creative industry feeds directly into a diversity of creative output, which in turn allows an industry to grow by addressing new groups of consumers. Moreover, creative mediums benefit from economic stability, because when people’s incomes are low or uncertain, entertainment purchases are often among the first to fall.

Once upon a time, games had such strong underlying growth that they were “recession proof,” but this is no longer the case. Indeed, it was never entirely an accurate reading anyway, since broader recessions undoubtedly did slow down – though not reverse – the industry’s growth. Finally, as a consequence of the industry’s broad demographic reach, expansion overseas is now the industry’s best path to future growth, and that demands continued economic progress in the developing world to open up new markets for game hardware and software.

What is now happening on a global basis threatens all of those conditions, and therefore poses a major commercial threat to the games business. That threat must be taken especially seriously given that many companies and creators are already struggling with the enormous challenges that have been thrown up by the messy and uneven transition towards smart devices, and the increasing need to find new revenue streams to support AAA titles whose audience has remained largely unchanged even as development budgets have risen. Even if the global economic system looked stable and conditions were ideal for creative industries, this would be a tough time for games; the prospect of restrictions on trade and hiring, and the likelihood of yet another deep global recession and a slow-down in the advances being made by developing economies, make this situation outright hazardous.

Consider the UK development industry. Since well over a decade ago, if you asked just about any senior figure in the UK industry what the most pressing problem they faced was, they’d give you the same answer: skills shortages. Hiring talented staff is tough in any industry, but game development demands highly skilled people from across a range of fields, and assembling that kind of talent isn’t cheap or easy – even when you have access to the entire European Union as a hiring base, as UK companies did. Now UK companies face having to fill their positions with a much smaller pool of talent to draw from, and hiring from abroad will be expensive, complex and, in many cases, simply impossible.

The US, too, looks like it may tighten visa regulations for skilled hires from overseas, which will have a hugely negative impact on game development there. There are, of course, many skilled creatives who work within the borders of their own country, but the industry has been built on labour flows; centres of excellence in game development, like the UK and parts of the US, are sustained and bolstered by their ability to attract talent from overseas. Any restriction on that will impact the ability of companies to create world-class games – it will make them poorer creatively and throw hiring roadblocks in the path of timely, well-polished releases.

Then there’s the question of trade barriers; not only tariffs, which seem likely to make a comeback in many places, but non-tariff barriers in terms of diverse regulations and standards that will make it harder for companies to operate across national borders. The vast majority of games are multinational efforts; assets, code, and technology are created in different parts of the world and brought together to create the final product. Sometimes this is because of outsourcing, other times it’s because of staff who work remotely, and very often it’s simply because a certain piece of technology is licensed from a company overseas.

If countries become more hostile to free trade, all of that will become more complex and expensive. And that’s even before we start to think about what happens to game hardware, from consoles that source components from across Asia before assembly in China or Japan, to PC and smart device parts that flow out of China, Korea, Taiwan and, increasingly, from developing nations in South-East Asia. If tariff barriers are raised, all of those things will get a lot more expensive, limiting the industry’s consumer base at the most damaging time possible.

Such trade barriers – be they tariff barriers or non-tarriff barriers – would disproportionately impact developing countries. Free trade and globalisation have had negative externalities, unquestionably, but by and large they have contributed to an extraordinary period of prosperity around the world, with enormous populations of people being lifted out of poverty in recent decades and many developing countries showing clear signs of a large emerging middle class. Those are the markets game companies desperately want to target in the coming decade or so. In order for the industry to continue to grow and prosper, the emerging middle class in countries like India, Brazil and Indonesia needs to cultivated as a new wave of game consumers, just as many markets in Central and Eastern Europe were a decade ago.

The current political attacks on the existing order of world trade threaten to cut those economies off from the system that has allowed them to grow and develop so quickly, potentially hurling them into deep recession before they have an opportunity to cement stable, sustainable long-term economic prosperity. That’s an awful prospect on many levels, of course (it goes without saying that many of the things under discussion threaten human misery and catastrophe that far outweighs the impact on the games business), but one consequence will likely be a hard stop to the games industry’s capacity to grow in the coming years.

It’s not just developing economies whose consumers are at risk from a rise of protectionism and anti-trade sentiments, however. If we learned anything from the 2008 crash and the recession that followed, it should be that the global economy largely runs not on cash, but on confidence. The entire edifice is built on a set of rules and standards that are designed to give investors confidence; the structure changes over time, of course, but only slowly, because stability is required to allow people to invest and to build businesses with confidence that the rug won’t be tugged out from underneath them tomorrow. From the rhetoric of Donald Trump to the hardline Brexit approach of the UK, let alone the extremist ideas of politicians like Marine le Pen and Geert Wilders, the current political movement deeply threatens that confidence. Only too recently we’ve seen what happens to ordinary consumers’ job security and incomes when confidence disappears from the global economy; a repeat performance now seems almost inevitable.

Of course, the games industry isn’t in a position to do anything about these political changes – not alone, at least. The same calculations, however, apply to a wide variety of industries, and they’re all having the same conversations. Creative industries are at the forefront for the simple reason that they will be the first to suffer should the business environment upon which they rely turn negative, but in opposing those changes, creative businesses will find allies across a wide range of industries and sectors.

Any business leader that wants to throw their weight behind opposing these changes on moral or ethical grounds is more than welcome to, of course – that’s a laudable stance – but regardless of personal ideology, the whole industry should be making its voice heard. The livelihoods of everyone working in this industry may depend on the willingness of the industry as a whole to identify these commercial threats and respond to them clearly and powerfully.

Courtey-GI.biz

Is The Gaming Industry Due For An Overhaul?

February 16, 2017 by  
Filed under Gaming

Physical retailers are calling for a change in how video game pre-orders are conducted.

They are speaking to publishers and platform holders over the possibility of selling games before the release date. Consumers can pick up the disc 1 to 3 weeks before launch, but it will remain ‘locked’ until launch day.

The whole concept stems from the pre-loading service available in the digital space. Today, consumers can download a game via Steam, Xbox Live and PSN before it’s out, and the game becomes unlocked at midnight on launch day for immediate play (after the obligatory day one patch).

It makes sense to roll this out to other distribution channels. The idea of going into a shop to order a game, and then returning a month later to buy it, always seemed frankly antiquated.

Yet it’s not only consumer friendly, it’s potentially retailer and publisher friendly, too.

For online retailers, the need to hit an embargo is costly – games need to be turned around rapidly to get it into consumers’ hands on day one.

For mainstream retailers, it would clear up a lot of confusion. These stores are not naturally built for pre-ordering product, with staff that are more used to selling bananas than issuing pre-order receipts. The fact you can immediately take the disc home would help – it could even boost impulse sales.

Meanwhile, specialist retailers will be able to make a longer ‘event’ of the game coming out, and avoid the situation of consumers cancelling pre-orders or simply not picking up the game.

Yet when retail association ERA approached some companies about the prospect of doing this, it struggled to find much interest from the publishing community. So what’s the problem?

There are a few challenges.

There are simple logistical obstacles. Games often go Gold just a few weeks before they’re launched, and then it’s over to the disc manufacturers, the printers, the box makers and the distributors to get that completed code onto store shelves. This process can take two weeks in itself. Take the recent Nioh. That game was available to pre-download just a few days before launch – so how difficult would it be to get that into a box, onto a lorry and into a retailer in advance of release?

It also benefits some retailers more than others – particularly online ones, and those with strong distribution channels.

For big games, there’s a potential challenge when it comes to bandwidth. If those that pre-ordered Call of Duty all go online straight away at 12:01, that would put a lot of pressure on servers.

Piracy may also be an issue, because it makes the code available ahead of launch.

The end of the midnight launch may be happening anyway, but not for all games. If consumers can get their game without standing in the cold for 2 hours, then they will. And those lovely marketable pictures of snaking queues will be a thing of the past.

None of these obstacles are insurmountable. Getting the game finished earlier before launch is something that most big games publishers are trying to do, and this mechanism will help force that issue. Of course, the disc doesn’t actually have to contain a game at all. It can be an unlock mechanism for a download, which will allow the discs to be ready far in advance of launch. That strategy is significantly riskier, especially considering the consumer reaction to the same model proposed by Xbox back in 2013.

As for midnight events, there are still ways to generate that big launch ‘moment’. Capcom released Resident Evil 7 with an experiential haunted house experience that generated lots of media attention and attracted a significant number of fans. Pokémon last year ran a big fan event for Sun and Moon, complete with a shop, activities, signing opportunities and the chance to download Mew.

So there are other ways of creating launch theatre than inviting consumers to wait outside a shop. If anything, having the game available in advance of launch will enable these theatrical marketing events to last longer. And coupled with influencer activity, it would actually drive pre-release sales – not just pre-release demand.

However, the reality is this will work for some games and not for others, and here lies the heart of the challenge.

Pre-ordering is already a relatively complex matter, so imagine what it’ll be like if some games can be taken home in advance and others can’t? How many instances can we expect of people complaining that ‘their disc doesn’t work’?

If this is going to work, it needs cross-industry support, which isn’t going to happen. This is a business that can’t even agree on a digital chart, don’t forget.

What we may well see is someone giving this concept a go. Perhaps a digital native publisher, like Blizzard or Valve, who can make it part of their PR activity.

Because if someone like that can make the idea work, then others will follow.

Courtesy-GI.biz

Is Microsoft Taking Windows To The Cloud?

February 2, 2017 by  
Filed under Computing

Software king of the world Microsoft is planning a cut down version of Windows 10 which will operate in the cloud.

Dubbed the Composable Shell (CSHELL) the software is a single, unified, ‘adaptive shell’ for Windows 10 and it is part of Vole’s cunning plan to create a universal Windows 10 version.

This will mean we will see a standardised framework to scale and adapt the OS to any type of device, display size or user experience, including smartphones, PCs, tablets, consoles, large touchscreens, and more.

Stage one apparently means a Cloud Shell which is a cut-down version of Windows designed for the modern computing world.

Cloud Shell should be out there in 2017 and it will be connected to the Windows Store and Universal Windows Platform app framework.

This would fit with Vole’s plans to bring the full version of Windows 10 to mobile devices with ARM-based processors, which it announced in December.

A ‘lightweight’ version of Windows could hint at a ‘thin client’-style approach which has been touted as a viable business tool for the last 20 years but has never really taken off.

Courtesy-Fud

Is Sony Really Committed To The PSVR?

February 1, 2017 by  
Filed under Gaming

The positive reviews pouring in from all corners for Capcom’s Resident Evil 7 are a welcome validation of the firm’s decision to go in quite a radically new direction with the series, but Capcom isn’t the only company that will be happy (and perhaps a little relieved) by the response to the game. A positive reaction to RE7 is also hugely important for Sony, because this is the first real attempt at proving that PSVR is a worthy platform for full-scale, AAA games, and much of the credibility of the nascent VR platform rests on RE7.

Although some of the sentiment in reviews of the game suggests that the VR mode is interesting but somewhat flawed, and several reviewers have expressed a preference for playing on a normal screen, the game’s VR aspect undoubtedly fascinates consumers and seems to be implemented well enough to justify their interest. In the process, it also justifies Sony’s investment in the title – the company did a deal that secured a year-long VR exclusivity window for PSVR – and Capcom’s own faith in the burgeoning medium, which undoubtedly played a large role in the decision to switch the entire game over to a first-person perspective.

The critical success of RE7, and the likely commercial success that will follow, comes at a crucial juncture for PSVR. Although the hardware was well-reviewed at launch and remains more or less supply-constrained at retail – you certainly can’t get your hands on one without paying a hefty re-seller premium in Japan at the moment, and believe me I’ve tried – there’s an emerging narrative about the VR headset that’s distinctly negative and pessimistic. Plenty of op-eds and videos have popped up in recent weeks comparing PSVR to previous Sony peripheral launches like PlayStation Eye and PlayStation Move; hardware that was launched with a lot of heavy marketing support but which the giant company rapidly seemed to lose interest in, condemning it to a few years of token, declining software support before being quietly shelved altogether.

It’s worth noting, of course, that neither Eye nor Move actually died off entirely – in fact, both of these technologies have made their way into PSVR itself, with the headset essentially being an evolution of a number of previous Sony technologies that have finally found a decent application in VR. However, there’s no question but that Sony has a bad track record with peripherals, and those interested in the future of PSVR should absolutely be keeping a close eye on the company to see if there are any signs of it repeating its past behaviour patterns.

Most of what’s being written now, however, feels premature. PSVR had a pretty solid launch line-up, with good support from across the industry; just this week it got its first truly big third-party AAA title, which is receiving excellent reviews, and later in the year it’s got some big launches like GT Sport on the way. The pace of software releases slumped after the launch window, but that’s not unusual for any platform. There’s nothing about PSVR that you can point to right now and declare as evidence of Sony’s focus shifting away; it feels like editorials claiming this are doing so purely on the basis of Sony’s track record, not the facts as they exist now.

If you really want to know how PSVR is shaping up, there are two key things to watch out for in the near future. The first will be any data that’s released regarding the performance of RE7’s VR mode; is it popular? Is it being played widely? Does it become a part of the broad conversation about the game? Much of this latter aspect is down to Sony and Capcom’s marketing of course; there’s an opportunity to push the VR aspect of RE7 as a genuinely unique experience with appeal even beyond the usual gaming audience, and if that can be capitalised upon, it will likely secure PSVR’s future to a large degree. What’s crucial, though, is that every other company in the industry will be watching RE7 like hawks; if proper, well-integrated PSVR support seems to be a major selling factor or a popular feature, you can be guaranteed that other publishers will start to look at their major franchises with a view to identifying which of them would suit a similar “traditional display plus optional VR” approach.

The other thing to watch for, unsurprisingly, is what Sony does at E3 and other major gaming events this spring. This is really where we’ll see the proof of the company’s focus – or lack of same. There’s still plenty of time to announce VR titles for the back half of this year, which is likely to be the crucial point for PSVR; by the time we slip into the second half of 2017, the hardware will no longer be supply constrained and the early adopters buying for novelty will be all but exhausted. That’s the point in time where PSVR’s software line-up really needs to come together coherently, to convince the next wave of potential purchasers that this is a platform worth investing in. If it fails that test, PSVR will join Move and Eye in the graveyard of Sony’s failed peripherals; success will turn it into a cornerstone of the PS4 for the coming years.

So keep a close eye on E3. Part of this is just down to optics; how much time and focus does the firm devote to PSVR on stage at its conference? If it’s not very much, if the PSVR section feels rushed or underemphasised, that will send a strong message that Sony is back to its old bad habits and has lost interest in its latest peripheral already. A strong, confident PSVR segment would convince consumers and the industry alike that the headset isn’t just another easily abandoned gimmick; better yet if this is backed up by plenty of the big games being announced having PSVR functionality built into them, so the device can be referred back to repeatedly during the conference rather than being confined to its own short segment.

It’s more than just optics though; the reality is that PSVR, like any platform, needs software, and Sony needs to lead the way by showing that it’s truly devoted to its own hardware. It may seem a little unfair that people are already keen to declare PSVR to be stumbling due to lack of attention, and well, it is a little unfair – but nobody should be surprised that people are seeing a pattern here that Sony itself clearly established with its behaviour towards previous peripherals. That’s the reputation the firm has, unfortunately, created for itself; that makes it all the more important that it should convince the world of its commitment to PSVR when the time comes.

Courtesy-GI.biz

Is EA Slowing Moving Away From Appearing At E3

January 20, 2017 by  
Filed under Gaming

It would appear that the trend of big publishers hosting their own events will continue in 2017. Last year’s E3 show floor was missing booths from the likes of Electronic Arts, Activision Blizzard, Disney and Wargaming. For its part, EA decided it could better serve the fans by hosting its own event next door to E3, and now the publisher has confirmed that EA Play will be making a return for the second year in a row, but it won’t be as close to the Los Angeles Convention Center.

EA Play will be held from June 10-12 at the Hollywood Palladium, which is around seven miles away. “Whether in person or online, EA Play 2017 will connect fans around the world to EA’s biggest new games through live broadcasts, community content, competitions and more. Those that can attend in Hollywood will experience hands-on gameplay, live entertainment and much more. For anyone joining digitally around the world, EA Play will feature livestreams, deeper looks into EA’s upcoming games and experiences, and content from some of the best creators in the community,” the company stated in a press release.

Furthermore, a spokesperson confirmed to GamesIndustry.biz that EA will indeed be skipping out on having a major E3 presence. “EA Play was such a powerful platform for us last year to connect with our player community. We learned a ton, and we wanted to build on everything we loved about last year’s event to make EA Play 2017 even better,” EA corporate communications VP John Reseburg said.

“So after an extensive search, we’ve selected the Hollywood Palladium as a place where we can bring our vision of creativity, content and storytelling to life, and build an even more powerful experience to connect with players, community leaders, media and partners. EA Play 2017 will originate from Hollywood, with more ways for players around the world to connect and experience the excitement.”

It’ll be interesting to see what the other major publishers do about E3 this year. We’ll be sure to keep you posted.

Courtesy-Fud

Can The Xbox One S Succeed Without Exclusives?

January 18, 2017 by  
Filed under Gaming

As with many game cancellations, it’s likely we’ll never know exactly why Platinum Games’ Xbox One exclusive Scalebound has been dropped by Microsoft. For a game that’s been in development for several years at a top-flight studio, helmed by one of the most accomplished directors working in the industry today, to be cancelled outright is a pretty big deal. Even acknowledging that most of the cost of launching a game lies in marketing budgets, not development costs, this still represents writing off a fairly huge financial investment – not to mention the hard-to-quantify costs to the image and reputation of the Xbox brand. This isn’t the kind of decision that’s made rapidly or taken lightly – and though the reasons remain obscure, we can guess that a mix of factors was considered.

For one thing, it’s fairly likely that the game wasn’t living up to expectations. Scalebound was ambitious, combining unusual RPG aspects with a style of action Platinum Games (usually masters of the action genre) hadn’t attempted before, and throwing four-player co-op into the mix as well. There are a lot of things in that mix that could go wrong; plenty of fundamental elements that just might not gel well, that might look good on paper but ultimately fail to provide the kind of compelling, absorbing experience a AAA console exclusive needs. These things happen, even to the most talented of creative teams and directors.

For another thing, though, it’s equally likely that Microsoft’s decision stems in part from some issues internal to the publisher. Since Scalebound went into development in 2013, the Xbox division has been on a long, strange journey, and has ended up in a very different place to the one it anticipated when it inked its deal with Platinum three years ago. When Microsoft signed on to publish Scalebound, it was gearing up to launch an ambitious successor to the hugely successful Xbox 360 which would, it believed, expand upon the 360’s audience by being an all-purpose entertainment box, a motion-controlled device as much media hub and high-tech TV viewing system as game console.

By the time Scalebound was cancelled this week, much of that ambition had been scrapped, PS4 had soared off into the sunset leaving Microsoft trailing in a very distant second place, and Xbox One has become instead one link in a longer chain, a single component of an Xbox and Xbox Live brand and platform that extends across the Windows 10 ecosystem and which will, later this year, also encompass a vastly upgraded console in the form of Scorpio.

It only stands to reason that the logic which led to the signing of a game before this upheaval would no longer apply in the present environment. While quality issues around Scalebound cannot be dismissed – if Microsoft felt that it had a truly great game on its hands, it would have proceeded with it regardless of any strategic calculation – the implications of Scalebound’s cancellation for the broader Xbox strategy are worthy of some thought. Actually, it’s not so much Scalebound itself – which is just one game, albeit a very high profile one – as the situation in which its cancellation leaves the Xbox in 2017, and the dramatic defocusing of exclusive software which the removal of Scalebound from the release list throws into sharp relief.

A quick glance down 2017’s release calendar suggests that there remain only two major Xbox One exclusive titles due to launch this year – Halo Wars 2 and Crackdown 3. The console remains well supported with cross-platform releases, of course, but in terms of reasons for a player to choose Xbox One over the more successful PS4, or indeed for an existing PS4 owner to invest in an Xbox One as a second console (a vital and often overlooked factor in growing the install base mid-cycle), things are very sparse. By contrast, the PS4 has a high profile exclusive coming out just about every few weeks – many of them from Sony’s first-party studios, but plenty of others coming from third parties. Platinum Games’ fans will note, no doubt, that Sony’s console will be getting a new title from the studio – NieR: Automata – only a few months after Scalebound’s cancellation.

The proliferation of multiplatform games means that Xbox One owners won’t be starved of software – this is no Wii U situation. Existing owners, and those who bought into the platform after the launch of the Xbox One S last year, will probably be quite happy with their system, but the fact remains that with the exception of the two titles mentioned above and a handful of indie games (some of which do look good), the Xbox One this year is going to get by on a subset of the PS4’s release schedule.

That’s not healthy for the future of the platform. The strong impression is that third parties have largely abandoned Xbox One as a platform worth launching exclusive games on, and unlike Sony during the PS3’s catch-up era, Microsoft’s own studios and publishing deals have not come forward to take up the slack in its console’s release schedule. This isn’t all down to Scalebound, of course; Scalebound is just the straw that breaks the camel’s back, making this situation impossible to ignore.

Why have things ended up this way? There are two possible answers, and the reality is probably a little from column A and a little from column B. The first answer is that Microsoft’s strategy for Xbox has changed in a way which makes high-profile (and high-cost) exclusive software less justifiable within the company. That’s especially true of high-profile games that won’t be on Windows 10 as well as Xbox One; one of the ways in which the Xbox division has secured its future within Microsoft in the wake of the company’s reorganisation under CEO Satya Nadella is by positioning itself as a key part of the Windows 10 ecosystem.

Pushing Xbox One exclusive software flies in the face of that strategic positioning; new titles Microsoft lines up for the future will be cross-platform between Windows and Xbox, and that changes publishing priorities. It’s also worth noting that the last attempt Microsoft made to plug the gap in its exclusive software line-up didn’t go down so well and hasn’t been repeated; paying for a 12-month exclusivity window for the sequel to the (multiplatform) Tomb Raider reboot just seems to have annoyed people and didn’t sell a notable number of Xbox Ones.

The second answer, unsurprisingly, revolves around Scorpio. It’s not unusual for a console to suffer a software drought before its successor appears on the market, so with Scorpio presumably being unveiled at E3 this year, the Xbox One release list could be expected to dry up. The wrinkle in this cloth is that Scorpio isn’t meant to be an Xbox One replacement. What little information Microsoft has provided about the console thus far has been careful to position it as an evolution of the Xbox One platform, not a new system. What that means in practice, though, hasn’t been explained or explored. Microsoft’s messaging on Scorpio is similar to the positioning of PS4 Pro – an evolutionary upgrade whose arrival made no difference to software release schedules – but at the same time suggests a vastly more powerful system, one whose capabilities will far outstrip those of Xbox One to an extent more reminiscent of a generational leap than an evolutionary upgrade.

The question is whether Microsoft’s anaemic slate of exclusive releases is down, in part, to a focus on getting big titles ready for Scorpio’s launch window. If so, it feels awfully like confirmation that Scorpio – though no doubt sharing Xbox One’s architecture and thus offering perfect backwards compatibility – is really a new console with new exclusive software to match. If it’s not the case, however, then along with clearing up the details of Scorpio, this year’s E3 will have to answer another big question for Microsoft; where is all your software?

2017 needs to just be a temporary dip in the company’s output, or all its efforts on Scorpio will be for naught. Seamus Blackley, Ed Fries, Kevin Bachus and the rest of the original Xbox launch team understood something crucial all the way back in the late nineties when they were preparing to enter Microsoft into the console business; software sells hardware. If you don’t have the games, nothing else matters. Whatever the reasons for 2017’s weak offering from Xbox, we must firmly hope that that lesson hasn’t been forgotten in the corridors of Redmond.

Courtesy-GI.biz

HDMI v2.1 To Support 8K

January 9, 2017 by  
Filed under Around The Net

The HDMI Forum has officially announced the upcoming release of the HDMI v2.1 specification, bringing a range of higher video resolutions, Dynamic HDR support and more.

According to the provided details, the new HDMI v2.1 specification will be backward compatible with earlier versions and in addition to higher video resolution and refresh rates, including 8K@60Hz and 4K@120Hz, it will also bring support for Dynamic HDR, Game Mode variable refresh rate (VRR), eARC support for advanced audio formats and the new 48G cable that will provide 48Gbps of bandwidth which is a key for 8K HDR support.

The full list of resolutions and refresh rates start with 4K at 50/60Hz and 100/120Hz and climbs all the way up to 10K resolution at both 50/60Hz and 100/120Hz.

The big surprise is the new Game Mode VRR, which is similar with AMD FreeSync and Nvidia G-Sync, and meant to provide stutter-, tearing- and lag-free gaming, on both consoles and the PC.

Another big novelty is the all new 48G cable, which will provide enough bandwidth for higher resolutions with HDR. The old cables will be compatible with some of the new features, but for 8K/10K with HDR, you will need the new HDMI v2.1 cable.

According to HDMI Forum, the new HDMI v2.1 specification will be available to all HDMI 2.0 Adopters which will be notified when it is released in early Q2 2017.

Courtesy-Fud

Next Page »