Subscribe to:

Subscribe to :: TheGuruReview.net ::

Are Low Profile Radeon RX 460 Forthcoming?

February 23, 2017 by  
Filed under Computing

MSI has unveiled yet another HTPC-friendly graphics card, the low-profile Radeon RX 460 that will comes in both 2GB and 4GB versions.

Featuring a dual-slot, low-profile, dual-fan cooler and a low-profile PCB to match, both the MSI RX 460 2GT LP and 4GT LP graphics card will be working at reference 1090MHz GPU base and 1200MHz GPU Boost clocks with GDDR5 memory working at 1750MHz on a 128-bit memory interface.

It also comes with single DVI and one HDMI display outputs.

In case you missed it, the Radeon RX 460 is based on AMD’s Polaris 11 GPU with 896 Stream Processors, 48 TMUs and 16 ROPs and should pack enough punch for a decent casual gaming experience.

Unfortunately, the price or the availability date have not been revealed but we are sure these two will appear in retail/e-tail soon at around US $100/€100.

Courtesy-Fud

Why Are The NPD Games Sales Kept Private?

February 22, 2017 by  
Filed under Gaming

When I first began my career in the games industry I wrote a story about an impending digital download chart.

It was February 2008 and Dorian Bloch – who was leader of UK physical games data business Chart-Track at the time – vowed to have a download Top 50 by Christmas.

It wasn’t for want of trying. Digital retailers, including Steam, refused to share the figures and insisted it was down to the individual publishers and developers to do the sharing (in contrast to the retail space, where the stores are the ones that do the sharing). This led to an initiative in the UK where trade body UKIE began using its relationships with publishers to pull together a chart. However, after some initial success, the project ultimately fell away once the sheer scale of the work involved became apparent.

Last year in the US, NPD managed to get a similar project going and is thus far the only public chart that combines physical and digital data from accurate sources. However, although many big publishers are contributing to the figures, there remains some notable absentees and a lack of smaller developers and publishers.

In Europe, ISFE is just ramping up its own project and has even began trialling charts in some territories (behind closed doors), however, it currently lacks the physical retail data in most major markets. This overall lack of information has seen a rise in the number of firms trying to plug the hole in our digital data knowledge. Steam Spy uses a Web API to gather data from Steam user profiles to track download numbers – a job it does fairly accurately (albeit not all of the time).

SuperData takes point-of-sale and transaction information from payment service providers, plus some publishers and developers, which means it can track actual spend. It’s strong on console, but again, it’s not 100% accurate. The mobile space has a strong player in App Annie collecting data, although developers in the space find the cost of accessing this information high.

It feels unusual to be having this conversation in 2017. In a market that is now predominantly digital, the fact we have no accurate way of measuring our industry seems absurd. Film has almost daily updates of box office takings, the music market even tracks streams and radio plays… we don’t even know how many people downloaded Overwatch, or where Stardew Valley would have charted. So what is taking so long?

“It took a tremendous amount of time and effort from both the publisher and NPD sides to make digital sales data begin to flow,” says Mat Piscatella, NPD’s US games industry analyst. NPD’s monthly digital chart is the furthest the industry has come to accurate market data in the download space.

“It certainly wasn’t like flipping a switch. Entirely new processes were necessary on both sides – publishers and within NPD. New ways of thinking about sales data had to be derived. And at the publishers, efforts had to be made to identify the investments that would be required in order to participate. And of course, most crucially, getting those investments approved. We all had to learn together, publishers, NPD, EEDAR and others, in ways that met the wants and needs of everyone participating.

“Over time, most of the largest third-party publishers joined the digital panel. It has been a remarkable series of events that have gotten us to where we are today. It hasn’t always been smooth; and keep in mind, at the time the digital initiative began, digital sales were often a very small piece of the business, and one that was often not being actively managed. Back then, publishers may have been letting someone in a first-party operation, or brand marketing role post the box art to the game on the Sony, Microsoft and Steam storefronts, and that would be that. Pricing wouldn’t be actively managed, sales might be looked at every month or quarter, but this information certainly was not being looked at like packaged sales were. The digital business was a smaller, incremental piece of the pie then. Now, of course, that’s certainly changed, and continues to change.”

“For one, the majors are publicly traded firms, which means that any shared data presents a financial liability. Across the board the big publishers have historically sought to protect the sanctity of their internal operations because of the long development cycles and high capital risks involved in AAA game publishing. But, to be honest, it’s only been a few years that especially legacy publishers have started to aggregate and apply digital data, which means that their internal reporting still tends to be relatively underdeveloped. Many of them are only now building the necessary teams and infrastructure around business intelligence.”

Indeed, both SuperData and NPD believe that progress – as slow as it may be – has been happening. And although some publishers are still holding out or refusing to get involved, that resolve is weakening over time.   “For us, it’s about proving the value of participation to those publishers that are choosing not to participate at this time,” Piscatella says. “And that can be a challenge for a few reasons. First, some publishers may believe that the data available today is not directly actionable or meaningful to its business. The publisher may offer products that have dominant share in a particular niche, for example, which competitive data as it stands today would not help them improve.

“Second, some publishers may believe that they have some ‘secret sauce’ that sharing digital sales data would expose, and they don’t want to lose that perceived competitive advantage. Third, resources are almost always stretched thin, requiring prioritisation of business initiatives. For the most part, publishers have not expanded their sales planning departments to keep pace with all of the overwhelming amount of new information and data sources that are now available. There simply may not be the people power to effectively participate, forcing some publishers to pass on participating, at least for now.

“So I would certainly not classify this situation as companies ‘holding out’ as you say. It’s that some companies have not yet been convinced that sharing such information is beneficial enough to overcome the business challenges involved. Conceptually, the sharing of such information seems very easy. In reality, participating in an initiative like this takes time, money, energy and trust. I’m encouraged and very happy so much progress has been made with participating publishers, and a tremendous amount of energy is being applied to prove that value to those publishers that are currently not participating.”

NPD’s achievements is significant because it has managed to convince a good number of bigger publishers, and those with particularly successful IP, to share figures. And this has long been seen as a stumbling block, because for those companies performing particularly well, the urge to share data is reduced. I’ve heard countless comments from sales directors who have said that ‘sharing download numbers would just encourage more competitors to try what we’re doing.’ It’s why van Dreunen has noted that “as soon as game companies start to do well, they cease the sharing of their data.”

Indeed, it is often fledgling companies, and indie studios, that need this data more than most. It’s part of the reason behind the rise of Steam Spy, which prides itself on helping smaller outfits.

“I’ve heard many stories about indie teams getting financed because they managed to present market research based on Steam Spy data,” boasts Sergey Galyonkin, the man behind Steam Spy. “Just this week I talked to a team that got funded by Medienboard Berlin-Brandenburg based on this. Before Steam Spy it was harder to do a proper market research for people like them.

“Big players know these numbers already and would gain nothing from sharing them with everyone else. Small developers have no access to paid research to publish anything.

“Overall I’d say Steam Spy helped to move the discussion into a more data-based realm and that’s a good thing in my opinion.”

The games industry may be behaving in an unusually backwards capacity when it comes to sharing its digital data, but there are signs of a growing willingness to be more open. A combination of trade body and media pressure has convinced some larger publishers to give it a go. Furthermore, publishers are starting to feel obligated to share figures anyway, especially when the likes of SuperData and Steam Spy are putting out information whether they want them to or not.

Indeed, although the chart Dorian promised me 9 years ago is still AWOL, there are at least some figures out there today that gives us a sense of how things are performing.

“When we first started SuperData six years ago there was exactly zero digital data available,” van Dreunen notes. “Today we track the monthly spending of 78 million digital gamers across platforms, in spite of heavy competition and the reluctance from publishers to share. Creating transparency around digital data is merely a matter of market maturity and executive leadership, and many of our customers and partners have started to realize that.”

He continues: The current inertia comes from middle management that fears new revenue models and industry changes. So we are trying to overcome a mindset rather than a data problem. It is a slow process of winning the confidence and trust of key players, one-at-a-time. We’ve managed to broker partnerships with key industry associations, partner with firms like GfK in Europe and Kadokawa Dwange in Japan, to offer a complete market picture, and win the trust with big publishers. As we all move into the next era of interactive entertainment, the need for market information will only increase, and those that have shown themselves willing to collaborate and take a chance are simply better prepared for the future.”

NPD’s Piscatella concludes: “The one thing I’m most proud of, and impressed by, is the willingness of the participating publishers in our panel to work through issues as they’ve come up. We have a dedicated, positive group of companies working together to get this information flowing. Moving forward, it’s all about helping those publishers that aren’t participating understand how they can benefit through the sharing of digital consumer sales information, and in making that decision to say “yes” as easy as possible.

“Digital selling channels are growing quickly. Digital sales are becoming a bigger piece of the pie across the traditional gaming market. I fully expect participation from the publishing community to continue to grow.”

Courtesy-GI.biz

Will Politics Bring Down The Gaming Industry?

February 20, 2017 by  
Filed under Gaming

If you’re someone who makes a living from videogames – as most readers of this site are – then political developments around the world at the moment should deeply concern you. I’m sure, of course, that a great many of you are concerned about things ranging from President Trump’s Muslim travel ban to the UK Parliament’s vote for “Hard Brexit” or the looming elections in Holland and France simply on the basis of being politically aware and engaged. However, there’s a much more practical and direct way in which these developments and the direction of travel which they imply will impact upon us. Regardless of personal ideology or beliefs, there’s no denying that the environment that seems to be forming is one that’s bad for the medium, bad for the industry, and will ultimately be bad for the incomes and job security of everyone who works in this sector.

Video games thrive in broadly the same conditions as any other artistic or creative medium, and those conditions are well known and largely undisputed. Creative mediums benefit from diversity; a wide range of voices, views and backgrounds being represented within a creative industry feeds directly into a diversity of creative output, which in turn allows an industry to grow by addressing new groups of consumers. Moreover, creative mediums benefit from economic stability, because when people’s incomes are low or uncertain, entertainment purchases are often among the first to fall.

Once upon a time, games had such strong underlying growth that they were “recession proof,” but this is no longer the case. Indeed, it was never entirely an accurate reading anyway, since broader recessions undoubtedly did slow down – though not reverse – the industry’s growth. Finally, as a consequence of the industry’s broad demographic reach, expansion overseas is now the industry’s best path to future growth, and that demands continued economic progress in the developing world to open up new markets for game hardware and software.

What is now happening on a global basis threatens all of those conditions, and therefore poses a major commercial threat to the games business. That threat must be taken especially seriously given that many companies and creators are already struggling with the enormous challenges that have been thrown up by the messy and uneven transition towards smart devices, and the increasing need to find new revenue streams to support AAA titles whose audience has remained largely unchanged even as development budgets have risen. Even if the global economic system looked stable and conditions were ideal for creative industries, this would be a tough time for games; the prospect of restrictions on trade and hiring, and the likelihood of yet another deep global recession and a slow-down in the advances being made by developing economies, make this situation outright hazardous.

Consider the UK development industry. Since well over a decade ago, if you asked just about any senior figure in the UK industry what the most pressing problem they faced was, they’d give you the same answer: skills shortages. Hiring talented staff is tough in any industry, but game development demands highly skilled people from across a range of fields, and assembling that kind of talent isn’t cheap or easy – even when you have access to the entire European Union as a hiring base, as UK companies did. Now UK companies face having to fill their positions with a much smaller pool of talent to draw from, and hiring from abroad will be expensive, complex and, in many cases, simply impossible.

The US, too, looks like it may tighten visa regulations for skilled hires from overseas, which will have a hugely negative impact on game development there. There are, of course, many skilled creatives who work within the borders of their own country, but the industry has been built on labour flows; centres of excellence in game development, like the UK and parts of the US, are sustained and bolstered by their ability to attract talent from overseas. Any restriction on that will impact the ability of companies to create world-class games – it will make them poorer creatively and throw hiring roadblocks in the path of timely, well-polished releases.

Then there’s the question of trade barriers; not only tariffs, which seem likely to make a comeback in many places, but non-tariff barriers in terms of diverse regulations and standards that will make it harder for companies to operate across national borders. The vast majority of games are multinational efforts; assets, code, and technology are created in different parts of the world and brought together to create the final product. Sometimes this is because of outsourcing, other times it’s because of staff who work remotely, and very often it’s simply because a certain piece of technology is licensed from a company overseas.

If countries become more hostile to free trade, all of that will become more complex and expensive. And that’s even before we start to think about what happens to game hardware, from consoles that source components from across Asia before assembly in China or Japan, to PC and smart device parts that flow out of China, Korea, Taiwan and, increasingly, from developing nations in South-East Asia. If tariff barriers are raised, all of those things will get a lot more expensive, limiting the industry’s consumer base at the most damaging time possible.

Such trade barriers – be they tariff barriers or non-tarriff barriers – would disproportionately impact developing countries. Free trade and globalisation have had negative externalities, unquestionably, but by and large they have contributed to an extraordinary period of prosperity around the world, with enormous populations of people being lifted out of poverty in recent decades and many developing countries showing clear signs of a large emerging middle class. Those are the markets game companies desperately want to target in the coming decade or so. In order for the industry to continue to grow and prosper, the emerging middle class in countries like India, Brazil and Indonesia needs to cultivated as a new wave of game consumers, just as many markets in Central and Eastern Europe were a decade ago.

The current political attacks on the existing order of world trade threaten to cut those economies off from the system that has allowed them to grow and develop so quickly, potentially hurling them into deep recession before they have an opportunity to cement stable, sustainable long-term economic prosperity. That’s an awful prospect on many levels, of course (it goes without saying that many of the things under discussion threaten human misery and catastrophe that far outweighs the impact on the games business), but one consequence will likely be a hard stop to the games industry’s capacity to grow in the coming years.

It’s not just developing economies whose consumers are at risk from a rise of protectionism and anti-trade sentiments, however. If we learned anything from the 2008 crash and the recession that followed, it should be that the global economy largely runs not on cash, but on confidence. The entire edifice is built on a set of rules and standards that are designed to give investors confidence; the structure changes over time, of course, but only slowly, because stability is required to allow people to invest and to build businesses with confidence that the rug won’t be tugged out from underneath them tomorrow. From the rhetoric of Donald Trump to the hardline Brexit approach of the UK, let alone the extremist ideas of politicians like Marine le Pen and Geert Wilders, the current political movement deeply threatens that confidence. Only too recently we’ve seen what happens to ordinary consumers’ job security and incomes when confidence disappears from the global economy; a repeat performance now seems almost inevitable.

Of course, the games industry isn’t in a position to do anything about these political changes – not alone, at least. The same calculations, however, apply to a wide variety of industries, and they’re all having the same conversations. Creative industries are at the forefront for the simple reason that they will be the first to suffer should the business environment upon which they rely turn negative, but in opposing those changes, creative businesses will find allies across a wide range of industries and sectors.

Any business leader that wants to throw their weight behind opposing these changes on moral or ethical grounds is more than welcome to, of course – that’s a laudable stance – but regardless of personal ideology, the whole industry should be making its voice heard. The livelihoods of everyone working in this industry may depend on the willingness of the industry as a whole to identify these commercial threats and respond to them clearly and powerfully.

Courtey-GI.biz

Is The Gaming Industry Due For An Overhaul?

February 16, 2017 by  
Filed under Gaming

Physical retailers are calling for a change in how video game pre-orders are conducted.

They are speaking to publishers and platform holders over the possibility of selling games before the release date. Consumers can pick up the disc 1 to 3 weeks before launch, but it will remain ‘locked’ until launch day.

The whole concept stems from the pre-loading service available in the digital space. Today, consumers can download a game via Steam, Xbox Live and PSN before it’s out, and the game becomes unlocked at midnight on launch day for immediate play (after the obligatory day one patch).

It makes sense to roll this out to other distribution channels. The idea of going into a shop to order a game, and then returning a month later to buy it, always seemed frankly antiquated.

Yet it’s not only consumer friendly, it’s potentially retailer and publisher friendly, too.

For online retailers, the need to hit an embargo is costly – games need to be turned around rapidly to get it into consumers’ hands on day one.

For mainstream retailers, it would clear up a lot of confusion. These stores are not naturally built for pre-ordering product, with staff that are more used to selling bananas than issuing pre-order receipts. The fact you can immediately take the disc home would help – it could even boost impulse sales.

Meanwhile, specialist retailers will be able to make a longer ‘event’ of the game coming out, and avoid the situation of consumers cancelling pre-orders or simply not picking up the game.

Yet when retail association ERA approached some companies about the prospect of doing this, it struggled to find much interest from the publishing community. So what’s the problem?

There are a few challenges.

There are simple logistical obstacles. Games often go Gold just a few weeks before they’re launched, and then it’s over to the disc manufacturers, the printers, the box makers and the distributors to get that completed code onto store shelves. This process can take two weeks in itself. Take the recent Nioh. That game was available to pre-download just a few days before launch – so how difficult would it be to get that into a box, onto a lorry and into a retailer in advance of release?

It also benefits some retailers more than others – particularly online ones, and those with strong distribution channels.

For big games, there’s a potential challenge when it comes to bandwidth. If those that pre-ordered Call of Duty all go online straight away at 12:01, that would put a lot of pressure on servers.

Piracy may also be an issue, because it makes the code available ahead of launch.

The end of the midnight launch may be happening anyway, but not for all games. If consumers can get their game without standing in the cold for 2 hours, then they will. And those lovely marketable pictures of snaking queues will be a thing of the past.

None of these obstacles are insurmountable. Getting the game finished earlier before launch is something that most big games publishers are trying to do, and this mechanism will help force that issue. Of course, the disc doesn’t actually have to contain a game at all. It can be an unlock mechanism for a download, which will allow the discs to be ready far in advance of launch. That strategy is significantly riskier, especially considering the consumer reaction to the same model proposed by Xbox back in 2013.

As for midnight events, there are still ways to generate that big launch ‘moment’. Capcom released Resident Evil 7 with an experiential haunted house experience that generated lots of media attention and attracted a significant number of fans. Pokémon last year ran a big fan event for Sun and Moon, complete with a shop, activities, signing opportunities and the chance to download Mew.

So there are other ways of creating launch theatre than inviting consumers to wait outside a shop. If anything, having the game available in advance of launch will enable these theatrical marketing events to last longer. And coupled with influencer activity, it would actually drive pre-release sales – not just pre-release demand.

However, the reality is this will work for some games and not for others, and here lies the heart of the challenge.

Pre-ordering is already a relatively complex matter, so imagine what it’ll be like if some games can be taken home in advance and others can’t? How many instances can we expect of people complaining that ‘their disc doesn’t work’?

If this is going to work, it needs cross-industry support, which isn’t going to happen. This is a business that can’t even agree on a digital chart, don’t forget.

What we may well see is someone giving this concept a go. Perhaps a digital native publisher, like Blizzard or Valve, who can make it part of their PR activity.

Because if someone like that can make the idea work, then others will follow.

Courtesy-GI.biz

Is Microsoft Taking Windows To The Cloud?

February 2, 2017 by  
Filed under Computing

Software king of the world Microsoft is planning a cut down version of Windows 10 which will operate in the cloud.

Dubbed the Composable Shell (CSHELL) the software is a single, unified, ‘adaptive shell’ for Windows 10 and it is part of Vole’s cunning plan to create a universal Windows 10 version.

This will mean we will see a standardised framework to scale and adapt the OS to any type of device, display size or user experience, including smartphones, PCs, tablets, consoles, large touchscreens, and more.

Stage one apparently means a Cloud Shell which is a cut-down version of Windows designed for the modern computing world.

Cloud Shell should be out there in 2017 and it will be connected to the Windows Store and Universal Windows Platform app framework.

This would fit with Vole’s plans to bring the full version of Windows 10 to mobile devices with ARM-based processors, which it announced in December.

A ‘lightweight’ version of Windows could hint at a ‘thin client’-style approach which has been touted as a viable business tool for the last 20 years but has never really taken off.

Courtesy-Fud

Is Sony Really Committed To The PSVR?

February 1, 2017 by  
Filed under Gaming

The positive reviews pouring in from all corners for Capcom’s Resident Evil 7 are a welcome validation of the firm’s decision to go in quite a radically new direction with the series, but Capcom isn’t the only company that will be happy (and perhaps a little relieved) by the response to the game. A positive reaction to RE7 is also hugely important for Sony, because this is the first real attempt at proving that PSVR is a worthy platform for full-scale, AAA games, and much of the credibility of the nascent VR platform rests on RE7.

Although some of the sentiment in reviews of the game suggests that the VR mode is interesting but somewhat flawed, and several reviewers have expressed a preference for playing on a normal screen, the game’s VR aspect undoubtedly fascinates consumers and seems to be implemented well enough to justify their interest. In the process, it also justifies Sony’s investment in the title – the company did a deal that secured a year-long VR exclusivity window for PSVR – and Capcom’s own faith in the burgeoning medium, which undoubtedly played a large role in the decision to switch the entire game over to a first-person perspective.

The critical success of RE7, and the likely commercial success that will follow, comes at a crucial juncture for PSVR. Although the hardware was well-reviewed at launch and remains more or less supply-constrained at retail – you certainly can’t get your hands on one without paying a hefty re-seller premium in Japan at the moment, and believe me I’ve tried – there’s an emerging narrative about the VR headset that’s distinctly negative and pessimistic. Plenty of op-eds and videos have popped up in recent weeks comparing PSVR to previous Sony peripheral launches like PlayStation Eye and PlayStation Move; hardware that was launched with a lot of heavy marketing support but which the giant company rapidly seemed to lose interest in, condemning it to a few years of token, declining software support before being quietly shelved altogether.

It’s worth noting, of course, that neither Eye nor Move actually died off entirely – in fact, both of these technologies have made their way into PSVR itself, with the headset essentially being an evolution of a number of previous Sony technologies that have finally found a decent application in VR. However, there’s no question but that Sony has a bad track record with peripherals, and those interested in the future of PSVR should absolutely be keeping a close eye on the company to see if there are any signs of it repeating its past behaviour patterns.

Most of what’s being written now, however, feels premature. PSVR had a pretty solid launch line-up, with good support from across the industry; just this week it got its first truly big third-party AAA title, which is receiving excellent reviews, and later in the year it’s got some big launches like GT Sport on the way. The pace of software releases slumped after the launch window, but that’s not unusual for any platform. There’s nothing about PSVR that you can point to right now and declare as evidence of Sony’s focus shifting away; it feels like editorials claiming this are doing so purely on the basis of Sony’s track record, not the facts as they exist now.

If you really want to know how PSVR is shaping up, there are two key things to watch out for in the near future. The first will be any data that’s released regarding the performance of RE7’s VR mode; is it popular? Is it being played widely? Does it become a part of the broad conversation about the game? Much of this latter aspect is down to Sony and Capcom’s marketing of course; there’s an opportunity to push the VR aspect of RE7 as a genuinely unique experience with appeal even beyond the usual gaming audience, and if that can be capitalised upon, it will likely secure PSVR’s future to a large degree. What’s crucial, though, is that every other company in the industry will be watching RE7 like hawks; if proper, well-integrated PSVR support seems to be a major selling factor or a popular feature, you can be guaranteed that other publishers will start to look at their major franchises with a view to identifying which of them would suit a similar “traditional display plus optional VR” approach.

The other thing to watch for, unsurprisingly, is what Sony does at E3 and other major gaming events this spring. This is really where we’ll see the proof of the company’s focus – or lack of same. There’s still plenty of time to announce VR titles for the back half of this year, which is likely to be the crucial point for PSVR; by the time we slip into the second half of 2017, the hardware will no longer be supply constrained and the early adopters buying for novelty will be all but exhausted. That’s the point in time where PSVR’s software line-up really needs to come together coherently, to convince the next wave of potential purchasers that this is a platform worth investing in. If it fails that test, PSVR will join Move and Eye in the graveyard of Sony’s failed peripherals; success will turn it into a cornerstone of the PS4 for the coming years.

So keep a close eye on E3. Part of this is just down to optics; how much time and focus does the firm devote to PSVR on stage at its conference? If it’s not very much, if the PSVR section feels rushed or underemphasised, that will send a strong message that Sony is back to its old bad habits and has lost interest in its latest peripheral already. A strong, confident PSVR segment would convince consumers and the industry alike that the headset isn’t just another easily abandoned gimmick; better yet if this is backed up by plenty of the big games being announced having PSVR functionality built into them, so the device can be referred back to repeatedly during the conference rather than being confined to its own short segment.

It’s more than just optics though; the reality is that PSVR, like any platform, needs software, and Sony needs to lead the way by showing that it’s truly devoted to its own hardware. It may seem a little unfair that people are already keen to declare PSVR to be stumbling due to lack of attention, and well, it is a little unfair – but nobody should be surprised that people are seeing a pattern here that Sony itself clearly established with its behaviour towards previous peripherals. That’s the reputation the firm has, unfortunately, created for itself; that makes it all the more important that it should convince the world of its commitment to PSVR when the time comes.

Courtesy-GI.biz

Is EA Slowing Moving Away From Appearing At E3

January 20, 2017 by  
Filed under Gaming

It would appear that the trend of big publishers hosting their own events will continue in 2017. Last year’s E3 show floor was missing booths from the likes of Electronic Arts, Activision Blizzard, Disney and Wargaming. For its part, EA decided it could better serve the fans by hosting its own event next door to E3, and now the publisher has confirmed that EA Play will be making a return for the second year in a row, but it won’t be as close to the Los Angeles Convention Center.

EA Play will be held from June 10-12 at the Hollywood Palladium, which is around seven miles away. “Whether in person or online, EA Play 2017 will connect fans around the world to EA’s biggest new games through live broadcasts, community content, competitions and more. Those that can attend in Hollywood will experience hands-on gameplay, live entertainment and much more. For anyone joining digitally around the world, EA Play will feature livestreams, deeper looks into EA’s upcoming games and experiences, and content from some of the best creators in the community,” the company stated in a press release.

Furthermore, a spokesperson confirmed to GamesIndustry.biz that EA will indeed be skipping out on having a major E3 presence. “EA Play was such a powerful platform for us last year to connect with our player community. We learned a ton, and we wanted to build on everything we loved about last year’s event to make EA Play 2017 even better,” EA corporate communications VP John Reseburg said.

“So after an extensive search, we’ve selected the Hollywood Palladium as a place where we can bring our vision of creativity, content and storytelling to life, and build an even more powerful experience to connect with players, community leaders, media and partners. EA Play 2017 will originate from Hollywood, with more ways for players around the world to connect and experience the excitement.”

It’ll be interesting to see what the other major publishers do about E3 this year. We’ll be sure to keep you posted.

Courtesy-Fud

Can The Xbox One S Succeed Without Exclusives?

January 18, 2017 by  
Filed under Gaming

As with many game cancellations, it’s likely we’ll never know exactly why Platinum Games’ Xbox One exclusive Scalebound has been dropped by Microsoft. For a game that’s been in development for several years at a top-flight studio, helmed by one of the most accomplished directors working in the industry today, to be cancelled outright is a pretty big deal. Even acknowledging that most of the cost of launching a game lies in marketing budgets, not development costs, this still represents writing off a fairly huge financial investment – not to mention the hard-to-quantify costs to the image and reputation of the Xbox brand. This isn’t the kind of decision that’s made rapidly or taken lightly – and though the reasons remain obscure, we can guess that a mix of factors was considered.

For one thing, it’s fairly likely that the game wasn’t living up to expectations. Scalebound was ambitious, combining unusual RPG aspects with a style of action Platinum Games (usually masters of the action genre) hadn’t attempted before, and throwing four-player co-op into the mix as well. There are a lot of things in that mix that could go wrong; plenty of fundamental elements that just might not gel well, that might look good on paper but ultimately fail to provide the kind of compelling, absorbing experience a AAA console exclusive needs. These things happen, even to the most talented of creative teams and directors.

For another thing, though, it’s equally likely that Microsoft’s decision stems in part from some issues internal to the publisher. Since Scalebound went into development in 2013, the Xbox division has been on a long, strange journey, and has ended up in a very different place to the one it anticipated when it inked its deal with Platinum three years ago. When Microsoft signed on to publish Scalebound, it was gearing up to launch an ambitious successor to the hugely successful Xbox 360 which would, it believed, expand upon the 360’s audience by being an all-purpose entertainment box, a motion-controlled device as much media hub and high-tech TV viewing system as game console.

By the time Scalebound was cancelled this week, much of that ambition had been scrapped, PS4 had soared off into the sunset leaving Microsoft trailing in a very distant second place, and Xbox One has become instead one link in a longer chain, a single component of an Xbox and Xbox Live brand and platform that extends across the Windows 10 ecosystem and which will, later this year, also encompass a vastly upgraded console in the form of Scorpio.

It only stands to reason that the logic which led to the signing of a game before this upheaval would no longer apply in the present environment. While quality issues around Scalebound cannot be dismissed – if Microsoft felt that it had a truly great game on its hands, it would have proceeded with it regardless of any strategic calculation – the implications of Scalebound’s cancellation for the broader Xbox strategy are worthy of some thought. Actually, it’s not so much Scalebound itself – which is just one game, albeit a very high profile one – as the situation in which its cancellation leaves the Xbox in 2017, and the dramatic defocusing of exclusive software which the removal of Scalebound from the release list throws into sharp relief.

A quick glance down 2017’s release calendar suggests that there remain only two major Xbox One exclusive titles due to launch this year – Halo Wars 2 and Crackdown 3. The console remains well supported with cross-platform releases, of course, but in terms of reasons for a player to choose Xbox One over the more successful PS4, or indeed for an existing PS4 owner to invest in an Xbox One as a second console (a vital and often overlooked factor in growing the install base mid-cycle), things are very sparse. By contrast, the PS4 has a high profile exclusive coming out just about every few weeks – many of them from Sony’s first-party studios, but plenty of others coming from third parties. Platinum Games’ fans will note, no doubt, that Sony’s console will be getting a new title from the studio – NieR: Automata – only a few months after Scalebound’s cancellation.

The proliferation of multiplatform games means that Xbox One owners won’t be starved of software – this is no Wii U situation. Existing owners, and those who bought into the platform after the launch of the Xbox One S last year, will probably be quite happy with their system, but the fact remains that with the exception of the two titles mentioned above and a handful of indie games (some of which do look good), the Xbox One this year is going to get by on a subset of the PS4’s release schedule.

That’s not healthy for the future of the platform. The strong impression is that third parties have largely abandoned Xbox One as a platform worth launching exclusive games on, and unlike Sony during the PS3’s catch-up era, Microsoft’s own studios and publishing deals have not come forward to take up the slack in its console’s release schedule. This isn’t all down to Scalebound, of course; Scalebound is just the straw that breaks the camel’s back, making this situation impossible to ignore.

Why have things ended up this way? There are two possible answers, and the reality is probably a little from column A and a little from column B. The first answer is that Microsoft’s strategy for Xbox has changed in a way which makes high-profile (and high-cost) exclusive software less justifiable within the company. That’s especially true of high-profile games that won’t be on Windows 10 as well as Xbox One; one of the ways in which the Xbox division has secured its future within Microsoft in the wake of the company’s reorganisation under CEO Satya Nadella is by positioning itself as a key part of the Windows 10 ecosystem.

Pushing Xbox One exclusive software flies in the face of that strategic positioning; new titles Microsoft lines up for the future will be cross-platform between Windows and Xbox, and that changes publishing priorities. It’s also worth noting that the last attempt Microsoft made to plug the gap in its exclusive software line-up didn’t go down so well and hasn’t been repeated; paying for a 12-month exclusivity window for the sequel to the (multiplatform) Tomb Raider reboot just seems to have annoyed people and didn’t sell a notable number of Xbox Ones.

The second answer, unsurprisingly, revolves around Scorpio. It’s not unusual for a console to suffer a software drought before its successor appears on the market, so with Scorpio presumably being unveiled at E3 this year, the Xbox One release list could be expected to dry up. The wrinkle in this cloth is that Scorpio isn’t meant to be an Xbox One replacement. What little information Microsoft has provided about the console thus far has been careful to position it as an evolution of the Xbox One platform, not a new system. What that means in practice, though, hasn’t been explained or explored. Microsoft’s messaging on Scorpio is similar to the positioning of PS4 Pro – an evolutionary upgrade whose arrival made no difference to software release schedules – but at the same time suggests a vastly more powerful system, one whose capabilities will far outstrip those of Xbox One to an extent more reminiscent of a generational leap than an evolutionary upgrade.

The question is whether Microsoft’s anaemic slate of exclusive releases is down, in part, to a focus on getting big titles ready for Scorpio’s launch window. If so, it feels awfully like confirmation that Scorpio – though no doubt sharing Xbox One’s architecture and thus offering perfect backwards compatibility – is really a new console with new exclusive software to match. If it’s not the case, however, then along with clearing up the details of Scorpio, this year’s E3 will have to answer another big question for Microsoft; where is all your software?

2017 needs to just be a temporary dip in the company’s output, or all its efforts on Scorpio will be for naught. Seamus Blackley, Ed Fries, Kevin Bachus and the rest of the original Xbox launch team understood something crucial all the way back in the late nineties when they were preparing to enter Microsoft into the console business; software sells hardware. If you don’t have the games, nothing else matters. Whatever the reasons for 2017’s weak offering from Xbox, we must firmly hope that that lesson hasn’t been forgotten in the corridors of Redmond.

Courtesy-GI.biz

HDMI v2.1 To Support 8K

January 9, 2017 by  
Filed under Around The Net

The HDMI Forum has officially announced the upcoming release of the HDMI v2.1 specification, bringing a range of higher video resolutions, Dynamic HDR support and more.

According to the provided details, the new HDMI v2.1 specification will be backward compatible with earlier versions and in addition to higher video resolution and refresh rates, including 8K@60Hz and 4K@120Hz, it will also bring support for Dynamic HDR, Game Mode variable refresh rate (VRR), eARC support for advanced audio formats and the new 48G cable that will provide 48Gbps of bandwidth which is a key for 8K HDR support.

The full list of resolutions and refresh rates start with 4K at 50/60Hz and 100/120Hz and climbs all the way up to 10K resolution at both 50/60Hz and 100/120Hz.

The big surprise is the new Game Mode VRR, which is similar with AMD FreeSync and Nvidia G-Sync, and meant to provide stutter-, tearing- and lag-free gaming, on both consoles and the PC.

Another big novelty is the all new 48G cable, which will provide enough bandwidth for higher resolutions with HDR. The old cables will be compatible with some of the new features, but for 8K/10K with HDR, you will need the new HDMI v2.1 cable.

According to HDMI Forum, the new HDMI v2.1 specification will be available to all HDMI 2.0 Adopters which will be notified when it is released in early Q2 2017.

Courtesy-Fud

Will The ARM And Windows 10 Union Hurt Intel In 2017?

December 22, 2016 by  
Filed under Computing

Next year will see the arrival of ARM based Windows machines and while the beasts have not been that popular in the past, it is expected that this time it will be different.

The original Windows-on-ARM experiment, otherwise known as Windows RT, failed it was mostly since it was a tablet and Vole had not really got any mojo in the market at the time.

Now Microsoft’s OEM partners believe that the company will be more successful this time around and less exclusive. Several companies have signed up to create ARM-based notebooks and tablets running Windows 10.

This because Windows 10 on ARM will support the entire library of millions of standard Win32 desktop apps via an optimised emulation engine.

Qualcomm’s Snapdragon 835 will be the first candidate to receive Windows 10 in 2017, likely starting with the Redstone 3 release expected in late 2017. Microsoft has demoed this chip in action and it provided good performance and decent battery life.

It could also be the key to Microsoft getting into smartphones. It is all rumour and speculation at the moment, but it is a pretty good bet. It will also not be a great thing for Intel which will be having to beat AMD off with a stick as the hardware markets shrink.

Courtesy-Fud

What Is On The Horizon For Gaming Consoles?

December 21, 2016 by  
Filed under Gaming

xboxs-psproWith the exception of the last generation of consoles, which saw a roughly eight-year run on the market, the traditional console cycle has averaged around five to six years. This time around, however, perhaps influenced by the wave of high-end graphics cards necessitated by VR, both Microsoft and Sony are testing the market with so-called mid-cycle upgrades. The PS4 Pro and next year’s Scorpio offer gamers the chance to play in 4K – a resolution that until recently was only possible in the realm of PC gaming. Perhaps more importantly, the inclusion of HDR gaming offers a new level of visual fidelity that brings a much wider color gamut to players.

Factoring in the recent release of the Xbox One S, and the PS4 Slim model as well, we’ve never seen this many new consoles launched to the market so near to the start of a new console cycle – both PS4 and Xbox One released only three years ago – which begs the question: is the traditional console cycle now dead?

2016 and 2017 with Scorpio will certainly prove to be an interesting test for the market. It’s far too early to judge the reception to PS4 Pro, but Xbox One S has been selling moderately well, even allowing Xbox One to outsell PS4 for a few months.

NPD analyst Mat Piscatella, who joined the data firm with years of publishing experience at Activision and WBIE, commented, “I think I’d call it more ‘evolved’ than ‘died’… Nintendo seems to have already been there for years, at least in the portable space. We have to wait and see what they will do with the Switch. But the iterative model certainly will be tested over the next 12-18 months.”

“Nintendo’s past approach in the portable space has proven that the iterative model can be successful,” he continued. “However, the PS4 Pro and Scorpio (we assume) will bring much more significant performance upgrades at higher upfront costs to the consumer.

“Adding iterative hardware into a cycle also creates the need for a much more demanding set of go to market strategies while making successful execution of those strategies more critical than ever before.  Balancing game development resources, the hardware R&D challenges surrounding an iterative launch, the detailed planning that will be necessary to properly align the supply chain from production to retail to ensure the proper mix and stock volumes in channel, ensuring the pricing and price promotion programs are right, all while communicating marketing messages that speak to the different customer sets effectively… this will certainly be an ongoing challenge.”

While some have speculated that we’ll now see new consoles literally every year, mimicking the lightspeed pace at which smartphones get upgraded, the market dynamics for consoles and the impact of new hardware on developers makes that a much more difficult proposition.

Consequently, Piscatella doesn’t think we’ll see more than one new console iteration per cycle. “I don’t see a more rapid deployment as feasible due primarily to development challenges. Making video games is hard, and ensuring a game is optimized for two versions of a console is challenging enough. Getting to 3 or 4 versions of the game for the same console base seems to me as though it would bring diminishing returns,” he explained.

SuperData’s Joost van Dreunen believes that the typical console cycle will remain intact, but unlike Piscatella, he sees the platform holders iterating continuously.

The constant technical upgrades that we see in mobile have effectively changed consumer expectations and the broader market for interactive entertainment, he noted. “So as Sony and Microsoft are pursuing their respective long-term VR/AR agendas, they now also have to keep in touch with what’s happening outside of their secret labs. This means we’re probably looking at the release of a new hardware architecture every 5-7 years, and allowing for annual iterations of key components throughout the lifecycle,” he said.

“This blend of internally developing proprietary hardware and adopting externally emerging trends that are popular with consumers is a powerful mix, and has allowed console gaming to thrive when many wrote it off. And given the current success of both platforms, and the imminent arrival of Nintendo’s bid, I don’t expect the console gaming market to soften any time soon.”

While the longstanding five-plus year console cycle was a boon for developers to work with and optimize for a set specification, the iterative approach that we’re likely to see in hardware moving forward brings advantages as well. As Piscatella explained, introducing console iterations can help to reduce cycle stagnation and boost consoles’ cycle tail, while also “encouraging pubs/devs to invest in scalable development environments, to hopefully avoid the dramatic steps up in development costs seen previously with new hardware deployment.”

Furthermore, by offering multiple configurations that still adhere to the same architecture (Xbox One, One S, Scorpio), there’s an opportunity for “price/benefit ratios appealing to both enthusiast and mass market audiences. Marketing approaches no longer have to take a one-size-fits-all approach,” Piscatella said.

The iterative release schedule would appear to make sense for the console manufacturers, enabling them to maximize returns and keep average pricing elevated, and so long as the ecosystem and gaming experiences are kept consistent across the numerous variations of the consoles, Sony and Microsoft don’t really care which version of their hardware a player owns – so long as that player remains invested in Xbox Live or PlayStation Network, that’s a win for Microsoft or Sony. The platform is less important than the digital ecosystem nowadays, especially with digital sales rising rapidly.

“The console market is at its heart a consumer electronics market,” van Dreunen remarked. “But increasingly it is incorporating tactics from the fashion industry, where we see an accelerated adoption of trends that emerged outside of the ateliers and studios of salaried designers. Console manufacturers have been actively pursuing digital distribution and free-to-play, both of which first gained traction on PC and mobile. Full game downloads now represent about 27% of holiday sales, up from just 5% in 2012, adding just under $7 billion a year to the console market. Titles like FIFA, GTA Online, and Call of Duty, do really well in terms of digital sales, and have managed to improve margins and player base longevity. Further facilitating this trend will be as important as making the hardware better.”

With the digital ecosystem taking precedence, it’s no surprise that Sony has made PlayStation Now streaming titles work for Windows PCs, and with the remote play feature, customers can enjoy PS4 titles on a PC or Mac as well. Microsoft, of course, which has a deep investment in the PC space with Windows 10 has extended the Xbox ecosystem across devices with Xbox Play Anywhere, enabling certain titles to be played with progress intact on a PC or Xbox console.

Whether you’re playing on Windows 10 or Xbox One doesn’t matter to Xbox boss Phil Spencer. In fact, he’s not even concerned about whether you upgrade to Scorpio next year.

“For us in the console [industry], the business is not selling the console,” Spencer told me back at E3. “The business is more of an attached business to the console install base. So if you’re an Xbox One customer and you bought that console 3 years ago, I think you’re a great customer. You’re still using the device. That’s why we focus on monthly active users. That’s actually the health of our ecosystem because it’s really you want this large install base of people that are active in your network buying games, playing games… So our model’s not really built around selling you a new console every one or two years. The model is almost the exact opposite. If I can keep you with the console you have, keep you engaged in buying and playing games, that’s a good business.”

If frequent hardware iteration is indeed the new reality for the console market, publishers couldn’t be happier. “I actually see it…as an incredibly positive evolution of the business strategy for players and for our industry and definitely for EA. The idea that we would potentially not have an end of cycle and a beginning of cycle I think is a positive place for our industry to be and for all of the commercial partners as well as players,” EA global publishing chief Laura Miele said during the company’s EA Play conference.

Take-Two CEO Strauss Zelnick agreed, “It would be a very good thing for us,” not to have to worry about hardware and who owns which console. He likened it to TV. “When you make a television show you don’t ask yourself ‘what monitor is this going to play on?’ It could play on a 1964 color television or it could play on a brand-new 4K television, but you’re still going to make a good television show.”

“We will get to the point where the hardware becomes a backdrop,” he said. That may very well be true. At the point when high fidelity games are playable literally anywhere and on any device, will we even have consoles? Much like Netflix, the networks and content providers/curators will live on, but hardware may not matter.

And as happy as publishers appear to be about the new world of console iteration, NPD’s Piscatella did point to a possible cause for concern. “Here’s the real question. If consumers are purchasing multiple iterations of the same console over the course of a generation, does this potentially increase or decrease the amount of money that consumer will spend on software and associated content?” he asked.

“And I think this is an open question… will they spend more because they’ve reinvested in the ecosystems? Or will they spend less because they’ve just output more money on to the iterative hardware? We’ll see how the market answers that question over the next 12-18 months.”

Courtesy-GI.biz

Are VR Games Profitable?

December 13, 2016 by  
Filed under Gaming

Dean Hall, CEO of RocketWerkz and previously lead designer of DayZ, has spoken openly on Reddit about the harsh financial realities of VR development, explaining that without the subsidies provided by platform exclusives and other mechanisms, the medium would currently be largely unviable.

In an extended post which has garnered over 200 comments, Hall proclaimed that there was simply “no money” in VR game development, explaining that even though his VR title Out of Ammo had sold better than expected, it remained unprofitable.

Hall believes that many consumer expectations from the mature and well-supported PC market have carried over to VR, with customers not fully comprehending the challenges involved with producing content for such a small install base.

“From our standpoint, Out of Ammo has exceeded our sales predictions and achieved our internal objectives,” Hall explained. “However, it has been very unprofitable. It is extremely unlikely that it will ever be profitable. We are comfortable with this, and approached it as such. We expected to lose money and we had the funding internally to handle this. Consider then that Out of Ammo has sold unusually well compared to many other VR games.”

Pointing out that making cross-platform VR to ameliorate that small install base is not as simple as cross platform console development, Hall went on to talk about the realities of funding VR games, and what that meant for the studios involved.

“Where do you get money to develop your games? How do you keep paying people? The only people who might be profitable will be microteams of one or two people with very popular games. The traditional approach has been to partner with platform developers for several reasons:

“The most common examples of this are the consoles. At launch, they actually have very few customers and the initial games release for them, if not bundled and/or with (timed or otherwise) exclusivity deals – the console would not have the games it does. Developers have relied on this funding in order to make games.

“How are the people who are against timed exclusives proposing that development studios pay for the development of the games?

“There is no money in it. I don’t mean ‘money to go buy a Ferrari’. I mean ‘money to make payroll’. People talk about developers who have taken Oculus/Facebook/Intel money like they’ve sold out and gone off to buy an island somewhere. The reality is these developers made these deals because it is the only way their games could come out.

“Here is an example. We considered doing some timed exclusivity for Out of Ammo, because it was uneconomical to continue development. We decided not to because the money available would just help cover costs. The amount of money was not going to make anyone wealthy. Frankly, I applaud Oculus for fronting up and giving real money out with really very little expectations in return other than some timed-exclusivity. Without this subsidization there is no way a studio can break even, let alone make a profit.

“Some will point to GabeN’s email about fronting costs for developers, however I’ve yet to know anyone who’s got that, has been told about it, or knows how to apply for this. It also means you need to get to a point you can access this. Additionally, HTC’s “accelerator” requires you to set up your studio in specific places – and these specific places are incredibly expensive areas to live and run a studio. I think Valve/HTC’s no subsidy/exclusive approach is good for the consumer in the short term – but terrible for studios.

“As I result I think we will see more and more microprojects, and then more and more criticism that there are not more games with more content.”

In addition to the financial burdens, Hall says that there are other pressures too. For example, in his experience VR development burns people out very quickly indeed, with the enthusiasm of most, including himself, waning after a single project.

“I laugh now when people say or tweet me things like ‘I can’t wait to see what your next VR game will be!’ Honestly, I don’t think I want to make any more VR games. Our staff who work on VR games all want to rotate off after their work is done. Privately, developers have been talking about this but nobody seems to feel comfortable talking about it publicly – which I think will ultimately be bad.”

“For us it became clear that the rise of VR would be gradual rather than explosive when in 2015, it was revealed that the Oculus Rift and HTC Vive would be released in 2016 and that the gold rush would be on hold”

Sam Watts, Make Real

It’s not a universal opinion among VR developers, however: there was opposition to Hall’s points both within and beyond the thread. Sam Watts, Operations Lead at Make Real had the following to say.

“I think the reality of that thread is a direct result of a perceived gold rush by developers of all sizes to a degree, since analyst predictions around sales volumes of units were far higher than the reality towards the end of the year. There have been waves of gold rush perceptions with VR over the past few years, mostly around each release of new hardware expecting the next boom to take the technology into the mainstream, which has mostly failed to materialise.

“For us it became clear that the rise of VR would be gradual rather than explosive when in 2015, it was revealed that the Oculus Rift and HTC Vive would be released in 2016 and that the gold rush would be on hold.”

Watts also sees a healthier VR ecosystem on the way, one where big publishers might be more willing to invest in the sort of budgets which console games are used to.

“Whilst typical AAA budgets aren’t yet being spent on VR (to our knowledge) it doesn’t mean AAA isn’t dipping their toe in the water. The main leader being Ubisoft who created a small VR R&D team that eventually became the Eagle Flight devs. They have avoided what many early VR developers were worried about AAA approach to VR by prototyping, iterating on design, making mistakes, learning from them and working out what does and doesn’t work in VR, even creating a now widely popular comfort option of the reduced peripheral vision black tunnel effect. They didn’t just storm in late to the party, throwing AAA megabucks around at the problem, assuming money would make great games.

“I know Oculus, Steam, Sony and Razer are still funding games titles for development in 2017, I would hope to see this continue beyond to ensure the continued steady adoption and rise of VR as a new gaming platform moving forwards. This will help continue to improve the quality of content offering on the platforms to ensure full gaming experiences that gamers want to buy and return to, rather than just a series of short tech demos, are available, helping establish the medium and widen the net.”

Courtesy-GI.biz

Will Windows 10 On ARM Finally Materialize?

December 9, 2016 by  
Filed under Computing

Microsoft dropped a bomb on December 7th.  At WinHEC it announced that the Next generation Qualcomm Snapdragon processors have full Windows 10 support. Yes, this time, they will run every Windows X86 application via an emulator.

It looks like 2017 will be a fun year. Qualcomm,  all of a sudden got support for Windows 10 on its mobile computing devices. This will enable new anytime, anywhere connected device form factors. What Qualcomm and Microsoft are trying to say is that you can expect some tablet/notebook devices powered by SoCs that aren’t coming from Intel nor AMD.

This will help the synergy between mobile devices and computers and may well be the right way to do the Windows “continuum” in the right way.

The Windows 10 devices powered by Snapdragon are expected to support all aspects of Microsoft’s latest operating system including Microsoft Office, Microsoft Edge browser, Windows 10 gaming titles like Crysis 2 and World of Tanks, Windows Hello, and touchscreen features like Windows Pen. Qualcomm Snapdragon powered devices are expected to support Universal Windows Platform (UWP) apps and Win32 apps through emulation, providing users with a wide selection of full featured applications. There is no label but most things should work, if not all of them.

This is definitely better than Windows RT, when Microsoft tried to develop Windows on ARM – a platform that simply confused the market as it would not run X86 applications. Now that problem is solved.

Terry Myerson, executive vice president of the Windows and Devices Group at Microsoft said:

“We are excited to bring Windows 10 to the ARM ecosystem with our partner, Qualcomm Technologies, We continue to look for ways to empower our customers to create wherever they are. Bringing Windows 10 to life with a range of thin, light, power-efficient and always-connected devices, powered by the Qualcomm Snapdragon platform, is the next step in delivering the innovations our customers love – touch, pen, Windows Hello, and more – anytime, anywhere.”

Cristiano Amon, executive vice president, Qualcomm Technologies, Inc. and president, QCT said:

“Qualcomm Snapdragon processors offer one of the world’s most advanced mobile computing features, including Gigabit LTE connectivity, advanced multimedia support, machine learning and superior hardware security features, all while supporting thin, fan-less designs and long battery life. “With full compatibility with the Windows 10 ecosystem, the Qualcomm Snapdragon platform is expected to support mobility to cloud computing and redefine how people will use their compute devices.”

The first devices running the full Windows 10 experience based on Snapdragon processors are expected to be commercially available in the second half of 2017. From what we understand, this cooperation will not only include Snapdragon 835 and it looks like that all  future chips might end up getting  support for Windows 10. We will have to wait until  the second half of next year to see which will be the first company to launch a device powered by Snapdragon.

It will be interesting to see if that incurs a performance penalty for emulating the applications written for X86 on ARM architecture as emulation always cost you some performance. But Qualcomm and Microsoft would not go to this venture if it wasn’t something they could generally contribute to. This announcement has just put a lot of fuel to a Snapdragon 835 powered Surface phone, or at least a Surface device at some point.

We have a feeling that that might be Microsoft itself of one of the big OEMs think Dell, HP, Lenovo kind of customers.

 

Courtesy-Fud

Does AMD Have A New Graphics Card In Route?

December 9, 2016 by  
Filed under Computing

A mystery AMD GPU has been spotted in the Ashes of the Singularity benchmark database, which goes in line with previous rumors that AMD is possibly preparing a new graphics card, called the RX 490.

The Radeon RX 490 has been already spotted online a few times and there have been quite a few rumors that AMD is working on a new graphics card. According to newest benchmark results, the graphics card, with Device ID 687F:C18, is very close to the Nvidia Geforce GTX 1080.

While it is currently anyone’s guess, the mystery graphics card spotted in these benchmarks is most likely a dual-GPU Polaris graphics card and the score is in line with what you would get with two Radeon RX 480 graphics cards in Crossfire. On the other hand, the Ashes of the Singularity benchmark does detect multi-GPU configurations and it did not detect it in these results.

We have already managed to confirm that AMD Radeon Technologies Group should launch its new Vega GPU architecture, with a Vega 10 GPU, sometime in the first half of 2017 with possible briefings sometime this month. Bear in mind that some sources suggest that Vega could launch as early as Q1 2017.

The AMD Vega 10 GPU is expected to hit 24 TFLOPs of half-precision and 12 TFLOPs of single-precision compute performance. It is expected to pack 4096 Stream Processors and come with 16GB of HBM2 memory.

The aforementioned benchmark result might easily be a sample of the Vega GPU, but that would be a big surprise. Results were pulled from the benchmark data site but Techpowerup.com managed to get all the screenshots.

As you already know, AMD is hosting the big “New Horizon” event on December 13, where we expect it to preview its new Zen CPU architecture as well as new AM4+ desktop motherboards and hopefully preview or at least mention a new graphics card.

Courtesy-Fud

Is The PSVR Really Selling?

December 2, 2016 by  
Filed under Gaming

As the numbers from Black Friday and Thanksgiving weekend continue to trickle in, many analysts are examining how the holiday sales picture is coming together this year. While The NPD Group is not ready to give its full assessment just yet, the firm did note to GamesIndustry.biz that digital promotions on PlayStation Network and Xbox Live were much more aggressive this year and may have impacted the retail channel. Digital aside, the sector that seemed to struggle the most is virtual reality, according to SuperData, which said VR has been the “biggest loser.”

Thanks to “notably fewer units sold than expected due to a relatively fragmented title line-up and modest marketing effort,” VR headsets are now expected to sell even fewer than previously thought. SuperData’s revised forecast for 2016 calls for under 750k PlayStation VR units sold (their previous estimate was 2.6 million) with Google’s Daydream selling just 261k (down from 450k). Previous estimates for HTC Vive, Oculus Rift and Gear VR remain unchanged at 450k, 355k and 2.3 million, respectively.

As you can see, expectations for PSVR have seen the most dramatic shift. Stephanie Llamas, director of research and insights at SuperData, explained to us, “PSVR had the best opportunity to benefit from the holidays but their supply inconsistencies and lack of marketing have put them behind their potential. They did not offer any first-party deals this weekend, restock bundles or market the device, pushing instead for the PS 4 Pro. They have also pointed out that VR looks even better on a Pro than a standard or slim PS 4, so the message to most gamers is: Get the Pro now, then the PSVR later. As a result, we won’t see them break 1M shipments until well into the new year.”

Llamas added that Sony may be deliberately limiting PSVR supply until it can do a better job with supporting the platform. “Had Sony pushed the PSVR the way they’ve been pushing their other new hardware, the demand would have certainly fulfilled a supply of over 2 million. However, given its quiet release it’s clear they’re being cautious before fully investing in the tech. Without the ‘killer app’ and the slow, steady release of AAA content, they will release less than 1 million devices until they have content they feel confident will bring in the praise they want. They can afford to take it slow since they have no competition for now, so their supply and sales will rise steadily into 2017 as opposed to riding the seasonal wave,” she said.

As for Oculus, Llamas believes they’ve taken a risk by possibly splitting their own user base. “The Rift’s Touch controllers are an opportunity for Oculus to penetrate, but not many headsets have moved, especially with their round-about deal where purchasers earned $100 Oculus credit rather than just getting $100 off. Oculus’s hardware release strategy has also slowed them down and split their user base, so developers are having to make some choices around whether they should develop for both Touch and non-Touch users. This means development has slowed and is becoming another barrier to growth,” she remarked.

Looking at the non-VR games market, Nintendo may actually prove to be the biggest winner, thanks to updates both to Pokémon GO and selling out of its NES mini. “On mobile we recorded a spike in earnings as players made the most of the Thanksgiving special for Pokémon GO. The game’s ability to stay in the forefront of people’s minds as we approach the release date for Super Mario Run may prove beneficial for Nintendo, which has yet to make a convincing claim on the $38 billion mobile games market,” said Joost van Dreunen.

Overall digital game sales this holiday are down 2% from 2015 so far, but the impact of digital has grown tremendously in just a few years. “In 2012 full game downloads accounted for only 6% of total unit sales around the Thanksgiving holiday in the United States. For 2016E that number was four times higher at 24%,” van Dreunen said.

The other big contributor to the slow holiday start has been big discounting, according to Wedbush Securities’ Michael Pachter. “We saw greater discounting of high-profile new video games this Black Friday compared to last year. Last year’s top sellers, Activision Blizzard’s Call of Duty: Black Ops III , Bethesda Softworks’ Fallout 4, and EA’s Star Wars Battlefront, saw sticky pricing on Black Friday, with the $60 price point remaining largely intact. While discounting of sports games happens each year, many other titles that maintain pricing on Black Friday were listed at discounts of 40% or more this weekend,” he observed.

“For example, Walmart had EA’s Battlefield 1 and Titanfall 2 at $27, and Microsoft’s Gears of War 4 and Take-Two’s Mafia III at $35. Walmart also had Activision Blizzard’s Call of Duty: Infinite Warfare Legacy Edition, which includes Modern Warfare Remastered , for $57, a $23 discount. Discounting of Call of Duty: Infinite Warfare began earlier in the week, with widespread discounts of roughly $20 for the different versions of the game. Hardware discounting for the PS4 and Xbox One was largely consistent with 2015, as $50 discounts were commonplace.”

Pachter also agreed that the “pace of the mix shift to digital full game downloads continues to be brisk,” but we probably won’t know whether digital sales fully made up for retail declines until we get the complete NPD report for 2016 sometime in January.

Courtesy-GI.biz

Next Page »