Subscribe to:

Subscribe to :: TheGuruReview.net ::

Is AMD Out Of The Woods?

May 9, 2017 by  
Filed under Computing

AMD reported an 18.3 percent jump in quarterly revenue but the chipmaker’s second-quarter gross margins forecast raised some concerns.

AMD said it expected adjusted gross margins to be about 33 percent in the current quarter, compared with 34 percent in the first quarter.

But the cocaine nose jobs of Wall Street were not impressed. Chipmakers who want to be profitable, trade on gross margin and there is concern that AMD’s is too low to make much dosh.

AMD launched a few of its Ryzen range of desktop processors in the first quarter and plans to unveil its Naples chips targeting the server market in the second quarter.

The Ryzen chips helped boost the company’s revenue in the first quarter ended April 1.

Chief Executive Officer Lisa Su said that “all of the feedback that we’ve gotten so far from both our customers and from end-users has been very strong.”

However, total revenue was weighed down by its business that supplies graphics cards used in gaming consoles such as the Xbox One and the PlayStation 4.

Revenue in the business rose 5 percent to $391 million, but came in below analysts’ average estimate of $442.1 million, according to financial data and analytics firm FactSet.

AMD also forecast low double-digit percentage revenue growth for the full year.

Revenue rose to $984 million in the first quarter, from $832 million a year earlier. AMD’s net loss narrowed to $73 million from $109 million.

Courtesy-Fud

nVidia Shows Off GameWorks Technology

May 1, 2017 by  
Filed under Gaming

Nvidia has revealed a few more details about its GameWorks Flow technology, which should provide fluid effects for realistic combustible fluid, fire and smoke simulation.

Following in the footsteps of Nvidia Turbulence and FlameWorks technologies, the new GameWorks Flow library provides both DirectX 11 and DirectX 12 implementations and can run on any recent DirectX 11- and DirectX 12-capable GPUs.

The GameWorks Flow uses an adaptive sparse voxel grid which should provide both maximum flexibility as well as the least memory impact. It is also optimized for use of Volume Tiled Resources, which allows volume textures to be used as three-dimensional tiled resources.

Nvidia has released a neat simulation video of the GameWorks Flow implementation in DirectX 12, which shows the fire and the combustion process with an adaptive sparse voxel grid used in both the fire and to compute self-shadowing on the smoke, increasing both the realism and visual effects.

Hopefully, game developers will manage to implement Nvidia’s GameWorks Flow without a significant impact on the performance.

Courtesy-Fud

Are Motherboard Shipments Decreasing?

May 1, 2017 by  
Filed under Computing

With the global decline in PC shipments finally showing signs of slowing, motherboard vendors are expecting to see a correlated slowing of overall volume in 2017, with some estimates hovering near 10 percent from last year.

Last month, a market research report from Global Information Inc showed the global volume of motherboard shipments in Q4 2016 dropping 5.2 percent from Q3 and 13.6 percent year-over-year. Total shipments for 2016 were estimated to be less than 50 million units, and this was even forecasted at the beginning of the year. As the fourth quarter approached, vendors said that sales of Kaby Lake motherboards were not living up to expectations, while the overall market remained in a state of weaker demand. The report covered vendors including AMD, ECS, Foxconn, Gigabyte, Intel, Jetway, Microstar, Pegatron, QCI, T&I, and Wistron.

Notebooks, exchange rates and component shortages to blame

According to the latest report, three problems are affecting the ability of motherboard vendors to increase sales numbers. First, sources within the motherboard industry have pointed out that notebooks have gradually taken market share from the build-it-yourself PC market, mainly as a result of “better specifications, smaller form factors, and cheaper prices”. Second, the vendors have experienced a large exchange rate hike over the past two years, from 6.2 percent in April 2015 to 6.8 percent in April 2017. Finally, rising component prices and various component shortages have also contributed to difficulties in production operations. So in order to remain profitable, some vendors have focused on reducing shipments and changing their focus to other product segments, including gaming notebooks and mobile devices.

Sources within the industry note that even while Intel’s Kaby Lake processor lineup and Z200 series chipset have not sold as much volume as anticipated, it is possible that the imminent thread of AMD’s Ryzen 5 and 7 lineups has continued to stimulate prices cuts across the board to keep up on platform sales. Many retailers have now begun offering more serious price cuts when bundled with compatible motherboards, and this trend is expected to continue with the release of AMD’s Ryzen 3 and Intel’s Z300 and X299 series chipsets later this year.

Courtesy-Fud

Will ARM On Windows Take Off This Year?

April 28, 2017 by  
Filed under Computing

Qualcomm has dropped a huge hint that we will see ARM based PCs in the shops in the fourth quarter.

Qualcomm said the first cellular laptop with Windows 10 and its ARM-based Snapdragon 835 will come by the end of the year.

Steve Mollenkopf, CEO of Qualcomm, said that the Snapdragon 835 will expanding into mobile PC designs running Windows 10, and it’s scheduled to launch in the fourth quarter.

Apparently Qualcomm and Microsoft are flat out getting ARM-based Windows 10 PCs to work. If they pull it off, you should get a thin-and-light device that could be used as a tablet or laptop.

Most of the design cues will come from smartphones and it is being dubbed a cellular PC by Qualcomm and Microsoft.

The device will always be connected to a cellular network with a high-speed modem, much like a smartphone. It will have other wireless connectivity features like Bluetooth 5 and possibly Wi-Gig, which are integrated into the Snapdragon 835 chipset.

The cellular PC could also have a long battery life, considering Snapdragon 835 was designed for smartphones. It will be 4K video capable with a powerful Adreno 540 GPU in the Snapdragon 835.

So far no major PC maker has yet announced an ARM-based Windows PC and we are not expecting to see a flood of the beasts. Suppliers will be cautious because ARM based Windows PCs have not worked well. Windows RT tablets were somewhat mocked.

Dell and HP have expressed interest in cellular PCs but need time to test them. HP wants to see if there’s enough demand for such a device before making a decision.

Microsoft has demonstrated Photoshop running on Snapdragon 835 but it is not clear how much other software will be out there.

Courtesy-Fud

The Witcher Franchise Goes 25 Million Units

April 12, 2017 by  
Filed under Gaming

The Witcher 3: Wild Hunt continues to pay off for CD Projekt. The Polish publisher today reported its financial results for calendar year 2016, and the hit 2015 role-playing game loomed large over another successful campaign for the company.

CD Projekt said its revenues “continued to be dominated by ongoing strong sales” of The Witcher 3 and its two expansions. While the base game and its first expansion debuted in 2015, the second and final expansion pack, Blood and Wine, arrived last May and helped drive revenues of 583.9 million PLN ($148.37 million). That was down almost 27 percent year-over-year, but still well beyond the company’s sales figures prior to 2015. Net profits were likewise down almost 27%, with the company posting a bottom line gain of 250.5 million PLN ($63.65 million).

The company also announced a new milestone for the Witcher franchise, saying the three games have now cumulatively topped 25 millions copies sold, a number that doesn’t include The Witcher 3 expansions packs. That suggests 2016 saw roughly 5 million copies sold over the 20 million reported in CD Projekt’s 2015 year-end financials.

Even if this year saw overall sales take a dip for CD Projekt, its GOG.com online retail storefront still managed to post its best year ever. The company reported GOG.com revenues of 133.5 million PLN ($33.92 million), up 15% year-over-year.

CD Projekt is currently testing its Gwent free-to-play card game in closed beta, and intends to open it up to the public this spring. It is also working on its next AAA game, Cyberpunk 2077, thought it has no release date as yet.

Courtesy-GI.biz

Can Violence In A Game Promote Safety?

March 30, 2017 by  
Filed under Gaming

When the original Doom was released in 1993, its unprecedentedly realistic graphic violence fueled a moral panic among parents and educators. Over time, the game’s sprite-based gore has lost a bit of its impact, and that previous sentence likely sounds absurd.

Given what games have depicted in the nearly quarter century since Doom, that level of violence no longer shocking so much as it is quaint, perhaps even endearing. So when it came time for id Software to reboot the series with last year’s critically acclaimed remake of Doom, one of the things the studio had to consider was exactly how violent it should be, and to what end.

Speaking with GamesIndustry.biz at the Game Developers Conference last month, the Doom reboot’s executive producer and game director Marty Stratton and creative director Hugo Martin acknowledged that the context of the first Doom’s violence had changed greatly over the years. And while the original’s violence may have been seen as horrific and shocking, they wanted the reboot to skew closer to cartoonishly entertaining or, as they put it, less Saw and more Evil Dead 2.

“We were going for smiles, not shrieks,” Martin said, adding, “What we found with violence is that more actually makes it safer, I guess, or just more acceptable. It pushes it more into the fun zone. Because if it’s a slow trickle of blood out of a slit wrist, that’s Saw. That’s a little bit unsettling, and sort of a different type of horror. If it’s a comical fountain of Hawaiian Punch-looking blood out of someone’s head that you just shot off, that’s comic book. That’s cartoonish, and that’s what we wanted.”

“They’re demons,” Stratton said. “We don’t kill a single human in all of Doom. No cursing, no nudity. No killing of humans. We’re actually a pretty tame game when you think about it. I’ve played a lot of games where you just slaughter massive amounts of human beings. I think if we had to make some of the decisions we make about violence and the animations we do and if we were doing them to humans, we would have completely different attitudes when we go into those discussions. It’s fun to sit down in a meeting and think about all the ways it would be cool to rip apart a pinky demon or an imp. But if we had the same discussions about, ‘How am I going to rip this person in half?’ or rip his arm off and beat him over the head with it, it takes on a different connotation that I don’t know would be as fun.”

That balancing act between horror and comedy paid off for the reboot, but it was by no means the only line last year’s Doom had to straddle. There was also the question of what a modern Doom game would look like. The first two Doom games were fast-paced shooters, while the third was a much slower horror-tinged game where players had to choose between holding a gun or a flashlight at the ready. Neither really fit into the recent mold of AAA shooters, and the developers knew different people would have very different expectations for a Doom game in 2016.

As Stratton explained, “At that point, we went to, ‘What do we want? What do we think a Doom game should be moving forward?’As much as we always consider how the audience is going to react to the game–what they’re thinking, and what we think they want–back in the very beginning, it was, ‘What do we think Doom should be, and what elements of the game do we want to build the future of Doom on?’ And that’s really where we came back to Doom 1, Doom II, the action, the tone, the attitude, the personality, the character, the irreverence of it… those were all key words that we threw up on the board in those early days. And then mechanically, it was about the speed. It was about unbelievable guns, crazy demons, really being very honest about the fact that it was Doom. It was unapologetic early on, and we built from there.”

It helped that they had a recent example of how not to bring Doom into the current generation. Prior to the Doom reboot, id Software had been working on Doom 4, which Stratton said was a good game, but just didn’t feel like Doom. For one, it cast players as a member of a resistance army rather than a one-marine wrecking crew. It was also slower from a gameplay perspective, utilizing a cover-based system shared by numerous modern shooters designed to make the player feel vulnerable.

“None of us thought that the word ‘vulnerable’ belonged in a proper Doom game,” Martin said. “You should be the scariest thing in the level.”

Doom 4 wasn’t a complete write-off, however. The reboot’s glory kill system of over-the-top executions actually grew out of a Doom 4 feature, although Stratton said they made it “faster and snappier.”

Of course, not everything worked as well. At one point the team tried giving players a voice in their ears to help guide them through the game, a pretty standard first-person shooter device along the lines of Halo’s Cortana. Stratton said while the device works well for other franchises, it just didn’t feel right for Doom, so it was quickly scrapped.

“We didn’t force anything,” Stratton said. “If something didn’t feel like Doom, we got rid of it and tried something that would feel like Doom.”

That approach paid off well for the game’s single-player mode, but Stratton and Martin suggested they weren’t quite as thrilled with multiplayer. Both are proud of the multiplayer (which continues to be worked on) and confident they delivered a high quality experience with it, but they each had their misgivings about it. Stratton said if he could change one thing, it would have been to re-do the multiplayer progression system and give more enticing or better placed “hooks” to keep players coming back for game after game. Martin wished the team had messaged what the multiplayer would be a little more clearly, saying too many expected a pure arena shooter along the lines of Quake 3 Arena, when that was never the development team’s intent.

Those issues aside, it’s clear the pair feel the new wrinkles and changes they made to the classic Doom formula paid off more often than not.

“Lots worked,” Stratton said. “That’s probably the biggest point of pride for us. The game really connected with people. We always said we wanted to make something that was familiar to long-time fans, felt like Doom from a gameplay perspective and from a style and tone and attitude perspective. And I think we really accomplished that at a high level. And I think we made some new fans, which is always what you’re trying to do when you have a game that’s only had a few releases over the course of 25 years… You’re looking to bring new people into the genre, or into the brand, and I think we did that.”

Courtesy-GI.biz

Will AMD’s Polaris Based RX 500 Launch April 18th?

March 27, 2017 by  
Filed under Computing

According to reports, the upcoming AMD Radeon RX 500 series, which should be based on Polaris GPUs, could be slightly delayed, with the new launch date set for April 18th.

While earlier information suggested that the Polaris 10-based Radeon RX 570/580 should be coming on April 4th, with Polaris 11-based RX 550/560 refresh coming a week later, on April 11th, a new report from China site Mydrivers.com, spotted by eTeknix.com, suggests that the launch date has been pushed back to April 18th.

As we’ve written before, the new Radeon RX 500 series will be based on an existing AMD Polaris GPU architecture but should have somewhat higher clocks and improved performance-per-watt while the flagship Vega GPU based Radeon RX Vega, should be coming at a later date, most likely at Computex 2017 show, starting on May 30th.

Unfortunately, the precise details regarding the upcoming Radeon RX 500 series are still unknown but hopefully these performance and clock improvements will allow AMD to compete with Nvidia’s mainstream lineup.

Courtesy-Fud

AMD’s Vega Benchmarks Continue To Leak

March 23, 2017 by  
Filed under Computing

An alleged SiSoft benchmark result leak that has been spotted recently has revealed a bit more information on what appears to be at least one version of the upcoming AMD Vega GPU.

According to the data provided by the benchmark result, which was originally spotted by Videocardz.com, the GPU features 64 CUs for a total of 4096 Stream Processors as well as comes with 8GB of VRAM on a 2048-bit memory interface (two HMB2 stacks, each with 4GB and 1024-bit memory interface).

Despite the obviously wrong 344MHz GPU clock, the results are quite impressive, outperforming the Geforce GTX 1080 in the same benchmark. Of course, these are just compute results and probably done on an alpha driver version so it is still too early to talk about the actual real world performance but at least it gives a general idea regarding the GPU.

Earlier rumors suggested that there will be at least two versions of the Vega GPU and it is still not clear if this is the slower or the faster one.

As confirmed by AMD earlier, its Radeon RX Vega graphics cards should be coming in the first half of this year, with the most likely launch at Computex 2017 show which opens its doors on May 30th.

Courtesy-Fud

Windows On ARM Ready To Debut

March 20, 2017 by  
Filed under Computing

After months of rumors, Windows is finally fully functional on ARM based chips and the Wintel alliance is in tatters.

Microsoft officials told Bloomberg that the company is committed to use ARM chips in machines running its cloud services.

Microsoft will use the ARM chips in a cloud server design that its officials will detail at the US Open Compute Project Summit today. Microsoft has been working with both Qualcomm and Cavium on the version of Windows Server for ARM.

Microsoft joined the Open Compute Project (OCP) in 2014, and is a founding member of and contributor to the organisation’s Switch Abstraction Interface (SAI) project.

The OCP publishes open hardware designs intended to be used to build cheaper datacentres. The OCP has released specs for motherboards, chipsets, cabling, and common sockets, connectors, and open networking and switches.

Vole’s cloud server specification is a a 12U shared server chassis capable of housing 24 1U servers. Microsoft is also releasing its Chassis Manager under the open-source Apache license.

Project Olympus is the codename for Vole’s next-generation cloud hardware design that it contributed last autumn Fall to the OCP.

Vole is also expected to use ARM processors in its Olympus systems which will be headed to its data systems by Christmas.

The winner appears to be Qualcomm which says it is working on a variety of cloud workloads to run on the Microsoft Azure cloud platform powered by Qualcomm Centriq 2400 server solutions.

Qualcomm said it had been working with Vole for several years on ARM-based server enablement and has onsite engineering at Microsoft to collaboratively optimise a version of Windows Server, for Microsoft’s internal use in its data centres, on Qualcomm Centriq 2400-based systems.

There’s no word from Microsoft when it will begin offering Windows Server on ARM to external customers or partners, but that is only a matter of time. With less power the need for Intel’s use in the server room becomes less important and if ARM designs become more established because of Microsoft’s blessing, it is unlikely that anyone will want Intelthere.

Courtesy-Fud

Is AMD Rebranding The Radeon Series Next Month?

March 10, 2017 by  
Filed under Computing

According to the newest report, AMD’s upcoming RX 500 series will be completely made of RX 400 series rebrands, based on Polaris 10 and Polaris 11, as well as possible new Polaris 12 GPU, while Radeon RX Vega won’t launch before late May.

According to a report written by Martin Fischer from Heise.de, which is pretty much what we have been hearing and able to confirm with our sources, AMD is planing to launch its Radeon RX 500 series in early April. While this would usually sound interesting, the entire lineup will be compromised from RX 400 series rebrands, but with slightly higher GPU clocks.

According to the report, Polaris 10-based RX 580 and RX 570, should be launching on April 4th, while Radeon RX 560 and RX 550, based on Polaris 11, should launch on April 11. According to a similar report from Videocardz.com, the Radeon RX 550 could be based on a new Polaris 12 chip, while the rebranded Radeon RX 560 could get the fully-enabled Polaris 11 GPU with 1024 Stream Processors, similar to RX 460 seen in China.

The big question is the newly named Radeon RX Vega, which, according to AMD, should be available in the first half of this year, so the most likely launch is at the Computex 2017 show which starts on May 30th. It appears that the Nvidia GTX 1080 Ti will have to wait to get its competitor as simple rebrands just won’t be enough.

Courtesy-Fud

Are Low Profile Radeon RX 460 Forthcoming?

February 23, 2017 by  
Filed under Computing

MSI has unveiled yet another HTPC-friendly graphics card, the low-profile Radeon RX 460 that will comes in both 2GB and 4GB versions.

Featuring a dual-slot, low-profile, dual-fan cooler and a low-profile PCB to match, both the MSI RX 460 2GT LP and 4GT LP graphics card will be working at reference 1090MHz GPU base and 1200MHz GPU Boost clocks with GDDR5 memory working at 1750MHz on a 128-bit memory interface.

It also comes with single DVI and one HDMI display outputs.

In case you missed it, the Radeon RX 460 is based on AMD’s Polaris 11 GPU with 896 Stream Processors, 48 TMUs and 16 ROPs and should pack enough punch for a decent casual gaming experience.

Unfortunately, the price or the availability date have not been revealed but we are sure these two will appear in retail/e-tail soon at around US $100/€100.

Courtesy-Fud

Why Are The NPD Games Sales Kept Private?

February 22, 2017 by  
Filed under Gaming

When I first began my career in the games industry I wrote a story about an impending digital download chart.

It was February 2008 and Dorian Bloch – who was leader of UK physical games data business Chart-Track at the time – vowed to have a download Top 50 by Christmas.

It wasn’t for want of trying. Digital retailers, including Steam, refused to share the figures and insisted it was down to the individual publishers and developers to do the sharing (in contrast to the retail space, where the stores are the ones that do the sharing). This led to an initiative in the UK where trade body UKIE began using its relationships with publishers to pull together a chart. However, after some initial success, the project ultimately fell away once the sheer scale of the work involved became apparent.

Last year in the US, NPD managed to get a similar project going and is thus far the only public chart that combines physical and digital data from accurate sources. However, although many big publishers are contributing to the figures, there remains some notable absentees and a lack of smaller developers and publishers.

In Europe, ISFE is just ramping up its own project and has even began trialling charts in some territories (behind closed doors), however, it currently lacks the physical retail data in most major markets. This overall lack of information has seen a rise in the number of firms trying to plug the hole in our digital data knowledge. Steam Spy uses a Web API to gather data from Steam user profiles to track download numbers – a job it does fairly accurately (albeit not all of the time).

SuperData takes point-of-sale and transaction information from payment service providers, plus some publishers and developers, which means it can track actual spend. It’s strong on console, but again, it’s not 100% accurate. The mobile space has a strong player in App Annie collecting data, although developers in the space find the cost of accessing this information high.

It feels unusual to be having this conversation in 2017. In a market that is now predominantly digital, the fact we have no accurate way of measuring our industry seems absurd. Film has almost daily updates of box office takings, the music market even tracks streams and radio plays… we don’t even know how many people downloaded Overwatch, or where Stardew Valley would have charted. So what is taking so long?

“It took a tremendous amount of time and effort from both the publisher and NPD sides to make digital sales data begin to flow,” says Mat Piscatella, NPD’s US games industry analyst. NPD’s monthly digital chart is the furthest the industry has come to accurate market data in the download space.

“It certainly wasn’t like flipping a switch. Entirely new processes were necessary on both sides – publishers and within NPD. New ways of thinking about sales data had to be derived. And at the publishers, efforts had to be made to identify the investments that would be required in order to participate. And of course, most crucially, getting those investments approved. We all had to learn together, publishers, NPD, EEDAR and others, in ways that met the wants and needs of everyone participating.

“Over time, most of the largest third-party publishers joined the digital panel. It has been a remarkable series of events that have gotten us to where we are today. It hasn’t always been smooth; and keep in mind, at the time the digital initiative began, digital sales were often a very small piece of the business, and one that was often not being actively managed. Back then, publishers may have been letting someone in a first-party operation, or brand marketing role post the box art to the game on the Sony, Microsoft and Steam storefronts, and that would be that. Pricing wouldn’t be actively managed, sales might be looked at every month or quarter, but this information certainly was not being looked at like packaged sales were. The digital business was a smaller, incremental piece of the pie then. Now, of course, that’s certainly changed, and continues to change.”

“For one, the majors are publicly traded firms, which means that any shared data presents a financial liability. Across the board the big publishers have historically sought to protect the sanctity of their internal operations because of the long development cycles and high capital risks involved in AAA game publishing. But, to be honest, it’s only been a few years that especially legacy publishers have started to aggregate and apply digital data, which means that their internal reporting still tends to be relatively underdeveloped. Many of them are only now building the necessary teams and infrastructure around business intelligence.”

Indeed, both SuperData and NPD believe that progress – as slow as it may be – has been happening. And although some publishers are still holding out or refusing to get involved, that resolve is weakening over time.   “For us, it’s about proving the value of participation to those publishers that are choosing not to participate at this time,” Piscatella says. “And that can be a challenge for a few reasons. First, some publishers may believe that the data available today is not directly actionable or meaningful to its business. The publisher may offer products that have dominant share in a particular niche, for example, which competitive data as it stands today would not help them improve.

“Second, some publishers may believe that they have some ‘secret sauce’ that sharing digital sales data would expose, and they don’t want to lose that perceived competitive advantage. Third, resources are almost always stretched thin, requiring prioritisation of business initiatives. For the most part, publishers have not expanded their sales planning departments to keep pace with all of the overwhelming amount of new information and data sources that are now available. There simply may not be the people power to effectively participate, forcing some publishers to pass on participating, at least for now.

“So I would certainly not classify this situation as companies ‘holding out’ as you say. It’s that some companies have not yet been convinced that sharing such information is beneficial enough to overcome the business challenges involved. Conceptually, the sharing of such information seems very easy. In reality, participating in an initiative like this takes time, money, energy and trust. I’m encouraged and very happy so much progress has been made with participating publishers, and a tremendous amount of energy is being applied to prove that value to those publishers that are currently not participating.”

NPD’s achievements is significant because it has managed to convince a good number of bigger publishers, and those with particularly successful IP, to share figures. And this has long been seen as a stumbling block, because for those companies performing particularly well, the urge to share data is reduced. I’ve heard countless comments from sales directors who have said that ‘sharing download numbers would just encourage more competitors to try what we’re doing.’ It’s why van Dreunen has noted that “as soon as game companies start to do well, they cease the sharing of their data.”

Indeed, it is often fledgling companies, and indie studios, that need this data more than most. It’s part of the reason behind the rise of Steam Spy, which prides itself on helping smaller outfits.

“I’ve heard many stories about indie teams getting financed because they managed to present market research based on Steam Spy data,” boasts Sergey Galyonkin, the man behind Steam Spy. “Just this week I talked to a team that got funded by Medienboard Berlin-Brandenburg based on this. Before Steam Spy it was harder to do a proper market research for people like them.

“Big players know these numbers already and would gain nothing from sharing them with everyone else. Small developers have no access to paid research to publish anything.

“Overall I’d say Steam Spy helped to move the discussion into a more data-based realm and that’s a good thing in my opinion.”

The games industry may be behaving in an unusually backwards capacity when it comes to sharing its digital data, but there are signs of a growing willingness to be more open. A combination of trade body and media pressure has convinced some larger publishers to give it a go. Furthermore, publishers are starting to feel obligated to share figures anyway, especially when the likes of SuperData and Steam Spy are putting out information whether they want them to or not.

Indeed, although the chart Dorian promised me 9 years ago is still AWOL, there are at least some figures out there today that gives us a sense of how things are performing.

“When we first started SuperData six years ago there was exactly zero digital data available,” van Dreunen notes. “Today we track the monthly spending of 78 million digital gamers across platforms, in spite of heavy competition and the reluctance from publishers to share. Creating transparency around digital data is merely a matter of market maturity and executive leadership, and many of our customers and partners have started to realize that.”

He continues: The current inertia comes from middle management that fears new revenue models and industry changes. So we are trying to overcome a mindset rather than a data problem. It is a slow process of winning the confidence and trust of key players, one-at-a-time. We’ve managed to broker partnerships with key industry associations, partner with firms like GfK in Europe and Kadokawa Dwange in Japan, to offer a complete market picture, and win the trust with big publishers. As we all move into the next era of interactive entertainment, the need for market information will only increase, and those that have shown themselves willing to collaborate and take a chance are simply better prepared for the future.”

NPD’s Piscatella concludes: “The one thing I’m most proud of, and impressed by, is the willingness of the participating publishers in our panel to work through issues as they’ve come up. We have a dedicated, positive group of companies working together to get this information flowing. Moving forward, it’s all about helping those publishers that aren’t participating understand how they can benefit through the sharing of digital consumer sales information, and in making that decision to say “yes” as easy as possible.

“Digital selling channels are growing quickly. Digital sales are becoming a bigger piece of the pie across the traditional gaming market. I fully expect participation from the publishing community to continue to grow.”

Courtesy-GI.biz

Will Politics Bring Down The Gaming Industry?

February 20, 2017 by  
Filed under Gaming

If you’re someone who makes a living from videogames – as most readers of this site are – then political developments around the world at the moment should deeply concern you. I’m sure, of course, that a great many of you are concerned about things ranging from President Trump’s Muslim travel ban to the UK Parliament’s vote for “Hard Brexit” or the looming elections in Holland and France simply on the basis of being politically aware and engaged. However, there’s a much more practical and direct way in which these developments and the direction of travel which they imply will impact upon us. Regardless of personal ideology or beliefs, there’s no denying that the environment that seems to be forming is one that’s bad for the medium, bad for the industry, and will ultimately be bad for the incomes and job security of everyone who works in this sector.

Video games thrive in broadly the same conditions as any other artistic or creative medium, and those conditions are well known and largely undisputed. Creative mediums benefit from diversity; a wide range of voices, views and backgrounds being represented within a creative industry feeds directly into a diversity of creative output, which in turn allows an industry to grow by addressing new groups of consumers. Moreover, creative mediums benefit from economic stability, because when people’s incomes are low or uncertain, entertainment purchases are often among the first to fall.

Once upon a time, games had such strong underlying growth that they were “recession proof,” but this is no longer the case. Indeed, it was never entirely an accurate reading anyway, since broader recessions undoubtedly did slow down – though not reverse – the industry’s growth. Finally, as a consequence of the industry’s broad demographic reach, expansion overseas is now the industry’s best path to future growth, and that demands continued economic progress in the developing world to open up new markets for game hardware and software.

What is now happening on a global basis threatens all of those conditions, and therefore poses a major commercial threat to the games business. That threat must be taken especially seriously given that many companies and creators are already struggling with the enormous challenges that have been thrown up by the messy and uneven transition towards smart devices, and the increasing need to find new revenue streams to support AAA titles whose audience has remained largely unchanged even as development budgets have risen. Even if the global economic system looked stable and conditions were ideal for creative industries, this would be a tough time for games; the prospect of restrictions on trade and hiring, and the likelihood of yet another deep global recession and a slow-down in the advances being made by developing economies, make this situation outright hazardous.

Consider the UK development industry. Since well over a decade ago, if you asked just about any senior figure in the UK industry what the most pressing problem they faced was, they’d give you the same answer: skills shortages. Hiring talented staff is tough in any industry, but game development demands highly skilled people from across a range of fields, and assembling that kind of talent isn’t cheap or easy – even when you have access to the entire European Union as a hiring base, as UK companies did. Now UK companies face having to fill their positions with a much smaller pool of talent to draw from, and hiring from abroad will be expensive, complex and, in many cases, simply impossible.

The US, too, looks like it may tighten visa regulations for skilled hires from overseas, which will have a hugely negative impact on game development there. There are, of course, many skilled creatives who work within the borders of their own country, but the industry has been built on labour flows; centres of excellence in game development, like the UK and parts of the US, are sustained and bolstered by their ability to attract talent from overseas. Any restriction on that will impact the ability of companies to create world-class games – it will make them poorer creatively and throw hiring roadblocks in the path of timely, well-polished releases.

Then there’s the question of trade barriers; not only tariffs, which seem likely to make a comeback in many places, but non-tariff barriers in terms of diverse regulations and standards that will make it harder for companies to operate across national borders. The vast majority of games are multinational efforts; assets, code, and technology are created in different parts of the world and brought together to create the final product. Sometimes this is because of outsourcing, other times it’s because of staff who work remotely, and very often it’s simply because a certain piece of technology is licensed from a company overseas.

If countries become more hostile to free trade, all of that will become more complex and expensive. And that’s even before we start to think about what happens to game hardware, from consoles that source components from across Asia before assembly in China or Japan, to PC and smart device parts that flow out of China, Korea, Taiwan and, increasingly, from developing nations in South-East Asia. If tariff barriers are raised, all of those things will get a lot more expensive, limiting the industry’s consumer base at the most damaging time possible.

Such trade barriers – be they tariff barriers or non-tarriff barriers – would disproportionately impact developing countries. Free trade and globalisation have had negative externalities, unquestionably, but by and large they have contributed to an extraordinary period of prosperity around the world, with enormous populations of people being lifted out of poverty in recent decades and many developing countries showing clear signs of a large emerging middle class. Those are the markets game companies desperately want to target in the coming decade or so. In order for the industry to continue to grow and prosper, the emerging middle class in countries like India, Brazil and Indonesia needs to cultivated as a new wave of game consumers, just as many markets in Central and Eastern Europe were a decade ago.

The current political attacks on the existing order of world trade threaten to cut those economies off from the system that has allowed them to grow and develop so quickly, potentially hurling them into deep recession before they have an opportunity to cement stable, sustainable long-term economic prosperity. That’s an awful prospect on many levels, of course (it goes without saying that many of the things under discussion threaten human misery and catastrophe that far outweighs the impact on the games business), but one consequence will likely be a hard stop to the games industry’s capacity to grow in the coming years.

It’s not just developing economies whose consumers are at risk from a rise of protectionism and anti-trade sentiments, however. If we learned anything from the 2008 crash and the recession that followed, it should be that the global economy largely runs not on cash, but on confidence. The entire edifice is built on a set of rules and standards that are designed to give investors confidence; the structure changes over time, of course, but only slowly, because stability is required to allow people to invest and to build businesses with confidence that the rug won’t be tugged out from underneath them tomorrow. From the rhetoric of Donald Trump to the hardline Brexit approach of the UK, let alone the extremist ideas of politicians like Marine le Pen and Geert Wilders, the current political movement deeply threatens that confidence. Only too recently we’ve seen what happens to ordinary consumers’ job security and incomes when confidence disappears from the global economy; a repeat performance now seems almost inevitable.

Of course, the games industry isn’t in a position to do anything about these political changes – not alone, at least. The same calculations, however, apply to a wide variety of industries, and they’re all having the same conversations. Creative industries are at the forefront for the simple reason that they will be the first to suffer should the business environment upon which they rely turn negative, but in opposing those changes, creative businesses will find allies across a wide range of industries and sectors.

Any business leader that wants to throw their weight behind opposing these changes on moral or ethical grounds is more than welcome to, of course – that’s a laudable stance – but regardless of personal ideology, the whole industry should be making its voice heard. The livelihoods of everyone working in this industry may depend on the willingness of the industry as a whole to identify these commercial threats and respond to them clearly and powerfully.

Courtey-GI.biz

Is The Gaming Industry Due For An Overhaul?

February 16, 2017 by  
Filed under Gaming

Physical retailers are calling for a change in how video game pre-orders are conducted.

They are speaking to publishers and platform holders over the possibility of selling games before the release date. Consumers can pick up the disc 1 to 3 weeks before launch, but it will remain ‘locked’ until launch day.

The whole concept stems from the pre-loading service available in the digital space. Today, consumers can download a game via Steam, Xbox Live and PSN before it’s out, and the game becomes unlocked at midnight on launch day for immediate play (after the obligatory day one patch).

It makes sense to roll this out to other distribution channels. The idea of going into a shop to order a game, and then returning a month later to buy it, always seemed frankly antiquated.

Yet it’s not only consumer friendly, it’s potentially retailer and publisher friendly, too.

For online retailers, the need to hit an embargo is costly – games need to be turned around rapidly to get it into consumers’ hands on day one.

For mainstream retailers, it would clear up a lot of confusion. These stores are not naturally built for pre-ordering product, with staff that are more used to selling bananas than issuing pre-order receipts. The fact you can immediately take the disc home would help – it could even boost impulse sales.

Meanwhile, specialist retailers will be able to make a longer ‘event’ of the game coming out, and avoid the situation of consumers cancelling pre-orders or simply not picking up the game.

Yet when retail association ERA approached some companies about the prospect of doing this, it struggled to find much interest from the publishing community. So what’s the problem?

There are a few challenges.

There are simple logistical obstacles. Games often go Gold just a few weeks before they’re launched, and then it’s over to the disc manufacturers, the printers, the box makers and the distributors to get that completed code onto store shelves. This process can take two weeks in itself. Take the recent Nioh. That game was available to pre-download just a few days before launch – so how difficult would it be to get that into a box, onto a lorry and into a retailer in advance of release?

It also benefits some retailers more than others – particularly online ones, and those with strong distribution channels.

For big games, there’s a potential challenge when it comes to bandwidth. If those that pre-ordered Call of Duty all go online straight away at 12:01, that would put a lot of pressure on servers.

Piracy may also be an issue, because it makes the code available ahead of launch.

The end of the midnight launch may be happening anyway, but not for all games. If consumers can get their game without standing in the cold for 2 hours, then they will. And those lovely marketable pictures of snaking queues will be a thing of the past.

None of these obstacles are insurmountable. Getting the game finished earlier before launch is something that most big games publishers are trying to do, and this mechanism will help force that issue. Of course, the disc doesn’t actually have to contain a game at all. It can be an unlock mechanism for a download, which will allow the discs to be ready far in advance of launch. That strategy is significantly riskier, especially considering the consumer reaction to the same model proposed by Xbox back in 2013.

As for midnight events, there are still ways to generate that big launch ‘moment’. Capcom released Resident Evil 7 with an experiential haunted house experience that generated lots of media attention and attracted a significant number of fans. Pokémon last year ran a big fan event for Sun and Moon, complete with a shop, activities, signing opportunities and the chance to download Mew.

So there are other ways of creating launch theatre than inviting consumers to wait outside a shop. If anything, having the game available in advance of launch will enable these theatrical marketing events to last longer. And coupled with influencer activity, it would actually drive pre-release sales – not just pre-release demand.

However, the reality is this will work for some games and not for others, and here lies the heart of the challenge.

Pre-ordering is already a relatively complex matter, so imagine what it’ll be like if some games can be taken home in advance and others can’t? How many instances can we expect of people complaining that ‘their disc doesn’t work’?

If this is going to work, it needs cross-industry support, which isn’t going to happen. This is a business that can’t even agree on a digital chart, don’t forget.

What we may well see is someone giving this concept a go. Perhaps a digital native publisher, like Blizzard or Valve, who can make it part of their PR activity.

Because if someone like that can make the idea work, then others will follow.

Courtesy-GI.biz

Is Microsoft Taking Windows To The Cloud?

February 2, 2017 by  
Filed under Computing

Software king of the world Microsoft is planning a cut down version of Windows 10 which will operate in the cloud.

Dubbed the Composable Shell (CSHELL) the software is a single, unified, ‘adaptive shell’ for Windows 10 and it is part of Vole’s cunning plan to create a universal Windows 10 version.

This will mean we will see a standardised framework to scale and adapt the OS to any type of device, display size or user experience, including smartphones, PCs, tablets, consoles, large touchscreens, and more.

Stage one apparently means a Cloud Shell which is a cut-down version of Windows designed for the modern computing world.

Cloud Shell should be out there in 2017 and it will be connected to the Windows Store and Universal Windows Platform app framework.

This would fit with Vole’s plans to bring the full version of Windows 10 to mobile devices with ARM-based processors, which it announced in December.

A ‘lightweight’ version of Windows could hint at a ‘thin client’-style approach which has been touted as a viable business tool for the last 20 years but has never really taken off.

Courtesy-Fud

Next Page »