Subscribe to:

Subscribe to :: TheGuruReview.net ::

Is The Gaming Industry Going Through A Nostalgic Summer

July 12, 2017 by  
Filed under Gaming

I had been repeating that this summer for games offers little outside of some decent Nintendo titles.

“You keep forgetting Crash Bandicoot,” said my retail friend.

I laughed. “Sure, it’s a nice piece of nostalgia,” I reasoned. “But it’s hardly going to set the market alight.”

“Pre-orders are brilliant,” came the reply. “We’ve upped our order twice. I think it’s going to be the biggest game of the summer.”

I shouldn’t be surprised. We’ve written extensively about the marketplace’s current love of nostalgia, and that trend only seems to be accelerating. In the last two weeks alone, we’ve seen the news that original Xbox games are coming to Xbox One, the reveal of the Sega Forever range of classics for smartphones, and now the best-selling SNES Mini.

The trend isn’t new. Classic re-releases have been standard for over a decade. However, the recent surge in nostalgia can be traced back to the onset of Kickstarter and the indie movement, which brought with it a deluge of fan-pleasing sequels, remakes and spiritual successors.

The trend reached the mainstream around the 20th anniversary of PlayStation, with Sony tapping into that latent love for all things PS1. And today, nostalgia is a significant trend in video games. Look at this year’s line-up: Sonic Mania, Yooka-Laylee, Super Bomberman, Wipeout, Crash Bandicoot, Thimbleweed Park, Micro Machines, Metroid II… even Tekken, Mario Kart and Resident Evil have found their way to the top of the charts (even if they never really went away).

It’s not just software, either. Accessories firms, hardware manufacturers and merchandise makers are all getting in on the act. I even picked up a magazine last week (on the shelves of my local newsagent) dedicated to the N64. This is the industry we live in.

Nostalgia has manifested itself in several different ways. We’ve seen re-releases (Xbox Originals, Sega Forever, NES Mini, Rare Replay), we’ve seen full remakes and updates (Crash Bandicoot, Final Fantasy VII, Resident Evil 2), plus sequels and continuations (Elite Dangerous, Shenmue 3). We’ve seen a plethora of spiritual successors (Yooka-Laylee, Bloodstained, Thimbleweed Park) and we have also witnessed old-fashioned game elements re-introduced into modern titles (split-screen multiplayer, for instance).

It’s not just games. We’ve recently seen nostalgia-tinged TV such as Twin Peaks, Stranger Things and X-Files, plus the cinematic return of Ghostbusters, Baywatch, and Jurassic Park. Yet this trend isn’t so new for film and TV (or music, either). And that’s because they’re older mediums. The demand for nostalgia tends to come from those aged 30 or above, and with video games being such a young industry, we’re only starting to see the manifestation of this now.

It’s perhaps also more significant in games because of just how different the experiences of the 1990s are to what we have today. In terms of tech, visuals, genre and connectivity, video games have moved so quickly. We simply don’t get many games like Crash Bandicoot or Wipeout anymore, which makes the demand for them even more acute.

Can it last forever? Or is this destined to be another gaming gold mine that gets picked to death? It’s difficult to say. Nostalgia isn’t like MMOs or futuristic shooters. This isn’t a genre, but an emotion ‘sentimental longing for a period in the past’. In theory, the clamour for old games and genres should get broader. In ten years’ time, those brought up on a diet of DS and Wii will be approaching 30. They’ll be reminiscing of the times they spent on Wii Sports and Viva Pinata. And the nostalgia wheel turns again.

Nevertheless, what we’re starting to see now is changing expectations of consumers. No longer are they pandering to every Kickstarter that promises to resurrect a long lost concept (sorry Project Rap Rabbit), and they will not tolerate a nostalgic releases that fails to deliver (sorry Mighty No.9). Lazy ports or half-hearted efforts will not win you any fans. If you want good examples of how to do it, look at Nintendo with the inclusion of Star Fox 2 in the SNES Mini, or the documentaries hidden in Rare Replay, or the special PS1-style case that Sony created for the new Wipeout. This is the games industry and the same rules apply. You cannot get away with rubbish.

Of course, big companies can’t live off nostalgia alone. Nintendo can’t build a business from just re-selling us Super Mario World (even if it seems to try sometimes). These moments of retro glory can often be fleeting. Will a new lick of paint on Crash Bandicoot revitalise the brand and deliver it back to the mainstream? It’s not impossible, but unlikely. More often than not you see a brief surge in gamers reminiscing over a time gone by, and then the IP drifts back to the era from which it was plucked. Musical comebacks are often short-lived and movie remakes are, typically, poorly received.

Yet there are exceptions every now and then. Major UK 1990s pop group Take That made its big comeback in 2006, but it did so with a modernised sound that has seen the band return to the top of the charts and stay there for over 10 years. In 2005, the BBC’s Doctor Who returned after 16 years. It was faster paced and far more current, and it remains a permanent fixture on Saturday night TV.

And last year’s Pokémon Go, which stayed true to the IP whilst delivering it in a new way and through new technology, has elevated that brand to the heights not seen since the late 1990s.

“Nostalgia is a seductive liar, that insists things were far better than they seemed. To be successful with it in the commercial world, you need to keep that illusion alive”

They say nostalgia is a seductive liar, that insists things were far better than they seemed. To be truly successful with it in the commercial world, you need to keep that illusion alive. You must create something that looks and sounds like it comes from a different era, but actually plays well in the modern age. And that’s true whether it’s Austin Powers or Shovel Knight.

Indeed, nostalgia isn’t always about the past, it can help take us into the future. One unique example comes in what Nintendo did with The Legend of Zelda: A Link Between Worlds. The company altered the traditional Zelda formula with that 3DS game, and made it more palatable to fans by dressing it in the same world as 1991’s A Link To The Past. It worked, and set the company up to take an even larger risk with its seminal Breath of the Wild.

If the SNES Mini taught us anything, the clamour for all things 1990s remains strong. For developers and publishers who were smart enough to keep hold of their code from that era, they may well reap the benefits.

However, there’s a broader market opportunity here than just cashing in on past success. There’s a chance to resurrect IP, bring back lost genres, and even rejuvenate long-standing brands in need of innovation.

It’s a chance for the games industry to take stock and look to its past before embarking on its future.

Courtesy-GI.biz

GTA V Still Riding High In England

July 6, 2017 by  
Filed under Gaming

GTA V unit sales dropped 10% this week (in terms of boxed sales), and yet the game still returned to the top of the UKIE/GfK All-Formats Charts.

It was a very poor week for games retail in general, with just 171,389 boxed games sold across the whole market. The lack of new releases is the main reason for the drop, and that’s a situation that won’t be getting any better during the course of the summer.

The only new games in the Top 40 are 505 Games’ Dead by Daylight at No.16, Final Fantasy XIV: Stormblood at No.23 and Ever Oasis at No.28.

Although the data shows a difficult week, there were a few positives. Dirt 4, after a disappointing first week, is showing some resilience. The Codemasters game is now at No.2, although sales did drop 49% week-on-week.

Mario Kart 8 Deluxe is back at No.5 with a 45% jump in sales, driven by an increase in available Switch stock, while The Legend of Zelda: Breath of the Wild had a 68% sales jump (but still sits outside of the Top Ten at No.12).

And Ubisoft’s Tom Clancy’s Ghost Recon: Wildlands returns to the Top Ten after a 31% sales boost, driven by price activity at games retail.

Elsewhere, Horizon: Zero Dawn, which was No.1 last week, has dropped down to No.8. The game had been on sale for several weeks, but now it has returned to a premium price point. Tekken 7 has dropped to No.10, while Wipeout Omega Collection, which was No.1 just three weeks ago, has now fallen to No.14.

Courtesy-GI.biz

Is e3 Leaving Los Angeles

June 27, 2017 by  
Filed under Gaming

The organizers behind the Electronic Entertainment Expo are considering taking the show away from its traditional home at the Los Angeles Convention Center.

During a roundtable interview, ESA CEO Mike Gallagher said his organisation might explore other possible locations if the center fails to upgrade and modernise its facilities, GameSpot reports.

The exec specifically hopes to see increased floor space and a smoother route between the West and South halls, currently separated by a length corridor. If these expectations are not met, E3 may be hosted in another venue – and, by extension, away from Los Angeles.

E3 2018 is already booked in for June 12th to 14th next year, once again at the convention center. The venue will also host E3 2019, but no decision has been made for 2020.

The ESA has previously attempted to hold E3 at an alternative location. In 2007, the show became the E3 Media and Business Summit and was around Santa Monica. This was part of an attempt to make it more industry focused, capping the attendance to shut out bloggers and non-industry professionals, as well as bringing the costs down for exhibitors.

However, the experiment proved to be unpopular and E3 has been held in the LA Convention Center ever since 2008.

In stark contrast to its 2007 decision, E3 officially opened its doors to the public for the first time this year, selling 15,000 tickets to consumers who wanted to attend the show.

GameSpot reports the ESA has now revealed attendance for this year’s event came in at 68,400 – boosted in part by those public tickets. The 30% increase over last year’s 50,300 brings attendance figures close to the 70,000 peak seen in 1998 and 2005, according to IGN.

The ESA has yet to confirm whether it will sell public tickets for E3 2018. Gallagher said his team is gathering feedback from attendees – both industry and consumer – before confirming how the show will be structured next year.

Courtesy-GI.bz

Is Grand Theft Auto V The Best Selling Video Game Ever

June 12, 2017 by  
Filed under Gaming

Grand Theft Auto V has sold more copies in the US than any other release over the past 22 years.

That’s according to NPD Group analyst Mat Piscatella, who tweeted that Rockstar’s masterpiece is the region’s best-selling game since the market research firm first began tracking.

“Not surprising, but still amazing,” he wrote.

That’s not to say GTA V has overtaken some previous champion, GamesBeat reports – just an interesting factoid Piscatella was keen to share.

As the analyst says, it comes as no surprise. The latest Grand Theft Auto has sold more than 80m units around the worldwide to date – despite originally launching way back in 2013 on the Xbox 360 and PS3.

Subsequent PC, Xbox One and PS4 releases have driven sales further, as have the regular updates for the game’s Grand Theft Auto Online multiplayer mode.

The latter was a significant contributor to the financial performance of Rockstar parent Take-Two, which reported revenues of $1.78bn for the year ended March 31st. Earlier this week, CEO Strauss Zelnick noted this success has come despite his belief the company has been restrained with in-game purchases and is currently “undermonetising” its users.

All eyes are on Rockstar’s next release Red Dead Redemption 2, which was recently delayed to 2018. The original was a huge worldwide hit, although it is perhaps unlikely the sequel can match the success of Grand Theft Auto V.

Courtesy-GI.biz

Square Enix Is Giving IO Interactive The Boot

May 23, 2017 by  
Filed under Gaming

Square Enix is dropping IO Interactive, the Danish studio behind the long-running Hitman franchise.

In a statement released today, the Japanese publisher said the decision was part of a strategy to “focus our resources and energies on key franchises and studios.”

The withdrawal was in effect as of the end of the last financial year, on March 31, 2017, and resulted in a ¥4.9 billion ($43 million) extraordinary loss on the company’s balance sheet.

Square Enix has already started discussion with potential new investors, the company said. “Whilst there can be no guarantees that the negotiations will be concluded successfully, they are being explored since this is in the best interests of our shareholders, the studio and the industry as a whole.”

IO Interactive was acquired by Eidos in 2003, just before it launched Hitman: Contracts, the third game in what was already its signature franchise. Eidos was acquired by Square Enix in 2009, and it has launched four games in the time since: Mini Ninjas, Kane & Lynch 2: Dog Days, Hitman: Absolution, and Hitman, last year’s episodic take on its most celebrated IP.

The bold new structure implemented in Hitman saw the game’s missions being separately on digital platforms, with various live events and challenges taking place between the release of each one. Square Enix originally planned to give the entire series a boxed retail release, but that never materialised. It has never disclosed official numbers regarding the sales figures for Hitman, either as a series or for individual episodes.

However, the series’ ámbition was widely appreciated within the games press – it was named 11th best game of 2016 by Eurogamer, for example, and was Giant Bomb’s overall Game of the Year. When we talked to IO studio head Hannes Seifert last year, he described the pride his team felt at the “new feeling” the game created, and made it clear that plans for Hitman extended far beyond a single season of epsiodes.

“When we say an ever expanding world of assassination, it means we don’t have to take everything that’s out there, throw it away and make a new game,” he said. “We can actually build on that. Just imagine after two or three seasons, you enter at that point in time, the amount of content will just blow your mind. That’s where we want to be.”

Seifert stepped down as IO’s studio head in February this year. He was replaced by Hakan Abrak, IO’s former studio production director.

Courtesy-GI.biz

Will Digital Video Game Sales Grow This Year

May 18, 2017 by  
Filed under Gaming

The growth of full game downloads in the console space has surprised EA, the firm says.

The company told investors during its Q&A – as transcribed by Seeking Alpha – that full game downloads accounted for 33% of unit sales. That’s considerably ahead of the firm’s previous estimate of 29%, and 9% higher than the figure it posted last year.

The firm says the chief driver was “the continuing evolution of consumer behavior. but some of the out-performance was driven by the shift from Star Wars Battlefront to Battlefield 1, as well as the digital performance of our catalog.”

It expects full game downloads will account for 38% of its console unit sales during 2017.

However, EA’s CFO Blake Jorgensen anticipates that for the whole industry the figure will be even higher – around 40%. This is because EA’s big titles, such as FIFA, often perform strongly in markets with slower digital uptake.

“In terms of full-game downloads, the number surprised us because we had thought that it’d be around the 5% year-over-year growth,” he said. “Some of that may simply be the consumer is shifting faster than we know or we expected. The trends can sometimes jump in dramatic ways and maybe we’re starting to see that overall shift. And some of it could be product-related. We do think the industry will end calendar year 2017 probably above 40%. We will most likely lag that as we have historically because FIFA is such a large product and it is so global that we are operating in markets where either the ability to purchase digitally, or the ability to download based on bandwidth speeds, are compromised and thus we tend to skew a little lower on FIFA than we do on the rest of our portfolio. So we’ve always lagged the industry slightly, but we are excited about the potential that you’re seeing the consumer possibly shift quicker to digital than we’d originally anticipated.”

EA remains optimistic about the console space. It says that at the end of last year the install base for both PS4 and Xbox One was 79m, and that it would grow to 105m by the end of 2017. This figure does not include Nintendo Switch, although EA is bullish about Nintendo, too.

“We have a tremendous relationship with Nintendo and have done for many, many years and are excited by the fact that they have come out very strong and are bringing in a whole new player base into the ecosystem,” said EA CEO Andrew Wilson. “We continue to be bullish on it and are looking at other titles that we might bring to the Switch. Our console number that we quoted does not include the Switch at this point, so anything that Nintendo does is additive to that number.”

There were a few additional takeaway points from EA’s financials. The publisher said that the traditional DLC mode is becoming “less important” as it moves further into live services. We’ve already seen EA evolve its DLC model with Titanfall 2, which is giving away all of its DLC for free.

EA also revealed that its new EA Motive studio in Montreal has 100 staff, and the publisher expects that number will grow to 150.

Courtesy-GI-biz

Can Big Game Developers Keep Innovation Alive

May 12, 2017 by  
Filed under Gaming

The games industry has gone through a series of major transitions and changes over the past couple of decades – changes to the platforms people play on, the way they pay for and interact with games and even to the audiences that are actually playing. Each of those has brought along a series of challenges which the industry has had to surmount or circumvent; none of them, arguably, is a perfectly solved problem. Meanwhile, though, there have also been a handful of challenges running in the background – consistent issues that are even more fundamental to the nature of the games business, less exciting and sexy than the latest great transition but no less in need of clever solutions. Education and skills is one example; tax regimes and the industry’s relationship with governments is another.

Perhaps chief among those issues, though, is one which ties in to a common problem across a wide variety of industries, creative and otherwise. It’s the problem of innovation; specifically, the question of how to make innovation work in the context of a large corporation. The conventional wisdom of modern capitalism is that innovation bubbles up from small start-ups; unencumbered by the institutional, structural and cultural constraints that large, established companies operate within, they’re free to create new things and execute original ideas. As firms grow bigger, they lose that nimbleness and flexibility. Projects become wrapped up in internal politics, in the stifling requirements of handling shareholder relationships, and all too often, in the innovator’s dilemma – the unwillingness to pursue fresh innovation for fear that it’ll disrupt one of your proven cash cows.

As a result, we see a structure in which innovation happens at small start-ups, which large companies tap into through acquisitions. We see this in the games industry too, in the form of big publishers acquiring innovative and successful developers. Such acquisitions usually come with golden handcuffs for the key talent, requiring them to work for their firm’s new owners for a certain amount of time – after which they’re free to go off and create something new, small and innovative again (with a few million quid in their back pocket, to boot). This creates a cycle, and a class of serial innovators who repeatedly build up new, successful small companies to sell to larger, innovation-starved firms.

For many large companies, this isn’t an entirely satisfactory situation. Surely, they reason, there must be some way for a company to scale up without losing the capacity to innovate? Yet for the most part, the situation holds; big companies can create great products, but they are generally iterative and derivative, only very rarely being major, disruptive breaks from what was offered before. There are just too many barriers a game or a product needs to get through; too much politics to navigate, too many layers of management stumped by new ideas or worried about how something hard to explain will play to investors who only want to hear descriptions like “it’s like GTA, but with elements of Call of Duty”, or “it’s like an iPhone, but with a better camera”.

The desire to find some way to bottle the start-up lightning and deploy it within existing corporations runs deep, though, and it’s resulted in a number of popular initiatives over the years. Perhaps the most famous of recent years is the buzz around Eric Ries’ book The Lean Start-Up, a guide to effective business practices for start-up companies which extolled a launch-early, iterate-fast approach. Though it had some impact in the start-up world, The Lean Start-Up seemed to find its most receptive audience among executives at large corporations keen to find some way to create “internal start-ups” – silos within their companies which would function like incubators, replicating the conditions which allowed start-ups in the wild to innovate and iterate rapidly.

For the most part, those efforts didn’t work. The reality is that a start-up inside a company isn’t the same as a start-up in the wild. It doesn’t have the same constraints or the same possibilities available to it; its staff remain employees of a large corporation and thus cannot expect the same rewards, or be exposed to the same decision-making environment, as staff at a start-up. Even something as basic as success or failure can’t be measured in the same way, and in place of experienced venture capitalists (often the final-stage Pokémon evolution of the serial innovators described above) as investors and advisors, an internal start-up finds itself being steered and judged by executives who have often spent a lifetime working within precisely the corporate structure they now claim to wish to subvert. It’s hardly surprising that this doesn’t work very often, either within games or in any other sector.

We haven’t talked about Hearthstone yet, even though it’s right up there in the opening lines. Let’s talk about Hearthstone.

Hearthstone is Blizzard’s card battling game, available across a variety of platforms. It’s a spin-off from the Warcraft franchise, and last year it made somewhere in the region of $350 million (according to estimates from SuperData). This week it topped 70 million unique users, and though the company doesn’t release concurrent user figures, it claims to have set a new record for those following the release of its latest expansion pack in April. It also remains one of the most popular games in the world for streaming. It’s a hell of a success story, and it’s also, in essence, a counterpoint to the notion that big companies can’t do small, innovative things. Hearthstone was prototyped and built by a small team within Blizzard, and ever since its launch it has embraced a distinctly start-up approach – iterating quickly and doing its experimentation in public through features like the “Barroom Brawl”, a sandbox that allows developers to test new mechanics and ideas that might make their way into the main game if they work well.

Given Hearthstone’s commercial success and the relatively small team and infrastructure behind it (relative, that is, to a behemoth like World of Warcraft), it’s probably Blizzard’s most profitable game. The question is, can other publishers and developers learn from what Blizzard did here? There’s a tendency with Blizzard success stories to simply attribute them to some intangible, indefinable “Blizzard Magic”, some sparkling pixie dust which is sprinkled liberally on all of their games but which can only be mined from the secret goblin tunnels under the company’s Irvine campus. In reality, though, Blizzard is simply a very creative and phenomenally well-managed company – one which has, in many respects, placed the solving of the whole question of how to innovate within a large company environment at the very heart of how it structures and defines itself.

One of the most famous things that people in the industry know about Blizzard is that the company is ruthless in its willingness to take an axe to projects that don’t live up to its standards. StarCraft: Ghost never saw the light of day after years in development; Titan, the planned MMO follow-up to World of Warcraft, was similarly ditched (with a core part of its team going on to rapidly develop the enormously successful Overwatch as their “rebound project”). What that means is that Blizzard has developed something within its internal culture that a lot of other firms in the industry lack; a capacity to coolly, rationally judge its own work on a purely creative and qualitative level, and to make very tough decisions without being overly swayed by internal politics, sunk-cost fallacies or other such calculations.

It’s instructive to listen to comments from people who worked on cancelled projects at Blizzard, even at a high level; while it was no doubt an emotional and difficult experience for them, their comments in hindsight usually express genuine agreement with the decision. There appears to be a culture that allows the company to judge projects without extending that judgment to the individuals who worked on them; I don’t doubt that this is an imperfect system and that there’s still plenty of friction around these decisions, but by and large, it seems to work.

There is no magic pixie dust involved in the success of games like Hearthstone (or Overwatch, for that matter). This is a model that can be replicated elsewhere… it’s not dissimilar to the structure of a company like Supercell”

That creates an environment in which a start-up style approach can actually thrive. Small, creative teams can work on innovative games, rapidly prototyping and being effectively judged for their quality along the way. After only a couple of cycles of internal culling and restarting, surviving projects can be pushed out to the market as a kind of “minimum viable product”; not a thinly disguised prototype, but the minimum required to be a viable Blizzard game. Polished, fun and interesting, but designed as a springboard from which the team can go on to iterate and innovate in a way that’s informed by feedback from a real audience, rather than as an expensively developed, monolithic product.

Not every company can accomplish this; it’s not just Blizzard’s exacting standards of quality that permit it, there are also important factors like the company’s opaqueness to investors (which allows it to make products for the market rather than making products for shareholders) and its ability to bootstrap new games with IP from existing franchises (the Nintendo model, in essence) to consider. There is, however, no magic pixie dust involved in the success of games like Hearthstone (or Overwatch, for that matter). This is a model that can be replicated elsewhere, given the right approach and the right people in decision-making roles. In fact, it’s a model that does exist elsewhere; it’s not dissimilar to the structure of a company like Supercell, for example, which helps to explain why Supercell is one of the only mobile developers that’s been able to “bottle its lightning” and consistently develop hit titles. It’s also close, though slightly different in structure, to the way Nintendo has shifted towards working in recent years, which has resulted in titles like Splatoon.

Big companies can be creative; they can be innovative, daring, clever and even disruptive. Hearthstone shows this at work within Blizzard, and it’s also present in a select but distinguished line-up of other game companies that have made it a priority to nurture innovation and to create a culture where good taste and creative excellence are celebrated above all else. For many companies, this would be a radical shift – requiring a change in priorities, in structure and even in staffing – but in the long run, such a shift might end up a lot cheaper than having to pull out your wallet every couple of years to buy the next innovative start-up that came up with an idea your own firm couldn’t conceive of.

Courtesy-GI.biz

Are Motherboard Shipments Decreasing?

May 1, 2017 by  
Filed under Computing

With the global decline in PC shipments finally showing signs of slowing, motherboard vendors are expecting to see a correlated slowing of overall volume in 2017, with some estimates hovering near 10 percent from last year.

Last month, a market research report from Global Information Inc showed the global volume of motherboard shipments in Q4 2016 dropping 5.2 percent from Q3 and 13.6 percent year-over-year. Total shipments for 2016 were estimated to be less than 50 million units, and this was even forecasted at the beginning of the year. As the fourth quarter approached, vendors said that sales of Kaby Lake motherboards were not living up to expectations, while the overall market remained in a state of weaker demand. The report covered vendors including AMD, ECS, Foxconn, Gigabyte, Intel, Jetway, Microstar, Pegatron, QCI, T&I, and Wistron.

Notebooks, exchange rates and component shortages to blame

According to the latest report, three problems are affecting the ability of motherboard vendors to increase sales numbers. First, sources within the motherboard industry have pointed out that notebooks have gradually taken market share from the build-it-yourself PC market, mainly as a result of “better specifications, smaller form factors, and cheaper prices”. Second, the vendors have experienced a large exchange rate hike over the past two years, from 6.2 percent in April 2015 to 6.8 percent in April 2017. Finally, rising component prices and various component shortages have also contributed to difficulties in production operations. So in order to remain profitable, some vendors have focused on reducing shipments and changing their focus to other product segments, including gaming notebooks and mobile devices.

Sources within the industry note that even while Intel’s Kaby Lake processor lineup and Z200 series chipset have not sold as much volume as anticipated, it is possible that the imminent thread of AMD’s Ryzen 5 and 7 lineups has continued to stimulate prices cuts across the board to keep up on platform sales. Many retailers have now begun offering more serious price cuts when bundled with compatible motherboards, and this trend is expected to continue with the release of AMD’s Ryzen 3 and Intel’s Z300 and X299 series chipsets later this year.

Courtesy-Fud

Intel Releases The Atom C3000

March 8, 2017 by  
Filed under Computing

As Intel’s Atom division gets buried under a scandal over its C2000s only having an 18 month life expectancy, Intel has released its new Atom CPU 3000 range to replace it.

To be fair to Intel, the C3000 is a superior product. At the top end it will have 16 core CPUs which are designed for enterprise servers. According to Intel they will have features borrowed from the Xeon line, such as hardware virtualization, and RAS (reliability, availability, and serviceability).

These chips will head to the NAS and IoT markets and they can deal with several parallel data streams. Not as fast as Kaby Lake and Broadwell, they will be placed as reliable workhorses in the network. That is if they don’t repeat the problems of the  flawed C2000 Atom series of products. These had quality control issues, which Intel claims that it has fixed. 

The new line is scheduled to launch in the second half of 2017. The Intel ARK shows a dual-core Atom C3338 with Denverton cores fabbed at 14nm. That SoC will get a 1.5 Ghz base frequency and 2.2 Ghz boost.

Each pair of Denverton cores will have two megabytes of level two cache. The new chips support for up to 128 gigabytes of DDR4-2400 memory and thus include support for 10Gb Ethernet as well as 16 lanes of PCI-Express 3.0 connectivity. Here are the highlights.

Thermal design points down to 8.5 watts.

Enhanced performance from 2 to 16 cores and frequencies from 1.5 Ghz to 2.2 Ghz.

Built-in hardware virtualisation to enable dynamic provisioning of services as communication service providers extend network functions virtualization to the network edge. 

Intel x86 64-bit software support.

Integrated Intel QuickAssist technology with up to 20 Gbps of compression/encryption throughput.

4 x 10 GbE integrated Intel Ethernet to enable high-speed connectivity to the network.

ECC memory for data integrity and system reliability through automatic data correction.

Flexible I/O lanes providing up to 16 SATA 3.0, 16 PCIE3, and 4 USB 3.0.

Extended temperature range and long-life support for dense network, storage, industrial IoT and autonomous driving environments.

Courtesy-Fud

Is The Intel C2000 Chip Flaw A Disaster In The Making?

February 21, 2017 by  
Filed under Computing

It is starting to look like Intel’s Atom C2000 chip fiasco has spread to another networking manufacturer.

The fatal clock timing flaw that causes switches, routers and security appliances die after about 18 months of service is apparently a feature of some Juniper products.

Cisco was the first vendor to post a notice about the problem earlier this month saying the notice covers some of the company’s most widely deployed products, such as certain models of its Series 4000 Integrated Services Routers, Nexus 9000 Series switches, ASA security devices and Meraki Cloud Managed Switches.

Juniper is telling its customers something similar:

“Although we believe the Juniper products with this component are performing normally as of February 13, 2017, the [listed] Juniper products could after the product has been in operation for at least 18 months begin to exhibit symptoms such as the inability to boot, or cease to operate. Recovery in the field is not possible. Juniper product with this supplier’s component were first placed into service on January 2016. Jupiter is working with the component supplier to implement a remediation. In addition, Juniper’s spare parts depots will be purged and updated with remediated products.”

The products in the warning comprise 13 Juniper switches, routers and other products including the MPC7E 10G, MPC7E (multi rate), MX2K-MPC8E, EX 920 Ethernet switches and PTX3000 integrated photonic line card.

So far neither Cisco nor Juniper have blamed Intel for the fault. However, Chipzilla did describe a flaw on its Atom C2000 chip which is under the bonnet of shedloads of net gear.

Intel said that problems with its Atom chip will hurt Intel’s 2016 Q4 earnings. CFO Robert Swan said that Intel was seeing a product quality issue in the fourth quarter with slightly higher expected failure rates under certain use and time constraints.

Swan said that it will be fixed with a minor design fix that Intel was working with its clients to resolve.

Intel had hoped it would see the back of its short-lived low-power Atom chips for servers. They were used in micro servers but also networking equipment from companies.

HPE and Dell are keeping quiet about the clock technology, though both are rumoured to use it. They might be hoping that Intel will come up with a fix so they can pretend it never happened.

Courtesy-Fud

Is The Gaming Industry Due For An Overhaul?

February 16, 2017 by  
Filed under Gaming

Physical retailers are calling for a change in how video game pre-orders are conducted.

They are speaking to publishers and platform holders over the possibility of selling games before the release date. Consumers can pick up the disc 1 to 3 weeks before launch, but it will remain ‘locked’ until launch day.

The whole concept stems from the pre-loading service available in the digital space. Today, consumers can download a game via Steam, Xbox Live and PSN before it’s out, and the game becomes unlocked at midnight on launch day for immediate play (after the obligatory day one patch).

It makes sense to roll this out to other distribution channels. The idea of going into a shop to order a game, and then returning a month later to buy it, always seemed frankly antiquated.

Yet it’s not only consumer friendly, it’s potentially retailer and publisher friendly, too.

For online retailers, the need to hit an embargo is costly – games need to be turned around rapidly to get it into consumers’ hands on day one.

For mainstream retailers, it would clear up a lot of confusion. These stores are not naturally built for pre-ordering product, with staff that are more used to selling bananas than issuing pre-order receipts. The fact you can immediately take the disc home would help – it could even boost impulse sales.

Meanwhile, specialist retailers will be able to make a longer ‘event’ of the game coming out, and avoid the situation of consumers cancelling pre-orders or simply not picking up the game.

Yet when retail association ERA approached some companies about the prospect of doing this, it struggled to find much interest from the publishing community. So what’s the problem?

There are a few challenges.

There are simple logistical obstacles. Games often go Gold just a few weeks before they’re launched, and then it’s over to the disc manufacturers, the printers, the box makers and the distributors to get that completed code onto store shelves. This process can take two weeks in itself. Take the recent Nioh. That game was available to pre-download just a few days before launch – so how difficult would it be to get that into a box, onto a lorry and into a retailer in advance of release?

It also benefits some retailers more than others – particularly online ones, and those with strong distribution channels.

For big games, there’s a potential challenge when it comes to bandwidth. If those that pre-ordered Call of Duty all go online straight away at 12:01, that would put a lot of pressure on servers.

Piracy may also be an issue, because it makes the code available ahead of launch.

The end of the midnight launch may be happening anyway, but not for all games. If consumers can get their game without standing in the cold for 2 hours, then they will. And those lovely marketable pictures of snaking queues will be a thing of the past.

None of these obstacles are insurmountable. Getting the game finished earlier before launch is something that most big games publishers are trying to do, and this mechanism will help force that issue. Of course, the disc doesn’t actually have to contain a game at all. It can be an unlock mechanism for a download, which will allow the discs to be ready far in advance of launch. That strategy is significantly riskier, especially considering the consumer reaction to the same model proposed by Xbox back in 2013.

As for midnight events, there are still ways to generate that big launch ‘moment’. Capcom released Resident Evil 7 with an experiential haunted house experience that generated lots of media attention and attracted a significant number of fans. Pokémon last year ran a big fan event for Sun and Moon, complete with a shop, activities, signing opportunities and the chance to download Mew.

So there are other ways of creating launch theatre than inviting consumers to wait outside a shop. If anything, having the game available in advance of launch will enable these theatrical marketing events to last longer. And coupled with influencer activity, it would actually drive pre-release sales – not just pre-release demand.

However, the reality is this will work for some games and not for others, and here lies the heart of the challenge.

Pre-ordering is already a relatively complex matter, so imagine what it’ll be like if some games can be taken home in advance and others can’t? How many instances can we expect of people complaining that ‘their disc doesn’t work’?

If this is going to work, it needs cross-industry support, which isn’t going to happen. This is a business that can’t even agree on a digital chart, don’t forget.

What we may well see is someone giving this concept a go. Perhaps a digital native publisher, like Blizzard or Valve, who can make it part of their PR activity.

Because if someone like that can make the idea work, then others will follow.

Courtesy-GI.biz

Is Sony Really Committed To The PSVR?

February 1, 2017 by  
Filed under Gaming

The positive reviews pouring in from all corners for Capcom’s Resident Evil 7 are a welcome validation of the firm’s decision to go in quite a radically new direction with the series, but Capcom isn’t the only company that will be happy (and perhaps a little relieved) by the response to the game. A positive reaction to RE7 is also hugely important for Sony, because this is the first real attempt at proving that PSVR is a worthy platform for full-scale, AAA games, and much of the credibility of the nascent VR platform rests on RE7.

Although some of the sentiment in reviews of the game suggests that the VR mode is interesting but somewhat flawed, and several reviewers have expressed a preference for playing on a normal screen, the game’s VR aspect undoubtedly fascinates consumers and seems to be implemented well enough to justify their interest. In the process, it also justifies Sony’s investment in the title – the company did a deal that secured a year-long VR exclusivity window for PSVR – and Capcom’s own faith in the burgeoning medium, which undoubtedly played a large role in the decision to switch the entire game over to a first-person perspective.

The critical success of RE7, and the likely commercial success that will follow, comes at a crucial juncture for PSVR. Although the hardware was well-reviewed at launch and remains more or less supply-constrained at retail – you certainly can’t get your hands on one without paying a hefty re-seller premium in Japan at the moment, and believe me I’ve tried – there’s an emerging narrative about the VR headset that’s distinctly negative and pessimistic. Plenty of op-eds and videos have popped up in recent weeks comparing PSVR to previous Sony peripheral launches like PlayStation Eye and PlayStation Move; hardware that was launched with a lot of heavy marketing support but which the giant company rapidly seemed to lose interest in, condemning it to a few years of token, declining software support before being quietly shelved altogether.

It’s worth noting, of course, that neither Eye nor Move actually died off entirely – in fact, both of these technologies have made their way into PSVR itself, with the headset essentially being an evolution of a number of previous Sony technologies that have finally found a decent application in VR. However, there’s no question but that Sony has a bad track record with peripherals, and those interested in the future of PSVR should absolutely be keeping a close eye on the company to see if there are any signs of it repeating its past behaviour patterns.

Most of what’s being written now, however, feels premature. PSVR had a pretty solid launch line-up, with good support from across the industry; just this week it got its first truly big third-party AAA title, which is receiving excellent reviews, and later in the year it’s got some big launches like GT Sport on the way. The pace of software releases slumped after the launch window, but that’s not unusual for any platform. There’s nothing about PSVR that you can point to right now and declare as evidence of Sony’s focus shifting away; it feels like editorials claiming this are doing so purely on the basis of Sony’s track record, not the facts as they exist now.

If you really want to know how PSVR is shaping up, there are two key things to watch out for in the near future. The first will be any data that’s released regarding the performance of RE7’s VR mode; is it popular? Is it being played widely? Does it become a part of the broad conversation about the game? Much of this latter aspect is down to Sony and Capcom’s marketing of course; there’s an opportunity to push the VR aspect of RE7 as a genuinely unique experience with appeal even beyond the usual gaming audience, and if that can be capitalised upon, it will likely secure PSVR’s future to a large degree. What’s crucial, though, is that every other company in the industry will be watching RE7 like hawks; if proper, well-integrated PSVR support seems to be a major selling factor or a popular feature, you can be guaranteed that other publishers will start to look at their major franchises with a view to identifying which of them would suit a similar “traditional display plus optional VR” approach.

The other thing to watch for, unsurprisingly, is what Sony does at E3 and other major gaming events this spring. This is really where we’ll see the proof of the company’s focus – or lack of same. There’s still plenty of time to announce VR titles for the back half of this year, which is likely to be the crucial point for PSVR; by the time we slip into the second half of 2017, the hardware will no longer be supply constrained and the early adopters buying for novelty will be all but exhausted. That’s the point in time where PSVR’s software line-up really needs to come together coherently, to convince the next wave of potential purchasers that this is a platform worth investing in. If it fails that test, PSVR will join Move and Eye in the graveyard of Sony’s failed peripherals; success will turn it into a cornerstone of the PS4 for the coming years.

So keep a close eye on E3. Part of this is just down to optics; how much time and focus does the firm devote to PSVR on stage at its conference? If it’s not very much, if the PSVR section feels rushed or underemphasised, that will send a strong message that Sony is back to its old bad habits and has lost interest in its latest peripheral already. A strong, confident PSVR segment would convince consumers and the industry alike that the headset isn’t just another easily abandoned gimmick; better yet if this is backed up by plenty of the big games being announced having PSVR functionality built into them, so the device can be referred back to repeatedly during the conference rather than being confined to its own short segment.

It’s more than just optics though; the reality is that PSVR, like any platform, needs software, and Sony needs to lead the way by showing that it’s truly devoted to its own hardware. It may seem a little unfair that people are already keen to declare PSVR to be stumbling due to lack of attention, and well, it is a little unfair – but nobody should be surprised that people are seeing a pattern here that Sony itself clearly established with its behaviour towards previous peripherals. That’s the reputation the firm has, unfortunately, created for itself; that makes it all the more important that it should convince the world of its commitment to PSVR when the time comes.

Courtesy-GI.biz

Is EA Slowing Moving Away From Appearing At E3

January 20, 2017 by  
Filed under Gaming

It would appear that the trend of big publishers hosting their own events will continue in 2017. Last year’s E3 show floor was missing booths from the likes of Electronic Arts, Activision Blizzard, Disney and Wargaming. For its part, EA decided it could better serve the fans by hosting its own event next door to E3, and now the publisher has confirmed that EA Play will be making a return for the second year in a row, but it won’t be as close to the Los Angeles Convention Center.

EA Play will be held from June 10-12 at the Hollywood Palladium, which is around seven miles away. “Whether in person or online, EA Play 2017 will connect fans around the world to EA’s biggest new games through live broadcasts, community content, competitions and more. Those that can attend in Hollywood will experience hands-on gameplay, live entertainment and much more. For anyone joining digitally around the world, EA Play will feature livestreams, deeper looks into EA’s upcoming games and experiences, and content from some of the best creators in the community,” the company stated in a press release.

Furthermore, a spokesperson confirmed to GamesIndustry.biz that EA will indeed be skipping out on having a major E3 presence. “EA Play was such a powerful platform for us last year to connect with our player community. We learned a ton, and we wanted to build on everything we loved about last year’s event to make EA Play 2017 even better,” EA corporate communications VP John Reseburg said.

“So after an extensive search, we’ve selected the Hollywood Palladium as a place where we can bring our vision of creativity, content and storytelling to life, and build an even more powerful experience to connect with players, community leaders, media and partners. EA Play 2017 will originate from Hollywood, with more ways for players around the world to connect and experience the excitement.”

It’ll be interesting to see what the other major publishers do about E3 this year. We’ll be sure to keep you posted.

Courtesy-Fud

HDMI v2.1 To Support 8K

January 9, 2017 by  
Filed under Around The Net

The HDMI Forum has officially announced the upcoming release of the HDMI v2.1 specification, bringing a range of higher video resolutions, Dynamic HDR support and more.

According to the provided details, the new HDMI v2.1 specification will be backward compatible with earlier versions and in addition to higher video resolution and refresh rates, including 8K@60Hz and 4K@120Hz, it will also bring support for Dynamic HDR, Game Mode variable refresh rate (VRR), eARC support for advanced audio formats and the new 48G cable that will provide 48Gbps of bandwidth which is a key for 8K HDR support.

The full list of resolutions and refresh rates start with 4K at 50/60Hz and 100/120Hz and climbs all the way up to 10K resolution at both 50/60Hz and 100/120Hz.

The big surprise is the new Game Mode VRR, which is similar with AMD FreeSync and Nvidia G-Sync, and meant to provide stutter-, tearing- and lag-free gaming, on both consoles and the PC.

Another big novelty is the all new 48G cable, which will provide enough bandwidth for higher resolutions with HDR. The old cables will be compatible with some of the new features, but for 8K/10K with HDR, you will need the new HDMI v2.1 cable.

According to HDMI Forum, the new HDMI v2.1 specification will be available to all HDMI 2.0 Adopters which will be notified when it is released in early Q2 2017.

Courtesy-Fud

Are VR Games Profitable?

December 13, 2016 by  
Filed under Gaming

Dean Hall, CEO of RocketWerkz and previously lead designer of DayZ, has spoken openly on Reddit about the harsh financial realities of VR development, explaining that without the subsidies provided by platform exclusives and other mechanisms, the medium would currently be largely unviable.

In an extended post which has garnered over 200 comments, Hall proclaimed that there was simply “no money” in VR game development, explaining that even though his VR title Out of Ammo had sold better than expected, it remained unprofitable.

Hall believes that many consumer expectations from the mature and well-supported PC market have carried over to VR, with customers not fully comprehending the challenges involved with producing content for such a small install base.

“From our standpoint, Out of Ammo has exceeded our sales predictions and achieved our internal objectives,” Hall explained. “However, it has been very unprofitable. It is extremely unlikely that it will ever be profitable. We are comfortable with this, and approached it as such. We expected to lose money and we had the funding internally to handle this. Consider then that Out of Ammo has sold unusually well compared to many other VR games.”

Pointing out that making cross-platform VR to ameliorate that small install base is not as simple as cross platform console development, Hall went on to talk about the realities of funding VR games, and what that meant for the studios involved.

“Where do you get money to develop your games? How do you keep paying people? The only people who might be profitable will be microteams of one or two people with very popular games. The traditional approach has been to partner with platform developers for several reasons:

“The most common examples of this are the consoles. At launch, they actually have very few customers and the initial games release for them, if not bundled and/or with (timed or otherwise) exclusivity deals – the console would not have the games it does. Developers have relied on this funding in order to make games.

“How are the people who are against timed exclusives proposing that development studios pay for the development of the games?

“There is no money in it. I don’t mean ‘money to go buy a Ferrari’. I mean ‘money to make payroll’. People talk about developers who have taken Oculus/Facebook/Intel money like they’ve sold out and gone off to buy an island somewhere. The reality is these developers made these deals because it is the only way their games could come out.

“Here is an example. We considered doing some timed exclusivity for Out of Ammo, because it was uneconomical to continue development. We decided not to because the money available would just help cover costs. The amount of money was not going to make anyone wealthy. Frankly, I applaud Oculus for fronting up and giving real money out with really very little expectations in return other than some timed-exclusivity. Without this subsidization there is no way a studio can break even, let alone make a profit.

“Some will point to GabeN’s email about fronting costs for developers, however I’ve yet to know anyone who’s got that, has been told about it, or knows how to apply for this. It also means you need to get to a point you can access this. Additionally, HTC’s “accelerator” requires you to set up your studio in specific places – and these specific places are incredibly expensive areas to live and run a studio. I think Valve/HTC’s no subsidy/exclusive approach is good for the consumer in the short term – but terrible for studios.

“As I result I think we will see more and more microprojects, and then more and more criticism that there are not more games with more content.”

In addition to the financial burdens, Hall says that there are other pressures too. For example, in his experience VR development burns people out very quickly indeed, with the enthusiasm of most, including himself, waning after a single project.

“I laugh now when people say or tweet me things like ‘I can’t wait to see what your next VR game will be!’ Honestly, I don’t think I want to make any more VR games. Our staff who work on VR games all want to rotate off after their work is done. Privately, developers have been talking about this but nobody seems to feel comfortable talking about it publicly – which I think will ultimately be bad.”

“For us it became clear that the rise of VR would be gradual rather than explosive when in 2015, it was revealed that the Oculus Rift and HTC Vive would be released in 2016 and that the gold rush would be on hold”

Sam Watts, Make Real

It’s not a universal opinion among VR developers, however: there was opposition to Hall’s points both within and beyond the thread. Sam Watts, Operations Lead at Make Real had the following to say.

“I think the reality of that thread is a direct result of a perceived gold rush by developers of all sizes to a degree, since analyst predictions around sales volumes of units were far higher than the reality towards the end of the year. There have been waves of gold rush perceptions with VR over the past few years, mostly around each release of new hardware expecting the next boom to take the technology into the mainstream, which has mostly failed to materialise.

“For us it became clear that the rise of VR would be gradual rather than explosive when in 2015, it was revealed that the Oculus Rift and HTC Vive would be released in 2016 and that the gold rush would be on hold.”

Watts also sees a healthier VR ecosystem on the way, one where big publishers might be more willing to invest in the sort of budgets which console games are used to.

“Whilst typical AAA budgets aren’t yet being spent on VR (to our knowledge) it doesn’t mean AAA isn’t dipping their toe in the water. The main leader being Ubisoft who created a small VR R&D team that eventually became the Eagle Flight devs. They have avoided what many early VR developers were worried about AAA approach to VR by prototyping, iterating on design, making mistakes, learning from them and working out what does and doesn’t work in VR, even creating a now widely popular comfort option of the reduced peripheral vision black tunnel effect. They didn’t just storm in late to the party, throwing AAA megabucks around at the problem, assuming money would make great games.

“I know Oculus, Steam, Sony and Razer are still funding games titles for development in 2017, I would hope to see this continue beyond to ensure the continued steady adoption and rise of VR as a new gaming platform moving forwards. This will help continue to improve the quality of content offering on the platforms to ensure full gaming experiences that gamers want to buy and return to, rather than just a series of short tech demos, are available, helping establish the medium and widen the net.”

Courtesy-GI.biz

Next Page »