Subscribe to:

Subscribe to :: TheGuruReview.net ::

AMD’s Summit Ridge Processor Details Leaked

January 29, 2015 by Michael  
Filed under Computing

AMD’s first 14nm processors are codenamed Summit Ridge and they are reportedly based on an all-new architecture dubbed Zen.

Information on the new architecture and the Summit Ridge design is still very sketchy. According to Sweclockers, the chips will feature up to eight CPU cores, support for DDR4 memory and TDPs of up to 95W.

Summit Ridge will use a new socket, designated FM3. This suggests we are looking at A-series APUs, but there is no word on graphics and the eight-core design points to proper FX-series CPUs – we simply do not know at this point. It is also possible that Summit Ridge is a Vishera FX replacement, but on an FM socket rather than an AM socket.

Of course, AMD Zen should end up in more products than one, namely in APUs and Opteron server parts. The new architecture has been described as a “high-performance” design and will be manufactured using the Samsung-GlobalFoundries 14nm node.

As for the launch date, don’t hold your breath – the new parts are expected to show up in the third quarter of 2016, roughly 18 months from now.

Courtesy-Fud

AMD’s Fiji GPU Goes High Bandwidth

January 16, 2015 by Michael  
Filed under Computing

New evidence coming from two LinkedIn profiles of AMD employees suggest that AMD’s upcoming Radeon R9 380X graphics card which is expected to be based on the Fiji GPU will actually use High-Bandwidth Memory.

Spotted by a member of 3D Center forums, the two LinkedIn profiles mention both the R9 380X by name as well as describe it as the world’s firts 300W 2.5D discrete GPU SoC using stacked die High-Bandwidth Memory and silicon interposer. While the source of the leak is quite strange, these are more reliable than just rumors.

The first in line is the profile of Ilana Shternshain, an ASIC Physical Design Engineer, which has been behind the Playstation 4 SoC, Radeon R9 290X and R9 380X, which is described as the “largest in ‘King of the hill’ line of products.”

The second LinkedIn profile is the one from AMD’s System Architect Manager, Linglan Zhang, which was involved in developing “the world’s first 300W 2.5D discrete GPU SOC using stacked die High Bandwidth Memory and silicon interposer.”

Earlier rumors suggest that AMD might launch the new graphics cards early this year as the company is under heavy pressure from Nvidia’s recently released, as well as the upcoming, Maxwell-based graphics cards.

Courtesy-Fud

Will 20nm GPU’s Ever Make It To Market?

January 14, 2015 by Michael  
Filed under Computing

We want to make sure that you realize that 20nm GPUs won’t be coming at all. Despite the fact that Nvidia, Qualcomm, Samsung and Apple are doing 20nm SoCs, there won’t be any 20nm GPUs.

From what we know AMD and Nvidia won’t be releasing 20nm GPUs ever, as the yields are so bad that it would not make any sense to manufacture them. It is not economically viable to replace 28nm production with 20nm.

This means the real next big thing technology will be coming with 16nm / 14nm FinFET from TSMC and GlobalFoundries / Samsung respectively, but we know that AMD is working on Caribbean Islands and Fiji as well, while Nvidia has been working on its new chip too.

This doesn’t mean that you cannot pull a small miracle in 28nm, as Nvidia did that back in September 2014 with Maxwell and proved that you can make a big difference with optimization on the same manufacturing process, in case when the new node is not an option.

Despite the lack of 20nm chips we still think that next gen Nvidia and AMD chips bring some innovations and make you want to upgrade in order to buy it to play the latest games on FreeSync or G-Sync monitors, or in 4K/UHD resolutions.

Courtesy-Fud

Will The Xbox One Go Virtual Reality This Year?

January 6, 2015 by Michael  
Filed under Gaming

While we can’t get a real handle on when Microsoft might reveal the VR headset that they have had in development, we have learned from our sources that it is well into development and some selected developers already have developmental prototypes.

It is hard to say when Microsoft might actually reveal the new VR headset and technology, but it would seem that GDC or E3 would be the likely events to see it introduced. We do know that Microsoft is targeting 2015 to move the VR headset into mass production and it is thought that we will see versions for both the Xbox One and PC. Though we expect the PC version to come a little after the Xbox One version.

Rumor has it that the same development team that worked on the Surface tablet are the team that has taken on this project as well.

Courtesy-Fud

Is Free-To-Play Here To Stay?

December 4, 2014 by Michael  
Filed under Gaming

Detractors of free-to-play have been having a good few weeks, on the surface at least. There’s been a steady drip-feed of articles and statements implying that premium-priced games are gaining ground on mobile and tablet devices, with parents in particular increasingly wary of F2P game mechanics; a suggestion from SuperData CEO Joost van Dreunen that the F2P audience has reached its limits; and, to top it off, a move by Apple to replace the word “Free” with a button labelled “Get” in the App Store, a response to EU criticism of the word Free being applied to games with in-app purchases.

Taken individually, each of these things may well be true. Premium-priced games may indeed be doing better on mobile devices than before; parents may indeed be demonstrating a more advanced understanding of the costs of “free” games, and reacting negatively to them. Van Dreunen’s assertion that the audience for F2P has plateaued may well be correct, in some sense; and of course, the EU’s action and Apple’s reaction is unquestionable. Yet to collect these together, as some have attempted, and present them as evidence of a turning tide in the “battle” between premium and free games, is little more than twisting the facts to suit a narrative in which you desperately want to believe.

Here’s another much-reported incident which upsets the apple cart; the launch of an add-on level pack for ustwo’s beautiful, critically acclaimed and much-loved mobile game Monument Valley. The game is a premium title, and its level pack, which added almost as much content as the original game again, cost $2. This charge unleashed a tide of furious one-star reviews slamming the developers for their greed and hubris in daring to charge $2 for a pack of painstakingly crafted levels.

This is a timely and sobering reminder of just how deeply ingrained the “content is free” ethos has become on mobile and tablet and platforms. To remind you; Monument Valley was a premium game. The furious consumers who viewed charging for additional content as a heinous act of money-grubbing were people who had already paid money for the game, and thus belong to the minority of mobile app customers willing to pay for stuff up front; yet even within this group the scope of their willingness to countenance paying for content is extremely limited (and their ire at being forced to do so is extraordinary).

Is this right? Are these consumers desperately wrong? It doesn’t matter, to be honest; it’s reality, and every amateur philosopher who fancies himself the Internet’s Immanuel Kant can talk about their theories of “right” pricing and value in comment threads all day long without making a whit of difference to the reality. Mobile consumers (and increasingly, consumers on other platforms) are used to the idea that they get content for free, through fair means or foul. We could argue the piece about whether this is an economic inevitability in an era of almost-zero reproduction and distribution costs, as some commentators believe, but the ultimate outcome is no longer in question. Consumers, the majority of them at least, expect content to be free.

F2P, for all that its practitioners have misjudged and overstepped on many occasions, is a fumbling attempt to answer an absolutely essential question that arises from that reality; if consumers expect content to be free, what will they pay for? The answer, it transpires, is quite a lot of things. Among the customers who wouldn’t pay $2 for a level pack are probably a small but significant number who wouldn’t have blinked an eye at dropping $100 on in-game currency to speed up their ability to access and complete much the same levels, and a much more significant percentage who would certainly have spent roughly that $2 or more on various in-game purchases which didn’t unlock content, per se, but rather smoothed a progression curve that allowed access to that content. Still others might have paid for customisation or for merchandise, digital or physical, confirming their status as a fan of the game.

I’m not saying necessarily that ustwo should have done any of those things; their approach to their game is undoubtedly grounded in an understanding of their market and their customers, and I hope that the expansion was ultimately successful despite all the griping. What I am saying is that this episode shows that the problem F2P seeks to solve is real, and the notion that F2P itself is creating the problem is naive; if games can be distributed for free, of course someone will work out a way to leverage that in order to build audience, and of course consumers will become accustomed to the idea that paying up front is a mugs’ game.

If some audiences are tiring of F2P’s present approach, that doesn’t actually remove the problem; it simply means that we need new solutions, better ways to make money from free games. Talking to developers of applications and games aimed at kids reveals that while there’s a sense that parents are indeed becoming very wary of F2P – both negative media coverage and strong anti-F2P word of mouth among parents seem to be major contributing factors – they have not, as some commentators suggest, responded by wanting to buy premium software. Instead, they want free games without any in-app purchases; they don’t buy premium games and either avoid or complain bitterly about in-app purchases. Is this reasonable? Again, it barely matters; in a business sense, what matters is figuring out how to make money from this audience, not questioning their philosophy of value.

Free has changed everything, yet that’s not to argue with the continued importance of premium software either. I agree with SuperData’s van Dreunen that there’s a growing cleavage between premium and free markets, although I suspect that the audience itself overlaps significantly. I don’t think, however, that purchasers of premium games are buying quite the same thing they once were. Free has changed this as well; the emergence and rapid rise of “free” as the default price point has meant that choosing to pay for software is an action that exists in the context of abundant free alternatives.

On a practical level, those who buy games are paying for content; in reality, though, that’s not why they choose to pay. There are lots of psychological reasons why people buy media (often it’s to do with self-image and self-presentation to peers), and now there’s a new one; by buying a game, I’m consciously choosing to pay for the privilege of not being subjected to free software monetisation techniques. If I pay $5 for a game, a big part of the motivation for that transaction is the knowledge that I’ll get to enjoy it without F2P mechanisms popping up. Thus, even the absence of F2P has changed the market.

This is the paradigm that developers at all levels of the industry need to come to terms with. Charging people for content is an easy model to understand, but it’s a mistaken one; people don’t really buy access to content. People buy all sorts of other things that are wrapped up, psychologically, in a content purchase, but are remarkably resistant to simply buying content itself.

“I think there’s a bright future for charging premium prices for games – even on platforms where Free otherwise dominates, although it will always be niche there”

There’s so much of it out there for free – sure, only some through legitimate means, but again, this barely matters. The act of purchase is a complex net of emotions, from convenience (I could pirate this but buying it is easier) and perceived risk (what if I get caught pirating? What if it’s got a virus?), through to self-identity (I buy this because this is the kind of game people like me play) and broadcast identity (I buy this because I want people to know I play this kind of game), through to peer group membership (I buy this because it’s in my friends’ Steam libraries and I want to fit in) or community loyalty (I buy this because I’m involved with a community around the developer and wish to support it); and yes, avoidance of free-game monetisation strategies is a new arrow in that quiver. Again, actually accessing content is low on the list, if it’s even there at all, because even if that specific content isn’t available for free somewhere (which it probably is), there’s so much other free content out there that anyone could be entertained endlessly without spending a cent.

In this context, I think there’s a bright future for charging premium prices for games – even on platforms where Free otherwise dominates, although it will always be niche there – but to harness this, developers should try to understand what actually motivates people to buy and recognise the disconnect between what the developer sees as value (“this took me ages to make, that’s why it’s got a price tag on it”) and what the consumer actually values – which could be anything from the above list, or a host of other things, but almost certainly won’t be the developer’s sweat and tears.

That might be tough to accept; but like the inexorable rise of free games and the continuing development of better ways to monetise them, it’s a commercial reality that defies amateur philosophising. You may not like the audience’s attitude to the value of content and unwillingness to pay for things you consider to be valuable – but between a developer that accepts reality and finds a way to make money from the audience they actually have, and the developer who instead ploughs ahead complaining bitterly about the lack of the ideal, grateful audience they dream of, I know which is going to be able to pay the bills at the end of the month.

Courtesy-GI.biz

Amazon’s Zocalo Goes Mobile

November 24, 2014 by Michael  
Filed under Around The Net

Amazon Web Services (AWS) has announced two much-needed boosts to its fledgling Zocalo productivity platform, making the service mobile and allowing for file capacities of up to 5TB.

The service, which is designed to do what Drive does for Google and what Office 365 does for software rental, has gained mobile apps for the first time as Zocalo appears on the Google Play store and Apple App Store.

Amazon also mentions availability on the Kindle store, but we’re not sure about that bit. We assume it means the Amazon App Store for Fire tablet users.

The AWS blog says that the apps allow the user to “work offline, make comments, and securely share documents while you are in the air or on the go.”

A second announcement brings Zocalo into line with the AWS S3 storage on which it is built. Users will receive an update to their Zocalo sync client which will enable file capacities up to 5TB, the same maximum allowed by the Amazon S3 cloud.

To facilitate this, multi-part uploads will allow users to carry on an upload from where it was after a break, deliberate or accidental.

Zocalo was launched in July as the fight for enterprise storage productivity hots up. The service can be trialled for 30 days free of charge, offering 200GB each for up to 50 users.

Rival services from companies including the aforementioned Microsoft and Google, as well as Dropbox and Box, coupled with aggressive price cuts across the sector, have led to burgeoning wars for the hearts and minds of IT managers as Microsoft’s Office monopoly begins to wane.

Courtesy-TheInq

Should AMD And nVidia Get The Blame For Assassin’s Creed’s PC Issues?

November 19, 2014 by Michael  
Filed under Gaming

Ubisoft is claiming that the reason that its latest Assassin’s Creed game was so bad was because of AMD and Nvidia configurations. Last week the Ubisoft was panned for releasing a game which was clearly not ready and Ubisoft originally blamed AMD for its faulty game. Now Ubisoft has amended an original forum post to include and acknowledge problems on Nvidia hardware as well.

Originally the post read “We are aware that the graphics performance of Assassin’s Creed Unity on PC may be adversely affected by certain AMD CPU and GPU configurations. This should not affect the vast majority of PC players, but rest assured that AMD and Ubisoft are continuing to work together closely to resolve the issue, and will provide more information as soon as it is available.”

However there is no equivalent Nvidia-centric post on the main forum, and no mention of the fact that if you own any Nvidia card which is not a GTX 970 or 980. What is amazing is that with the problems so widespread, Ubisoft did not see them in its own testing before sending it out to the shops. Unless they only played the game on an Nvidia GTX 970 and did not bother to test it on a console, it is inconceivable that they could not have seen it.

Courtesy-Fud

Amazon Goes With Intel Zeon Inside

November 18, 2014 by Michael  
Filed under Computing

Amazon has become the latest vendor to commission a customized Xeon chip from Intel to meet its exact compute requirements, in this case powering new high-performance C4 virtual machine instances on the AWS cloud computing platform.

Amazon announced at the firm’s AWS re:Invent conference in Las Vegas that the latest generation of compute-optimized Amazon Elastic Compute Cloud (EC2) virtual machine instances offer up to 36 virtual CPUs and 60GB of memory.

“These instances are designed to deliver the highest level of processor performance on EC2. If you’ve got the workload, we’ve got the instance,” said AWS chief evangelist Jeff Barr, detailing the new instances on the AWS blog.

The instances are powered by a custom version of Intel’s latest Xeon E5 v3 processor family, identified by Amazon as the Xeon E5-2666 v3. This runs at a base speed of 2.9GHz, and can achieve clock speeds as high as 3.5GHz with Turbo boost.

Amazon is not the first company to commission a customized processor from Intel. Earlier this year, Oracle unveiled new Sun Server X4-4 and Sun Server X4-8 systems with a custom Xeon E7 v2 processor.

The processor is capable of dynamically switching core count, clock frequency and power consumption without the need for a system level reboot, in order to deliver an elastic compute capability that adapts to the demands of the workload.

However, these are just the vendors that have gone public; Intel claims it is delivering over 35 customized versions of the Intel Xeon E5 v3 processor family to various customers.

This is an area the chipmaker seems to be keen on pursuing, especially with companies like cloud service providers that purchase a great many chips.

“We’re really excited to be working with Amazon. Amazon’s platform is the landing zone for a lot of new software development and it’s really exciting to partner with those guys on a SKU that really meets their needs,” said Dave Hill, ‎senior systems engineer in Intel’s Datacenter Group.

Also at AWS re:Invent, Amazon announced the Amazon EC2 Container Service, adding support for Docker on its cloud platform.

Currently available as a preview, the EC2 Container Service is designed to make it easy to run and manage distributed applications on AWS using containers.

Customers will be able to start, stop and manage thousands of containers in seconds, scaling from one container to hundreds of thousands across a managed cluster of Amazon EC2 instances, the firm said.

Courtesy-TheInq

Mozilla Goes Oculus Rift

November 14, 2014 by Michael  
Filed under Around The Net

Mozilla is continuing its 10th birthday celebrations with the launch of a virtual reality (VR) website.

MozVR will be a portal to sites compatible with the Oculus Rift VR helmet, accessible by a VR-enabled version of the Firefox browser.

The site is designed to act as a sharing platform for VR web experiences as well as a place where developers can get hold of resources to help create their own.

MozVR has been built to be a “native VR” site and navigating around from site to site is completely immersive, described by the developers as like being teleported from place to place.

All the tools to create VR websites are open source, as you would expect from Mozilla, and have been posted to Github, including the full source code, a collection of tools and a range of tutorials.

Mozilla has contributed its own experience to the site in the form of Talk Chat Show Thing, the world’s first VR talk show, presented from the roof of Mozilla’s offices in San Francisco.

MozVR will also render correctly in VR versions of Chromium, the open source version of Google Chrome, giving Mozilla a significant foothold in a burgeoning early-adopter market.

In March of this year, Facebook purchased Oculus Rift maker Oculus VR, which continues to be run as a separate subsidiary.

The move caused animosity between developers and early adopters who felt that Facebook was an inappropriate home for the cutting edge device which had been originally crowdfunded through Kickstarter.

Courtesy-TheInq

Will Big Huge Game Be Able To Make A Comeback?

October 30, 2014 by Michael  
Filed under Gaming

Brian Reynolds buys the rights to Big Huge Games from the State of Rhode Island at auction and then reopens the studio and teams with Nexon to deliver the new mobile title called DomiNations.

The game might be inspired by a lot of games, but the basic idea is that you are the leader of a Stone Age tribe and you have guide your tribe through civilization and human history. The ability exists for you to form alliances, trade with friends, and raid your enemies.

Reynolds has not said what is next for the new Big Huge Games, but if DomiNations is successful, it could fund more complex projects for console or PC according to our sources.

Courtesy-Fud

Does Samsung Fear A Processor War?

October 15, 2014 by Michael  
Filed under Computing

Kwon Oh-hyun has said he is not worried about a price war in the semiconductor industry next year even though the firm is rapidly expanding its production volume.

“We’ll have to wait and see how things will go next year, but there definitely will not be any game of chicken,” said Oh-hyun, according to Reuters, suggesting the firm will not take chip rivals head on.

Samsung has reported strong profits for 2014 owing to better-than-expected demand for PCs and server chips. Analysts have also forecast similar results for the coming year, so things are definitely looking good for the company.

It emerged last week that Samsung will fork out almost $15bn on a new chip facility in South Korea, representing the firm’s biggest investment in a single plant.

Samsung hopes the investment will bolster profits in its already well-established and successful semiconductor business, and help to maintain its lead in memory chips and grow beyond the declining sales of its smartphones.

According to sources, Samsung expects its chip production capacity to increase by a “low double-digit percentage” after the facility begins production, which almost goes against the CEO’s claims that it is not looking for a price war.

Last month, Samsung was found guilty of involvement in a price fixing racket with a bunch of other chip makers stretching back over a decade, and was fined €138m by European regulators.

An antitrust investigation into chips used in mobile device SIM cards found that Infineon, Philips and Samsung colluded to artificially manipulate the price of SIM card chips.

Courtesy-TheInq

Will The Chip Industry Take Fall?

October 14, 2014 by Michael  
Filed under Computing

Microchip Technology has managed to scare Wall Street by warning of an industry downturn. This follows rumours that a number of US semiconductor makers with global operations are reducing demand for chips in regions ranging from Asia to Europe.

Microchip Chief Executive Steve Sanghi warned that the correction will spread more broadly across the industry in the near future. Microchip expects to report sales of $546.2 million for its fiscal second quarter ending in September. The company had earlier forecast revenue in a range of $560 million to $575.9 million. Semiconductor companies’ shares are volatile at the best of times and news like this is the sort of thing that investors do not want to hear.

Trading in Intel, whiich is due to report third quarter results tomorrow, was 2.6 times the usual volume. Micron, which makes dynamic random access memory, or DRAM, was the third-most traded name in the options market. All this seems to suggest that the market is a bit spooked and much will depend on what Chipzilla tells the world tomorrow as to whether it goes into a nosedive.

Courtesy-Fud

Will Sony’s Morpheus Succeed?

September 30, 2014 by Michael  
Filed under Gaming

PS4 is going gangbusters, 3DS continues to impress, Steam and Kickstarter have between them overseen an extraordinary revitalisation of PC gaming, and mobile gaming goes from strength to strength; yet it’s absolutely clear where the eager eyes of most gamers are turned right now. Virtual reality headsets are, not for the first time, the single most exciting thing in interactive entertainment. At the Tokyo Game Show and its surrounding events, the strongest contrast to the huge number of mobile titles on display was the seemingly boundless enthusiasm for Sony’s Morpheus and Oculus’ Rift headsets; at Oculus’ own conference in California the same week, developers were entranced by the hardware and its promise.

VR is coming; this time, it’s for real. Decades of false starts, disappointments and dodgy Hollywood depictions will finally be left behind. The tech and the know-how have finally caught up with the dreams. Immersion and realism are almost within touching distance, a deep, involved experience that will fulfil the childhood wishes of just about every gamer and SF aficionado while also putting clear blue water between core games and more casual entertainment. The graphical fidelity of mobile devices may be rapidly catching up to consoles, but the sheer gulf between a VR experience and a mobile experience will be unmistakeable.

That’s the promise, anyway. There’s no question that it’s a promise which feels closer to fulfilment than ever before. Even in the absence of a final consumer product or even a release date, let alone a killer app, the prototypes and demos we’ve seen thus far are closer to “true” virtual reality than many of us had dared to hope. Some concerns remain; how mainstream can a product that relies on strapping on a headset to the exclusion of the real world actually become? (I wouldn’t care to guess on this front, but would note that we already use technology in countless ways that would have seemed alien, anti-social or downright weird to people only a generation ago.) Won’t an appreciable portion of people get motion sickness? (Perhaps; only widespread adoption will show us how widespread this problem really is.) There’s plenty to ponder even as the technology marches inexorably closer.

One thing I found myself pondering around TGS and Oculus Connect was the slightly worrying divergence in the strategies of Sony and Oculus. A year or even six months ago, it felt like these companies, although rivals, were broadly marching in lock step. Morpheus and Rift felt like very similar devices – Rift was more “hobbyist” yet a little more technically impressive, while Morpheus was more clearly the product of an experienced consumer products company, but in essence they shared much of the same DNA.

Now, however, there’s a clear divergence in strategy, and it’s something of a concern. Shuhei Yoshida says that Morpheus is 85% complete (although anyone who has worked in product development knows that the last 10% can take a hell of a lot more than 10% of the effort to get right); Sony is seemingly feeling reasonably confident about its device and has worked out various cunning approaches to make it cost effective, from using mobile phone components through to repurposing PlayStation Move as a surprisingly effective VR control mechanism.

By contrast, Oculus Connect showed off a new prototype of Rift which is still clearly in a process of evolution. The new hardware is lighter and more comfortable – closer to being a final product, in short – but it’s also still adding new features and functionality to the basic unit. Oculus, unlike Sony, still doesn’t feel like a company that’s anywhere close to having a consumer product ready to launch. It’s still hunting for the “right” level of hardware capabilities and functionality to make VR really work.

I could be wrong; Oculus could be within a year of shipping something to consumers, but if so, they’ve got a damned funny way of showing it. Based on the tone of Oculus Connect, the firm’s hugely impressive technology is still in a process of evolution and development. It barely feels any closer to being a consumer product this year than it did last year, and its increasingly complex functionality implies a product which, when it finally arrives, will command a premium price point. This is still a tech company in a process of iteration, discovering the product they actually want to launch; for Luckey, Carmack and the rest of the dream team assembled at Oculus, their VR just isn’t good enough yet, even though it’s moving in the right direction fast.

Sony, by contrast, now feels like it’s about to try something disruptive. It’s seemingly pretty happy with where Morpheus stands as a VR device; now the challenge is getting the design and software right, and pushing the price down to a consumer friendly level by doing market-disruptive things like repurposing components from its (actually pretty impressive) smartphones. Again, it’s possible that the mood music from both companies is misleading, but right now it feels like Sony is going to launch a reasonably cost-effective VR headset while Oculus is still in the prototyping phase.

These are two very different strategic approaches to the market. The worrying thing is that they can’t both be right. If Oculus is correct and VR still needs a lot of fine-tuning, prototyping and figuring out before it’s ready for the market, then Sony is rushing in too quickly and risks seriously damaging the market potential of VR as a whole with an underwhelming product. This risk can’t be overstated; if Morpheus launches first and it makes everyone seasick, or is uncomfortable to use for more than a short period of time, or simply doesn’t impress people with its fidelity and immersion, then it could see VR being written off for another decade in spite of Oculus’ best efforts. The public are fickle and VR has cried wolf too many times already.

If, on the other hand, Sony is correct and “good enough” VR tech is pretty much ready to go, then that’s great for VR and for PS4, but potentially very worrying for Oculus, who risk their careful, evolutionary, prototype after prototype approach being upended by an unusually nimble and disruptive challenge from Sony. If this is the case (and I’ve heard little but good things about Morpheus, which suggests Sony’s gamble may indeed pay off) then the Facebook deal could be either a blessing or a curse. A blessing, if it allows Oculus to continue to work on evolving and developing VR tech, shielding them from the impact of losing first-mover advantage to Sony; a curse, if that failure to score a clear win in the first round spooks Facebook’s management and investors and causes them to pull the plug. That’s one that could go either way; given the quality of the innovative work Oculus is doing, even if Sony’s approach proves victorious, everyone should hope that the Oculus team gets an opportunity to keep plugging away.

It’s exciting and interesting to see Sony taking this kind of risk. These gambles don’t always pay off, of course – the company placed bets on 3D TV in the PS3 era which never came to fruition, for example – but that’s the nature of innovation and we should never criticise a company for attempting something truly interesting, innovative and even disruptive, as long as it passes the most basic of Devil’s Advocate tests. Sony has desperately needed a Devil’s Advocate in the past – Rolly, anyone? UMD? – but Morpheus is a clear pass, an interesting and exciting product with the potential to truly turn around the company’s fortunes.

I just hope that in the company’s enthusiasm, it understands the absolute importance of getting this right, not just being first. This is a quality Sony was famed for in the past; rather than trying to be first to market in new sectors, it would ensure that it had by far the best product when it launched. This is one of the things which Steve Jobs, a huge fan of Sony, copied from the company when he created the philosophies which still guide Apple (a company that rarely innovates first, but almost always leapfrogs the competition in quality and usability when it does adopt new technology and features). For an experience as intimate as VR – complete immersion in a headset, screens mere centimetres from your eyes – that’s a philosophy which must be followed. When these headsets reach the market, what will be most important isn’t who is first; it isn’t even who is cheapest. The consumer’s first experience must be excellent – nothing less will do. Oculus seems to get that. Sony, in its enthusiasm to disrupt, must not lose sight of the same goal.

 

Courtesy-GI.biz

Will Oculus Go Into The Mobile Space?

September 25, 2014 by Michael  
Filed under Gaming

We attended the first ever Oculus Connect conference, the beats and chatter of a cocktail reception just next door, Max Cohen is being brutally honest about the company’s mobile-based virtual reality headset.

“I can spend ten minutes talking about the problems with this device. We’re not afraid of them,” the VP of mobile says with a smile.

“It overheats if you run it too long. It is 60 Hertz low persistence, which means some people will notice flicker. The graphical quality is obviously a lot less than the PC. Battery life is a concern. There’s no positional tracking.

“We could try to say this is the be-all end-all of VR. We’d be lying. That’s a bad thing. We would hurt where we can get to the be-all end-all of VR. Everyone, Samsung, Facebook, Oculus, we’re all aligned with making a damn good product that we put out in the market and then working on improving it. Really soon, maybe even sooner than you think, we’ll get to that amazing VR experience for everyone.”

“Samsung, Facebook, Oculus, we’re all aligned with making a damn good product”

Cohen’s talking about the Gear VR, the Samsung backed headset that offers a more portable and accessible entry into the virtual reality world for developers and users alike. It’s John Carmack’s passion project at the company and clearly it’s Cohen’s too.

“The first thing they did was to put me in the HD prototype with the Tuscany demo. I was floored, of course,” he remembers.

“Then I got to see the Valve room and then he showed me this mobile project. It was running on a Galaxy S4 at the time. It crashed a little bit. There were a lot of problems with it, but I just thought this was so amazing. I went back and was talking to a friend of mine who’s an entrepreneur. He said it’s rare that you have the opportunity to work on transformational hardware, and that’s really what this was.”

The story of the Gear VR is a simple one; Oculus went to the Korean company hoping to work with them on screens for the PC-based Rift and found Samsung had been working on a headset you could simply slide a Samsung Galaxy phone into to experience virtual reality. Now the companies are working together on both devices, with Samsung fielding calls from Carmack on a regular basis.

“It’s a collaboration. It’s not we tell them what to do or they tell us what to do,” Cohen continues. “We’re the software platform, so when you put that on, you’re in Oculus, but that wouldn’t be possible without maximizing the hardware. Carmack and our team works very closely with their engineering team. They make suggestions about UI as well. We’re working together to make the best possible experience. If it wasn’t collaborative, this thing just honestly wouldn’t function because this is really hard to do.”

The focus of Oculus Connect isn’t the media or sales or even recruitment, but developers. Supporting them, showing them the technology, offering them advice on the new territory that is virtual reality. Cohen, like everyone else I speak to at the weekend, believes developers and their content is absolutely key to the success of the hardware.

“At the end of the day, we want to make the developers’ lives as easy as possible so they can make cool content.”

“Facebook invested in the platform. They didn’t buy it. What they did is they’re also committing money to make sure it’s successful on an ongoing basis”

That content will be supported by an app store, and Cohen wants it to be a place where developers can make a living, rather than just a showcase of free demos. Jason Holtman, former director of business development at Valve, is overseeing its creation.

“We’re going to launch initially with a free store, but maybe a month later, follow along with commerce,” says Cohen.

“At the end of the day, as great as doing the art for free and sharing that is, we will have a hundred times more content when people can actually monetize it. This is a business. There’s nothing wrong with that. People need to be able to feed themselves. Our job is to make the platform as friendly for developers as we can so that it’s painless. You don’t have to worry about a bunch of overhead.”

There’s a sense that the Facebook money, that headline-grabbing $2 billion, has given the team the luxury of time and the chance to recruit the people they need to make sure this time virtual reality lives up to its promises. Other than that, Facebook seems to be letting Oculus just get on with it.

“That’s the thing… a lot of people, with the Facebook acquisition, asked how that would impact us and the answer is it hasn’t, in terms of our culture, and Facebook’s actually supportive of the way Oculus is because we know that content makes or breaks a platform,” says Cohen.

“They invested in the platform. They didn’t buy it. What they did is they’re also committing money to make sure it’s successful on an ongoing basis. We could have continued to raise a lot of venture capital. It would have been very expensive to do it right. Now we have replaced our board of directors with Facebook, but that’s completely fine. They are helping us. They are accelerating our efforts.”

No one at Oculus is talking about release dates for consumer units yet, and Cohen is no different. It’s clear that he and the team are hungry for progress as he talks about skipping minor updates and making major advances. He talks about “awesome” ideas that he’s desperate to get to, and pushing the envelope, but what matters most is getting it right.

“I think everyone understands that with a little bit more magic, VR can be ubiquitous. Everyone needs it. I think a lot of people understand what we need to do to get there, but it takes hard work to actually solve those things. Oculus and Facebook have lined up the right team to do it, but I want us to actually have time to do that,” says Cohen.

“We’re not trying to sell millions now. We’re trying to get people and early adopters, tech enthusiasts and all that interested in it.”

Courtesy-GI.biz

Will AMD’s FreeSync Appear In Early 2015?

September 22, 2014 by Michael  
Filed under Computing

Last week in San Francisco we spent some time with Richard Huddy, AMD’s Chief gaming scientist to get a glimpse what is going on in the world of AMD graphics. Of course we touched on Mantle, AMD’s future in graphics and FreeSync, the company’s alternative to Nvidia G-Sync.

Now a week later AMD is ready to announce that MStar, Novatek and Realtek scaler manufactures are getting ready with DisplayPort Adaptive-Sync and AMD’s Project FreeSync. They should be done by end of the year with monitors shipping in Q1 2015.

FreeSync will prevent frame tearing as the graphic card often pushes more (or fewer) frames than the monitor can draw and this lack of synchronisation creates quite annoying frame tears.

FreeSync will allow Radeon gamers to synchronize display refresh rates and GPU frame rates to enable tearing and stutter-free gaming along with low input latency. We still do not have the specs or names of the new monitors, but we can confirm that they will use robust DisplayPort receivers from MStar, Novatek and Realtek in 144Hz panels with QHD 2560×1440 and UHD 3840×2160 panels up to 60 Hz.

It took Nvidia quite some time to get G-Sync monitors off the ground and we expect to see the first 4K G-Sync monitors shipping shortly, while QHD 2560×1440 ones have been available for a few months. Since these are gaming monitors with a 144Hz refresh rate they don’t come cheap, but they are nice to look at and should accompany a high end graphic card such as Geforce GTX 980 or a few of them.

Radeon lovers will get FreeSync, but monitors will take a bit more time since AMD promises Project FreeSync-ready monitors through a media review program in 1Q 15 and doesn’t actually tells us much about retail / etail availability.

Courtesy-Fud