Subscribe to:

Subscribe to :: TheGuruReview.net ::

Intel Give A Boost To Its Exascale Computing Effort

February 20, 2015 by Michael  
Filed under Computing

Intel’s exascale computing efforts have received a boost with the extension of the company’s research collaboration with the Barcelona Supercomputing Center.

Begun in 2011 and now extended to September 2017, the Intel-BSC work is currently looking at scalability issues with parallel applications.

Karl Solchenbach, Intel’s director, Innovation Pathfinding Architecture Group in Europe said it was important to improve scalability of threaded applications on many core nodes through the OmpSs programming model.

The collaboration has developed a methodology to measure these effects separately. “An automatic tool not only provides a detailed analysis of performance inhibitors, but also it allows a projection to a higher number of nodes,” says Solchenbach.

BSC has been making HPC tools and given Intel an instrumentation package (Extrae), a performance data browser (Paraver), and a simulator (Dimemas) to play with.

Charlie Wuischpard, VP & GM High Performance Computing at Intel said that the Barcelona work is pretty big scale for Chipzilla.

“A major part of what we’re proposing going forward is work on many core architecture. Our roadmap is to continue to add more and more cores all the time.”

“Our Knights Landing product that is coming out will have 60 or more cores running at a slightly slower clock speed but give you vastly better performance,” he said.

Courtesy-Fud

Do “CORE” Gamers Exist?

February 10, 2015 by Michael  
Filed under Gaming

A year or two ago, it seemed that doom and gloom reigned over the prospects for “core” gaming. With smartphones and tablets becoming this decade’s ubiquitous gaming devices, casual and social games ascendant and free-to-play established as just about the only effective way to make money from the teeming masses swarming to gaming for the first time, dire predictions abounded about the death of game consoles, the decline of paid-for games and the dwindling importance of “core” gamers to the games industry at large.

This week’s headlines speak of a different narrative – one that’s become increasingly strong as we’ve delved into what 2015 has to offer. Sony’s financial figures look pretty good, buoyed partially by the weakness of the Yen but notably also by the incredible success of the PlayStation 4 – a console which more aggressive commentators were reading funeral rites for before it was even announced. Both of the PS4′s competitors, incidentally, ended 2014 (and began 2015) in a stronger sales position than they were in 12 months previously, with next-gen home consoles overall heading for the 40 million sales mark in pretty much record time.

Then there’s the software story of the week; the startling sales of Grand Theft Auto V, which thanks to ten million sales of the PS4 and Xbox One versions of the game, have now topped 45 million units. That’s an incredible figure, one which suggests that this single game has generated well over $2 billion in revenue thus far; the GTA franchise as a whole must, at this point, be one of the most valuable entertainment franchises in existence, comparable in revenue terms to the likes of Star Wars or the Marvel Cinematic Universe.

Look, this is basically feel-good stuff for the games business; “hey guys, we’re doing great, our biggest franchise is right up there with Hollywood’s finest and these console sales are a promise of a solid future”. Stories like this used to turn up all the time back when games were genuinely struggling to be recognised as a valid and important industry alongside TV, music and film. Nowadays, that struggle has been internalised; it’s worth stepping back every now and then from the sheer enormity of figures like Apple and Samsung’s smartphone sales, or Puzzle & Dragons’ revenue (comparable to GTAV’s, but whether that means the game can birth a successful franchise or sustain itself long-term is another question entirely), or the number of players engaged with top F2P games, to remind ourselves that there’s still huge success happening in the “traditional” end of the market.

The take-away, perhaps, is that this isn’t a zero-sum game. The great success of casual and social games, first on Facebook and now on smartphones, isn’t that they’ve replaced core games, cannibalising the existing high-value market; it’s that they’ve acquired a whole new audience for themselves. Sure, there’s overlap, but there’s little evidence to suggest that this overlap results in people engaging less with core games; I, for one, have discovered that many smartphone F2P games have a core loop that fits nicely into the match-making and loading delays for Destiny’s Crucible.

That’s not to say that changes to the wider business haven’t resonated back through the “core” games space. The massive success of a game like GTAV has a dark side; it reflects the increasing polarisation of the high-end games market, in which successful games win bigger than ever, but games which fail to become enormous hits find themselves failing utterly. There’s no mid-market any more; you’re either a complete hit or a total miss. Developers have lamented the loss of the “AA” market (as distinct from the “AAA” space) for some time; that loss is becoming increasingly keenly felt as enormous budgets, production values and financial pressures come to bear on a smaller and smaller line-up of top-tier titles. Several factors drove the death of AA, with production costs and team sizes being major issues, but the rise of casual games and even of increasingly high-quality indie titles undoubtedly played a role – creating whole new market sectors that cost far less to consumers than AA titles had done.

It’s not just success that’s been polarised by this process; it’s also risk. At the high-end of the market, risk is simply unacceptable, such are the enormous financial figures at play. Thus it’s largely left to the low-end – the indie scene, the flood of titles appearing on the App Store, on Steam and even on the likes of PlayStation Vita – to take interesting risks and challenge gaming conventions. Along the way, some of the talented creators involved in these scenes are either trying to engage new audiences, or to engage existing audiences in new ways; sometimes experimenting with gameplay and interactive, sometimes with narrative and art style, sometimes with business model or distribution.

All of which leads me to explain why I keep writing “core” games, with inverted commas around “core”; because honestly, I’m increasingly uncertain what this term means. It used to refer to specific genres, largely speaking those considered to have special resonance for geeky guys; gory science fiction FPS games, high fantasy RPGs, complex beat-’em-ups and shoot-’em-ups, graphic survival horror titles, war-torn action games. Then, for a while, the rise of F2P seemed to make the definition of “core” shimmer and reform itself; now it meant “games people pay for up front, and the kind of people who pay for those games”.

Now? Now, who knows what “core” really means? League of Legends is certainly something you have to be pretty damn deeply involved with to enjoy, but it’s free-to-play; so is Hearthstone, which is arguably not quite so “core” but still demands a lot of attention and focus. There are great games on consoles – systems whose owners paid hundreds of dollars for a devoted gaming machine – which are free-to-play. There are games on mobile phones that cost money up front and are intricate and engrossing. There are games you can download for free on your PC, or pick up for a few dollars on Steam, that explore all sorts of interesting and complex niches of narrative, of human experience and of the far-flung corners of what it means to play a “game”. Someone who sits down for hours unravelling the strands of a text adventure written in Twine; are they “core”? Someone who treats retro gaming like a history project, travelling back through the medium’s tropes and concepts to find their origin points; are they “core”? How about Frank Underwood in House of Cards, largely disinterested in games but picking up a violent shooter to work out frustrations on his Xbox in the evenings; is he a “core gamer”?

Don’t get me wrong; this fuzzing of the lines around the concept of “core” is, to my mind, a vital step in the evolution of our medium. That the so-called “battle” between traditional business models and F2P, between AAA studios and indies, between casual and core, was not a zero-sum game and could result in the expansion of the entire industry, not the destruction of one side or another, has been obvious from the outset. What was less obvious and took a little more time to come to pass was that not only would each of those sides not detract from the others; they would actually learn from one another and help to fuel one another’s development. New creative outlooks, new approaches to interactivity, new thoughts on social and community aspects of gaming, new ideas about business models and monetisation; these all mingle with one another and help to make up for the creative drought at the top of the AAA industry (and increasingly, at the top of the F2P industry, too) by providing a steady feed of new concepts and ideas from below.

It’s fantastic and very positive that the next-gen consoles are doing well and that GTAV has sold so many copies (dark thoughts regarding the polarisation of AAA success aside); but it’s wrong, I think, to just look at this as being “hey, core gaming is doing fine”. Games aren’t made up of opposed factions, casual at war with core; it’s a spectrum, attracting relevant audiences from across the board. Rather than pitting GTAV against Puzzle and Dragons, I’d rather look at the enormous success of both games as being a sign of how well games are doing overall; rather than stacking sales of next-gen consoles against sales of smartphones and reheating old arguments about dedicated game devices vs multi-purpose devices, I’d rather think about the enormous addressable audience that represents overall. As the arguments about casual or F2P gaming “destroying” core games start to fade out, let’s take this opportunity to rid ourselves of some of our more meaningless distinctions and categories for good.

Courtesy-GI.biz

 

Will The Xbox One Go Virtual Reality This Year?

January 6, 2015 by Michael  
Filed under Gaming

While we can’t get a real handle on when Microsoft might reveal the VR headset that they have had in development, we have learned from our sources that it is well into development and some selected developers already have developmental prototypes.

It is hard to say when Microsoft might actually reveal the new VR headset and technology, but it would seem that GDC or E3 would be the likely events to see it introduced. We do know that Microsoft is targeting 2015 to move the VR headset into mass production and it is thought that we will see versions for both the Xbox One and PC. Though we expect the PC version to come a little after the Xbox One version.

Rumor has it that the same development team that worked on the Surface tablet are the team that has taken on this project as well.

Courtesy-Fud

Microsoft Opens Up Halo

December 16, 2014 by Michael  
Filed under Gaming

Project Orleans, the cloud engine that powers Xbox hits Halo Reach and Halo 4, is being taken open source.

The engine, which has also played a vital role in the development of Microsoft’s Azure cloud computing platform, will be released under an MIT licence next year by Microsoft Technologies after being trailed at this year’s Microsoft Build Conference.

This is the latest in a long line of open-source announcements by Microsoft this year as the company tries to reinvent itself for the age where its stranglehold on the market has reduced and a wide variety of non-proprietary alternatives exist.

At the same Build conference, the company also announced that it will open source the .NET framework, on which most Windows applications depend.

The project, as described by the team itself, is “an implementation of an improved actor model that borrows heavily from Erlang and distributed objects systems, adds static typing, message indirection and actor virtualisation, exposing them in an integrated programming model”.

The team added that, whereas Erlang is a pure functional language with its own custom virtual machine, the Orleans programming model “directly leverages .NET and its object-oriented capabilities”.

One example available to try is an analysis of Twitter sentiment gauging reaction to a given hash-tag based on the language around it and creating visual representations of the mood of the web.

The code will be available as an extension to Microsoft Studio 12 or 13 with samples and supporting documentation already available, including for the Azure implementations. Non-Azure users can grab a free trial version before they buy.

Courtesy-TheInq

Was Sony’s Playstation Network Hacked Again?

November 25, 2014 by Michael  
Filed under Gaming

Sony has denied the claims of DerpTrolling, a hacker group which claimed it had raided the databases of the PSN, along with a number of other online services.

The group had published a list of emails and passwords for PSN, Windows Live Mail and 2K Games accounts online, and claimed to be prepared to release more, but Sony says that they’ve come from other sources than hacking.

“We have investigated the claims that our network was breached and have found no evidence that there was any intrusion into our network,” the company wrote in a declaration to Joystiq. “Unfortunately, Internet fraud including phishing and password matching are realities that consumers and online networks face on a regular basis. We take these reports very seriously and will continue to monitor our network closely.”

 

 

Courtesy-GI.biz

Amazon’s Zocalo Goes Mobile

November 24, 2014 by Michael  
Filed under Around The Net

Amazon Web Services (AWS) has announced two much-needed boosts to its fledgling Zocalo productivity platform, making the service mobile and allowing for file capacities of up to 5TB.

The service, which is designed to do what Drive does for Google and what Office 365 does for software rental, has gained mobile apps for the first time as Zocalo appears on the Google Play store and Apple App Store.

Amazon also mentions availability on the Kindle store, but we’re not sure about that bit. We assume it means the Amazon App Store for Fire tablet users.

The AWS blog says that the apps allow the user to “work offline, make comments, and securely share documents while you are in the air or on the go.”

A second announcement brings Zocalo into line with the AWS S3 storage on which it is built. Users will receive an update to their Zocalo sync client which will enable file capacities up to 5TB, the same maximum allowed by the Amazon S3 cloud.

To facilitate this, multi-part uploads will allow users to carry on an upload from where it was after a break, deliberate or accidental.

Zocalo was launched in July as the fight for enterprise storage productivity hots up. The service can be trialled for 30 days free of charge, offering 200GB each for up to 50 users.

Rival services from companies including the aforementioned Microsoft and Google, as well as Dropbox and Box, coupled with aggressive price cuts across the sector, have led to burgeoning wars for the hearts and minds of IT managers as Microsoft’s Office monopoly begins to wane.

Courtesy-TheInq

Amazon Goes With Intel Zeon Inside

November 18, 2014 by Michael  
Filed under Computing

Amazon has become the latest vendor to commission a customized Xeon chip from Intel to meet its exact compute requirements, in this case powering new high-performance C4 virtual machine instances on the AWS cloud computing platform.

Amazon announced at the firm’s AWS re:Invent conference in Las Vegas that the latest generation of compute-optimized Amazon Elastic Compute Cloud (EC2) virtual machine instances offer up to 36 virtual CPUs and 60GB of memory.

“These instances are designed to deliver the highest level of processor performance on EC2. If you’ve got the workload, we’ve got the instance,” said AWS chief evangelist Jeff Barr, detailing the new instances on the AWS blog.

The instances are powered by a custom version of Intel’s latest Xeon E5 v3 processor family, identified by Amazon as the Xeon E5-2666 v3. This runs at a base speed of 2.9GHz, and can achieve clock speeds as high as 3.5GHz with Turbo boost.

Amazon is not the first company to commission a customized processor from Intel. Earlier this year, Oracle unveiled new Sun Server X4-4 and Sun Server X4-8 systems with a custom Xeon E7 v2 processor.

The processor is capable of dynamically switching core count, clock frequency and power consumption without the need for a system level reboot, in order to deliver an elastic compute capability that adapts to the demands of the workload.

However, these are just the vendors that have gone public; Intel claims it is delivering over 35 customized versions of the Intel Xeon E5 v3 processor family to various customers.

This is an area the chipmaker seems to be keen on pursuing, especially with companies like cloud service providers that purchase a great many chips.

“We’re really excited to be working with Amazon. Amazon’s platform is the landing zone for a lot of new software development and it’s really exciting to partner with those guys on a SKU that really meets their needs,” said Dave Hill, ‎senior systems engineer in Intel’s Datacenter Group.

Also at AWS re:Invent, Amazon announced the Amazon EC2 Container Service, adding support for Docker on its cloud platform.

Currently available as a preview, the EC2 Container Service is designed to make it easy to run and manage distributed applications on AWS using containers.

Customers will be able to start, stop and manage thousands of containers in seconds, scaling from one container to hundreds of thousands across a managed cluster of Amazon EC2 instances, the firm said.

Courtesy-TheInq

Mozilla Goes Oculus Rift

November 14, 2014 by Michael  
Filed under Around The Net

Mozilla is continuing its 10th birthday celebrations with the launch of a virtual reality (VR) website.

MozVR will be a portal to sites compatible with the Oculus Rift VR helmet, accessible by a VR-enabled version of the Firefox browser.

The site is designed to act as a sharing platform for VR web experiences as well as a place where developers can get hold of resources to help create their own.

MozVR has been built to be a “native VR” site and navigating around from site to site is completely immersive, described by the developers as like being teleported from place to place.

All the tools to create VR websites are open source, as you would expect from Mozilla, and have been posted to Github, including the full source code, a collection of tools and a range of tutorials.

Mozilla has contributed its own experience to the site in the form of Talk Chat Show Thing, the world’s first VR talk show, presented from the roof of Mozilla’s offices in San Francisco.

MozVR will also render correctly in VR versions of Chromium, the open source version of Google Chrome, giving Mozilla a significant foothold in a burgeoning early-adopter market.

In March of this year, Facebook purchased Oculus Rift maker Oculus VR, which continues to be run as a separate subsidiary.

The move caused animosity between developers and early adopters who felt that Facebook was an inappropriate home for the cutting edge device which had been originally crowdfunded through Kickstarter.

Courtesy-TheInq

Will Big Huge Game Be Able To Make A Comeback?

October 30, 2014 by Michael  
Filed under Gaming

Brian Reynolds buys the rights to Big Huge Games from the State of Rhode Island at auction and then reopens the studio and teams with Nexon to deliver the new mobile title called DomiNations.

The game might be inspired by a lot of games, but the basic idea is that you are the leader of a Stone Age tribe and you have guide your tribe through civilization and human history. The ability exists for you to form alliances, trade with friends, and raid your enemies.

Reynolds has not said what is next for the new Big Huge Games, but if DomiNations is successful, it could fund more complex projects for console or PC according to our sources.

Courtesy-Fud

Does Samsung Fear A Processor War?

October 15, 2014 by Michael  
Filed under Computing

Kwon Oh-hyun has said he is not worried about a price war in the semiconductor industry next year even though the firm is rapidly expanding its production volume.

“We’ll have to wait and see how things will go next year, but there definitely will not be any game of chicken,” said Oh-hyun, according to Reuters, suggesting the firm will not take chip rivals head on.

Samsung has reported strong profits for 2014 owing to better-than-expected demand for PCs and server chips. Analysts have also forecast similar results for the coming year, so things are definitely looking good for the company.

It emerged last week that Samsung will fork out almost $15bn on a new chip facility in South Korea, representing the firm’s biggest investment in a single plant.

Samsung hopes the investment will bolster profits in its already well-established and successful semiconductor business, and help to maintain its lead in memory chips and grow beyond the declining sales of its smartphones.

According to sources, Samsung expects its chip production capacity to increase by a “low double-digit percentage” after the facility begins production, which almost goes against the CEO’s claims that it is not looking for a price war.

Last month, Samsung was found guilty of involvement in a price fixing racket with a bunch of other chip makers stretching back over a decade, and was fined €138m by European regulators.

An antitrust investigation into chips used in mobile device SIM cards found that Infineon, Philips and Samsung colluded to artificially manipulate the price of SIM card chips.

Courtesy-TheInq

Is Master Chief Returning To Halo 5?

October 8, 2014 by Michael  
Filed under Gaming

In a recent interview with OXN, Mile Colter who plays Agent Locke in Halo Nightfall claims that his character is the primary character that people will be playing in the Halo 5 game. That is not to say that Master Chief will not have a significant role in Halo 5 as well.

Part of the campaign will apparently be Locke’s search for Master Chief. Still we don’t know if Locke is a friend or not, so it is obvious that the relationship between the two will be a big part of the story in Halo 5 according to our sources.

Hard to say how accurate this all is, but we do know that we don’t have much longer to wait till the Nightfall series starts airing on the Halo Channel starting November 11th.

Courtesy-Fud

Will Sony’s Morpheus Succeed?

September 30, 2014 by Michael  
Filed under Gaming

PS4 is going gangbusters, 3DS continues to impress, Steam and Kickstarter have between them overseen an extraordinary revitalisation of PC gaming, and mobile gaming goes from strength to strength; yet it’s absolutely clear where the eager eyes of most gamers are turned right now. Virtual reality headsets are, not for the first time, the single most exciting thing in interactive entertainment. At the Tokyo Game Show and its surrounding events, the strongest contrast to the huge number of mobile titles on display was the seemingly boundless enthusiasm for Sony’s Morpheus and Oculus’ Rift headsets; at Oculus’ own conference in California the same week, developers were entranced by the hardware and its promise.

VR is coming; this time, it’s for real. Decades of false starts, disappointments and dodgy Hollywood depictions will finally be left behind. The tech and the know-how have finally caught up with the dreams. Immersion and realism are almost within touching distance, a deep, involved experience that will fulfil the childhood wishes of just about every gamer and SF aficionado while also putting clear blue water between core games and more casual entertainment. The graphical fidelity of mobile devices may be rapidly catching up to consoles, but the sheer gulf between a VR experience and a mobile experience will be unmistakeable.

That’s the promise, anyway. There’s no question that it’s a promise which feels closer to fulfilment than ever before. Even in the absence of a final consumer product or even a release date, let alone a killer app, the prototypes and demos we’ve seen thus far are closer to “true” virtual reality than many of us had dared to hope. Some concerns remain; how mainstream can a product that relies on strapping on a headset to the exclusion of the real world actually become? (I wouldn’t care to guess on this front, but would note that we already use technology in countless ways that would have seemed alien, anti-social or downright weird to people only a generation ago.) Won’t an appreciable portion of people get motion sickness? (Perhaps; only widespread adoption will show us how widespread this problem really is.) There’s plenty to ponder even as the technology marches inexorably closer.

One thing I found myself pondering around TGS and Oculus Connect was the slightly worrying divergence in the strategies of Sony and Oculus. A year or even six months ago, it felt like these companies, although rivals, were broadly marching in lock step. Morpheus and Rift felt like very similar devices – Rift was more “hobbyist” yet a little more technically impressive, while Morpheus was more clearly the product of an experienced consumer products company, but in essence they shared much of the same DNA.

Now, however, there’s a clear divergence in strategy, and it’s something of a concern. Shuhei Yoshida says that Morpheus is 85% complete (although anyone who has worked in product development knows that the last 10% can take a hell of a lot more than 10% of the effort to get right); Sony is seemingly feeling reasonably confident about its device and has worked out various cunning approaches to make it cost effective, from using mobile phone components through to repurposing PlayStation Move as a surprisingly effective VR control mechanism.

By contrast, Oculus Connect showed off a new prototype of Rift which is still clearly in a process of evolution. The new hardware is lighter and more comfortable – closer to being a final product, in short – but it’s also still adding new features and functionality to the basic unit. Oculus, unlike Sony, still doesn’t feel like a company that’s anywhere close to having a consumer product ready to launch. It’s still hunting for the “right” level of hardware capabilities and functionality to make VR really work.

I could be wrong; Oculus could be within a year of shipping something to consumers, but if so, they’ve got a damned funny way of showing it. Based on the tone of Oculus Connect, the firm’s hugely impressive technology is still in a process of evolution and development. It barely feels any closer to being a consumer product this year than it did last year, and its increasingly complex functionality implies a product which, when it finally arrives, will command a premium price point. This is still a tech company in a process of iteration, discovering the product they actually want to launch; for Luckey, Carmack and the rest of the dream team assembled at Oculus, their VR just isn’t good enough yet, even though it’s moving in the right direction fast.

Sony, by contrast, now feels like it’s about to try something disruptive. It’s seemingly pretty happy with where Morpheus stands as a VR device; now the challenge is getting the design and software right, and pushing the price down to a consumer friendly level by doing market-disruptive things like repurposing components from its (actually pretty impressive) smartphones. Again, it’s possible that the mood music from both companies is misleading, but right now it feels like Sony is going to launch a reasonably cost-effective VR headset while Oculus is still in the prototyping phase.

These are two very different strategic approaches to the market. The worrying thing is that they can’t both be right. If Oculus is correct and VR still needs a lot of fine-tuning, prototyping and figuring out before it’s ready for the market, then Sony is rushing in too quickly and risks seriously damaging the market potential of VR as a whole with an underwhelming product. This risk can’t be overstated; if Morpheus launches first and it makes everyone seasick, or is uncomfortable to use for more than a short period of time, or simply doesn’t impress people with its fidelity and immersion, then it could see VR being written off for another decade in spite of Oculus’ best efforts. The public are fickle and VR has cried wolf too many times already.

If, on the other hand, Sony is correct and “good enough” VR tech is pretty much ready to go, then that’s great for VR and for PS4, but potentially very worrying for Oculus, who risk their careful, evolutionary, prototype after prototype approach being upended by an unusually nimble and disruptive challenge from Sony. If this is the case (and I’ve heard little but good things about Morpheus, which suggests Sony’s gamble may indeed pay off) then the Facebook deal could be either a blessing or a curse. A blessing, if it allows Oculus to continue to work on evolving and developing VR tech, shielding them from the impact of losing first-mover advantage to Sony; a curse, if that failure to score a clear win in the first round spooks Facebook’s management and investors and causes them to pull the plug. That’s one that could go either way; given the quality of the innovative work Oculus is doing, even if Sony’s approach proves victorious, everyone should hope that the Oculus team gets an opportunity to keep plugging away.

It’s exciting and interesting to see Sony taking this kind of risk. These gambles don’t always pay off, of course – the company placed bets on 3D TV in the PS3 era which never came to fruition, for example – but that’s the nature of innovation and we should never criticise a company for attempting something truly interesting, innovative and even disruptive, as long as it passes the most basic of Devil’s Advocate tests. Sony has desperately needed a Devil’s Advocate in the past – Rolly, anyone? UMD? – but Morpheus is a clear pass, an interesting and exciting product with the potential to truly turn around the company’s fortunes.

I just hope that in the company’s enthusiasm, it understands the absolute importance of getting this right, not just being first. This is a quality Sony was famed for in the past; rather than trying to be first to market in new sectors, it would ensure that it had by far the best product when it launched. This is one of the things which Steve Jobs, a huge fan of Sony, copied from the company when he created the philosophies which still guide Apple (a company that rarely innovates first, but almost always leapfrogs the competition in quality and usability when it does adopt new technology and features). For an experience as intimate as VR – complete immersion in a headset, screens mere centimetres from your eyes – that’s a philosophy which must be followed. When these headsets reach the market, what will be most important isn’t who is first; it isn’t even who is cheapest. The consumer’s first experience must be excellent – nothing less will do. Oculus seems to get that. Sony, in its enthusiasm to disrupt, must not lose sight of the same goal.

 

Courtesy-GI.biz

Will Oculus Go Into The Mobile Space?

September 25, 2014 by Michael  
Filed under Gaming

We attended the first ever Oculus Connect conference, the beats and chatter of a cocktail reception just next door, Max Cohen is being brutally honest about the company’s mobile-based virtual reality headset.

“I can spend ten minutes talking about the problems with this device. We’re not afraid of them,” the VP of mobile says with a smile.

“It overheats if you run it too long. It is 60 Hertz low persistence, which means some people will notice flicker. The graphical quality is obviously a lot less than the PC. Battery life is a concern. There’s no positional tracking.

“We could try to say this is the be-all end-all of VR. We’d be lying. That’s a bad thing. We would hurt where we can get to the be-all end-all of VR. Everyone, Samsung, Facebook, Oculus, we’re all aligned with making a damn good product that we put out in the market and then working on improving it. Really soon, maybe even sooner than you think, we’ll get to that amazing VR experience for everyone.”

“Samsung, Facebook, Oculus, we’re all aligned with making a damn good product”

Cohen’s talking about the Gear VR, the Samsung backed headset that offers a more portable and accessible entry into the virtual reality world for developers and users alike. It’s John Carmack’s passion project at the company and clearly it’s Cohen’s too.

“The first thing they did was to put me in the HD prototype with the Tuscany demo. I was floored, of course,” he remembers.

“Then I got to see the Valve room and then he showed me this mobile project. It was running on a Galaxy S4 at the time. It crashed a little bit. There were a lot of problems with it, but I just thought this was so amazing. I went back and was talking to a friend of mine who’s an entrepreneur. He said it’s rare that you have the opportunity to work on transformational hardware, and that’s really what this was.”

The story of the Gear VR is a simple one; Oculus went to the Korean company hoping to work with them on screens for the PC-based Rift and found Samsung had been working on a headset you could simply slide a Samsung Galaxy phone into to experience virtual reality. Now the companies are working together on both devices, with Samsung fielding calls from Carmack on a regular basis.

“It’s a collaboration. It’s not we tell them what to do or they tell us what to do,” Cohen continues. “We’re the software platform, so when you put that on, you’re in Oculus, but that wouldn’t be possible without maximizing the hardware. Carmack and our team works very closely with their engineering team. They make suggestions about UI as well. We’re working together to make the best possible experience. If it wasn’t collaborative, this thing just honestly wouldn’t function because this is really hard to do.”

The focus of Oculus Connect isn’t the media or sales or even recruitment, but developers. Supporting them, showing them the technology, offering them advice on the new territory that is virtual reality. Cohen, like everyone else I speak to at the weekend, believes developers and their content is absolutely key to the success of the hardware.

“At the end of the day, we want to make the developers’ lives as easy as possible so they can make cool content.”

“Facebook invested in the platform. They didn’t buy it. What they did is they’re also committing money to make sure it’s successful on an ongoing basis”

That content will be supported by an app store, and Cohen wants it to be a place where developers can make a living, rather than just a showcase of free demos. Jason Holtman, former director of business development at Valve, is overseeing its creation.

“We’re going to launch initially with a free store, but maybe a month later, follow along with commerce,” says Cohen.

“At the end of the day, as great as doing the art for free and sharing that is, we will have a hundred times more content when people can actually monetize it. This is a business. There’s nothing wrong with that. People need to be able to feed themselves. Our job is to make the platform as friendly for developers as we can so that it’s painless. You don’t have to worry about a bunch of overhead.”

There’s a sense that the Facebook money, that headline-grabbing $2 billion, has given the team the luxury of time and the chance to recruit the people they need to make sure this time virtual reality lives up to its promises. Other than that, Facebook seems to be letting Oculus just get on with it.

“That’s the thing… a lot of people, with the Facebook acquisition, asked how that would impact us and the answer is it hasn’t, in terms of our culture, and Facebook’s actually supportive of the way Oculus is because we know that content makes or breaks a platform,” says Cohen.

“They invested in the platform. They didn’t buy it. What they did is they’re also committing money to make sure it’s successful on an ongoing basis. We could have continued to raise a lot of venture capital. It would have been very expensive to do it right. Now we have replaced our board of directors with Facebook, but that’s completely fine. They are helping us. They are accelerating our efforts.”

No one at Oculus is talking about release dates for consumer units yet, and Cohen is no different. It’s clear that he and the team are hungry for progress as he talks about skipping minor updates and making major advances. He talks about “awesome” ideas that he’s desperate to get to, and pushing the envelope, but what matters most is getting it right.

“I think everyone understands that with a little bit more magic, VR can be ubiquitous. Everyone needs it. I think a lot of people understand what we need to do to get there, but it takes hard work to actually solve those things. Oculus and Facebook have lined up the right team to do it, but I want us to actually have time to do that,” says Cohen.

“We’re not trying to sell millions now. We’re trying to get people and early adopters, tech enthusiasts and all that interested in it.”

Courtesy-GI.biz

Intel Sampling Xeon D 14nm

September 15, 2014 by Michael  
Filed under Computing

Intel has announced that it is sampling its Xeon D 14nm processor family, a system on chip (SoC) optimized to deliver Intel Xeon processor performance for hyperscale workloads.

Announcing the news on stage during a keynote at IDF in San Francisco, Intel SVP and GM of the Data Centre Group, Diane Bryant, said that the Intel Xeon processor D, which initially was announced in June, will be based on 14nm process technology and be aimed at mid-range communications.

“We’re pleased to announce that we’re sampling the third generation of the high density [data center system on a chip] product line, but this one is actually based on the Xeon processor, called Xeon D,” Bryant announced. “It’s 14nm and the power levels go down to as low as 15 Watts, so very high density and high performance.”

Intel believes that its Xeon D will serve the needs of high density, optimized servers as that market develops, and for networking it will serve mid-range routers as well as other network appliances, while it will also serve entry and mid-range storage. So, Intel claimed, you will get all of the benefits of Xeon-class reliability and performance, but you will also get a very small footprint and high integration of SoC capability.

This first generation Xeon D chip will also showcase high levels of I/O integrations, including 10Gb Ethernet, and will scale Intel Xeon processor performance, features and reliability to lower power design points, according to Intel.

The Intel Xeon processor D product family will also include data centre processor features such as error correcting code (ECC).

“With high levels of I/O integration and energy efficiency, we expect the Intel Xeon processor D product family to deliver very competitive TCO to our customers,” Bryant said. “The Intel Xeon processor D product family will also be targeted toward hyperscale storage for cloud and mid-range communications market.”

Bryant said that the product is not yet available, but it is being sampled, and the firm will release more details later this year.

This announcement comes just days after Intel launched its Xeon E5 v2 processor family for servers and workstations.

Courtesy-TheInq

Vendors Testing New Intel Xeon Processors

September 3, 2014 by Michael  
Filed under Computing

Intel is cooking up a hot batch of Xeon processors for servers and workstations, and system vendors have already designed systems that are ready and raring to go as soon as the chips become available.

Boston is one of the companies doing just that, and we know this because it gave us an exclusive peek into its labs to show off what these upgraded systems will look like. While we can’t share any details about the new chips involved yet, we can preview the systems they will appear in, which are awaiting shipment as soon as Intel gives the nod.

Based on chassis designs from Supermicro, with which Boston has a close relationship, the systems comprise custom-built solutions for specific user requirements.

On the workstation side, Boston is readying a mid-range and a high-end system with the new Intel Xeon chips, both based on two-socket Xeon E5-2600v3 rather than the single socket E5-1600v3 versions.

There’s also the mid-range Venom 2301-12T, which comes in a mid-tower chassis and ships with an Nvidia Quadro K4000 card for graphics acceleration. It comes with 64GB of memory and a 240GB SSD as a boot device, plus two 1TB Sata drives configured as a Raid array for data storage.

For extra performance, Boston has also prepared the Venom 2401-12T, which will ship with faster Xeon processors, 128GB of memory and an Nvidia Quadro K6000 graphics card. This also has a 240GB SSD as a boot drive, with two 2TB drives configured as a Raid array for data storage.

Interestingly, Intel’s new Xeon E5-2600v3 processors are designed to work with 2133MHz DDR4 memory instead of the more usual DDR3 RAM, and as you can see in the picture below, DDR4 DIMM modules have slightly longer connectors towards the middle.

For servers, Boston has prepared a 1U rack-mount “pizza box” system, the Boston Value 360p. This is a two-socket server with twin 10Gbps Ethernet ports, support for 64GB of memory and 12Gbps SAS Raid. It can also be configured with NVM Express (NVMe) SSDs connected to the PCI Express bus rather than a standard drive interface.

Boston also previewed a multi-node rack server, the Quattro 12128-6, which is made up of four separate two-socket servers inside a 2U chassis. Each node has up to 64GB of memory, with 12Gbps SAS Raid storage plus a pair of 400GB SSDs.

Courtesy-TheInq