Last week an alleged slide leak purporting to show AMD’s desktop roadmap appeared and it was quickly picked up by most tech sites. Not us of course, since we knew it was rubbish and we’ve got a couple of inboxes littered with similar fakes.
The slide indicated that AMD was about to ditch big-core FX processors, something that has been rumoured for a couple of years. This is not the case. AMD will not pull the plug on FX products in 2015.
AMD Manager of APU/CPU Product Reviews James Prior told Gamers Nexus that the slide was fake and that FX parts aren’t going anywhere. The actual AMD roadmap doesn’t even cover 2015. Prior said it was “rare” to see roadmaps that go more than a year into the future.
That is odd, because we got three such roadmaps over the weekend. One of them is an AMD ARM consumer roadmap 2014-2016. Sounds legit. Perhaps we should publish it just to see how many clickbait loving news editors would fall for it?
Anyway you can submit your fake roadmaps any day of the week, including Sunday. We accept death threats only on weekdays, 9AM to 6PM. Nick Farrell’s astral initiation rituals are available every weekend. Bring your own chicken (BYOC).
With the release of Grand Theft Auto Online, Rockstar has taken its blockbuster franchise in an ambitious new direction. The multiplayer world, complete with in-game economy, certainly has many of the hallmarks of a Free-2-Play title, but could GTA Online actually make it as a standalone F2P game?
Given the seismic shift the games industry has already made towards F2P, no one would be surprised if Rockstar made this next step. However, there is a lot a stake and creating a successful F2P isn’t simply a case of throwing in some in-app purchases and giving a £40 game away for free.
F2P is already established as the dominant business model for mobile and PC games. Reasons for this include the prevalence of micro-transactions and because these platforms make it relatively easy for publishers and developers to integrate analytics and use that data to make informed real-time game design changes to keep players engaged and increase retention. The transition onto console has been a slower burn – designing successful F2P games requires an understanding and skill set which isn’t necessarily native to publishers with a long heritage in designing games to ship in a box.
“Many F2P console games have come up short, offering a poor tutorial and on boarding process, plus a monetisation structure that is much closer to a used car sales man than an enjoyable experience that puts the control in the users’ hands”
As a result, many F2P console games have come up short, offering a poor tutorial and on boarding process, plus a monetisation structure that is much closer to a used car sales man than an enjoyable experience that puts the control in the users’ hands. However, the data capabilities of the Xbox One and PS4 means that F2P on console finally looks set to take off, with an impressive list of F2P titles already set for release including Little Big Planet, Planetside 2 and War Thunder.
To better understand the potential of console transition we thought we’d take a theoretical look at GTA Online as a standalone F2P title.
Our in-house design team applied GamesAnalytics’ proprietary evidenced based research methodology to benchmark key aspects of its game design against best practice F2P game design from over 80 titles.
Focusing on six main categories including Monetisation, Retention, Engagement and Virality and analysing 50 key criteria the team found unsurprisingly that GTA Online surpassed the best in genre score for Retention, Game Mechanics, Engagement and Game Overview, clearly reflecting the high quality of the game. However, if GTA Online was going F2P it would need to look at mechanics around Monetisation and Virality.
Based on these data findings, here are five recommendations to improve the F2P potential of GTA Online:
1. Improve the currency structure
Currently GTA Online has a single currency, this is fine when the game is not relying on this currency as a part of the monetisation, but for a true F2P game you would want to extend this to provide greater flexibility. Adding in a premium currency is generally the way of giving games more flexibility in delivering the F2P mechanic. Making the currency a part of the world so it feels natural is vital in making sure the monetisation doesn’t jar with the game surrounding.
There are a number of ways that people are encouraged to spend money both in the real and the virtual world. Especially for a game like GTA, it is vital that it feels natural and intuitive. Discounts and bundles are obvious incentives for getting people to invest in in-game economies, but rental and test drives are also a good way of letting players get a taste for the high life and incentivising them to keep grinding or splash the cash.
These ‘try before you buy’ mechanics are good ways of easing players onto the paying path while keeping the barrier low and the incentive high.
Giving players the ability to buy luxury vanity items using a premium currency is exactly the way you would expect Rockstar to monetise its players. The game has always been about getting rich quick and showing off the proceeds of your crimes. This is not about honest hard slog, so it’s fitting that players should be given a quick route to the high life through whatever means at their disposal. A successfully free-to-play GTA Online should also include consumables: things that the player will spend money on that give them a short term advantage or simply let them show off.
2. Introduce a VIP structure to fast track progress and reward members
“This is not about honest hard slog, so it’s fitting that players should be given a quick route to the high life through whatever means at their disposal”
There is no game that is more about being king of the hill than GTA, so a full VIP structure is essential. Imagine the retention value of being the only player that can drive around the hills of Los Santos in a purple Ferrari with gold trim.
VIP membership could offer:
Rank Point/Job Point boosts
Monthly $/Gold allowance
Access to premium clothes, vehicle paint jobs and vanity items
Special members store accessible through the iFruit with daily/weekly member offers
3. Utilise no lose gambling
We’ve already touched on the repetition which exists within GTA Online – completing mission after mission to build up your cash and accessory stockpiles. One alternative to a life of hard graft and long hours is gambling, an easy to implement F2P mechanic which fits with Rockstar’s vision and GTA’s ‘feel’. Mechanics such as magic boxes offer players a no lose gamble: spending some money guarantees something cool. There can be no better way of taking the easy route than making sure the odds are stacked.
4. Introduce a trading mechanism to help increase community aspects
If gambling isn’t your thing then a bit of business on the side can help you make it to the top. Trading in F2P games inevitably encourages a black market, but unlike other F2P games where there is a clear split between grind currency and premium currency, GTA Online F2P should allow this secondary market to exist.
Letting players trade whatever they want will encourage a free-form economy that will favour the adventurous, the ruthless and the downright corrupt. The mechanic will drive the economy and build player loyalty.
Players will buy and sell from each other, and using rare items it is also possible to use data analytics to monitor the price elasticity of items as players bid for certain items. Items can trade for 100x their original value in F2P games and can be useful to define pricing as well as delivering value and incentivising players.
5. Build in reward mechanics for better social sharing
GTA is such a well-known franchise, it pretty much sells itself. However, giving players rewards for inviting other players to join is a well-structured mechanism and can help to double your player base for little or no cost.
Giving players an incentive to invite is key, there would be nothing better than being able to pimp your friends by taking a cut of the money they spend as their due deserves for getting them in to the game in the first place.
Take-Two Interactive Software has repurchased all of the Icahn Group’s stock, a deal worth $203.5 million and involving 12.02 million shares.
“This share repurchase reflects our confidence in the Company’s outlook for record results in fiscal 2014 and continued Non-GAAP profitability every year for the foreseeable future,” said Take-Two CEO Strauss Zelnick.
“With our ample cash and strong expected cash flow, we are able to pursue a variety of investment opportunities, including repurchasing our Company’s stock. On behalf of our board and management team, I would like to thank Brett, James and Sung for their support, dedication and service to our organisation. They leave Take-Two better positioned than ever for continued success.”
The move was funded by cash and cash equivalents on hand and Take-Two explained the move is “part of an ongoing strategy to buy back its shares.”
Take-Two and Icahn gave no reason for the sale of the shares, but as previously agreed, Icahn’s Brett Icahn, Jim Nelson, and SungHwan Cho and have resigned from the Take-Two board.
The Icahn Group is overseen by activist investor Carl Icahn and this year Forbes named him one of its 40 Highest-Earning hedge fund managers. In the past he’s tried to acquire Dell, Marvel Comics and owns a ten percent stake in Netflix.
[UPDATE]: Investors did not greet the news warmly, as Take-Two shares traded at twice their average volume and ended the trading day down 5.49 percent to $16.
Even if it means that it will be the first to make ARM’s 64-bit chips, Intel said that it wants to expand its contract foundry work. Intel CEO Brian Krzanich said he would expand his company’s small contract manufacturing business, paving the way for more chipmakers to tap into the world’s most advanced process technology.
Krzanich told analysts that he planned to step up the company’s foundry work, effectively giving Intel’s process technology to its rivals. He said that company’s who can use Intel’s leading edge and build computing capabilities that are better than anyone else’s, are good candidates for foundry service. Krzanich added that the slumping personal computer industry, Intel’s core market, was showing signs of bottoming out.
Intel also unveiled two upcoming mobile chips from its Atom line designed interchange features to create different versions of the component. A high-end version of the new chip, code named Broxton, and is due out in mid-2015. SoFIA, a low-end chip was shown as an example of Intel’s pragmatism and willingness to change how it does business. Krzanich said that in the interest of speed, SoFIA would be manufactured outside of Intel, with the goal of bringing it to market next year.
Intel will move production of SoFIA chips to its own 14 nanometer manufacturing lines, Krzanich added.
Researchers have made a quantum leap in the search for ultra-fast computing.
Scientist at Simon Fraser University managed to keep information in a quantum memory state for 39 minutes, smashing a hypothetical world record.
Previous attempts yielded results of under 30 seconds at room temperature and just under three minutes in cryogenic conditions.
The global race to harness the power of qubits has high stakes – the ability to create computers capable of calculating many times faster. Qubits are able to exist simultaneously in a superimposed state of ’0′ and ’1′.
This experiment involved a new type of silicon that could, scientists believe, be the secret of creating long term memory in quantum systems.
Speaking to Sky News, co-author of the paper Stephanie Simmons of Oxford University said, “Thirty-nine minutes may not seem very long but as it only takes one-hundred-thousandth of a second to flip the nuclear spin of a phosphorus ion – the type of operation used to run quantum calculations – in theory over two million operations could be applied in the time it takes for the superposition to naturally decay by one percent.”
The next stage will be to find a way to manipulate the qubits to talk to each other in a meaningful way so that information can be passed between them during their short, glorious lives.
Although there is a significant amount of research to come before quantum computing provides an effective alternative to traditional methods, this has been a huge leap forward for the concept, and it’s widely expected that eventually the next leap will be the leap home.
Cash strapped chipmaker AMD has had to apply for a short term loan to help the company slow its financial decline. The outfit has raised half-billion dollar line of finance from a group of lenders, with Bank of America acting as agent.
All this is happening as AMD struggles with the economic downturn and a decline in PC sales. The outfit is good for the cash. After all it is focusing on game consoles and associated royalties, which at its fiscal third-quarter earnings saw the business unit increase in revenue by 110 percent on the previous quarter, and 96 percent year-over-year.
AMD said the proceeds of the five-year secured revolving line of credit, ending November 2018, which retires may be used for general corporate services, such as working capital needs.
The PS4 and Xbox One are about to go on sale and both consoles are powered by custom AMD silicon. Analysts are expecting strong sales and AMD is bound to ship millions of Jaguar-based custom parts for Sony’s and Redmond’s latest consoles.
As a result, AMD is gaining market share in the x86 space. This is hardly surprising given the sheer volume of next-gen consoles that will be produced over the next few quarters, although AMD still lacks competitive x86 parts in the mid-range and high-end segments.
Mercury Research principal analyst Den McCarron told IDG that millions of new consoles will sell in the coming weeks, boosting AMD’s numbers in the process. Intel on the other hand still relies on shipments of PC and server parts, so the PC slump is taking its toll.
McCarron argues AMD’s long-term goal is to get outside the PC market. AMD is already seeing growth thanks to custom chips in the non-PC space. Meanwhile Intel is hoping to seize more tablet market share with Bay Trail parts. Neither AMD nor Intel have any smartphone at this point, although Intel is slowly getting there.
Intel ended Q3 with an 80.2 percent market share, down from 83.3 percent a year ago. AMD went up to 19.3 percent, up from 16.1 percent. However, in the PC space Intel actually gained share, while AMD’s share dropped from 16.1 to 15.8 percent.
AMD is unlikely to score big design wins for custom chips in the short run, but with emerging technologies like HSA its upcoming APU-based server parts and their custom derivatives could become a bit more interesting.
Intel has acquired educational software developer Kno to add to its Education division.
Speaking in the company blog, Intel Sales and Marketing Group VP John Galvin explained that in a world where kids are being bombarded by technology, Intel Education has a mission to support the rollout of technology in the classroom.
Galvin said, “The Kno platform provides administrators and teachers with the tools they need to easily assign, manage and monitor their digital learning content and assessments.”
This acquisition brings Intel’s global digital content library to over 225,000 [higher education] and K-12 titles from 74 education publishers. “We’re looking forward to combining our expertise with Kno’s rich content so that together, we can help teachers create classroom environments and personalized learning experiences that lead to student success,” Galvin added.
Intel Education has been working for the past decade with over 10 million teachers that it has assisted to integrate technology with education.
In the UK alone there have been tremendous strides in educational software over the past 30 years, dating back to the government pledge to provide a computer in every school, which led to the creation of the BBC Microcomputer designed specifically for that purpose.
Today, not only is ICT a dedicated lesson in its own right, but it forms one of the key skills that educators are expected to incorporate into all lesson plans, putting it on a par with English and Maths, showing just how far we’ve come from making Venn diagrams with ascii art.
As evidenced by the 40-foot console constructed in a Vancouver parking lot recently, Microsoft expects Xbox One to be big. Microsoft Canada’s Xbox director of marketing Craig Flannagan put the November 22 launch into perspective.
“I’ve been here for the launch of Xbox 360. I was here for the launch of Kinect. This is far and away the biggest launch we’ve ever done,” Flannagan said. “It’s the most hardware we’ve ever produced. It’s the most we’ve ever pre-sold. We’re preselling a little over 2-to-1 from what we did with Xbox 360. The momentum on launch has been really good. And we didn’t have a 40-foot console at the launch of the 360, either.”
As for how Xbox One will fare against the PlayStation 4 and Wii U, Flannagan pointed to Xbox Live and the company’s focus on social integration as two differentiating factors that will give it the edge. He also said he was proud of the game lineup, saying Xbox One exclusives walked out of E3 with twice the awards of both competitors.
“Xbox One is going to start ahead, in terms of the experience we can deliver,” Flannagan said. “And because we’re built for the future, we’re going to stay ahead. I think there is not a better experience you can buy this holiday, and there will not be a time this generation where there’s a better experience you can buy than Xbox One…And it’s probably going to be a pretty long generation. We’re probably here for a while because we’re built for the future. This is a console that will last you, conservatively a decade, if I had to put a bet down today.”
The idea of a launch Xbox One lasting a decade brings to mind the Red Ring of Death and Microsoft’s notoriously unreliable Xbox 360 launch hardware. When asked if he’s heard consumers expressing concerns about the Xbox One’s durability, Flannagan said, “Not really.”
“We feel great about where the hardware is at right now,” Flanagan said. “Our yields are good. It’s allowing us to produce more consoles than we ever have for a launch. We feel great about how the hardware is performing.”
While Flannagan expects the hardware purchased this month to keep running years into the future, he doesn’t expect it to offer the same experience. Just as the Xbox One went through multiple different dashboards and overhauled feature sets over the course of the last eight years, so too will the Xbox One evolve.
“Much like 360, Xbox One’s not going to look a whole lot five years from now like it does on November 22, 2013. I don’t know where it’s going to go, but that’s kind of fun because we’re built for the future. We do have a connection; we can change what things look like and how it performs.”
According to a new report from research firm Strategy Analytics, global shipments of mobile SoCs in the second quarter of 2013 were 44 percent higher than in the same period last year.
Qualcomm still dominates the market, with a 53-percent revenue share. Apple ranks second at 15 percent, while MediaTek got the bronze with an 11-percent share. Samsung came in fourth, trailed by Spreadtrum, a fabless Chinese chipmaker that specializes in TD-SCDMA 3G-enabled parts.
So how did it end up so high in the rankings? Well, Spreadtrum is the third biggest player in China, a market traditionally dominated by MediaTek. While Qualcomm and MediaTek started to focus on mid-range parts for the local market, Sptreadtrum decided to churn out cheaper, low-end parts.
“Strategy Analytics estimates that low-cost suppliers MediaTek and Spreadtrum together captured over one-third volume share in the smartphone applications processor market in Q2 2013, thanks to the smartphone boom in emerging markets,” said Sravan Kundojjala, Senior Analyst, Strategy Analytics. “MediaTek and Spreadtrum’s improving global footprint coupled with their maturing product portfolio could spell a threat to global players such as Qualcomm, Broadcom, NVIDIA and Intel.”
Qualcomm still has a virtual monopoly in the LTE market and it is expected to further strengthen its position as Snapdragon 800 parts end up in more designs. However, it is losing share in China.
Although Samsung and Apple rank relatively high, they don’t exactly have a habit of selling their chips to competing handset makers, and even if they did, they don’t exactly make cheap chips, so the China market is practically up for grabs.
Japanese gamers are playing a slightly different version of Rockstar’s Grand Theft Auto V (GTA 5) than players in the US and Europe.
Gaming website Kotaku has posted a video that compares the two versions, and we can see that most of the differences relate to Trevor, who is perhaps the most colorful of all the GTA characters.
In the game Trevor performs various acts that can be viewed as unsavory. In one he tortures a chap, while in another he drops his pants.
In the case of the latter, someone, probably the waggish Japanese censor, has installed a second pair of trousers on the Trevor character. This means that when he flashes his genitals in the UK he only shows off a clean pair of pants in Japan. It doesn’t make much sense, but nor does most of Trevor’s behavior.
A bigger difference is seen in the Japanese telling of the torture scene, a part of the game that some people found stretched things too far anyway. In the version seen in the US and the UK Trevor is tasked with selecting an implement with which he must rearrange the bone structure of a man from whom he needs information. In the Japanese version he doesn’t, and those gameplay parts are just skipped.
A couple of sex scenes have also been excised from the game. One of these features Trevor, who is also watching TV at the same time, while another happens during one of the paparazzi missions.
The Windows 8.1 launch didn’t get much attention, which probably has something to do with the fact that it’s basically Windows 8 done right. However, users of AMD APUs could have a good reason to celebrate.
According to AMD’s senior marketing manager Clarice Simmons, Windows 8.1 is a lot better than Windows 8 when it comes to harnessing the potential of AMD silicon. Writing in her blog, Simmons said the new OS could deliver performance gains of up to 9.5 percent on some PCs based on AMD APUs.
However, her numbers are for the A10-6800K and the 9.5 percent gain only applies to machines with an outdated video driver. With the same driver, the difference is actually 3.5 percent, which still isn’t bad but it’s not nearly as good as 9.5 percent.
“Our work with Microsoft includes development on the essential operating system “plumbing” that enables Windows to directly leverage AMD technology in order to run more efficiently. The two companies also cooperate on the development and tuning of the latest AMD video drivers,”wrote Simmons.
“Of course AMD’s fast CPU and GPU cores contribute to high performance, but having software that is optimized to take advantage of the AMD hardware architecture is a significant advantage. Tuning our device drivers to simultaneously suit AMD hardware, software applications, and Windows 8.1 makes systems more streamlined.”
Simmons also pointed out that AMD Wireless Display works better on Windows 8.1, due to better architectural implementation and support for Miracast, better ecosystem support and new solutions that enable the OS to tap low latency display encode paths available in Radeons.
Michael Dailly is head of development at Yoyo Games in Dundee. He is also the chap that gave the world Lemmings and GTA.
He has been posting updates to his GTA project on Twitter and has shared some imagery. It doesn’t look much like the Grand Theft Auto we have seen in GTA 5, but it’s still cool.
The original Grand Theft Auto was released in 1997 on the PSOne, Windows PC and Nintendo Gameboy Colour. It was followed two years later with a London version that added the Sega Dreamcast to its hardware list.
Dailly said that it has some gaps, but explained that he does not have the rights to the maps, and can only work from what he has. What he has runs in HTML5 and WebGL at 60 frames per second, according to the USgamers website. This should mean that while you won’t be able to play it you will be able to use it through your web browser for virtual tours.
@gnysek I can’t really give it out I’m afraid….I don’t own the assets, and the extraction tool isn’t complete. I had to do bits by hand
— Michael Dailly (@mdf200) October 16, 2013
In tweeted messages he said that original CMP files are uploaded into Gamemaker Studio software – that’s Yoyo Games’ software – and stored as a “2D Grid with 1D arrays in it (so 3D map)”.
“It’s the original .CMP map format the game uses. I just import that and convert to a 3D model on load,” he added. “Taking stock, looks like all I’m missing from the rendering of the #GTA level now, is slopes. You can see the large holes in the map. Shouldn’t take too long to fix,but probably only when I get back from holiday.”
He has gone on holiday now, so will not be doing any more work on his project in the short term.
Not too long ago, Sony and Microsoft laid bare the engines of their eighth generation consoles. CPU clock speeds and DDR3 Ram numbers were bandied about, GHz were brought to bear, teraflops flaunted salaciously. When the dust eventually settled and the media guns lay relatively silent once more, a fairly predictable treaty was agreed upon: to all but the most technically minded of consumers, there’s little to choose between the raw grunt of the two machines. The company supporting each machine has its priorities, its foibles and its USPs, and we’re discounting Kinect and the first-party exclusives, but in the on-paper battle of boxes, you can expect much of a muchness.
Except now Microsoft’s John Bruno is now telling me that those numbers aren’t the full story, and it might not be unfeasible that they’ll one day become almost completely irrelevant. Microsoft, owner of one of the largest and most powerful arrays of computational servers in the known universe, is putting it to use on Xbox Live.
Now, we’ve all heard promises about cloud processing and non-local computation. For a while, it seemed like it might be the future. Then it seemed like perhaps it might not. The public, burned by an experience which promised so much and delivered so little, returned to thinking of the cloud purely as a handy place to keep save games and MP3s. Now, says Bruno, that might all be about to change with advent of Xbox Live Compute, a service which “is specifically designed to enable game creators to utilize the scalable computing resources that Microsoft deploys within our regional datacenters, to enhance their game experiences beyond what is generally possible with the finite resources of a console.”
What that means is not just convenience or multi-device access to content, but a significant extension of the power and scope which the Xbox One can offer developers and players. It means persistent worlds, improved AI, better rendering and dedicated servers for every multiplayer game on the platform. And it’s all being offered to developers for free.
“Essentially what we did, about a year and a half ago, was sit down with a big group of game devs, some of whom have talked about their development on the platform,” Bruno explains to me. He’s the lead program manager of Xbox Live, a role which involves overseeing product direction as well as the engineering teams that build the Compute services.
“We really tried to understand how we could help them on the server side, we have this huge asset of lots of available computing power in the cloud. The intent was to build a platform which takes away a lot of the heavy lifting from server development. Things like scalability, things like peer distribution, things like being able to monitor and keep servers healthy: things that don’t really do a lot for game development, but if we were to take that problem away from them and enable them to focus on building better games, think of the amazing things they’d be able to do with the additional compute power.
“So really what the service is intended to do is to provide more of the infrastructure type services and deliver the on-demand compute features to developers so that they can build that into their games from the outset. What we’ve seen, from a feature function benefit perspective, at least in v1.0, is that dedicated server multiplayer is a lot easier to build on Xbox One than it has been in previous years. So that was an obvious key benefit and there are a lot of key benefits to multiplayer gaming from that. We’ve also seen things like Forza, where they’ve done a lot with Drivatar and a lot of AI computations in the cloud. The cloud can just get smarter about the player and the game.
“One of the other things we’ve really been trying to push on is games as a service, we’ve seen this with other online games, but from a console view we saw it as a real opportunity to get games to be more adaptive, with more updates directly from the cloud. Building a game configuration in from the outset, so that game developers can tweak and tune the game without having to update the physical bits actually on the box.
“So again, building that sort of infrastructure to make those scenarios easier for developers was sort of our initial goal. We see a lot of opportunity in the future, there’s a large number of things we’re considering for the future, but right now we’re obviously laser-focused on making it a really great launch.”
That’s an understandably fuzzy picture of the future, considering the program’s nascent qualities, but will it sell to the customer? So far, the cloud seems to be cut from the same cloth as the clothes of the proverbial Emperor. It’s everywhere, but doing relatively little of practical use. What actual difference is this going to make to players?
“From a computing perspective, server computing is evolving at a rapid rate,” Bruno offers. “We expect that, over time, there’ll be tons and tons more power that comes online from a server point of view. The physical box, with the chips in it that it has, well there’s no easy way to upgrade that. So we do expect that over time we’ll see more and more offloading of intensive CPU processing to the cloud.
“Now what that buys game developers is that, as you can imagine, they’re going to make trade-offs in their game as to what they’re going to use the local CPU for versus the remote CPU. We believe that there’s going to be higher fidelity experiences over time, because of having that ability to offload those tasks that they often have to trade off with local resource. So we do expect higher fidelity games over time, we do expect that the cloud will just be better from a pure computing point of view.”
Suspecting that this might not be specific enough for some, I try to nail Bruno down to a specific measure of the improvements we can expect. Is this going to be on the order of magnitude of a jump from 30fps to 60, for example, or the switch from SD to HD?
“That’s not a question that’s actually that easy for me to answer,” he tells me, diplomatically. “Mostly because a lot of that depends on how the game is built. What I can tell you is what we’ve seen with some of our developers, in the case of someone like Respawn, is that adding that additional CPU resource for them in the cloud has made a huge difference in terms of what they can do locally on the box. So we’re super excited about what we can do in the short term, but in the long term there’s a lot of opportunities. Especially when you look at what our launch footprint looks like from a datacentre perspective and what that can grow to over a number of years.”
Obviously, being an remote resource, utilising the Compute network is going to require a reliable, always-on internet connection. Last time Microsoft tried to introduce something along those lines, it ended in something of a backpeddle. What makes Bruno convinced that the announcement of Xbox Live Compute isn’t going to result in a similar outcry?
“I think it comes down to a couple of things. One is that users who want to play multiplayer games are going to play them online. So for argument’s sake we can assume that there’s a connection there. The game itself can make the decision about what sort of experience it wants to deliver online vs. offline. I think that obviously there are some benefits to being online, and there are some benefits to being offline, but I generally think that it will be additive to users that are online.”
It’s a tricky proposition, and aiming the advantages at those who are already going to be permanently connected is a canny way to get around it. Bruno tells me that “at launch the experiences will be predominantly multiplayer,” but there will be more to come on the single-player side in the future, if the developers decide to use it. For now, however, it’s going to be the blockbuster multiplayer games like Titanfall and Forza 5 which are going to be the big beneficiaries.
“We’ve had Forza 5 working on it from day one,” Bruno confirms. “We’ve had Titanfall working on it in the more recent months. I’d say Titanfall is definitely pushing on the additional computing resources; they’re doing a good job of taking advantage of what’s in the box and what’s on the cloud. The Forza guys have done a really good job of providing a good multiplayer story as well as the AI technology for Drivatar in the cloud as well. So we’ve definitely had a great partnership from our development shops, both first and third party.
“We are giving this resource away to them for free, so there is a huge incentive to utilise it on Xbox One as much as possible. I don’t think that game developers of that magnitude, the Activisions and EAs, are going to put all their eggs into that basket. I think that any good service infrastructure is going to pick and choose the way that they architect the system in the way that’s most beneficial to them. I think there’ll be cases where developers will want services that the Compute isn’t designed for, things like database services or CDNs, things that are going to provide different experiences that are unique to the way that they want to build the game.
“But I do think that will be advantages to the smaller game shops that had previously been spooked about getting into the server development because of the financial obstacle or the development obstacle there. That was one of the big intents, to take this barrier to entry of server development away and let these developers really explore what they could do with the cloud without having to worry about allocating financial resources or server developers to the problem.
“We’ve even heard stories where the developers have had that and wanted to shut down games and servers over time and that really does disrupt their communities. One of the big advantages of our service is that it’s completely on demand, so that as games wax and wane in popularity so do the resources that get applied to it from Compute. Providing that elastic scale at a really beneficial cost price point is a big benefit to developers.”
Giving those big-hitters new toys to play with might be a good thing for the end user who wants to while away endless hours in the worlds of Titanfall or Battlefield, but it doesn’t do too much for Microsoft’s reputation with the indies. Presumably there’s not going to be much need to utilise Compute unless you’re already stretching the Xbox One’s internal organs, but is this extra dimension reserved only for major publishers, or can anyone get a piece of the action? At the bottom line, is the extent to which you can take advantage of Compute tied to success?
“Technically we have developer policies that we apply for any of our assets for Xbox Live, we don’t make a lot of those public – but I should say the intent is to incentivise developers to do great things with the computing power but obviously not run away with it. So we have put some minimal guidance in place, we’re trying to encourage this environment where developers can iterate and do more with the server and so we don’t want to be limiting but at the same time we want to make sure there are some guardrails to keep cost somewhat under control.”
So far, so free-market capitalism, but I feel we’ve not really reached the end of the list of potential gripes which consumers are going to raise. What about dropped connections, server side crashes, lost data and unavailable services? Bruno is surprisingly honest and pragmatic.
“Well, there are always some risks associated with any internet connection, right? But we are trying to provide facilities to developers to help them mitigate those types of things. One of the great things about building on the server is that lost connections are something that the server can smartly detect and deal with from a state-saving perspective. We have also included this notion of storing a state for a game session, so a game that like similar to Minecraft, for example, with a number of players participating in a shared objective, that can be stored in the cloud in the event of disconnects.
Potentially, then, this could be something the effect of which exponentially increases over time. If there’s the power to turn the Xbox One into what is essentially a terminal, streaming content processed on a different continent, surely this is going to extend the lifecycle of the machine tremendously?
“I don’t know that it’s true or untrue,” Bruno admits. “I guess at the end of the day we believe that the cloud is going to augment the Xbox One experience pretty well and it’s obviously going to get better over time. Does that extend the life of the box? Potentially, I guess we’re going to have to wait and see.”
When is a blink not a natural blink? For Google the question has such ramifications that it has devoted a supercomputer to solving the puzzle.
Slashgear reports that the internet giant is using its $10 million quantum computer to find out how products like Google Glass can differentiate between a natural blink and a deliberate blink used to trigger functionality.
The supercomputer based at Google’s Quantum Artificial Intelligence Lab is a joint venture with NASA and is being used to refine the algorithms used for new forms of control such as blinking. The supercomputer uses D-Wave chips kept at as near to absolute zero as possible, which makes it somewhat impractical for everyday wear but amazingly fast at solving brainteasers.
A Redditor reported earlier this year that Google Glass is capable of taking pictures by responding to blinking, however the feature is disabled in the software code as the technology had not advanced enough to differentiate between natural impulse and intentional request.
It is easy to see the potential of blink control. Imagine being able to capture your life as you live it, exactly the way you see it, without anyone ever having to stop and ask people to say “cheese”.
Google Glass is due for commercial release next year but for the many beta testers and developers who already have one this research could lead to an even richer seam of touchless functionality.
If nothing else you can almost guarantee that Q will have one ready for Daniel Craig’s next James Bond outing.