Take-Two Interactive Software has repurchased all of the Icahn Group’s stock, a deal worth $203.5 million and involving 12.02 million shares.
“This share repurchase reflects our confidence in the Company’s outlook for record results in fiscal 2014 and continued Non-GAAP profitability every year for the foreseeable future,” said Take-Two CEO Strauss Zelnick.
“With our ample cash and strong expected cash flow, we are able to pursue a variety of investment opportunities, including repurchasing our Company’s stock. On behalf of our board and management team, I would like to thank Brett, James and Sung for their support, dedication and service to our organisation. They leave Take-Two better positioned than ever for continued success.”
The move was funded by cash and cash equivalents on hand and Take-Two explained the move is “part of an ongoing strategy to buy back its shares.”
Take-Two and Icahn gave no reason for the sale of the shares, but as previously agreed, Icahn’s Brett Icahn, Jim Nelson, and SungHwan Cho and have resigned from the Take-Two board.
The Icahn Group is overseen by activist investor Carl Icahn and this year Forbes named him one of its 40 Highest-Earning hedge fund managers. In the past he’s tried to acquire Dell, Marvel Comics and owns a ten percent stake in Netflix.
[UPDATE]: Investors did not greet the news warmly, as Take-Two shares traded at twice their average volume and ended the trading day down 5.49 percent to $16.
The phone is a variant, though not an outright successor, of the Lumia 520, and helps Nokia offer Windows Phone at a more accessible price to a larger number of users, a spokeswoman said via email.
The smartphone will go on sale before the end of the year in China, Vietnam, Hong Kong, Cambodia, Singapore and Russia. In China, it is priced at 1099 yuan ($180) before taxes and subsidies. It will then go on sale in Australia, New Zealand, Ukraine, Khazakstan and parts of Africa during the first quarter of next year, according to Nokia.
During the third quarter, Lumia sales increased by 19 percent quarter-on-quarter to 8.8 million units, reflecting strong demand particularly for the Lumia 520, Nokia said. The Lumia 525 and the expanded distribution it brings, then, is important to Nokia.
Other than 1GB of RAM, rather than 512MB, the specs of the Lumia 525 are identical to what users get with the Lumia 520. That includes a 4-inch screen with a resolution of 800 by 480 pixels, a 5-pixel camera and dual-core 1GHz processor. There is also 8GB of integrated memory and a microSD card slot.
The market for sub-$200 smartphones is at a crossroads, mostly thanks to Google’s efforts. The recently announced Moto G from Google-owned Motorola Mobility costs as much as the Lumia 525, but is powered by a 1.2GHz quad-core processor and has a 4.5-inch 720p screen.
Even though the Lumia 520 has helped increase the popularity of Windows Phone, Nokia and Microsoft can’t afford to rest. Their main priority should now be to bring down the cost of Windows Phones to below $100 without a contract, said Pete Cunningham, principal analyst at Canalys.
Nokia shareholders last week voted to approve Microsoft’s acquisition of “substantially all” of the company’s Devices & Services business. The deal is expected to close during the first quarter of next year.
The Galaxy Grand 2 has a number of hardware improvements over the first Grand, which was announced last December. The Grand 2′s processor has four cores, twice as many as its predecessor’s, but runs at the same speed, 1.2GHz. Its screen measures 5.25 inches across the diagonal and can display HD video with a resolution of 1280 by 720 pixels, an improvement on its predecessor’s 5-inch, 800-by-400-pixel screen.
To power those performance improvements, Samsung has increased the battery capacity from 2,100 mAh to 2,600 mAh. The bigger battery and screen has had little effect on the size and weight of the Grand 2 compared to its predecessor. It is one gram heavier and a couple of millimeters longer and wider. The thickness is virtually the same at 8.9 millimeters versus 9.5 millimeters, according to Samsung’s spec sheets.
The Grand 2 has the same resolution and basic processor configuration as the recently announced Moto G from Google-owned Motorola Mobility, which set a new performance benchmark for devices costing around $180 without a contract.
Like its predecessor, the Grand 2 has an 8-megapixel camera, while the Moto G only has a 5-megapixel camera. Neither device supports LTE. For storage, Samsung has stuck with 8GB of integrated storage and a microSD card slot, while the Motorola device is available with 8GB or 16GB of built-in storage, but no card slot.
Both devices run Android 4.3, but while Motorola has said it will upgrade the Moto G to version 4.4 in January, Samsung is mum on its upgrade plans. Samsung also isn’t saying what the Grand 2 will cost, so for now its hard to say which is the better value.
Sony is looking to make $250 million worth of cuts in its entertainment business, including shifting movie investment to TV production and media networks, and reducing the output of Columbia Pictures.
The company’s largest investor has suggested Sony should consider selling-off parts of its business, but at a meeting of investors yesterday CEO Kaz Hirai made the case for keeping its entertainment divisions as one.
“I know that the whole of Sony is greater than the sum of its parts,” he said, as reported by Bloomberg. “Sony Entertainment is a core part of Sony and is crucial to our future growth.”
He pointed to the introduction of exclusive Sony content for the PlayStation 4 and the adoption of Blu-ray in the PS3 as examples of synergy in the business.
But according to CEO of the entertainment division Michael Lynton, “no cost is too sacred to cut”.
The business is currently looking at $150 million of overhead and operational efficiencies and $100 million of procurement savings, according to the report.
Researchers have made a quantum leap in the search for ultra-fast computing.
Scientist at Simon Fraser University managed to keep information in a quantum memory state for 39 minutes, smashing a hypothetical world record.
Previous attempts yielded results of under 30 seconds at room temperature and just under three minutes in cryogenic conditions.
The global race to harness the power of qubits has high stakes – the ability to create computers capable of calculating many times faster. Qubits are able to exist simultaneously in a superimposed state of ’0′ and ’1′.
This experiment involved a new type of silicon that could, scientists believe, be the secret of creating long term memory in quantum systems.
Speaking to Sky News, co-author of the paper Stephanie Simmons of Oxford University said, “Thirty-nine minutes may not seem very long but as it only takes one-hundred-thousandth of a second to flip the nuclear spin of a phosphorus ion – the type of operation used to run quantum calculations – in theory over two million operations could be applied in the time it takes for the superposition to naturally decay by one percent.”
The next stage will be to find a way to manipulate the qubits to talk to each other in a meaningful way so that information can be passed between them during their short, glorious lives.
Although there is a significant amount of research to come before quantum computing provides an effective alternative to traditional methods, this has been a huge leap forward for the concept, and it’s widely expected that eventually the next leap will be the leap home.
Cash strapped chipmaker AMD has had to apply for a short term loan to help the company slow its financial decline. The outfit has raised half-billion dollar line of finance from a group of lenders, with Bank of America acting as agent.
All this is happening as AMD struggles with the economic downturn and a decline in PC sales. The outfit is good for the cash. After all it is focusing on game consoles and associated royalties, which at its fiscal third-quarter earnings saw the business unit increase in revenue by 110 percent on the previous quarter, and 96 percent year-over-year.
AMD said the proceeds of the five-year secured revolving line of credit, ending November 2018, which retires may be used for general corporate services, such as working capital needs.
As evidenced by the 40-foot console constructed in a Vancouver parking lot recently, Microsoft expects Xbox One to be big. Microsoft Canada’s Xbox director of marketing Craig Flannagan put the November 22 launch into perspective.
“I’ve been here for the launch of Xbox 360. I was here for the launch of Kinect. This is far and away the biggest launch we’ve ever done,” Flannagan said. “It’s the most hardware we’ve ever produced. It’s the most we’ve ever pre-sold. We’re preselling a little over 2-to-1 from what we did with Xbox 360. The momentum on launch has been really good. And we didn’t have a 40-foot console at the launch of the 360, either.”
As for how Xbox One will fare against the PlayStation 4 and Wii U, Flannagan pointed to Xbox Live and the company’s focus on social integration as two differentiating factors that will give it the edge. He also said he was proud of the game lineup, saying Xbox One exclusives walked out of E3 with twice the awards of both competitors.
“Xbox One is going to start ahead, in terms of the experience we can deliver,” Flannagan said. “And because we’re built for the future, we’re going to stay ahead. I think there is not a better experience you can buy this holiday, and there will not be a time this generation where there’s a better experience you can buy than Xbox One…And it’s probably going to be a pretty long generation. We’re probably here for a while because we’re built for the future. This is a console that will last you, conservatively a decade, if I had to put a bet down today.”
The idea of a launch Xbox One lasting a decade brings to mind the Red Ring of Death and Microsoft’s notoriously unreliable Xbox 360 launch hardware. When asked if he’s heard consumers expressing concerns about the Xbox One’s durability, Flannagan said, “Not really.”
“We feel great about where the hardware is at right now,” Flanagan said. “Our yields are good. It’s allowing us to produce more consoles than we ever have for a launch. We feel great about how the hardware is performing.”
While Flannagan expects the hardware purchased this month to keep running years into the future, he doesn’t expect it to offer the same experience. Just as the Xbox One went through multiple different dashboards and overhauled feature sets over the course of the last eight years, so too will the Xbox One evolve.
“Much like 360, Xbox One’s not going to look a whole lot five years from now like it does on November 22, 2013. I don’t know where it’s going to go, but that’s kind of fun because we’re built for the future. We do have a connection; we can change what things look like and how it performs.”
Japanese gamers are playing a slightly different version of Rockstar’s Grand Theft Auto V (GTA 5) than players in the US and Europe.
Gaming website Kotaku has posted a video that compares the two versions, and we can see that most of the differences relate to Trevor, who is perhaps the most colorful of all the GTA characters.
In the game Trevor performs various acts that can be viewed as unsavory. In one he tortures a chap, while in another he drops his pants.
In the case of the latter, someone, probably the waggish Japanese censor, has installed a second pair of trousers on the Trevor character. This means that when he flashes his genitals in the UK he only shows off a clean pair of pants in Japan. It doesn’t make much sense, but nor does most of Trevor’s behavior.
A bigger difference is seen in the Japanese telling of the torture scene, a part of the game that some people found stretched things too far anyway. In the version seen in the US and the UK Trevor is tasked with selecting an implement with which he must rearrange the bone structure of a man from whom he needs information. In the Japanese version he doesn’t, and those gameplay parts are just skipped.
A couple of sex scenes have also been excised from the game. One of these features Trevor, who is also watching TV at the same time, while another happens during one of the paparazzi missions.
The Windows 8.1 launch didn’t get much attention, which probably has something to do with the fact that it’s basically Windows 8 done right. However, users of AMD APUs could have a good reason to celebrate.
According to AMD’s senior marketing manager Clarice Simmons, Windows 8.1 is a lot better than Windows 8 when it comes to harnessing the potential of AMD silicon. Writing in her blog, Simmons said the new OS could deliver performance gains of up to 9.5 percent on some PCs based on AMD APUs.
However, her numbers are for the A10-6800K and the 9.5 percent gain only applies to machines with an outdated video driver. With the same driver, the difference is actually 3.5 percent, which still isn’t bad but it’s not nearly as good as 9.5 percent.
“Our work with Microsoft includes development on the essential operating system “plumbing” that enables Windows to directly leverage AMD technology in order to run more efficiently. The two companies also cooperate on the development and tuning of the latest AMD video drivers,”wrote Simmons.
“Of course AMD’s fast CPU and GPU cores contribute to high performance, but having software that is optimized to take advantage of the AMD hardware architecture is a significant advantage. Tuning our device drivers to simultaneously suit AMD hardware, software applications, and Windows 8.1 makes systems more streamlined.”
Simmons also pointed out that AMD Wireless Display works better on Windows 8.1, due to better architectural implementation and support for Miracast, better ecosystem support and new solutions that enable the OS to tap low latency display encode paths available in Radeons.
The Lumia 1320 and Lumia 1520, revealed at the Nokia World event in Abu Dhabi on Tuesday, both have 6-inch screens. The 1520 is the high-end model, with a full HD screen, LTE and a quad-core Snapdragon 800 processor. The device has 32GB of storage, which can be expanded by another 64GB using a microSD card slot, something that has been missing from recent Nokia smartphones.
Nokia is leaning on its camera technology to differentiate its products from rivals. The Lumia 1520 has a 20-megapixel camera with optical image stabilization. Nokia has also developed a new app called Camera that lets users access settings more easily, the company said.
The Lumia 1520 will start shipping this quarter in Hong Kong, Singapore, the U.S., China, the U.K., France, Germany and Finland. The price will be $749 before taxes and subsidies.
The Lumia 1320 will be cheaper at $339 before taxes and subsidies, but only has a dual-core processor and 720p screen resolution. It also has a simpler 5-megapixel camera, but users can still access the Internet using LTE. Nokia expects to start shipping it in the first quarter of 2014 in China and Vietnam, followed by other Asian markets, India and Europe.
The lower price will make the smartphone a good fit for the Chinese market, according to Pete Cunningham, principal analyst at Canalys.
Both devices will run a new version of Windows Phone 8 called General Distribution Release 3, to which Nokia has added enhancements such as its Camera app. The software will also be offered to users of existing Lumia devices via an update called Black.
Instagram and Vine will soon be available on Lumia devices too, Nokia announced. App availability is still Windows Phone’s Achilles heel, but the availability of those two third-party apps is a step in the right direction.
Michael Dailly is head of development at Yoyo Games in Dundee. He is also the chap that gave the world Lemmings and GTA.
He has been posting updates to his GTA project on Twitter and has shared some imagery. It doesn’t look much like the Grand Theft Auto we have seen in GTA 5, but it’s still cool.
The original Grand Theft Auto was released in 1997 on the PSOne, Windows PC and Nintendo Gameboy Colour. It was followed two years later with a London version that added the Sega Dreamcast to its hardware list.
Dailly said that it has some gaps, but explained that he does not have the rights to the maps, and can only work from what he has. What he has runs in HTML5 and WebGL at 60 frames per second, according to the USgamers website. This should mean that while you won’t be able to play it you will be able to use it through your web browser for virtual tours.
@gnysek I can’t really give it out I’m afraid….I don’t own the assets, and the extraction tool isn’t complete. I had to do bits by hand
— Michael Dailly (@mdf200) October 16, 2013
In tweeted messages he said that original CMP files are uploaded into Gamemaker Studio software – that’s Yoyo Games’ software – and stored as a “2D Grid with 1D arrays in it (so 3D map)”.
“It’s the original .CMP map format the game uses. I just import that and convert to a 3D model on load,” he added. “Taking stock, looks like all I’m missing from the rendering of the #GTA level now, is slopes. You can see the large holes in the map. Shouldn’t take too long to fix,but probably only when I get back from holiday.”
He has gone on holiday now, so will not be doing any more work on his project in the short term.
Not too long ago, Sony and Microsoft laid bare the engines of their eighth generation consoles. CPU clock speeds and DDR3 Ram numbers were bandied about, GHz were brought to bear, teraflops flaunted salaciously. When the dust eventually settled and the media guns lay relatively silent once more, a fairly predictable treaty was agreed upon: to all but the most technically minded of consumers, there’s little to choose between the raw grunt of the two machines. The company supporting each machine has its priorities, its foibles and its USPs, and we’re discounting Kinect and the first-party exclusives, but in the on-paper battle of boxes, you can expect much of a muchness.
Except now Microsoft’s John Bruno is now telling me that those numbers aren’t the full story, and it might not be unfeasible that they’ll one day become almost completely irrelevant. Microsoft, owner of one of the largest and most powerful arrays of computational servers in the known universe, is putting it to use on Xbox Live.
Now, we’ve all heard promises about cloud processing and non-local computation. For a while, it seemed like it might be the future. Then it seemed like perhaps it might not. The public, burned by an experience which promised so much and delivered so little, returned to thinking of the cloud purely as a handy place to keep save games and MP3s. Now, says Bruno, that might all be about to change with advent of Xbox Live Compute, a service which “is specifically designed to enable game creators to utilize the scalable computing resources that Microsoft deploys within our regional datacenters, to enhance their game experiences beyond what is generally possible with the finite resources of a console.”
What that means is not just convenience or multi-device access to content, but a significant extension of the power and scope which the Xbox One can offer developers and players. It means persistent worlds, improved AI, better rendering and dedicated servers for every multiplayer game on the platform. And it’s all being offered to developers for free.
“Essentially what we did, about a year and a half ago, was sit down with a big group of game devs, some of whom have talked about their development on the platform,” Bruno explains to me. He’s the lead program manager of Xbox Live, a role which involves overseeing product direction as well as the engineering teams that build the Compute services.
“We really tried to understand how we could help them on the server side, we have this huge asset of lots of available computing power in the cloud. The intent was to build a platform which takes away a lot of the heavy lifting from server development. Things like scalability, things like peer distribution, things like being able to monitor and keep servers healthy: things that don’t really do a lot for game development, but if we were to take that problem away from them and enable them to focus on building better games, think of the amazing things they’d be able to do with the additional compute power.
“So really what the service is intended to do is to provide more of the infrastructure type services and deliver the on-demand compute features to developers so that they can build that into their games from the outset. What we’ve seen, from a feature function benefit perspective, at least in v1.0, is that dedicated server multiplayer is a lot easier to build on Xbox One than it has been in previous years. So that was an obvious key benefit and there are a lot of key benefits to multiplayer gaming from that. We’ve also seen things like Forza, where they’ve done a lot with Drivatar and a lot of AI computations in the cloud. The cloud can just get smarter about the player and the game.
“One of the other things we’ve really been trying to push on is games as a service, we’ve seen this with other online games, but from a console view we saw it as a real opportunity to get games to be more adaptive, with more updates directly from the cloud. Building a game configuration in from the outset, so that game developers can tweak and tune the game without having to update the physical bits actually on the box.
“So again, building that sort of infrastructure to make those scenarios easier for developers was sort of our initial goal. We see a lot of opportunity in the future, there’s a large number of things we’re considering for the future, but right now we’re obviously laser-focused on making it a really great launch.”
That’s an understandably fuzzy picture of the future, considering the program’s nascent qualities, but will it sell to the customer? So far, the cloud seems to be cut from the same cloth as the clothes of the proverbial Emperor. It’s everywhere, but doing relatively little of practical use. What actual difference is this going to make to players?
“From a computing perspective, server computing is evolving at a rapid rate,” Bruno offers. “We expect that, over time, there’ll be tons and tons more power that comes online from a server point of view. The physical box, with the chips in it that it has, well there’s no easy way to upgrade that. So we do expect that over time we’ll see more and more offloading of intensive CPU processing to the cloud.
“Now what that buys game developers is that, as you can imagine, they’re going to make trade-offs in their game as to what they’re going to use the local CPU for versus the remote CPU. We believe that there’s going to be higher fidelity experiences over time, because of having that ability to offload those tasks that they often have to trade off with local resource. So we do expect higher fidelity games over time, we do expect that the cloud will just be better from a pure computing point of view.”
Suspecting that this might not be specific enough for some, I try to nail Bruno down to a specific measure of the improvements we can expect. Is this going to be on the order of magnitude of a jump from 30fps to 60, for example, or the switch from SD to HD?
“That’s not a question that’s actually that easy for me to answer,” he tells me, diplomatically. “Mostly because a lot of that depends on how the game is built. What I can tell you is what we’ve seen with some of our developers, in the case of someone like Respawn, is that adding that additional CPU resource for them in the cloud has made a huge difference in terms of what they can do locally on the box. So we’re super excited about what we can do in the short term, but in the long term there’s a lot of opportunities. Especially when you look at what our launch footprint looks like from a datacentre perspective and what that can grow to over a number of years.”
Obviously, being an remote resource, utilising the Compute network is going to require a reliable, always-on internet connection. Last time Microsoft tried to introduce something along those lines, it ended in something of a backpeddle. What makes Bruno convinced that the announcement of Xbox Live Compute isn’t going to result in a similar outcry?
“I think it comes down to a couple of things. One is that users who want to play multiplayer games are going to play them online. So for argument’s sake we can assume that there’s a connection there. The game itself can make the decision about what sort of experience it wants to deliver online vs. offline. I think that obviously there are some benefits to being online, and there are some benefits to being offline, but I generally think that it will be additive to users that are online.”
It’s a tricky proposition, and aiming the advantages at those who are already going to be permanently connected is a canny way to get around it. Bruno tells me that “at launch the experiences will be predominantly multiplayer,” but there will be more to come on the single-player side in the future, if the developers decide to use it. For now, however, it’s going to be the blockbuster multiplayer games like Titanfall and Forza 5 which are going to be the big beneficiaries.
“We’ve had Forza 5 working on it from day one,” Bruno confirms. “We’ve had Titanfall working on it in the more recent months. I’d say Titanfall is definitely pushing on the additional computing resources; they’re doing a good job of taking advantage of what’s in the box and what’s on the cloud. The Forza guys have done a really good job of providing a good multiplayer story as well as the AI technology for Drivatar in the cloud as well. So we’ve definitely had a great partnership from our development shops, both first and third party.
“We are giving this resource away to them for free, so there is a huge incentive to utilise it on Xbox One as much as possible. I don’t think that game developers of that magnitude, the Activisions and EAs, are going to put all their eggs into that basket. I think that any good service infrastructure is going to pick and choose the way that they architect the system in the way that’s most beneficial to them. I think there’ll be cases where developers will want services that the Compute isn’t designed for, things like database services or CDNs, things that are going to provide different experiences that are unique to the way that they want to build the game.
“But I do think that will be advantages to the smaller game shops that had previously been spooked about getting into the server development because of the financial obstacle or the development obstacle there. That was one of the big intents, to take this barrier to entry of server development away and let these developers really explore what they could do with the cloud without having to worry about allocating financial resources or server developers to the problem.
“We’ve even heard stories where the developers have had that and wanted to shut down games and servers over time and that really does disrupt their communities. One of the big advantages of our service is that it’s completely on demand, so that as games wax and wane in popularity so do the resources that get applied to it from Compute. Providing that elastic scale at a really beneficial cost price point is a big benefit to developers.”
Giving those big-hitters new toys to play with might be a good thing for the end user who wants to while away endless hours in the worlds of Titanfall or Battlefield, but it doesn’t do too much for Microsoft’s reputation with the indies. Presumably there’s not going to be much need to utilise Compute unless you’re already stretching the Xbox One’s internal organs, but is this extra dimension reserved only for major publishers, or can anyone get a piece of the action? At the bottom line, is the extent to which you can take advantage of Compute tied to success?
“Technically we have developer policies that we apply for any of our assets for Xbox Live, we don’t make a lot of those public – but I should say the intent is to incentivise developers to do great things with the computing power but obviously not run away with it. So we have put some minimal guidance in place, we’re trying to encourage this environment where developers can iterate and do more with the server and so we don’t want to be limiting but at the same time we want to make sure there are some guardrails to keep cost somewhat under control.”
So far, so free-market capitalism, but I feel we’ve not really reached the end of the list of potential gripes which consumers are going to raise. What about dropped connections, server side crashes, lost data and unavailable services? Bruno is surprisingly honest and pragmatic.
“Well, there are always some risks associated with any internet connection, right? But we are trying to provide facilities to developers to help them mitigate those types of things. One of the great things about building on the server is that lost connections are something that the server can smartly detect and deal with from a state-saving perspective. We have also included this notion of storing a state for a game session, so a game that like similar to Minecraft, for example, with a number of players participating in a shared objective, that can be stored in the cloud in the event of disconnects.
Potentially, then, this could be something the effect of which exponentially increases over time. If there’s the power to turn the Xbox One into what is essentially a terminal, streaming content processed on a different continent, surely this is going to extend the lifecycle of the machine tremendously?
“I don’t know that it’s true or untrue,” Bruno admits. “I guess at the end of the day we believe that the cloud is going to augment the Xbox One experience pretty well and it’s obviously going to get better over time. Does that extend the life of the box? Potentially, I guess we’re going to have to wait and see.”
When is a blink not a natural blink? For Google the question has such ramifications that it has devoted a supercomputer to solving the puzzle.
Slashgear reports that the internet giant is using its $10 million quantum computer to find out how products like Google Glass can differentiate between a natural blink and a deliberate blink used to trigger functionality.
The supercomputer based at Google’s Quantum Artificial Intelligence Lab is a joint venture with NASA and is being used to refine the algorithms used for new forms of control such as blinking. The supercomputer uses D-Wave chips kept at as near to absolute zero as possible, which makes it somewhat impractical for everyday wear but amazingly fast at solving brainteasers.
A Redditor reported earlier this year that Google Glass is capable of taking pictures by responding to blinking, however the feature is disabled in the software code as the technology had not advanced enough to differentiate between natural impulse and intentional request.
It is easy to see the potential of blink control. Imagine being able to capture your life as you live it, exactly the way you see it, without anyone ever having to stop and ask people to say “cheese”.
Google Glass is due for commercial release next year but for the many beta testers and developers who already have one this research could lead to an even richer seam of touchless functionality.
If nothing else you can almost guarantee that Q will have one ready for Daniel Craig’s next James Bond outing.
Sources close to the Eurogamer website and games house Rockstar have revealed that the PC version of Grand Theft Auto V (GTA 5) should be released next year.
So far we are filing a PC release of the game under “very likely”. There is a petition that is slowly approaching one million signatures.
Rockstar has had some problems on its hands, however, and has been scrambling to fix GTA Online, the online version of GTA 5. That is supposedly all settled now, and it is possible that Rockstar will turn an eye toward the PC desktop version.
This summer, in a question and answer post, it didn’t give any clues about its intentions and has not since.
“The only versions of the game that we have announced are for the Xbox 360 and Playstation 3 which are set for a September 17th worldwide release,” said the unchanged information.
“We don’t have anything to share about the possibility of a next-gen or a PC platform release at this time and we are completely focused on delivering the best possible experience for the consoles people have right now.”
Eurogamer’s sources come from the industry, it said. They reckoned that the PC version will be out in the first three months of 2014.
The petition wants 1,000,000 signatures. So far it has 597,736 supporters. Previous title Grand Theft Auto IV (GTA 4) for the PC came out around eight months after the console versions.
The online portion of the GTA 5 console game is now fixed, we think, and we have installed a third Xbox 360 update.
In a post Rockstar explained some of the most recent fixes, including one that prevents players from losing all-important guns and ammo.
The independent gaming scene has been growing by leaps of bounds, so it makes sense that the events designed to celebrate it are keeping step. This weekend’s IndieCade Festival in Culver City, California (on the West side of Los Angeles) is the largest in the event’s seven-year history. IndieCade founder and CEO Stephanie Barish said the event is expecting to draw more than 5,000 people to Culver City, which has a population around 39,000.
Much like the indie scene it promotes, the show has also been getting increased attention from the mainstream gaming industry of late. Sony has been a primary sponsor of the event for years, but the 2013 show sees Nintendo chip in for the first time, with Microsoft returning to the list after taking 2012 off. Activision is also on the list of sponsors, as well as Epic Games (for the Unreal Engine), Unity, and 20 more companies. Barish said some of the event’s more recent sponsors saw how Sony benefitted from its overtures to independent developers and have been following suit.
“[Sony has] put four or five years of effort now into the indie development sector and it’s really paid off for them,” Barish said. “Developers are really interested in meeting with them. They see there are possibilities, that Sony has proven [indies] can do well and are treated well. More and more the fact that independent games are interesting to a broader public is becoming apparent to the larger publishers. As well, there’s a huge creative energy and force and momentum coming out of the independent sector, and they don’t want to not be part of the future.”
That future is a big part of the attraction for IndieCade. Attendees to this year’s show will be able to try out a handful of games on upcoming hardware like the Oculus Rift and PlayStation 4. In all, IndieCade 2013 features 36 “official selections” for the festival, with dozens more games on show. Barish expects that crop of games to not only produce some of the next big hits, but also draw attention to the next crop of important developers. In the past, she said IndieCade has served as a coming out party for indie hits like Braid and Everyday Shooter, or developers like Telltale Games (who would go on to create the multiple Game of the Year award-winning The Walking Dead series). It’s also been a place to debut games that think outside the set-top box, like Johann Sebastian Joust, a six-player game that uses music and PlayStation Move controllers, but no screen.
“It’s really important for the mainstream to see what’s at the cutting edge, and we just continue to bring things in that are more cutting edge, that are more different than publishers or other mainstream things would even think to look at yet,” Barish said. “We’re really a window into what’s going to happen.”
Among this year’s selections are That Dragon, Cancer (a narrative-driven game set in a children’s hospital over three years), Perfect Woman (a “strategic dancing game” for the Kinect), and [code] (a PC game in which players delve into ersatz programming code to solve puzzles). While some of the IndieCade games will almost certainly prove to be lucrative for their creators, Barish stressed that isn't the only way to measure their success.
"There's definitely a desire for the Cinderella story, but having seen so many of the games, they're really good," Barish said. "So even if they're not commercially successful, they're impacting the way mainstream games are designed, the directions and the trends for those."
The trend for IndieCade looks to be continued growth. This year saw the event spawn an IndieCade East sister show in New York City, a second installment of which is confirmed for February 14-16, 2014 at the Museum of the Moving Image. Beyond that, Barish said there has been talk about expanding the festival even further with a European event.