Subscribe to:

Subscribe to :: TheGuruReview.net ::

Intel Sampling Xeon D 14nm

September 15, 2014 by Michael  
Filed under Computing

Intel has announced that it is sampling its Xeon D 14nm processor family, a system on chip (SoC) optimized to deliver Intel Xeon processor performance for hyperscale workloads.

Announcing the news on stage during a keynote at IDF in San Francisco, Intel SVP and GM of the Data Centre Group, Diane Bryant, said that the Intel Xeon processor D, which initially was announced in June, will be based on 14nm process technology and be aimed at mid-range communications.

“We’re pleased to announce that we’re sampling the third generation of the high density [data center system on a chip] product line, but this one is actually based on the Xeon processor, called Xeon D,” Bryant announced. “It’s 14nm and the power levels go down to as low as 15 Watts, so very high density and high performance.”

Intel believes that its Xeon D will serve the needs of high density, optimized servers as that market develops, and for networking it will serve mid-range routers as well as other network appliances, while it will also serve entry and mid-range storage. So, Intel claimed, you will get all of the benefits of Xeon-class reliability and performance, but you will also get a very small footprint and high integration of SoC capability.

This first generation Xeon D chip will also showcase high levels of I/O integrations, including 10Gb Ethernet, and will scale Intel Xeon processor performance, features and reliability to lower power design points, according to Intel.

The Intel Xeon processor D product family will also include data centre processor features such as error correcting code (ECC).

“With high levels of I/O integration and energy efficiency, we expect the Intel Xeon processor D product family to deliver very competitive TCO to our customers,” Bryant said. “The Intel Xeon processor D product family will also be targeted toward hyperscale storage for cloud and mid-range communications market.”

Bryant said that the product is not yet available, but it is being sampled, and the firm will release more details later this year.

This announcement comes just days after Intel launched its Xeon E5 v2 processor family for servers and workstations.

Courtesy-TheInq

FreeSync Only For New AMD Processors

September 5, 2014 by Michael  
Filed under Computing

AMD has explained that its new FreeSync technology will only work in new silicon.

FreeSync is AMD’s initiative to enable variable-refresh display technology for smoother in-game animation and was supposed to give Nvidia’s G-Sync technology a good kicking.

G-Sync has already resulted in some top production gaming monitors like the Asus ROG Swift PG278Q.

However AMD said that the only the newest GPU silicon from AMD will support FreeSync displays. Specifically, the Hawaii GPU that drives the Radeon R9 290 and 290X will be compatible with FreeSync monitors, as will the Tonga GPU in the Radeon R9 285.

The Bonaire chip that powers the Radeon R7 260X and HD 7790 cards could support FreeSync, but that is not certain yet.

Now that would be OK if the current Radeon lineup is populated by a mix of newer and older GPU technology. What AMD is saying is that there are some brand-new graphics cards selling today that will not support FreeSync monitors when they arrive.

The list of products that won’t work with FreeSync includes anything based on the older revision of the GCN architecture used in chips like Tahiti and Pitcairn.

So if you have splashed out on the the Radeon R9 280, 280X, 270, and 270X hoping that it will be FreeSync-capable you will be out of luck. Nor will any older Radeons in the HD 7000 and 8000 series.

Nvidia’s G-Sync works with GeForce graphics cards based on the Kepler architecture, which include a broad swath of current and past products dating back to the GeForce GTX 600 series.

Courtesy-Fud

Lenovo Adds More Features To It’s $199 Tablet

September 4, 2014 by mphillips  
Filed under Consumer Electronics

Lenovo has decided to upgrade the features in low-cost Android tablets with the Tab S8 tablet, which will start selling this month for $199.

The tablet, which runs on Google’s Android 4.4 OS, has Intel’s quad-core Atom chip, code-named Bay Trail. The chip is capable of running PC-class applications and rendering high-definition video.

The 8-inch S8 offers 1920 x 1200-pixel resolution, which is also on Google’s 7-inch Nexus 7. The S8 is priced lower than the Nexus 7, which sells for $229.

The Tab S8 is 7.87 millimeters thick, weighs 294 grams, and runs for seven hours on a single battery charge. It has a 1.6-megapixel front camera and 8-megapixel back camera. Other features include 16GB of storage, Wi-Fi and Bluetooth. LTE is optional.

The Tab S8 will ship in multiple countries. Most of Lenovo’s tablets worldwide with screen sizes under 10 inches run on Android.

Lenovo also announced its largest gaming laptop. The Y70 Touch has a 17.3-inch touchscreen, and can be configured with Intel’s Core i7 processors and Nvidia’s GTX-860M graphics card. It is 25.9 millimeters thick and is priced starting at $1,299. It will begin shipping next month.

The company also announced Erazer X315 gaming desktop with Advanced Micro Devices processors code-named Kaveri. It can be configured with up to 32GB of DDR3 DRAM and 4TB of hard drive storage or 2TB of hybrid solid-state/hard drive storage. It will ship in November in the U.S. with prices starting at $599.

The products were announced ahead of the IFA trade show in Berlin. Lenovo is holding a press conference at IFA where it is expected to announce more products.

 

 

Are ARM 64-bit Processors Making Gains?

September 4, 2014 by Michael  
Filed under Computing

ARM claims it has seen growing momentum for its 64-bit ARMv8-A processor designs, announcing it has signed 50 licensing agreements with silicon partners to fab chips based on the architecture.

ARM said that a total of 27 companies have signed agreements for the company’s ARMv8-A technology, including all of the silicon vendors selling application processors for smartphones plus most of those targeting enterprise networking and servers.

The firm did not disclose which company signed the 50th licence, telling The INQUIRER that it was up to the licensees themselves whether to announce their plans. However, it claimed that while the first wave of ARM v8-A licences were for silicon targeting smartphones and tablets, the latest wave includes many aimed at enterprise infrastructure as well.

ARM unveiled its 64-bit processor architecture in 2011, followed a year later by the Cortex-A53 and Cortex-A57 core designs based on it. These provide backwards compatibility with existing 32-bit ARM software, but add a new 64-bit execution state that delivers more capabilities, including support for 64-bit data and a larger memory address space that is required if ARM chips are to make their way into servers and other enterprise hardware.

“ARMv8-A technology brings multiple benefits, including 64-bit capability alongside improved efficiency of existing 32-bit applications,” said Noel Hurley, GM of ARM’s processor division.

While ARM’s chips are already widely used in smartphones and tablets thanks to their low power consumption, they have also been getting attention in recent years for use in the data centre, as service providers and enterprises alike have become concerned about the amount of power being consumed by IT infrastructure.

The list of silicon vendors developing chips based on the ARMv8-A architecture already includes Samsung, Qualcomm, Broadcom and AMD, the latter of which is set to bring to market a series of ARM-based server chips, the Opteron A1100 Series processors, codenamed Seattle.

Meanwhile, software vendors including Red Hat and Ubuntu Linux developer Canonical are working on a 64-bit software ecosystem to power ARM-based servers.

ARM recently announced that the 50 billionth chip containing an ARM processor core had been shipped by partners, and said the momentum in 64-bit ARM architecture is a key component in the journey toward the next 100 billion chips.

Courtesy-TheInq

Vendors Testing New Intel Xeon Processors

September 3, 2014 by Michael  
Filed under Computing

Intel is cooking up a hot batch of Xeon processors for servers and workstations, and system vendors have already designed systems that are ready and raring to go as soon as the chips become available.

Boston is one of the companies doing just that, and we know this because it gave us an exclusive peek into its labs to show off what these upgraded systems will look like. While we can’t share any details about the new chips involved yet, we can preview the systems they will appear in, which are awaiting shipment as soon as Intel gives the nod.

Based on chassis designs from Supermicro, with which Boston has a close relationship, the systems comprise custom-built solutions for specific user requirements.

On the workstation side, Boston is readying a mid-range and a high-end system with the new Intel Xeon chips, both based on two-socket Xeon E5-2600v3 rather than the single socket E5-1600v3 versions.

There’s also the mid-range Venom 2301-12T, which comes in a mid-tower chassis and ships with an Nvidia Quadro K4000 card for graphics acceleration. It comes with 64GB of memory and a 240GB SSD as a boot device, plus two 1TB Sata drives configured as a Raid array for data storage.

For extra performance, Boston has also prepared the Venom 2401-12T, which will ship with faster Xeon processors, 128GB of memory and an Nvidia Quadro K6000 graphics card. This also has a 240GB SSD as a boot drive, with two 2TB drives configured as a Raid array for data storage.

Interestingly, Intel’s new Xeon E5-2600v3 processors are designed to work with 2133MHz DDR4 memory instead of the more usual DDR3 RAM, and as you can see in the picture below, DDR4 DIMM modules have slightly longer connectors towards the middle.

For servers, Boston has prepared a 1U rack-mount “pizza box” system, the Boston Value 360p. This is a two-socket server with twin 10Gbps Ethernet ports, support for 64GB of memory and 12Gbps SAS Raid. It can also be configured with NVM Express (NVMe) SSDs connected to the PCI Express bus rather than a standard drive interface.

Boston also previewed a multi-node rack server, the Quattro 12128-6, which is made up of four separate two-socket servers inside a 2U chassis. Each node has up to 64GB of memory, with 12Gbps SAS Raid storage plus a pair of 400GB SSDs.

Courtesy-TheInq

AMD Confirms Custom ARM Server Processors

August 14, 2014 by Michael  
Filed under Computing

As we expected AMD will make custom ARM server chips for customers, much as it made custom chips for the Xbox One and PlayStation 4 game consoles.

According to Sean White, an engineer at AMD, during a presentation at the Hot Chips conference in Cupertino, California, his outfit will consider customizing its 64-bit ARM server processor to meet specific customer needs as a market for the new type of servers evolves, and the company gets better visibility of usage models.

ARM chips are unproven in servers but the low-power processors have Web-hosting and cloud uses. AMD’s ARM server chips could go into dense servers and process such applications while saving power, White said.

“There are more and more of those applications that are showing up in big data centers,” White said. “They don’t want traditional high-end… database type workloads.”

AMD does seem to think that there is more mileage in providing customized chips for those who want a SOC something specific or include some unique IP. He provided the example of possibly customizing I/O and ports for specific customers. AMD last year also started putting more emphasis on the custom chip business after the PC market declined. The company is already recording strong custom chip revenue thanks to the game consoles, which are shipping in the millions.

AMD also shared the technical details of its first 64-bit ARM processor called Opteron A1100, code-named Seattle, at Hot Chips. The company has already started shipping the chips to server makers for testing. The first Seattle servers are expected to ship by the end of this year or early next year. One of the first servers with the new chip could be AMD’s own SeaMicro server.

The Seattle server chip has two DDR3 and DDR4 memory channels, which is half that of the typical four memory channels in its x86 server chips. The ARM chip will have up to 4MB L2 cache, with two cores sharing 1MB. A total of 8MB of L3 cache is accessible to all eight cores.

It will give ARM processors is ECC memory, which is important in servers to correct data errors. The 32-bit ARM processors did not have ECC memory. Each Seattle CPU will support up to 128GB of memory, totaling up to 1TB for the eight CPU cores on the Opteron A1100. The 32-bit ARM chips supported only up to 4GB of memory.

 

Courtesy-Fud

Will Developers Screw Up Virtual Reality Based Gaming?

August 4, 2014 by Michael  
Filed under Gaming

Whether you think it’s a fad or the next big thing, there’s no denying that the return of virtual reality, this time backed up by competent technology and plausible price-points, has caught the imagination of developers and their customers alike. Projects for Sony’s Morpheus and the Oculus Rift are popping up everywhere, from the modest to the monumental.

As of yet, though, none of the major publishers have publicly committed much to the new platforms, leaving it to smaller studios to test the waters of what could potentially form an entirely new frontier for games. Many of those smaller studios are changing their models and work-methods entirely to focus on the new technology, preparing to hit the ground running once consumers are finally able to get their hands on the headsets.

One of those studios is Patrick O’Luanaigh’s nDreams. A studio which has always enjoyed a broad remit, nDreams now has “around five [VR] projects on the go”, including forthcoming title The Assembly: a 3D VR adventure game which will see players investigating a ground-breaking scientific organisation which has started to push some ethical boundaries.

“We decided that an adventure game would make sense because we don’t have the budget to draw tons of environments that you run through at top speed,” Patrick tells me. “Adventure games work well because we’ve found that, when people play with VR, they want to really look around and explore. They want to examine the walls, everything, in a way you might not in a FPS.

“The game is split into sections of about 10-15 minutes long, which we thought makes sense for VR. We still don’t know what the final consumer versions will be like, but 10-15 minutes seems sensible. People can either do a chapter then take a break, or they can play through the entire game.

“We spent around six months prototyping lots of experiments with VR. What happens when your avatar wears glasses? What would it be like if it’s cold and you have frosty breath? What about different sized characters? That tested really nicely – Madeline is 5’1″ and Joel is 6 foot and you really notice that. You notice the breathing, the speed they walk at, the perspective. It’s all very different. You feel like you’re playing those roles.

“We’ve also got lots of specific things for VR, microscopes, binoculars, night vision goggles, things like that. They work really well. We’ve also got plenty of puzzles and other bits like vertigo and fear sections that we think are great for VR, so it’s a real medley.”

The Assembly is a definite step up for the developer in terms of scope and ambition, so I ask O’Luanaigh if the resource costs were pushed up even further by the technology they’re working with. In short, is making a VR game more expensive?

“I don’t know, honestly,” he admits. “It’s probably slightly more for VR, but there’s not a lot of difference. We’ve kind of picked our battle here and chosen a game we think would be great for VR, but one that we can also afford to make. This seemed like the right genre and approach. We’re taking influence from games like Gone Home and Dear Esther – with more puzzles, but still about exploring a great environment. I guess if we’d just done it as a Steam game it might have been a bit cheaper, but not a big difference.

The Shahid Effect: Sony’s indie push & VR

Being PC-based, the Oculus Rift has a clear advantage in attracting indie developers: working on an open platform with little or no restriction. That said, Sony has made a very strong argument to small studios this generation, something it will need to continue if it wants to recruit the most exciting VR ideas. O’Luanaigh agrees, and says that there’s no need for concern on that front.

“Sony has been fantastic,” he says, enthusiastically. “We’re very lucky in that we’ve been working on Home for a number of years, so we have a good relationship with Sony. Our account manager happens to be the evangelist for Morpheus as well, so they’ve been great. They’ve been very supportive.

“We saw the Morpheus very early, it was one of the things that persuaded us to pivot away from what we were doing and spend so much time and money on VR. They’ve been really open, really helpful. I’ve got nothing but positive things to say about Sony. I can’t wait to see the final hardware that’s going to launch to consumers.”

“It’s more about the design, doing things the right way. There are a lot of ways you can mess up VR really easily. We’ve figured out what works and what doesn’t and designed the game with that in mind. It’s working really nicely.”

The Assembly is due for release on both the Oculus Rift and Sony’s Morpheus headset, currently the two mindshare leaders of virtual reality tech. Whilst neither is likely to admit it, each has a vested interest in the success of the other – a reason which was floated to explain Valve passing on some of its own VR research to Oculus last year: if the tech is to succeed it needs to attract developers. To do that, a rough ‘gold standard’ needs to be established, giving developers a technological target to aim at for cross-platform games. Having used both the Oculus and Morpheus and found them to be roughly equivalent, I’m interested to know if O’Luanaigh sees parity in the two visors.

“They are very, very similar, technology-wise,” he confirms. “Obviously with Oculus being on PC it’s a lot more open, there’s more freedom to mess around, but it’s also easier for people to just stick stuff out, to make bad VR. That’s one of the big risks – it’s very easy make people feel ill. You have to have good software as well as hardware. I think it’s easier for Sony to control that, because it’s a closed platform. They can say, do this, do that; to make sure people don’t do stupid stuff. I suspect that Oculus will do something similar, but obviously it’s open, so people can put what they want up online.

“In terms of specs, though, they’re really very similar. We’re creating this game for both and there’s not a big difference. There are a few little things involved in supporting the PS4: the Dualshock and some of the ways that PSN works, but by and large they’re very similar.”

Moving away from comfortable ground is an essential part of growing almost any company, but when you’re relying on a third party, such as a platform holder, for your success, there’s an additional risk. nDreams must be confident about the future of virtual reality to put such stake in it, so I ask Patrick if there’s a sales point when they’ll breathe a little more easily.

“We’ve kind of come at it the other way,” he counters. “We believe it will work. We’ve got financial models and projections but it’s all a bit finger-in-the-air, it’s very hard to know. We’re committed to doing it though, we’ve got a lot of launch titles and we’re going to be pushing and growing those. We’re lucky in that we’re financially secure enough to do that without too much stress.

“We’ve been looking at things like previous install bases of hardware on consoles. If you look at the Kinect install base, which was amazing, really – something like 35-40 per cent on the 360 – we’ve made projections on a conservative install base over time. I actually think that it’s going to be better than that, given the excitement around VR and the customer reaction when they see it, but we’re being fairly conservative. With Oculus they’ve spoken about trying to sell a million, by a set point. We’ve been working along those lines. Again, we think it’s going to do really well.

“There’s going to be other headsets out there as well, that haven’t been announced, we think those are going to be very exciting. There’s not going to just be two headsets, there’ll be a number of things over the next few years. We’re going to try and work out as best we can what we think they can sell, but we want to be there at launch with products so we can build and learn what people like and don’t like.

“It’s definitely going to be more of a core audience at launch, but I think Facebook’s acquisition of Oculus means that it’s going to be a bit cheaper than it would have been. I think they can afford to give it away at cost, which is brilliant. But it’s really hard to put a finger on how much that market is going to be worth. We think it’s going to be a couple of billion within two years, but we’ll see. We may be massively over-egging, or hugely under-estimating it. What’s clear is that there’s massive potential here, it could really explode. When you get a great VR experience it’s really special.

“I was at E3 playing Alien Isolation on Oculus and, although I’m slightly embarrassed to admit it, when it came to the end I ripped my headset off because I was so scared. You really feel like the Alien is there and actually attacking you. I’ve never done that with Dead Space or Resident Evil or anything. It really heightens your emotions.”

I can attest to just how absorbing that experience can be, having lost myself in the Morpheus demo at GDC in March. Even surrounded by other gawking journalists and nervous PR, dropping that helmet on was, in many sense, completely akin to teleportation. That demonstration wasn’t exactly a road-test, though. These were first-party, highly polished demonstrations designed to show off the potential for the new technology in a short, well-controlled session. Had my first experience been a shoddy, half-finished or poorly-executed demo instead, I might never have been interested at all. For O’Luanaigh, the responsibility for audience growth is firmly on the shoulders of developers.

“For me, it’s really important,” he tells me when I ask whether VR needs to get it right this time around. “I’m utterly convinced that VR is now a technology that’s caught up to an amazing idea and can make it work. The only thing that can ruin that is dreadful games. It’s easy to make a rubbish VR game with a bad framerate that takes control of the camera and does stupid things. That’s the worst thing that could happen, and I think that both Oculus and Sony get that. I think everyone entering the VR space gets it, but we just need to keep an eye on it.

“At least one or two of the projects we’re working on are non-traditional games, it’s definitely quite different. You’ll see VR spread into different areas over the next few years”

“I hope that the press plays its part as well and makes sure that, if there’s one rogue VR game that’s snuck out and it’s dreadful, that they won’t use that to argue that VR is awful.”

Good games might be the things that get people queuing in the shops, or, more likely, clicking online, but there are clear possibilities for virtual reality which fall well outside our sphere, particularly for Oculus’ Rift. Will nDreams being dipping a toe in those waters?

“At least one or two of the projects we’re working on are non-traditional games, it’s definitely quite different. You’ll see VR spread into different areas over the next few years, although it’ll definitely start with games. Oculus aren’t showing off Facebook social pop-up sims, they’re showing off great games.

“I don’t think Facebook has changed that but I think you’ll notice them start to add stuff in over the next few years. You might see spaces where people can hang out with their friends, stuff like that. If you’ve ever read Snowcrash, I think that sort of thing is why Facebook bought Oculus. They’ve got more money now, but it’s the same people with the same values. It’s very cool to be rude about Facebook, but I think a lot of the people who were being rude about Facebook when it bought Oculus were doing it on Facebook, which is pretty ironic.”

Courtesy-GI.biz

Is Free-To-Play Always The Best Bet?

July 18, 2014 by Michael  
Filed under Gaming

To hear the likes of Electronic Arts and Gameloft tell it, premium apps are all but a relic of the past, the obsolete progenitor to mobile’s free-to-play future. But some smaller developers have found that future isn’t all it’s made out to be, and have been finding more success back on the premium side of the fence.

Kitfox Games and Double Stallion, two Montreal studios from Jason Della Rocca’s Execution Labs incubator, launched Shattered Planet and Big Action Mega Fight, respectively, on mobile in the last year. However, both titles struggled to rake in revenue, and the studios have since released more successful premium versions of the two. Kitfox’s Tanya X. Short and Double Stallion’s Nicolas Barrière-Kucharski spoke with GamesIndustry International this week to discuss their forays into free-to-play, and why more traditional business models worked better for them.

In Double Stallion’s case, part of the problem was that Big Action Mega Fight proved an awkward fit for the free-to-play format.

“We picked a genre, fighting, that was very content-driven,” Barrière-Kucharski said. “It was really very arduous to keep up and engage the audience with new levels, new enemies, and new types of content. We couldn’t compete at our size and budget with other, more established free-to-play studios and games.”

Beyond that, the genre may have been a poor fit for the audience. Barrière-Kucharski said that the people who would appreciate Big Action Mega Fight’s skill-based gameplay and faithful take on the beat-’em-up genre simply weren’t the same people interested in free-to-play games.

“I think the overlap between audiences was just too small to sustain a thriving community around the game,” Barrière-Kucharski said.

With Shattered Planet, Short said genre wasn’t a problem. She thinks the games-as-a-service model is actually a perfect fit for roguelikes like Shattered Planet, where a few new items and systems can exponentially increase the potential content for players to experience. However, Shattered Planet still didn’t fit the free-to-play mold for a few reasons.

“Free-to-play is not always suitable to single-player games,” Short said. “I think it’s best suited to multiplayer games in which it being free is actually of value to players because they can have more people to play with. That’s one philosophy we’ve developed, that if we ever do free-to-play again, we would only do it for multiplayer.”

On top of that, Shattered Planet was designed to be a tough game for players. But Short said in the free-to-play business model, difficulty can be “a dangerous thing.”

“We made a difficult game, and the fact that it was free made people suspicious, and rightfully so,” Short said. “I think they had every right to be a little bit paranoid about why the game was difficult. And in a business model where difficulty generally does often make people spend more, I think a designer’s hands are tied as to how and when a game can be difficult and when it’s ethical. So we felt a lot more comfortable about making a premium game, and me as the designer, I was happier because we could say sincerely that it’s exactly as difficult as we wanted it to be and you can’t say it was greedy or whatever.

Both games have found more success since they were released as premium versions. Big Action Mega Fight was re-launched last month as a $3 app ($2 during a first-week sale); those who downloaded the free-to-play version received the upgrade to the premium version as a free title update. Even though the free version of the game was downloaded about 400,000 times, Barrière-Kucharski said the revenues from Big Action Mega Fight’s first week as a paid app topped the total lifetime income from the free-to-play version since its November debut. To date the company has sold about 3,600 copies of Big Action Mega Fight on iOS, Android, Amazon Fire, and Ouya.

Kitfox took a different approach to premium the switch, continuing to run the free-to-play Shattered Planet mobile app alone, but also releasing a premium PC version on Steam with a $15 price tag and no monetization beyond that. The results were similarly positive, as Short said the studio made as much on Steam in one day as it had on mobile in two months. In its first week, Shattered Planet sold 2,500 copies on Steam. Short is happy to see the game bringing in more money, but she confessed to being a little bit torn on the trade-off it required.

“It really was great seeing that we had 300,000 downloads on mobile,” Short said. “We had 300,000 people play Shattered Planet on iOS and Android, and that’s amazing. Sure, it looks like we’re going to make two to five to 10 times more money on Steam, but it’s only going to be 1 percent of the amount of people that could see it if we tried to release it free, in theory… It’s a little bit sad that you monetize better with fewer people. When you’re trying to get your brand and your name out there, it is sad we couldn’t have another few hundred thousand people.”

Beyond the trade-off of settling for a smaller but more supportive audience, Kitfox has encountered some negative effects of releasing Shattered Planet as a free-to-play mobile title and then as a PC premium game.

“For us, a lot of people remained skeptical of the quality of the game if they knew the mobile version existed,” Short said. “I don’t think that really has that much to do with free-to-play and more to do with platform snobbery. It’s just kind of a general feeling of console and PC gamers that if a game was ever on mobile, it couldn’t possibly be as feature-rich or as deep, as strategic or anything like that.”

Nicolas Barrière-Kucharski

On top of that, there was some customer confusion over the game and its business model. Short said the game’s forums on Steam had some angry users saying they wouldn’t buy the game because it had in-app purchases (which it didn’t). Although the developers were able to post in the threads and clear things up, that sort of inconsistency has convinced them that if they ever do return to mobile platforms, they will stick to a free demo or companion app rather than something monetized.

“It’s just so dominated by giant players,” Short said of the mobile scene. “It’s such a completely different market that I think you really have to focus on it, and that’s not my team’s expertise. For us, we’re definitely going to be focus on PC and console; I think that’s where our talents are.”

Barrière-Kucharski agreed, saying that even if a niche audience is willing to pay for a certain experience, there just aren’t good ways for developers to connect to that audience.

“It’s really hard to be found or be discovered by players,” Barrière-Kucharski said. “I’m really looking forward to all the curation issues that are going to be tackled in the next year or so on iOS 8 and the Steam Greenlight update.”

But even if those initiatives follow through on their promises of improving discoverability, Barrière-Kucharski worries that the problem could still get worse as the gains made won’t be enough to offset the flood of new developers entering the field. Short also saw discoverability as a key problem facing developers right now, but stressed that finding a solution is in the best interests of the platform holders.

“Whatever platform figures out discoverability first will have a huge advantage because there are these thousands of developers that as soon as they hear there is any discoverability, that’s where they’re going to flood for sure,” Short said. “So it is almost a race at the moment between Steam and Apple and Google.”

Courtesy-GI.biz

 

MediaTek Shows Off 64-bit SoC LTE Chip

July 16, 2014 by Michael  
Filed under Computing

Mediatek has unveiled what it claims is the “world’s first” 64-bit octa-core LTE smartphone system on chip (SoC) with 2K display support.

Named the Mediatek MT6795, the chip is designed for use by high-end device makers for upcoming Android 64-bit mobile operating systems like the recently announced Android L, with support for 2K display up to 2560×1600 resolution.

The chip also features a clock speed of up to 2.2GHz along with Corepilot, which refers to Mediatek’s technology that aims to deliver higher performance per Watt to save power, thus increasing battery life on mobile devices while not sacrificing performance and bringing on board the power of eight cores.

The SoC also provides 4G LTE support, Mediatek said, as well as dual-channel LPDDR3 clocked at 933MHz for “top-end memory bandwidth” in a smartphone.

Mediatek VP and GM for Europe Siegmund Redl told The INQUIRER in a media briefing that the announcement is in line with the industry’s growth in the smartphone arena.

“There has been a discussion about ‘how many cores do you really need’ and what is the benefit [of octo-core],” Redl said. “Quad-core is pretty much mainstream today and application developers are exploiting the fact they can do multithreading and pipelining and parallel computing with handheld devices.

“This will not change with octa-core. When we started to introduce the first octa-core we were showing off a game with very intense graphics and processing that needed the support of multiple cores and again this is the way the industry is going; you bring out the hardware and the software development follows that and takes advantage of it and the user experience is a smoother one.”

The firm claims that the SoC features multimedia subsystems that support many technologies “never before possible or seen in a smartphone”, including support for 120Hz displays.

“With multimedia we raised the bar in terms of recording frames per second, such as slow motion replay with 480 frames per second, for much better user experience,” Redl added.

Multi-mode wireless charging is also supported by the SoC’s companion multi-mode wireless power receiver chip.

The Mediatek MT6795, dubbed the chip for “power users”, joins the firm’s MT6752 SoC for mainstream users and MT6732 SoC for entry level users. It’s the 64-bit version of the 32-bit MT6595 SoC that was launched at Mobile World Congress earlier this year, which features four ARM Cortex A17 cores and four Cortex A7 cores as well as Imagination Technologies PowerVR Series6 GPU for “high-performance graphics”.

Redl said that existing customers that use the MT6595 today for devices that are soon to be hitting the market can reuse the designs they have for the older chip as “they have a pin compatible drop-in with a 64-bit architecture”.

Redl said Mediatek will make the MT6795 chip commercially available by the end of the year, for commercial devices coming in early January or February.

Courtesy-TheInq

 

Panasonic Goes Intel For SoC

July 9, 2014 by Michael  
Filed under Computing

Panasonic and Intel have just announced that they will start making its SoC chips using Intel’s 14nm process.

Panasonic is joining Altera , Achronix Semiconductor, Tabula, Netronome and Microsemi on an ever growing list of Intel foundry clients. We expect that the list to expand over time. There have been some rumours that Cisco is planning to make its chips at Intel, too.

Keep the fabs busy

In our recent conversation with a few industry insiders we learned that Intel wants to keep its fabs busy and occupied. This is rather obvious and makes perfect sense as investing in a transition to a new manufacturing node cost a few billion dollars on a good day.

Intel has announced its Core M Broadwell processors that are coming in the latter part of this year and this will be just a fraction of what Intel plans to manufacture in its new 14nm fabs. Intel Airmont, Morganfield as well as Cherryview and Willowview Atoms, all 14nm designs, will also try to keep the fabs busy.

Lower power with 14nm SoC

Panasonic is planning to make 14nm next-generation SoCs that will target audio-visual equipment markets and will enable higher levels of performance, power and viewing experience for consumers.

The 14nm low power process technology with second generation Tri-Gate transistors will help Panasonic to decrease overall power consumption of the device. We expect that these SoCs to be used for future 4K TVs as well as the set-top boxes and possibly upscaling in Blu-ray players.

TSMC will start making 20nm chips later this year and Nvidia might be among the first clients to use it for its upcoming Maxwell 20nm GPUs. Other players will follow as Qualcomm has started making modems in 20nm and will soon move some of its SoC production to this new manufacturing node. Of course, AMD’s 20nm GPUs are in the works, too.

Intel’s 14nm is still significantly more power optimised than the 20nm process offered by TSMC and the Global Foundries – Samsung alliance, but Intel is probably not offering its services for pennies either.

Intel is known as a high margin company and we don’t see this changing over night. One of Intel’s biggest challenges in 2015 and beyond is to keep the fabs busy at all time. It will try to win more mobile phone business and it is really pushing to win its spot in the wearable technology market as phone market seems oversaturated.

Wearables offer a clean start for Intel, but first steps in any new markets are usually hard. Android Wear ARM based watches that just hit the market will complement or replace wearable wristbands like the Fitbit, based on an ARM Core M3. Intel wants to make shirts with chips inside and the more success it has in bringing cheap SoCs into our lives, the more chips it can sell. Panasonic will just help keep the fabs busy until Intel manages to fill them with its own chips.

Courtesy-Fud

ARM Launches Juno Hardware Development Program

July 7, 2014 by Michael  
Filed under Computing

ARM has announced two programs to assist Android’s ascent into the 64-bit architecture market.

The first of those is Linaro, a port of the Android Open Source Project to the 64-bit ARMv8-A architecture. ARM said the port was done on a development board codenamed “Juno”, which is the second initiative to help Android reach the 64-bit market.

The Juno hardware development platform includes a system on chip (SoC) powered by a quad-core ARM Cortex-A53 CPU and dual-core ARM Cortex-A57 CPU in an ARM big.little processing configuration.

Juno is said to be an “open, vendor neutral ARMv8 development platform” that will also feature an ARM Mali-T624 graphics processor.

Alongside the news of the 64-bit initiatives, ARM also announced that Actions Semiconductor of China signed a license agreement for the 64-bit ARM Cortex-A50 processor family.

“Actions provides SoC solutions for portable consumer electronics,” ARM said. “With this IP license, Actions will develop 64-bit SoC solutions targeting the tablet and over-the-counter (OTT) set top box markets.”

The announcements from ARM come at an appropriate time, as it was only last week that Google announced the latest version of its Android mobile operating system, Android L, which comes with support for 64-bit processors. ARM’s latest developments mean that Android developers are likely to take advantage of them in the push to take Android to the 64-bit market.

Despite speculation that it would launch as Android 5.0 Lollipop, Google outed its next software iteration on Wednesday last week as simply Android L, touting the oddly-named iteration as “the largest update to the operating system yet”.

Courtesy-TheInq

 

Intel Discusses Knights Landing

June 26, 2014 by Michael  
Filed under Computing

Intel has disclosed more details about its next generation Xeon Phi processor for high-performance computing (HPC), codenamed Knights Landing.

Knights Landing chips are due to be available in the second half of 2015, along with a new interconnect fabric known as Intel Omni Scale.

The chipmaker showed off the updates to its Xeon Phi many integrated core (MIC) platform at the International Supercomputing Conference (ISC) 2014 conference in Leipzig, Germany.

Intel announced Knights Landing at last year’s event, but gave away few details other than that the chip will be a 14nm part, will be able to operate as a standalone CPU rather than a co-processor, and will have integrated on-package memory.

Now, Intel has disclosed that this chip will be based on a version of the Silvermont core used in Intel’s Atom processors, with HPC enhancements including a low-latency mesh for inter-core communication. The first commercial systems with it are likely to ship in the second half of 2015.

Knights Landing will also have “at least as many compute cores” as the existing Xeon Phi products, according to Charles Wuischpard, VP for Workstations and HPC in Intel’s Data Centre Group. This means at least 61 cores, while rumours have already indicated it may in fact be a 72-core chip.

The first Knights Landing chips will have up to 16GB of on-package memory, which offers five times the bandwidth of DDR4 memory, but this is expected to be in addition to DDR4 memory on the motherboard, not replacing it.

“One of the choke points in many applications used today is I/O and memory bandwidth, and this is specifically designed to remove that bottleneck,” Wuischpard explained.

Knights Landing will in fact offer three times the performance of the current Knights Corner Xeon Phi product, offering over three teraflops in a single processor socket, Intel claimed.

However, with the Silvermont Atom cores it also continues Intel’s approach to HPC, which is to keep as much compatibility as possible between its Xeon Phi architecture and its existing x86 chips with their huge installed base of software.

Knights Landing will also feature a new interconnect fabric that will be integrated onto the chip, and which Intel is referring to as Omni Scale. This will be the fabric used in future Xeon Phi chips, according to Wuischpard.

Intel is not giving away too much detail on Omni Scale yet, but said it is different from the current True Scale fabric in Knights Corner, which is based on quad data rate (QDR) InfiniBand technology, while maintaining software compatibility.

It will use Intel’s Silicon Photonics fibre-optic technology, and will encompass a full suite of offerings including PCI Express adapters and switch hardware. Intel will provide an upgrade path from True Scale to Omni Scale, Wuischpard said.

Courtesy-TheInq

 

Will AMD’s Mantle See Success On Linux?

June 20, 2014 by Michael  
Filed under Computing

AMD is planning to bring its new Mantle API to Linux in the near future. Although Linux is not a big gaming platform at the moment, SteamOS could change all that starting next year.

AMD’s Richard Huddy says the decision was prompted by requests from developers who would like to see Mantle on Linux. However, he stopped short of specifying a launch date. Huddy confirmed that AMD plans to dedicate resources to bringing Mantle to Linux, but other than that we don’t have much to go on.

Mantle on SteamOS makes a lot of sense

Mantle is designed to cut CPU overhead and offer potentially significant performance improvements on certain hardware configurations. This basically means gamers can save a few pennies on their CPU and use them towards a better GCN-based graphics card.

However, aside from enthusiasts who build their own gaming rigs, the world of PC gaming is also getting a lot of attention from vendors specialising in out-of-the box gaming PCs and laptops. Many of them have already announced plans to jump the SteamOS bandwagon with Steam Machines of their own.

Should Mantle become available on Linux and SteamOS, it would give AMD a slight competitive edge, namely in the value department. In theory vendors should be able to select a relatively affordable APU and discrete GPU combo for their Steam boxes.

AMD already tends to provide good value in the CPU department. The prospect of using mainstream APUs backed by cheap discrete Radeons (or even Dual Graphics systems) sounds interesting.

It will take a while but the potential is there

Huddy told PC World that Mantle has some clear advantages over DirectX. Microsoft’s new DirectX 12 API has already been announced, but the first games to support it won’t arrive until late 2015.

“It (Mantle) could provide some advantages on Steam boxes,” said Huddy. “We are getting requests to deliver this high-performance layer.”

While DirectX 12 will be very relevant in the PC space, the same obviously cannot be said of Linux and SteamOS. Therefore Mantle on Linux makes a lot of sense. However, it all depends on AMD’s timetable.

Last month Valve announced Steam Machines would be pushed back to 2015. They were originally supposed to launch this summer and the first announcements were made months ago. The first designs were based on Intel and Nvidia silicon, but support for AMD hardware was added just a bit later.

When Valve announced the delay we argued that it could have a silver lining for AMD. It simply gives AMD more time to improve its drivers or add Mantle support, something Nvidia and Intel do not have to worry about.

It still remains to be seen whether Steam Machines can make a big dent on the gaming market. PC gaming is going through a renaissance, but the latest consoles are doing well, too (apart from the Wii U). The concept is very attractive on more than one level, but it is very difficult to make any predictions yet, since we are still about 15 months away from launch.

Courtesy-Fud

20nm SoCs In Route To Production

June 20, 2014 by Michael  
Filed under Computing

The transition to 20nm has been anything but fast and much of the industry has been stuck at 28nm for a while, but the first 20nm products are coming as we speak.

TSMC’s 20nm process is almost ready for prime time, but volume production is still a couple of months away. However, some outfits do not need great yields and huge volumes and one maker if bitcoin mining ASICs says it will become the first outfit to ship 20nm products this week. Sweden-based KnCMiner received the first batch of 20nm Neptune ASICs earlier this week and it says it should start shipping finalized mining rigs by the end of the week.

Most ARM-based SoCs and practically every GPU on the market today are 28nm designs. The first 20nm SoCs should arrive by the end of the year, courtesy of Qualcomm and possibly Apple. Nvidia and AMD were expected to introduce 20nm GPUs sometime in the second half of 2014, but it is becoming increasingly apparent that we won’t see them until a bit later, with volume production slated for 2015.

The KnCMiner Neptune is a relatively big chip, with 1440 cores in a 55x55mm package, but there is no word on die size. The miner will use five chips and churn out 3TH/s while consuming 2.1KW. Although KnCMiner does not talk about the foundry, it appears that we are looking at TSMC silicon. However, this does not mean we will see mainstream chips manufactured on the new node anytime soon.

Cryptocurrency mining is a relatively small niche and many miners are willing to take big risks and pay through the nose for the latest kit in an effort to gain an upper hand in the mining hardware arms race. Mining ASICs don’t require great yields or big volumes, as the designers can operate with much higher margins than consumer chip outfits.

It is a risky space that has already seen a number of spectacular flops, but the promise of quick cash and downright ridiculous ROI is still attracting a lot of (greedy) risk takers. As a result there is a lot of demand and pre-orders are the norm.

Regardless of the controversy surrounding this very risky industry, it is hard not to be impressed by KnC’s feat, as the company states it has managed to beat big chipmakers to 20nm – and it has, albeit in a very tight niche.

Courtesy-Fud

Intel’s New Core i7 Chip Capable Of Hitting 5GHz

June 4, 2014 by mphillips  
Filed under Computing

Intel is shipping a new Core i7 chip for the serious gamer that runs at 4.4GHz — and can be overclocked to 5GHz.

The Core i7-4790K is a quad-core chip based on the Haswell microarchitecture. It draws 88 watts of power and has 8MB of cache, integrated graphics, memory controllers and support for the latest I/O technologies. It also supports multithreading and allow cores to process two tasks at one time.

The chip, now Intel’s flagship PC processor, is mainly for gaming and enthusiast desktops.

It’s Intel’s first chip capable of running at over 4GHz under normal conditions. It can be overclocked to 5GHz in air-cooled systems, said Renee James, president of Intel, during a keynote speech at the Computex trade show in Taipei.

Intel’s not the first chip company to reach 5GHz though: Advanced Micro Devices offers FX chips for gamers with clock speeds of up to 5GHz.

Chip makers moved away from cranking up chip clock speeds in favor of adding cores as a way to boost performance about a decade ago. Bumping up clock speeds generated more heat and consumed more electricity. Performance improvements over time have also come by shrinking chips and integrating more components such as graphics cores.

But AMD and Intel haven’t given up on clock speed altogether: They continue the battle on their flagship chips with the aim of capturing the performance crown.