Subscribe to:

Subscribe to :: TheGuruReview.net ::

Does Samsung Fear A Processor War?

October 15, 2014 by Michael  
Filed under Computing

Kwon Oh-hyun has said he is not worried about a price war in the semiconductor industry next year even though the firm is rapidly expanding its production volume.

“We’ll have to wait and see how things will go next year, but there definitely will not be any game of chicken,” said Oh-hyun, according to Reuters, suggesting the firm will not take chip rivals head on.

Samsung has reported strong profits for 2014 owing to better-than-expected demand for PCs and server chips. Analysts have also forecast similar results for the coming year, so things are definitely looking good for the company.

It emerged last week that Samsung will fork out almost $15bn on a new chip facility in South Korea, representing the firm’s biggest investment in a single plant.

Samsung hopes the investment will bolster profits in its already well-established and successful semiconductor business, and help to maintain its lead in memory chips and grow beyond the declining sales of its smartphones.

According to sources, Samsung expects its chip production capacity to increase by a “low double-digit percentage” after the facility begins production, which almost goes against the CEO’s claims that it is not looking for a price war.

Last month, Samsung was found guilty of involvement in a price fixing racket with a bunch of other chip makers stretching back over a decade, and was fined €138m by European regulators.

An antitrust investigation into chips used in mobile device SIM cards found that Infineon, Philips and Samsung colluded to artificially manipulate the price of SIM card chips.

Courtesy-TheInq

Will The Chip Industry Take Fall?

October 14, 2014 by Michael  
Filed under Computing

Microchip Technology has managed to scare Wall Street by warning of an industry downturn. This follows rumours that a number of US semiconductor makers with global operations are reducing demand for chips in regions ranging from Asia to Europe.

Microchip Chief Executive Steve Sanghi warned that the correction will spread more broadly across the industry in the near future. Microchip expects to report sales of $546.2 million for its fiscal second quarter ending in September. The company had earlier forecast revenue in a range of $560 million to $575.9 million. Semiconductor companies’ shares are volatile at the best of times and news like this is the sort of thing that investors do not want to hear.

Trading in Intel, whiich is due to report third quarter results tomorrow, was 2.6 times the usual volume. Micron, which makes dynamic random access memory, or DRAM, was the third-most traded name in the options market. All this seems to suggest that the market is a bit spooked and much will depend on what Chipzilla tells the world tomorrow as to whether it goes into a nosedive.

Courtesy-Fud

Will Sony’s Morpheus Succeed?

September 30, 2014 by Michael  
Filed under Gaming

PS4 is going gangbusters, 3DS continues to impress, Steam and Kickstarter have between them overseen an extraordinary revitalisation of PC gaming, and mobile gaming goes from strength to strength; yet it’s absolutely clear where the eager eyes of most gamers are turned right now. Virtual reality headsets are, not for the first time, the single most exciting thing in interactive entertainment. At the Tokyo Game Show and its surrounding events, the strongest contrast to the huge number of mobile titles on display was the seemingly boundless enthusiasm for Sony’s Morpheus and Oculus’ Rift headsets; at Oculus’ own conference in California the same week, developers were entranced by the hardware and its promise.

VR is coming; this time, it’s for real. Decades of false starts, disappointments and dodgy Hollywood depictions will finally be left behind. The tech and the know-how have finally caught up with the dreams. Immersion and realism are almost within touching distance, a deep, involved experience that will fulfil the childhood wishes of just about every gamer and SF aficionado while also putting clear blue water between core games and more casual entertainment. The graphical fidelity of mobile devices may be rapidly catching up to consoles, but the sheer gulf between a VR experience and a mobile experience will be unmistakeable.

That’s the promise, anyway. There’s no question that it’s a promise which feels closer to fulfilment than ever before. Even in the absence of a final consumer product or even a release date, let alone a killer app, the prototypes and demos we’ve seen thus far are closer to “true” virtual reality than many of us had dared to hope. Some concerns remain; how mainstream can a product that relies on strapping on a headset to the exclusion of the real world actually become? (I wouldn’t care to guess on this front, but would note that we already use technology in countless ways that would have seemed alien, anti-social or downright weird to people only a generation ago.) Won’t an appreciable portion of people get motion sickness? (Perhaps; only widespread adoption will show us how widespread this problem really is.) There’s plenty to ponder even as the technology marches inexorably closer.

One thing I found myself pondering around TGS and Oculus Connect was the slightly worrying divergence in the strategies of Sony and Oculus. A year or even six months ago, it felt like these companies, although rivals, were broadly marching in lock step. Morpheus and Rift felt like very similar devices – Rift was more “hobbyist” yet a little more technically impressive, while Morpheus was more clearly the product of an experienced consumer products company, but in essence they shared much of the same DNA.

Now, however, there’s a clear divergence in strategy, and it’s something of a concern. Shuhei Yoshida says that Morpheus is 85% complete (although anyone who has worked in product development knows that the last 10% can take a hell of a lot more than 10% of the effort to get right); Sony is seemingly feeling reasonably confident about its device and has worked out various cunning approaches to make it cost effective, from using mobile phone components through to repurposing PlayStation Move as a surprisingly effective VR control mechanism.

By contrast, Oculus Connect showed off a new prototype of Rift which is still clearly in a process of evolution. The new hardware is lighter and more comfortable – closer to being a final product, in short – but it’s also still adding new features and functionality to the basic unit. Oculus, unlike Sony, still doesn’t feel like a company that’s anywhere close to having a consumer product ready to launch. It’s still hunting for the “right” level of hardware capabilities and functionality to make VR really work.

I could be wrong; Oculus could be within a year of shipping something to consumers, but if so, they’ve got a damned funny way of showing it. Based on the tone of Oculus Connect, the firm’s hugely impressive technology is still in a process of evolution and development. It barely feels any closer to being a consumer product this year than it did last year, and its increasingly complex functionality implies a product which, when it finally arrives, will command a premium price point. This is still a tech company in a process of iteration, discovering the product they actually want to launch; for Luckey, Carmack and the rest of the dream team assembled at Oculus, their VR just isn’t good enough yet, even though it’s moving in the right direction fast.

Sony, by contrast, now feels like it’s about to try something disruptive. It’s seemingly pretty happy with where Morpheus stands as a VR device; now the challenge is getting the design and software right, and pushing the price down to a consumer friendly level by doing market-disruptive things like repurposing components from its (actually pretty impressive) smartphones. Again, it’s possible that the mood music from both companies is misleading, but right now it feels like Sony is going to launch a reasonably cost-effective VR headset while Oculus is still in the prototyping phase.

These are two very different strategic approaches to the market. The worrying thing is that they can’t both be right. If Oculus is correct and VR still needs a lot of fine-tuning, prototyping and figuring out before it’s ready for the market, then Sony is rushing in too quickly and risks seriously damaging the market potential of VR as a whole with an underwhelming product. This risk can’t be overstated; if Morpheus launches first and it makes everyone seasick, or is uncomfortable to use for more than a short period of time, or simply doesn’t impress people with its fidelity and immersion, then it could see VR being written off for another decade in spite of Oculus’ best efforts. The public are fickle and VR has cried wolf too many times already.

If, on the other hand, Sony is correct and “good enough” VR tech is pretty much ready to go, then that’s great for VR and for PS4, but potentially very worrying for Oculus, who risk their careful, evolutionary, prototype after prototype approach being upended by an unusually nimble and disruptive challenge from Sony. If this is the case (and I’ve heard little but good things about Morpheus, which suggests Sony’s gamble may indeed pay off) then the Facebook deal could be either a blessing or a curse. A blessing, if it allows Oculus to continue to work on evolving and developing VR tech, shielding them from the impact of losing first-mover advantage to Sony; a curse, if that failure to score a clear win in the first round spooks Facebook’s management and investors and causes them to pull the plug. That’s one that could go either way; given the quality of the innovative work Oculus is doing, even if Sony’s approach proves victorious, everyone should hope that the Oculus team gets an opportunity to keep plugging away.

It’s exciting and interesting to see Sony taking this kind of risk. These gambles don’t always pay off, of course – the company placed bets on 3D TV in the PS3 era which never came to fruition, for example – but that’s the nature of innovation and we should never criticise a company for attempting something truly interesting, innovative and even disruptive, as long as it passes the most basic of Devil’s Advocate tests. Sony has desperately needed a Devil’s Advocate in the past – Rolly, anyone? UMD? – but Morpheus is a clear pass, an interesting and exciting product with the potential to truly turn around the company’s fortunes.

I just hope that in the company’s enthusiasm, it understands the absolute importance of getting this right, not just being first. This is a quality Sony was famed for in the past; rather than trying to be first to market in new sectors, it would ensure that it had by far the best product when it launched. This is one of the things which Steve Jobs, a huge fan of Sony, copied from the company when he created the philosophies which still guide Apple (a company that rarely innovates first, but almost always leapfrogs the competition in quality and usability when it does adopt new technology and features). For an experience as intimate as VR – complete immersion in a headset, screens mere centimetres from your eyes – that’s a philosophy which must be followed. When these headsets reach the market, what will be most important isn’t who is first; it isn’t even who is cheapest. The consumer’s first experience must be excellent – nothing less will do. Oculus seems to get that. Sony, in its enthusiasm to disrupt, must not lose sight of the same goal.

 

Courtesy-GI.biz

Will Oculus Go Into The Mobile Space?

September 25, 2014 by Michael  
Filed under Gaming

We attended the first ever Oculus Connect conference, the beats and chatter of a cocktail reception just next door, Max Cohen is being brutally honest about the company’s mobile-based virtual reality headset.

“I can spend ten minutes talking about the problems with this device. We’re not afraid of them,” the VP of mobile says with a smile.

“It overheats if you run it too long. It is 60 Hertz low persistence, which means some people will notice flicker. The graphical quality is obviously a lot less than the PC. Battery life is a concern. There’s no positional tracking.

“We could try to say this is the be-all end-all of VR. We’d be lying. That’s a bad thing. We would hurt where we can get to the be-all end-all of VR. Everyone, Samsung, Facebook, Oculus, we’re all aligned with making a damn good product that we put out in the market and then working on improving it. Really soon, maybe even sooner than you think, we’ll get to that amazing VR experience for everyone.”

“Samsung, Facebook, Oculus, we’re all aligned with making a damn good product”

Cohen’s talking about the Gear VR, the Samsung backed headset that offers a more portable and accessible entry into the virtual reality world for developers and users alike. It’s John Carmack’s passion project at the company and clearly it’s Cohen’s too.

“The first thing they did was to put me in the HD prototype with the Tuscany demo. I was floored, of course,” he remembers.

“Then I got to see the Valve room and then he showed me this mobile project. It was running on a Galaxy S4 at the time. It crashed a little bit. There were a lot of problems with it, but I just thought this was so amazing. I went back and was talking to a friend of mine who’s an entrepreneur. He said it’s rare that you have the opportunity to work on transformational hardware, and that’s really what this was.”

The story of the Gear VR is a simple one; Oculus went to the Korean company hoping to work with them on screens for the PC-based Rift and found Samsung had been working on a headset you could simply slide a Samsung Galaxy phone into to experience virtual reality. Now the companies are working together on both devices, with Samsung fielding calls from Carmack on a regular basis.

“It’s a collaboration. It’s not we tell them what to do or they tell us what to do,” Cohen continues. “We’re the software platform, so when you put that on, you’re in Oculus, but that wouldn’t be possible without maximizing the hardware. Carmack and our team works very closely with their engineering team. They make suggestions about UI as well. We’re working together to make the best possible experience. If it wasn’t collaborative, this thing just honestly wouldn’t function because this is really hard to do.”

The focus of Oculus Connect isn’t the media or sales or even recruitment, but developers. Supporting them, showing them the technology, offering them advice on the new territory that is virtual reality. Cohen, like everyone else I speak to at the weekend, believes developers and their content is absolutely key to the success of the hardware.

“At the end of the day, we want to make the developers’ lives as easy as possible so they can make cool content.”

“Facebook invested in the platform. They didn’t buy it. What they did is they’re also committing money to make sure it’s successful on an ongoing basis”

That content will be supported by an app store, and Cohen wants it to be a place where developers can make a living, rather than just a showcase of free demos. Jason Holtman, former director of business development at Valve, is overseeing its creation.

“We’re going to launch initially with a free store, but maybe a month later, follow along with commerce,” says Cohen.

“At the end of the day, as great as doing the art for free and sharing that is, we will have a hundred times more content when people can actually monetize it. This is a business. There’s nothing wrong with that. People need to be able to feed themselves. Our job is to make the platform as friendly for developers as we can so that it’s painless. You don’t have to worry about a bunch of overhead.”

There’s a sense that the Facebook money, that headline-grabbing $2 billion, has given the team the luxury of time and the chance to recruit the people they need to make sure this time virtual reality lives up to its promises. Other than that, Facebook seems to be letting Oculus just get on with it.

“That’s the thing… a lot of people, with the Facebook acquisition, asked how that would impact us and the answer is it hasn’t, in terms of our culture, and Facebook’s actually supportive of the way Oculus is because we know that content makes or breaks a platform,” says Cohen.

“They invested in the platform. They didn’t buy it. What they did is they’re also committing money to make sure it’s successful on an ongoing basis. We could have continued to raise a lot of venture capital. It would have been very expensive to do it right. Now we have replaced our board of directors with Facebook, but that’s completely fine. They are helping us. They are accelerating our efforts.”

No one at Oculus is talking about release dates for consumer units yet, and Cohen is no different. It’s clear that he and the team are hungry for progress as he talks about skipping minor updates and making major advances. He talks about “awesome” ideas that he’s desperate to get to, and pushing the envelope, but what matters most is getting it right.

“I think everyone understands that with a little bit more magic, VR can be ubiquitous. Everyone needs it. I think a lot of people understand what we need to do to get there, but it takes hard work to actually solve those things. Oculus and Facebook have lined up the right team to do it, but I want us to actually have time to do that,” says Cohen.

“We’re not trying to sell millions now. We’re trying to get people and early adopters, tech enthusiasts and all that interested in it.”

Courtesy-GI.biz

Will AMD’s FreeSync Appear In Early 2015?

September 22, 2014 by Michael  
Filed under Computing

Last week in San Francisco we spent some time with Richard Huddy, AMD’s Chief gaming scientist to get a glimpse what is going on in the world of AMD graphics. Of course we touched on Mantle, AMD’s future in graphics and FreeSync, the company’s alternative to Nvidia G-Sync.

Now a week later AMD is ready to announce that MStar, Novatek and Realtek scaler manufactures are getting ready with DisplayPort Adaptive-Sync and AMD’s Project FreeSync. They should be done by end of the year with monitors shipping in Q1 2015.

FreeSync will prevent frame tearing as the graphic card often pushes more (or fewer) frames than the monitor can draw and this lack of synchronisation creates quite annoying frame tears.

FreeSync will allow Radeon gamers to synchronize display refresh rates and GPU frame rates to enable tearing and stutter-free gaming along with low input latency. We still do not have the specs or names of the new monitors, but we can confirm that they will use robust DisplayPort receivers from MStar, Novatek and Realtek in 144Hz panels with QHD 2560×1440 and UHD 3840×2160 panels up to 60 Hz.

It took Nvidia quite some time to get G-Sync monitors off the ground and we expect to see the first 4K G-Sync monitors shipping shortly, while QHD 2560×1440 ones have been available for a few months. Since these are gaming monitors with a 144Hz refresh rate they don’t come cheap, but they are nice to look at and should accompany a high end graphic card such as Geforce GTX 980 or a few of them.

Radeon lovers will get FreeSync, but monitors will take a bit more time since AMD promises Project FreeSync-ready monitors through a media review program in 1Q 15 and doesn’t actually tells us much about retail / etail availability.

Courtesy-Fud

Intel Sampling Xeon D 14nm

September 15, 2014 by Michael  
Filed under Computing

Intel has announced that it is sampling its Xeon D 14nm processor family, a system on chip (SoC) optimized to deliver Intel Xeon processor performance for hyperscale workloads.

Announcing the news on stage during a keynote at IDF in San Francisco, Intel SVP and GM of the Data Centre Group, Diane Bryant, said that the Intel Xeon processor D, which initially was announced in June, will be based on 14nm process technology and be aimed at mid-range communications.

“We’re pleased to announce that we’re sampling the third generation of the high density [data center system on a chip] product line, but this one is actually based on the Xeon processor, called Xeon D,” Bryant announced. “It’s 14nm and the power levels go down to as low as 15 Watts, so very high density and high performance.”

Intel believes that its Xeon D will serve the needs of high density, optimized servers as that market develops, and for networking it will serve mid-range routers as well as other network appliances, while it will also serve entry and mid-range storage. So, Intel claimed, you will get all of the benefits of Xeon-class reliability and performance, but you will also get a very small footprint and high integration of SoC capability.

This first generation Xeon D chip will also showcase high levels of I/O integrations, including 10Gb Ethernet, and will scale Intel Xeon processor performance, features and reliability to lower power design points, according to Intel.

The Intel Xeon processor D product family will also include data centre processor features such as error correcting code (ECC).

“With high levels of I/O integration and energy efficiency, we expect the Intel Xeon processor D product family to deliver very competitive TCO to our customers,” Bryant said. “The Intel Xeon processor D product family will also be targeted toward hyperscale storage for cloud and mid-range communications market.”

Bryant said that the product is not yet available, but it is being sampled, and the firm will release more details later this year.

This announcement comes just days after Intel launched its Xeon E5 v2 processor family for servers and workstations.

Courtesy-TheInq

FreeSync Only For New AMD Processors

September 5, 2014 by Michael  
Filed under Computing

AMD has explained that its new FreeSync technology will only work in new silicon.

FreeSync is AMD’s initiative to enable variable-refresh display technology for smoother in-game animation and was supposed to give Nvidia’s G-Sync technology a good kicking.

G-Sync has already resulted in some top production gaming monitors like the Asus ROG Swift PG278Q.

However AMD said that the only the newest GPU silicon from AMD will support FreeSync displays. Specifically, the Hawaii GPU that drives the Radeon R9 290 and 290X will be compatible with FreeSync monitors, as will the Tonga GPU in the Radeon R9 285.

The Bonaire chip that powers the Radeon R7 260X and HD 7790 cards could support FreeSync, but that is not certain yet.

Now that would be OK if the current Radeon lineup is populated by a mix of newer and older GPU technology. What AMD is saying is that there are some brand-new graphics cards selling today that will not support FreeSync monitors when they arrive.

The list of products that won’t work with FreeSync includes anything based on the older revision of the GCN architecture used in chips like Tahiti and Pitcairn.

So if you have splashed out on the the Radeon R9 280, 280X, 270, and 270X hoping that it will be FreeSync-capable you will be out of luck. Nor will any older Radeons in the HD 7000 and 8000 series.

Nvidia’s G-Sync works with GeForce graphics cards based on the Kepler architecture, which include a broad swath of current and past products dating back to the GeForce GTX 600 series.

Courtesy-Fud

Lenovo Adds More Features To It’s $199 Tablet

September 4, 2014 by mphillips  
Filed under Consumer Electronics

Lenovo has decided to upgrade the features in low-cost Android tablets with the Tab S8 tablet, which will start selling this month for $199.

The tablet, which runs on Google’s Android 4.4 OS, has Intel’s quad-core Atom chip, code-named Bay Trail. The chip is capable of running PC-class applications and rendering high-definition video.

The 8-inch S8 offers 1920 x 1200-pixel resolution, which is also on Google’s 7-inch Nexus 7. The S8 is priced lower than the Nexus 7, which sells for $229.

The Tab S8 is 7.87 millimeters thick, weighs 294 grams, and runs for seven hours on a single battery charge. It has a 1.6-megapixel front camera and 8-megapixel back camera. Other features include 16GB of storage, Wi-Fi and Bluetooth. LTE is optional.

The Tab S8 will ship in multiple countries. Most of Lenovo’s tablets worldwide with screen sizes under 10 inches run on Android.

Lenovo also announced its largest gaming laptop. The Y70 Touch has a 17.3-inch touchscreen, and can be configured with Intel’s Core i7 processors and Nvidia’s GTX-860M graphics card. It is 25.9 millimeters thick and is priced starting at $1,299. It will begin shipping next month.

The company also announced Erazer X315 gaming desktop with Advanced Micro Devices processors code-named Kaveri. It can be configured with up to 32GB of DDR3 DRAM and 4TB of hard drive storage or 2TB of hybrid solid-state/hard drive storage. It will ship in November in the U.S. with prices starting at $599.

The products were announced ahead of the IFA trade show in Berlin. Lenovo is holding a press conference at IFA where it is expected to announce more products.

 

 

Are ARM 64-bit Processors Making Gains?

September 4, 2014 by Michael  
Filed under Computing

ARM claims it has seen growing momentum for its 64-bit ARMv8-A processor designs, announcing it has signed 50 licensing agreements with silicon partners to fab chips based on the architecture.

ARM said that a total of 27 companies have signed agreements for the company’s ARMv8-A technology, including all of the silicon vendors selling application processors for smartphones plus most of those targeting enterprise networking and servers.

The firm did not disclose which company signed the 50th licence, telling The INQUIRER that it was up to the licensees themselves whether to announce their plans. However, it claimed that while the first wave of ARM v8-A licences were for silicon targeting smartphones and tablets, the latest wave includes many aimed at enterprise infrastructure as well.

ARM unveiled its 64-bit processor architecture in 2011, followed a year later by the Cortex-A53 and Cortex-A57 core designs based on it. These provide backwards compatibility with existing 32-bit ARM software, but add a new 64-bit execution state that delivers more capabilities, including support for 64-bit data and a larger memory address space that is required if ARM chips are to make their way into servers and other enterprise hardware.

“ARMv8-A technology brings multiple benefits, including 64-bit capability alongside improved efficiency of existing 32-bit applications,” said Noel Hurley, GM of ARM’s processor division.

While ARM’s chips are already widely used in smartphones and tablets thanks to their low power consumption, they have also been getting attention in recent years for use in the data centre, as service providers and enterprises alike have become concerned about the amount of power being consumed by IT infrastructure.

The list of silicon vendors developing chips based on the ARMv8-A architecture already includes Samsung, Qualcomm, Broadcom and AMD, the latter of which is set to bring to market a series of ARM-based server chips, the Opteron A1100 Series processors, codenamed Seattle.

Meanwhile, software vendors including Red Hat and Ubuntu Linux developer Canonical are working on a 64-bit software ecosystem to power ARM-based servers.

ARM recently announced that the 50 billionth chip containing an ARM processor core had been shipped by partners, and said the momentum in 64-bit ARM architecture is a key component in the journey toward the next 100 billion chips.

Courtesy-TheInq

Vendors Testing New Intel Xeon Processors

September 3, 2014 by Michael  
Filed under Computing

Intel is cooking up a hot batch of Xeon processors for servers and workstations, and system vendors have already designed systems that are ready and raring to go as soon as the chips become available.

Boston is one of the companies doing just that, and we know this because it gave us an exclusive peek into its labs to show off what these upgraded systems will look like. While we can’t share any details about the new chips involved yet, we can preview the systems they will appear in, which are awaiting shipment as soon as Intel gives the nod.

Based on chassis designs from Supermicro, with which Boston has a close relationship, the systems comprise custom-built solutions for specific user requirements.

On the workstation side, Boston is readying a mid-range and a high-end system with the new Intel Xeon chips, both based on two-socket Xeon E5-2600v3 rather than the single socket E5-1600v3 versions.

There’s also the mid-range Venom 2301-12T, which comes in a mid-tower chassis and ships with an Nvidia Quadro K4000 card for graphics acceleration. It comes with 64GB of memory and a 240GB SSD as a boot device, plus two 1TB Sata drives configured as a Raid array for data storage.

For extra performance, Boston has also prepared the Venom 2401-12T, which will ship with faster Xeon processors, 128GB of memory and an Nvidia Quadro K6000 graphics card. This also has a 240GB SSD as a boot drive, with two 2TB drives configured as a Raid array for data storage.

Interestingly, Intel’s new Xeon E5-2600v3 processors are designed to work with 2133MHz DDR4 memory instead of the more usual DDR3 RAM, and as you can see in the picture below, DDR4 DIMM modules have slightly longer connectors towards the middle.

For servers, Boston has prepared a 1U rack-mount “pizza box” system, the Boston Value 360p. This is a two-socket server with twin 10Gbps Ethernet ports, support for 64GB of memory and 12Gbps SAS Raid. It can also be configured with NVM Express (NVMe) SSDs connected to the PCI Express bus rather than a standard drive interface.

Boston also previewed a multi-node rack server, the Quattro 12128-6, which is made up of four separate two-socket servers inside a 2U chassis. Each node has up to 64GB of memory, with 12Gbps SAS Raid storage plus a pair of 400GB SSDs.

Courtesy-TheInq

AMD Confirms Custom ARM Server Processors

August 14, 2014 by Michael  
Filed under Computing

As we expected AMD will make custom ARM server chips for customers, much as it made custom chips for the Xbox One and PlayStation 4 game consoles.

According to Sean White, an engineer at AMD, during a presentation at the Hot Chips conference in Cupertino, California, his outfit will consider customizing its 64-bit ARM server processor to meet specific customer needs as a market for the new type of servers evolves, and the company gets better visibility of usage models.

ARM chips are unproven in servers but the low-power processors have Web-hosting and cloud uses. AMD’s ARM server chips could go into dense servers and process such applications while saving power, White said.

“There are more and more of those applications that are showing up in big data centers,” White said. “They don’t want traditional high-end… database type workloads.”

AMD does seem to think that there is more mileage in providing customized chips for those who want a SOC something specific or include some unique IP. He provided the example of possibly customizing I/O and ports for specific customers. AMD last year also started putting more emphasis on the custom chip business after the PC market declined. The company is already recording strong custom chip revenue thanks to the game consoles, which are shipping in the millions.

AMD also shared the technical details of its first 64-bit ARM processor called Opteron A1100, code-named Seattle, at Hot Chips. The company has already started shipping the chips to server makers for testing. The first Seattle servers are expected to ship by the end of this year or early next year. One of the first servers with the new chip could be AMD’s own SeaMicro server.

The Seattle server chip has two DDR3 and DDR4 memory channels, which is half that of the typical four memory channels in its x86 server chips. The ARM chip will have up to 4MB L2 cache, with two cores sharing 1MB. A total of 8MB of L3 cache is accessible to all eight cores.

It will give ARM processors is ECC memory, which is important in servers to correct data errors. The 32-bit ARM processors did not have ECC memory. Each Seattle CPU will support up to 128GB of memory, totaling up to 1TB for the eight CPU cores on the Opteron A1100. The 32-bit ARM chips supported only up to 4GB of memory.

 

Courtesy-Fud

Will Developers Screw Up Virtual Reality Based Gaming?

August 4, 2014 by Michael  
Filed under Gaming

Whether you think it’s a fad or the next big thing, there’s no denying that the return of virtual reality, this time backed up by competent technology and plausible price-points, has caught the imagination of developers and their customers alike. Projects for Sony’s Morpheus and the Oculus Rift are popping up everywhere, from the modest to the monumental.

As of yet, though, none of the major publishers have publicly committed much to the new platforms, leaving it to smaller studios to test the waters of what could potentially form an entirely new frontier for games. Many of those smaller studios are changing their models and work-methods entirely to focus on the new technology, preparing to hit the ground running once consumers are finally able to get their hands on the headsets.

One of those studios is Patrick O’Luanaigh’s nDreams. A studio which has always enjoyed a broad remit, nDreams now has “around five [VR] projects on the go”, including forthcoming title The Assembly: a 3D VR adventure game which will see players investigating a ground-breaking scientific organisation which has started to push some ethical boundaries.

“We decided that an adventure game would make sense because we don’t have the budget to draw tons of environments that you run through at top speed,” Patrick tells me. “Adventure games work well because we’ve found that, when people play with VR, they want to really look around and explore. They want to examine the walls, everything, in a way you might not in a FPS.

“The game is split into sections of about 10-15 minutes long, which we thought makes sense for VR. We still don’t know what the final consumer versions will be like, but 10-15 minutes seems sensible. People can either do a chapter then take a break, or they can play through the entire game.

“We spent around six months prototyping lots of experiments with VR. What happens when your avatar wears glasses? What would it be like if it’s cold and you have frosty breath? What about different sized characters? That tested really nicely – Madeline is 5’1″ and Joel is 6 foot and you really notice that. You notice the breathing, the speed they walk at, the perspective. It’s all very different. You feel like you’re playing those roles.

“We’ve also got lots of specific things for VR, microscopes, binoculars, night vision goggles, things like that. They work really well. We’ve also got plenty of puzzles and other bits like vertigo and fear sections that we think are great for VR, so it’s a real medley.”

The Assembly is a definite step up for the developer in terms of scope and ambition, so I ask O’Luanaigh if the resource costs were pushed up even further by the technology they’re working with. In short, is making a VR game more expensive?

“I don’t know, honestly,” he admits. “It’s probably slightly more for VR, but there’s not a lot of difference. We’ve kind of picked our battle here and chosen a game we think would be great for VR, but one that we can also afford to make. This seemed like the right genre and approach. We’re taking influence from games like Gone Home and Dear Esther – with more puzzles, but still about exploring a great environment. I guess if we’d just done it as a Steam game it might have been a bit cheaper, but not a big difference.

The Shahid Effect: Sony’s indie push & VR

Being PC-based, the Oculus Rift has a clear advantage in attracting indie developers: working on an open platform with little or no restriction. That said, Sony has made a very strong argument to small studios this generation, something it will need to continue if it wants to recruit the most exciting VR ideas. O’Luanaigh agrees, and says that there’s no need for concern on that front.

“Sony has been fantastic,” he says, enthusiastically. “We’re very lucky in that we’ve been working on Home for a number of years, so we have a good relationship with Sony. Our account manager happens to be the evangelist for Morpheus as well, so they’ve been great. They’ve been very supportive.

“We saw the Morpheus very early, it was one of the things that persuaded us to pivot away from what we were doing and spend so much time and money on VR. They’ve been really open, really helpful. I’ve got nothing but positive things to say about Sony. I can’t wait to see the final hardware that’s going to launch to consumers.”

“It’s more about the design, doing things the right way. There are a lot of ways you can mess up VR really easily. We’ve figured out what works and what doesn’t and designed the game with that in mind. It’s working really nicely.”

The Assembly is due for release on both the Oculus Rift and Sony’s Morpheus headset, currently the two mindshare leaders of virtual reality tech. Whilst neither is likely to admit it, each has a vested interest in the success of the other – a reason which was floated to explain Valve passing on some of its own VR research to Oculus last year: if the tech is to succeed it needs to attract developers. To do that, a rough ‘gold standard’ needs to be established, giving developers a technological target to aim at for cross-platform games. Having used both the Oculus and Morpheus and found them to be roughly equivalent, I’m interested to know if O’Luanaigh sees parity in the two visors.

“They are very, very similar, technology-wise,” he confirms. “Obviously with Oculus being on PC it’s a lot more open, there’s more freedom to mess around, but it’s also easier for people to just stick stuff out, to make bad VR. That’s one of the big risks – it’s very easy make people feel ill. You have to have good software as well as hardware. I think it’s easier for Sony to control that, because it’s a closed platform. They can say, do this, do that; to make sure people don’t do stupid stuff. I suspect that Oculus will do something similar, but obviously it’s open, so people can put what they want up online.

“In terms of specs, though, they’re really very similar. We’re creating this game for both and there’s not a big difference. There are a few little things involved in supporting the PS4: the Dualshock and some of the ways that PSN works, but by and large they’re very similar.”

Moving away from comfortable ground is an essential part of growing almost any company, but when you’re relying on a third party, such as a platform holder, for your success, there’s an additional risk. nDreams must be confident about the future of virtual reality to put such stake in it, so I ask Patrick if there’s a sales point when they’ll breathe a little more easily.

“We’ve kind of come at it the other way,” he counters. “We believe it will work. We’ve got financial models and projections but it’s all a bit finger-in-the-air, it’s very hard to know. We’re committed to doing it though, we’ve got a lot of launch titles and we’re going to be pushing and growing those. We’re lucky in that we’re financially secure enough to do that without too much stress.

“We’ve been looking at things like previous install bases of hardware on consoles. If you look at the Kinect install base, which was amazing, really – something like 35-40 per cent on the 360 – we’ve made projections on a conservative install base over time. I actually think that it’s going to be better than that, given the excitement around VR and the customer reaction when they see it, but we’re being fairly conservative. With Oculus they’ve spoken about trying to sell a million, by a set point. We’ve been working along those lines. Again, we think it’s going to do really well.

“There’s going to be other headsets out there as well, that haven’t been announced, we think those are going to be very exciting. There’s not going to just be two headsets, there’ll be a number of things over the next few years. We’re going to try and work out as best we can what we think they can sell, but we want to be there at launch with products so we can build and learn what people like and don’t like.

“It’s definitely going to be more of a core audience at launch, but I think Facebook’s acquisition of Oculus means that it’s going to be a bit cheaper than it would have been. I think they can afford to give it away at cost, which is brilliant. But it’s really hard to put a finger on how much that market is going to be worth. We think it’s going to be a couple of billion within two years, but we’ll see. We may be massively over-egging, or hugely under-estimating it. What’s clear is that there’s massive potential here, it could really explode. When you get a great VR experience it’s really special.

“I was at E3 playing Alien Isolation on Oculus and, although I’m slightly embarrassed to admit it, when it came to the end I ripped my headset off because I was so scared. You really feel like the Alien is there and actually attacking you. I’ve never done that with Dead Space or Resident Evil or anything. It really heightens your emotions.”

I can attest to just how absorbing that experience can be, having lost myself in the Morpheus demo at GDC in March. Even surrounded by other gawking journalists and nervous PR, dropping that helmet on was, in many sense, completely akin to teleportation. That demonstration wasn’t exactly a road-test, though. These were first-party, highly polished demonstrations designed to show off the potential for the new technology in a short, well-controlled session. Had my first experience been a shoddy, half-finished or poorly-executed demo instead, I might never have been interested at all. For O’Luanaigh, the responsibility for audience growth is firmly on the shoulders of developers.

“For me, it’s really important,” he tells me when I ask whether VR needs to get it right this time around. “I’m utterly convinced that VR is now a technology that’s caught up to an amazing idea and can make it work. The only thing that can ruin that is dreadful games. It’s easy to make a rubbish VR game with a bad framerate that takes control of the camera and does stupid things. That’s the worst thing that could happen, and I think that both Oculus and Sony get that. I think everyone entering the VR space gets it, but we just need to keep an eye on it.

“At least one or two of the projects we’re working on are non-traditional games, it’s definitely quite different. You’ll see VR spread into different areas over the next few years”

“I hope that the press plays its part as well and makes sure that, if there’s one rogue VR game that’s snuck out and it’s dreadful, that they won’t use that to argue that VR is awful.”

Good games might be the things that get people queuing in the shops, or, more likely, clicking online, but there are clear possibilities for virtual reality which fall well outside our sphere, particularly for Oculus’ Rift. Will nDreams being dipping a toe in those waters?

“At least one or two of the projects we’re working on are non-traditional games, it’s definitely quite different. You’ll see VR spread into different areas over the next few years, although it’ll definitely start with games. Oculus aren’t showing off Facebook social pop-up sims, they’re showing off great games.

“I don’t think Facebook has changed that but I think you’ll notice them start to add stuff in over the next few years. You might see spaces where people can hang out with their friends, stuff like that. If you’ve ever read Snowcrash, I think that sort of thing is why Facebook bought Oculus. They’ve got more money now, but it’s the same people with the same values. It’s very cool to be rude about Facebook, but I think a lot of the people who were being rude about Facebook when it bought Oculus were doing it on Facebook, which is pretty ironic.”

Courtesy-GI.biz

Is Free-To-Play Always The Best Bet?

July 18, 2014 by Michael  
Filed under Gaming

To hear the likes of Electronic Arts and Gameloft tell it, premium apps are all but a relic of the past, the obsolete progenitor to mobile’s free-to-play future. But some smaller developers have found that future isn’t all it’s made out to be, and have been finding more success back on the premium side of the fence.

Kitfox Games and Double Stallion, two Montreal studios from Jason Della Rocca’s Execution Labs incubator, launched Shattered Planet and Big Action Mega Fight, respectively, on mobile in the last year. However, both titles struggled to rake in revenue, and the studios have since released more successful premium versions of the two. Kitfox’s Tanya X. Short and Double Stallion’s Nicolas Barrière-Kucharski spoke with GamesIndustry International this week to discuss their forays into free-to-play, and why more traditional business models worked better for them.

In Double Stallion’s case, part of the problem was that Big Action Mega Fight proved an awkward fit for the free-to-play format.

“We picked a genre, fighting, that was very content-driven,” Barrière-Kucharski said. “It was really very arduous to keep up and engage the audience with new levels, new enemies, and new types of content. We couldn’t compete at our size and budget with other, more established free-to-play studios and games.”

Beyond that, the genre may have been a poor fit for the audience. Barrière-Kucharski said that the people who would appreciate Big Action Mega Fight’s skill-based gameplay and faithful take on the beat-’em-up genre simply weren’t the same people interested in free-to-play games.

“I think the overlap between audiences was just too small to sustain a thriving community around the game,” Barrière-Kucharski said.

With Shattered Planet, Short said genre wasn’t a problem. She thinks the games-as-a-service model is actually a perfect fit for roguelikes like Shattered Planet, where a few new items and systems can exponentially increase the potential content for players to experience. However, Shattered Planet still didn’t fit the free-to-play mold for a few reasons.

“Free-to-play is not always suitable to single-player games,” Short said. “I think it’s best suited to multiplayer games in which it being free is actually of value to players because they can have more people to play with. That’s one philosophy we’ve developed, that if we ever do free-to-play again, we would only do it for multiplayer.”

On top of that, Shattered Planet was designed to be a tough game for players. But Short said in the free-to-play business model, difficulty can be “a dangerous thing.”

“We made a difficult game, and the fact that it was free made people suspicious, and rightfully so,” Short said. “I think they had every right to be a little bit paranoid about why the game was difficult. And in a business model where difficulty generally does often make people spend more, I think a designer’s hands are tied as to how and when a game can be difficult and when it’s ethical. So we felt a lot more comfortable about making a premium game, and me as the designer, I was happier because we could say sincerely that it’s exactly as difficult as we wanted it to be and you can’t say it was greedy or whatever.

Both games have found more success since they were released as premium versions. Big Action Mega Fight was re-launched last month as a $3 app ($2 during a first-week sale); those who downloaded the free-to-play version received the upgrade to the premium version as a free title update. Even though the free version of the game was downloaded about 400,000 times, Barrière-Kucharski said the revenues from Big Action Mega Fight’s first week as a paid app topped the total lifetime income from the free-to-play version since its November debut. To date the company has sold about 3,600 copies of Big Action Mega Fight on iOS, Android, Amazon Fire, and Ouya.

Kitfox took a different approach to premium the switch, continuing to run the free-to-play Shattered Planet mobile app alone, but also releasing a premium PC version on Steam with a $15 price tag and no monetization beyond that. The results were similarly positive, as Short said the studio made as much on Steam in one day as it had on mobile in two months. In its first week, Shattered Planet sold 2,500 copies on Steam. Short is happy to see the game bringing in more money, but she confessed to being a little bit torn on the trade-off it required.

“It really was great seeing that we had 300,000 downloads on mobile,” Short said. “We had 300,000 people play Shattered Planet on iOS and Android, and that’s amazing. Sure, it looks like we’re going to make two to five to 10 times more money on Steam, but it’s only going to be 1 percent of the amount of people that could see it if we tried to release it free, in theory… It’s a little bit sad that you monetize better with fewer people. When you’re trying to get your brand and your name out there, it is sad we couldn’t have another few hundred thousand people.”

Beyond the trade-off of settling for a smaller but more supportive audience, Kitfox has encountered some negative effects of releasing Shattered Planet as a free-to-play mobile title and then as a PC premium game.

“For us, a lot of people remained skeptical of the quality of the game if they knew the mobile version existed,” Short said. “I don’t think that really has that much to do with free-to-play and more to do with platform snobbery. It’s just kind of a general feeling of console and PC gamers that if a game was ever on mobile, it couldn’t possibly be as feature-rich or as deep, as strategic or anything like that.”

Nicolas Barrière-Kucharski

On top of that, there was some customer confusion over the game and its business model. Short said the game’s forums on Steam had some angry users saying they wouldn’t buy the game because it had in-app purchases (which it didn’t). Although the developers were able to post in the threads and clear things up, that sort of inconsistency has convinced them that if they ever do return to mobile platforms, they will stick to a free demo or companion app rather than something monetized.

“It’s just so dominated by giant players,” Short said of the mobile scene. “It’s such a completely different market that I think you really have to focus on it, and that’s not my team’s expertise. For us, we’re definitely going to be focus on PC and console; I think that’s where our talents are.”

Barrière-Kucharski agreed, saying that even if a niche audience is willing to pay for a certain experience, there just aren’t good ways for developers to connect to that audience.

“It’s really hard to be found or be discovered by players,” Barrière-Kucharski said. “I’m really looking forward to all the curation issues that are going to be tackled in the next year or so on iOS 8 and the Steam Greenlight update.”

But even if those initiatives follow through on their promises of improving discoverability, Barrière-Kucharski worries that the problem could still get worse as the gains made won’t be enough to offset the flood of new developers entering the field. Short also saw discoverability as a key problem facing developers right now, but stressed that finding a solution is in the best interests of the platform holders.

“Whatever platform figures out discoverability first will have a huge advantage because there are these thousands of developers that as soon as they hear there is any discoverability, that’s where they’re going to flood for sure,” Short said. “So it is almost a race at the moment between Steam and Apple and Google.”

Courtesy-GI.biz

 

MediaTek Shows Off 64-bit SoC LTE Chip

July 16, 2014 by Michael  
Filed under Computing

Mediatek has unveiled what it claims is the “world’s first” 64-bit octa-core LTE smartphone system on chip (SoC) with 2K display support.

Named the Mediatek MT6795, the chip is designed for use by high-end device makers for upcoming Android 64-bit mobile operating systems like the recently announced Android L, with support for 2K display up to 2560×1600 resolution.

The chip also features a clock speed of up to 2.2GHz along with Corepilot, which refers to Mediatek’s technology that aims to deliver higher performance per Watt to save power, thus increasing battery life on mobile devices while not sacrificing performance and bringing on board the power of eight cores.

The SoC also provides 4G LTE support, Mediatek said, as well as dual-channel LPDDR3 clocked at 933MHz for “top-end memory bandwidth” in a smartphone.

Mediatek VP and GM for Europe Siegmund Redl told The INQUIRER in a media briefing that the announcement is in line with the industry’s growth in the smartphone arena.

“There has been a discussion about ‘how many cores do you really need’ and what is the benefit [of octo-core],” Redl said. “Quad-core is pretty much mainstream today and application developers are exploiting the fact they can do multithreading and pipelining and parallel computing with handheld devices.

“This will not change with octa-core. When we started to introduce the first octa-core we were showing off a game with very intense graphics and processing that needed the support of multiple cores and again this is the way the industry is going; you bring out the hardware and the software development follows that and takes advantage of it and the user experience is a smoother one.”

The firm claims that the SoC features multimedia subsystems that support many technologies “never before possible or seen in a smartphone”, including support for 120Hz displays.

“With multimedia we raised the bar in terms of recording frames per second, such as slow motion replay with 480 frames per second, for much better user experience,” Redl added.

Multi-mode wireless charging is also supported by the SoC’s companion multi-mode wireless power receiver chip.

The Mediatek MT6795, dubbed the chip for “power users”, joins the firm’s MT6752 SoC for mainstream users and MT6732 SoC for entry level users. It’s the 64-bit version of the 32-bit MT6595 SoC that was launched at Mobile World Congress earlier this year, which features four ARM Cortex A17 cores and four Cortex A7 cores as well as Imagination Technologies PowerVR Series6 GPU for “high-performance graphics”.

Redl said that existing customers that use the MT6595 today for devices that are soon to be hitting the market can reuse the designs they have for the older chip as “they have a pin compatible drop-in with a 64-bit architecture”.

Redl said Mediatek will make the MT6795 chip commercially available by the end of the year, for commercial devices coming in early January or February.

Courtesy-TheInq

 

Panasonic Goes Intel For SoC

July 9, 2014 by Michael  
Filed under Computing

Panasonic and Intel have just announced that they will start making its SoC chips using Intel’s 14nm process.

Panasonic is joining Altera , Achronix Semiconductor, Tabula, Netronome and Microsemi on an ever growing list of Intel foundry clients. We expect that the list to expand over time. There have been some rumours that Cisco is planning to make its chips at Intel, too.

Keep the fabs busy

In our recent conversation with a few industry insiders we learned that Intel wants to keep its fabs busy and occupied. This is rather obvious and makes perfect sense as investing in a transition to a new manufacturing node cost a few billion dollars on a good day.

Intel has announced its Core M Broadwell processors that are coming in the latter part of this year and this will be just a fraction of what Intel plans to manufacture in its new 14nm fabs. Intel Airmont, Morganfield as well as Cherryview and Willowview Atoms, all 14nm designs, will also try to keep the fabs busy.

Lower power with 14nm SoC

Panasonic is planning to make 14nm next-generation SoCs that will target audio-visual equipment markets and will enable higher levels of performance, power and viewing experience for consumers.

The 14nm low power process technology with second generation Tri-Gate transistors will help Panasonic to decrease overall power consumption of the device. We expect that these SoCs to be used for future 4K TVs as well as the set-top boxes and possibly upscaling in Blu-ray players.

TSMC will start making 20nm chips later this year and Nvidia might be among the first clients to use it for its upcoming Maxwell 20nm GPUs. Other players will follow as Qualcomm has started making modems in 20nm and will soon move some of its SoC production to this new manufacturing node. Of course, AMD’s 20nm GPUs are in the works, too.

Intel’s 14nm is still significantly more power optimised than the 20nm process offered by TSMC and the Global Foundries – Samsung alliance, but Intel is probably not offering its services for pennies either.

Intel is known as a high margin company and we don’t see this changing over night. One of Intel’s biggest challenges in 2015 and beyond is to keep the fabs busy at all time. It will try to win more mobile phone business and it is really pushing to win its spot in the wearable technology market as phone market seems oversaturated.

Wearables offer a clean start for Intel, but first steps in any new markets are usually hard. Android Wear ARM based watches that just hit the market will complement or replace wearable wristbands like the Fitbit, based on an ARM Core M3. Intel wants to make shirts with chips inside and the more success it has in bringing cheap SoCs into our lives, the more chips it can sell. Panasonic will just help keep the fabs busy until Intel manages to fill them with its own chips.

Courtesy-Fud