Subscribe to:

Subscribe to :: ::

Will AMD Bring Two New GPUs To Market In 2016?

November 18, 2015 by Michael  
Filed under Computing

AMDs’ head graphics guy, Raja Koduri promised that AMD will have two new GPUs out next year.

Koduri was talking to Forbes about how AMD needed to get some new architectural designs and create brand new GPUs into the shops.

He added that this is something that AMD has been pretty pants about lately.

He promised two brand new GPUs in 2016, which are hopefully going to both be 14nm/16nm FinFET from GlobalFoundries or TSMC and will help make Advanced Micro Devices more power and die size competitive.

AMD’s GPU architectures have gotten rather elderly, he said.

AMD also wants to increase its share in professional graphics. Apparently this is so low that any competition it brings Nvidia could significantly help their market share in this high margin business. The company has hired

Sean Burke to help drive this forward. Sean was a president at Flex and Nortek and a senior executive at Hewlett-Packard, Compaq and Dell. For those who came in late he was the father of Dell’s Dimension and Compaq’s Prolinea.

Koduri’s cunning plan is to capture consumer and professional graphics will be by providing fully immersive experiences that range from education and medicine to gaming and virtual reality with plenty of overlap in between.

He is also interested in expanding into “instinctive computing” applications which involve medicine, factory automation, automotive and security. These are computing applications that are more natural to the environment and less obvious to the user and should come as natural user experiences.

Koduri has three make attack plans. The first is to gain discrete GPU market share in 2016 and 2017 as well as win the next generation of consoles, which will be 4K. Ironically the AMD chips in the consoles on the market at the moment can handle 4K but they don’t.

Koduri wants console makers will continue to stick with Radeon IP for their next generation consoles and give Advanced Micro Devices an even bigger advantage in the gaming space.

DirectX 12 in the latest shipping version of Windows does seem to give Radeon GPUs a significant performance uplift against Nvidia, he said.



HP Inc. Releases ZBook Studio Laptop With 4K Screen

November 13, 2015 by mphillips  
Filed under Computing

The newly formed HP Inc. has announced its first product release, the ZBook Studio, a feature-packed, 15.6-in. laptop with a 4K screen.

The laptop can be configured to be as speedy as a gaming laptop, but is targeted at mobile workers.

The laptop marks the first product launched by HP Inc., which officially commenced operations last week after Hewlett-Packard split into two companies: HP Inc. and Hewlett-Packard Enterprise. More laptops, hybrids and tablets are expected to be released by HP Inc. in the coming months.

The ZBook Studio is 18 millimeters thick and weighs 1.99 kilograms (4.6 pounds). It  can be configured with Nvidia Quadro graphics cards, which are more for professional graphics and engineering applications.

The laptop has up to 2TB of storage capacity, but HP is selling a separate dock with a Thunderbolt 3 port, which will make it easy to add external storage drives.

Beyond the Intel Core chips, the ZBook Studio is one of the few laptops that can be configured with a Xeon server-class chip. Starting at $1,699, the laptop will ship in December.

A cheaper option would be HP’s new ZBook 15u, which starts at $1,099. It has a 1080p screen, up to 1.5TB of storage and can be configured with an AMD FirePro graphics processor, which competes with Nvidia’s Quadro.

A 4K screen can be included in HP’s ZBook 15 and 17 laptops, which have 15.6-in. and 17.3-in. screens, respectively. The ZBook 15 has up to 3TB of storage, while the ZBook 17 offers up to 4TB of storage.

The laptops have up to 64GB of memory and can be configured with Intel Xeon or Core chips. The laptops are scheduled for release in January; prices weren’t immediately available.





Will AMD Go Back To Black In 2016

November 12, 2015 by Michael  
Filed under Computing

AMD’s EMEA component sales manager Neil Spicer is “confident” his outfit can return to profitability in 2016.

Talking to CRN Spicer said he is sure that profitability will return as long as the company sticks to its principles.

“From a personal stance, I am confident [AMD can be profitable]. I believe we are working with exactly the right customers, and over the last few years we have become much simpler to execute and do business with.”

He said that in order to achieve profit, the company must ensure it is investing in the right areas.

“Moving forwards to 2016, we have to have profitable share growth,” he said. “So it’s choosing the right business to go after, both with the company itself and the ecosystem of partners. There is no point in us as a vendor chasing unprofitable partners.

“We want to focus [in the areas] we are good at – that’s where we are going to invest heavily. That’s things like winning the graphics battle with gaming and so forth, and we want to be part of this Windows 10 upgrade cycle.”

Spicer so far has been a little optimistic this year. He thought that Windows 10 would drive an upgrade refresh, particularly as AMD works so well with the new OS.

He also thinks that the combination of Windows 10, the advent of e-sports – competitive online gaming – and new technology and products AMD is launching, means “PC is an exciting market”.

Of course Spicer was extremely enthusiastic about Zen which he thinks will help its play in the high-end desktop space, and the server area. More cynical observers think that Zen will be AMD’s last roll of the dice.



Oracle Goes Elastic To Take On Amazon’s AWS

November 3, 2015 by Michael  
Filed under Computing

Oracle has launched a direct rival to the Amazon Web Services (AWS) public cloud with its own Elastic Compute Cloud.

The product was revealed amid a flurry of cloud-related product announcements, including five in the infrastructure-as-a-service (IaaS) space, at the OpenWorld show in San Francisco on Tuesday.

Oracle Elastic Compute Cloud adds to the Dedicated Compute service the firm launched last year. The latest service lets customers make use of elastic compute capabilities to run any workload in a shared cloud compute zone, a basic public cloud offering.

“Last year we had dedicated compute. You get a rack, it’s elastic but it’s dedicated to your needs,” said Thomas Kurian, president of Oracle Product Development (pictured below).

“We’ve now added in Elastic Compute, so you can just buy a certain number of cores and it runs four different operating systems: Oracle Linux, Red Hat, Ubuntu or Windows, and elastically scale that up and down.”

Oracle has yet to release pricing details for the Elastic Compute Cloud service, but chairman and CTO Larry Ellison said on Sunday that it will be charged at the equivalent or lower than AWS pricing. For the dedicated model, Ellison revealed on Tuesday at OpenWorld that firms will pay half the cost for Oracle Dedicated Compute of the equivalent AWS shared compute option.

It is not surprising that Oracle would like the opportunity to have a piece of the public cloud pie. AWS earned its owner $2.08bn in revenue in the quarter ending 30 September.

Kurian shared current use details for the Oracle Cloud as evidence of the success it has seen so far. The firm manages 1,000PB of cloud storage, and in September alone processed 34 billion transactions on its cloud. This was a result of the 35,000 companies signed up to the Oracle Cloud, which between them account for 30 million users logging in actively each day.

However, Oracle’s chances of knocking Amazon off its cloud-leader perch, or even making a slight dent in its share, seem low. The AWS revenue was only made possible by the fact that Amazon owns 30 percent of the cloud infrastructure service market, with second and third-ranked Microsoft and IBM lagging behind at 10 and seven percent respectively.

Google and Salesforce have managed to capture less than five percent each. Indeed, realising how competitive the market is and Amazon’s dominant position, HP has just left the public cloud market.

Despite Oracle going head to head with AWS in the public cloud space, Amazon has been attempting to attract Oracle customers to its own platform.

“AWS and Oracle are working together to offer enterprises a number of solutions for migrating and deploying their enterprise applications on the AWS cloud. Customers can launch entire enterprise software stacks from Oracle on the AWS cloud, and they can build enterprise-grade Oracle applications using database and middleware software from Oracle,” the web giant notes on its site.

Amazon describes EC2 as letting users “increase or decrease capacity within minutes, not hours or days. You can commission one, hundreds or even thousands of server instances simultaneously”, making Oracle Elastic Compute Cloud a direct competitor.

Oracle has also added a hierarchical storage option for its archive storage cloud service, aimed at automatically moving data that requires long-term retention such as corporate records, scientific archives and cultural preservation content.

Ellison noted that this archiving service is priced at a 10th of the cost of Amazon’s S3 offering.

Kurian explained of the archive system: “I’ve got data I need to put into the cloud but I don’t need a recovery time objective. So you get it very, very cheap”, adding that it costs $1/TB per month.

The firm also launched what Kurian dubbed as its “lorry service” for bulk data transfer. This will see Oracle ship a storage appliance to a customer’s site, where they can then do a huge data transfer directly onto that machine at a much quicker rate than streaming it to the cloud. The appliance is then sent back to Oracle via DHL or FedEx, Kurian explained, for Oracle to then do the transfer on-site to the cloud for storage.

“This is much faster if you’re moving a huge amount of data. One company is moving 250PB of data. To stream that amount of data to the cloud would take a very long time,” he said.

Bulk data transfer will be available from November, while the archive service is available now.

“You can go up to as a customer, enter a credit card and you can buy the service, all the PaaS services and the storage service. We’re adding compute over the next couple of weeks,” Kurian explained.

“You pay for it by credit card or an invoice if you’re a corporate customer and pay for it by hour or month, by processor or by per gigabyte per hour or month for storage.”

Oracle Container Cloud, meanwhile, lets firms run apps in Docker containers and deploy them in the Oracle Compute Cloud, supporting better automation of app implementations using technologies like Kubernetes.

Oracle also launched additional applications that sit in its cloud, including the Data Visualisation Cloud Service. This makes visual analytics accessible to general business users who do not have access to Hadoop systems or the data warehouse.

“All you need is a spreadsheet to load your data and a browser to do the analysis,” Kurian explained.

Several new big data cloud services are also aimed at letting users more easily prepare and analyse data using Hadoop as the data store, for example Big Data Preparation and Big Data Discovery.

“With Big Data Preparation you can move data into your data lake, you can enrich the data, prepare it, do data wrangling, cleanse it and store it in the data lake. Big Data Discovery lets a business user sit in front of Hadoop, and through a browser-based dashboarding environment search the environment, discover patterns in the data, do analysis and curate subsets of the data for other teams to look at. It’s an analytic environment and complete Hadoop stack,” Kurian said.



Will 2016 Be The Year For Virtual Reality Games?

October 15, 2015 by Michael  
Filed under Gaming

As the end of 2015 rapidly approaches (seriously, how on earth is it October already?), the picture of what we can expect from VR in 2016 is starting to look a little less fuzzy around the edges. There’s no question that next year is the Year of VR, at least in terms of mindshare. Right now it looks like no fewer than three consumer VR systems will be on the market during calendar 2016 – Oculus Rift, PlayStation VR and Valve / HTC Vive. They join Samsung’s already released Gear VR headset, although that device has hardly set the world on fire; it’s underwhelming at best and in truth, VR enthusiasts are all really waiting for one of the big three that will arrive next year.

Those fuzzy edges, though; they’re a concern, and as they come into sharper focus we’re starting to finally understand what the first year of VR is going to look like. In the past week or so, we’ve learned more about pricing for the devices – and for Microsoft’s approach, the similar but intriguingly different Hololens – and the aspect that’s brought into focus is simple; VR is going to be expensive. It’s going to be expensive enough to be very strictly limited to early adopters with a ton of disposable income. It’s quite likely going to be expensive enough that the market for software is going to struggle for the first couple of years at least, and that’s a worry.

Oculus Rift, we’ve learned, will cost “at least” $350. That’s just for the headset; you’ll also need a spectacularly powerful PC to play games in VR. No laptop will suffice, and you’re certainly out of luck with a Mac; even for many enthusiasts, the prospect of adding a major PC purchase or upgrade to a $350 headset is a hefty outlay for an early glimpse of the future. It’s likely (though as yet entirely unconfirmed) that Valve’s Vive headset will have a similar price tag and a similarly demanding minimum PC specification. The cheap end of the bunch is likely to be PlayStation VR – not because the headset will be cheap (Sony has confirmed that it is pricing it as a “platform” rather than a peripheral, suggesting a $300 or so price tag) but because the system you attach it to is a $350 PS4 rather than a much more expensive PC.

It is unreasonable, of course, to suggest that this means that people will be expected to pay upwards of $600 for Sony’s solution, or $1500 for the PC based solution. A great many people already own PS4s; quite a few own PCs capable of playing VR titles. For these people, the headset alone (and perhaps some software) is the cost of entry. That is still a pretty steep cost – enough to dissuade people with casual interest, certainly – but it’s tolerable for early adopters. The large installed base of PS4s, in particular, makes Sony’s offering interesting and could result in a market for PlayStation VR ramping up significantly faster than pessimistic forecasts suggest. On the PC side, things are a little more worrying – there’s the prospect of a standards war between Valve and Oculus, which won’t be good for consumers, and a question mark over how many enthusiasts actually own a PC powerful enough to run a VR headset reliably, though of course, the cost of PCs that can run VR will fall between now and the 2016 launch.

All the same, the crux of the matter remains that VR is going to be expensive enough – even the headsets alone – to make it into an early-adopter only market during its first year or so. It’s not just the cost, of course; the very nature of VR is going to make it into a slightly tough sell for anyone who isn’t a devoted enthusiast, and more than almost any other type of device, I think VR is going to need a pretty big public campaign to convince people to try it out and accept the concept. It’s one thing to wax lyrical about holodecks and sci-fi dreams; it’s quite another to actually get people to buy into the notion of donning a bulky headset that blocks you off from the world around you in the most anti-social way imaginable. If you’re reading a site like, you almost certainly get that concept innately; you may also be underestimating just how unattractive and even creepy it will seem to a large swathe of the population, and even to some of the gamer and enthusiast market VR hopes (needs!) to capture.

The multi, multi million dollar question remains, as it has been for some time – what about software? Again, Sony has something of an advantage in this area as it possesses very well regarded internal studios, superb developer relations and deep pockets; combined with its price and market penetration advantages, these ought to more than compensate for the difference in power between the PS4 and the PCs being used to power Rift and Vive, assuming (and it’s a big assumption) that the PS4′s solution actually works reliably and consistently with real games despite its lack of horsepower. The PC firms, on the other hand, need to rely on the excitement, goodwill and belief of developers and publishers to provide great games for VR in its early days. A handful of teams have devoted themselves to VR already and will no doubt do great things, but it’s a matter of some concern that a lot of industry people you talk to about PC VR today are still talking in terms of converting their existing titles to simply work in 3D VR; that will look cool, no doubt, but a conversion lacking the attention to controls, movement and interaction that’s required to make a VR world work will cause issues like motion sickness and straight-up disappointment to rear their ugly heads.

If VR is going to be priced as a system, not just a toy or a peripheral, then it needs to have software that people really, really want. Thus far, what we’ve seen are demos or half-hearted updates of old games. Even as we get close enough to consumer launches for real talk about pricing to begin, VR is still being sold off the back of science fiction dreams and long-held technological longings, not real games, real experiences, real-life usability. That desperately needs to change in the coming months.

At least Hololens, which this week revealed an eye-watering $3000 developer kit to ship early next year, has something of a roadmap in this regard; the device will no doubt be backed up by Microsoft’s own studios (an advantage it shares, perhaps to a lesser degree, with Sony) but more importantly, it’s a device not aimed solely at games, one which will in theory be able to build up a head of steam from sales to enterprise and research customers prior to making a splash in consumer markets with a more mature, less expensive proposition. I can’t help wondering why VR isn’t going down this road; why the headlong rush to get a consumer device on the market isn’t being tempered at least a little by a drive to use the obvious enterprise potential of VR to get the devices out into the wild, mature, established and affordable before pushing them towards consumers. I totally understand the enthusiasm that drives this; I just don’t entirely buy the business case.

At the very least, one would hope that if 2016 is the year of VR, it’s also the year in which we start to actually see VR in real-life applications beyond the gaming dens of monied enthusiasts. It’s a technology that’s perfectly suited to out-of-home situations; the architect who wants to give clients a walkthrough of a new building design; the museum that wants to show how a city looked in the past; the gaming arcade or entertainment venue that wants to give people an experience that most of them simply can’t have at home on their consoles. VR is something that a great many consumers will want to have access to given the right software, the right price point and crucially, the right experience and understanding of its potential. Getting the equipment into the hands of consumers at Tokyo Games Show or EGX is a start, but only a first step. If VR’s going to be a big part of the industry’s future, then come next year, VR needs to be everywhere; it needs to be unavoidable. It can’t keep running on dreams; virtual reality needs to take a step into reality.

Can Samsung Compete With Intel In The x86 Chip Space?

October 9, 2015 by Michael  
Filed under Computing

Samsung is not doing that well in smartphones. To be fair, no one is, but Samsung has the ability to become something much more interesting – it could replace AMD as Intel’s rival.

Actually AMD is pretty cheap right now and if it was not for the pesky arrangement that prevents AMD’s buyer getting its x86 technology then it would have been snapped up a while ago. But with, or without AMD, Samsung could still make a good fist of chipmaking if it put its mind to it. At the moment its chipmaking efforts are one of the better things on its balance sheet.

Its high-margin semiconductor business is more than making up for the shortfall in smartphones. Selling chips to rivals would be more lucrative if they were not spinning their own mobile business. The products it have are worth $11.7 billion this year, more than half the company’s total.

Growing demand for chips and thin-film displays is probably the main reason that Samsung now expects operating profit to have reached $6.3 billion. After applying Samsung’s 16 percent corporate tax rate, its chip division is likely to bring in net income of slightly less than $10 billion.

To put this figure into perspective Intel expects to earn $10.5 billion in this year. Samsung is also sitting on a $48 billion net cash pile. Samsung could see its handset and consumer electronics business as a sideline and just focus on bumping off Intel.

The two sides of such a war would be fascinating. Intel has its roots in the PC chip market which is still suffering while Samsung is based in the mobile chip market which is growing. Intel has had no luck crossing into the mobile market, but Samsung could start looking at server and PC chips.

AMD is still dying and unable to offer Intel any challenge but there is a large market for those PC users who do not want to buy Intel. What Samsung should have done is use its huge cash pile to buy its way into the PC market. It might have done so with the IBM tech which went to Lenovo. It is still not out of the running on that front. Lenovo might be happy to sell IBM tech to Samsung.

Another scenario is that it might try to buy an x86 licence from Intel. With AMD dying, Intel is sitting on a huge monopoly for PC technology. It is only a matter of time before an anti-trust suit appears. Intel might think it is worthwhile to get a reliable rival to stop those allegations taking place. Samsung would be a dangerous rival, but it would take a while before it got itself established. Intel might do well to consider it. Of course Samsung might buy AMD which could sweeten that deal for Intel.

Samsung could try adapting its mobile chip technology for the PC/server market – it has the money to do it. Then it has a huge job marketing itself as the new Intel.
It might just work.



Is Sony Dropping Morpheus?

September 21, 2015 by Michael  
Filed under Gaming

Sony has pulled back the curtains on its virtual reality headset, giving it an official introduction to the wild and a real-life name.

That name is PlayStation VR, which is an obvious but uninspired choice. The name that the unit had earlier, Morpheus, which was probably a nod towards starts-great-but-ends-badly film series The Matrix, had a bit more glamour about it.

The firm has shown off the hardware to the local journalistic crowd at the Tokyo Game Show, and provided the general press with information, details and specifications.

PlayStation VR was first discussed in March 2014 when it had the cooler name. Since then the firm has been hard at work getting something ready to announce and sell, according to a post on the PlayStation blog.

A game show in Tokyo would seem the most likely place for such an announcement.

Sony said that the system is “unique”, apparently because of a special sound system, and makes the most of the Sony PS4 and its camera. The firm is expecting the device to have a big impact on PlayStation gamers and gaming.

“The name PlayStation VR not only directly expresses an entirely new experience from PlayStation that allows players to feel as if they are physically inside the virtual world of a game, but reflects our hopes that we want our users to feel a sense of familiarity as they enjoy this amazing experience,” said Masayasu Ito, EVP and division president of PlayStation product business.

“We will continue to refine the hardware from various aspects, while working alongside third-party developers and publishers and SCE Worldwide Studios, in order to bring content that delivers exciting experiences only made possible with VR.”

Specifications are available, but they relate to a prototype and are subject to change. Sony said that the system has a 100-degree field of view, a 5.7in OLED display, a 120Hz refresh rate, and a panel resolution of 960×RGB×1080 per eye.

This will not put it at the high end of the market, as the field of view is only 10 degrees greater than with Google Cardboard, and 10 degrees under that of Oculus Rift. Some rivals go as wide as 210 degrees.

And no, no release date or price have been mentioned. We predict that these will be 2016 and expensive.


MediaTek Goes Power Management

September 15, 2015 by Michael  
Filed under Computing

Beancounters at Digitimes research claim that Mediatek’s moves to buy Richtek Technology will help the analog IC vendor to push its power management (PWM) gear to the smartphone and TV panel sectors.

Lately Richtek has been moving away from its notebook and motherboard segments and into PWM ICs for telecommunication and consumer applications. The idea was to avoid being damaged too much by the slump in the PC market.

Richtek has scored orders from Samsung Electronics for production of entry-level and mid-range smartphones and has also been ramping its PWM solutions to China’s LCD TV panel sector.

MediaTek has also announced its plans to acquire LCD driver IC maker Ilitech it means that the outfit will build up a comprehensive supply chain for the TV industry in China, Digitimes noted.

MediaTek could also use the planned 12-inch joint venture fab to be built by Powerchip Technology, which is the parent company of Ilitek, in Hefei, China.

The JV fab will provide foundry services for LCD driver ICs in 2018-2019.



Will MediaTek Buy RichTek?

September 11, 2015 by Michael  
Filed under Computing

MediaTek is planning to write a cheque for a 51 per cent stake in analogue ICs Richtek Technology and might even buy the whole company.

The company will offer US$5.94 for each Richtek common share. After completing the tender offer and going through relevant legal procedures, the company will move forward taking over the remaining shares of Richtek. The follow-up acquisition of Richtek shares is expected to complete in the second quarter of 2016.

Ming-Kai Tsai, MediaTek chairman and CEO said that Richtek was is a leader in analogue ICs and provides comprehensive power management solutions to satisfy various customer demand, backed by an experienced management and R&D team.

“We believe, through the deal, the competitive edges of both companies will be leveraged to maximize the platform synergy, strengthen MediaTek in Internet of Things segment and further enhance MediaTek’s competitiveness in the fast-changing and ever-competitive global semiconductor market,” he said.

Richtek chairman Kenneth Tai claimed the two outfits were complementary in power management IP and products which creates a leadership position in this field.

He said that by using MediaTek’s platform leadership, Richtek could optimize power management performance on the system level to enable competitive products for customers and further expand analogue IC offerings to propel the company into its next stage of growth.


MediaTeks Helio X30 Processor Comes To Light

September 3, 2015 by Michael  
Filed under Computing

The rumored Helio X30 is real and if you thought that X20 was not enough to see off Snapdragon 820, it looks like the Helio X30 has a much better chance.

All new Helio X20 deca-core has two A72 at 2.5GHz, four A53 at 2.0 and four A53 cores at 1.4 GHz. It has Core pilot 3.0 is a smart scheduler that decides which core gets what task.

This processor has every chance to be faster than Snapdragon 620 from Qualcomm. The Snapdragon 620 comes with four A72 cores at 1.8GHz and four A53 at 1.4 GHz but we are unsure how Helio X20 goes will match up against the Snapdragon 820 with its custom quad Krait cores.

But the the Helio X30 has four A72 cores at 2.5GHz, two A72 clocked at 2GHz, two Cortex A53 clocked at 1.5GHz and two low power A53 at 1GHz. A senior executive from MediaTek told us that not all cores were created equal.

Despite the fact that the  word “A53″ on the box looks like “A53″ on the other box, one is optimized for performance and the other for low power. If it is unclear if the A53 based cluster from MediaTek is the same as A53 cluster from Qualcomm.

As you can read at Fudzilla we spent quite some time learning about the potential gains of having three clusters. The X20 can have 30 to 40 percent less power consumption, simply by being smart how it uses all ten cores / three clusters.

With Helio X30 you will gain more performance with six out of ten cores being based on the A72 core. Having ten cores in four clusters raises another question, how efficient will the four cluster approach be versus the three cluster approach?

MediaTek has not officially confirmed or launched the Helio X30, but we expect that this will happen soon. The X30 should be shipping in devices in early 2016. at least this is what we would expect to place it well against the  Snapdragon 820.


Does The Xbox One Mini Exist?

September 1, 2015 by Michael  
Filed under Gaming

The rumor mill might have been a bit broken when it was announced that Microsoft was about to launch an Xbox-mini.

The rumor claimed that Microsoft would be holding a launch event in October where people could expect the company to launch the Surface Pro 4, Lumia flagships and an “Xbox One Mini.”

It was claimed that the X-box mini would be third the size of the current console and lack a Blu-Ray drive.

However Microsoft’s Phil Spencer has now debunked this theory, stating that the rumors are simply “not real”. Although he didn’t say the project didn’t exist just that the rumor that it was coming out in October was “not real.”

Given the nature of reality, and theories that the universe is a holographic game being played two-dimensional gods, we are not ready to dismiss out of hand yet.

While the Xbox One Mini definitely won’t be happening the Lumia flagships; Cityman and Talkman, new Surface tablets including the Surface Pro 4, the eagerly awaited Band 2 and perhaps even a slimmer Xbox One is still a possibility at the event.


AMD Still Losing Ground

August 21, 2015 by Michael  
Filed under Computing

AMD is continuing to lose market share to Nvidia, despite the fact that its new best video card, the Fury is out.

AMD always had a get out of jail card when the last GPU market share numbers were out on the basis of it not having released anything. At the time NVidia had 76% of the discrete GPU market. This was when Nvidia’s best card was the GeForce GTX 980.

A lot happened in that time. There was the release of the Titan X in March, and before the GTX 980 Ti in June. AMD had its Hawaii architecture inside of the R9 290X, and the dual-GPU in the form of the R9 295X2. It was expected that the R9 390X might turn AMD’s luck around but that turned out to be another rebrand. Then there was the arrival of the R9 Fury X.

AMD has new products on the market: the R9 Fury X, R9 Fury, R9 390X and a bunch of rebranded 300 series video cards. But according to Mercury Research’s latest data, NVIDIA has jumped from 76% of the discrete GPU market in Q4 2014 to 82 per cent in Q2 2015.

AMD has 18 per cent of the dGPU market share, even after the release of multiple new products.

It is not that the Fury X isn’t selling well, but because of yield problems there will only 30,000 units made over the entire of the year.

AMD also rebranded nearly its entire product stack thus making no reason to buy a R9 390X if you own an R9 290X.

Sure there is 8GB of GDDR5 on board compared to the 4GB offered on most R9 290X cards, but that’s not enough to push someone to upgrade their card.

Tweaktown noted that  there was a big issue of the HBM-powered R9 Fury X not really offering any form of performance benefits over the GDDR5-powered GeForce GTX 980 Ti from NVIDIA. The 980 Ti beating the Fury X in some tests which it should not have.

Nvidia has plenty of GM200 GPUs to go around, with countless GTX 980 Ti models from a bunch of AIB partners. There is absolutely no shortage of GTX 980 Ti cards. Even if you wanted to get your paws on a Fury X, AMD has made it difficult.

Now it seems that next year could be a lot worse for AMD. Nvidia will have its GP100 and GP104 out next year powered by Pascal. This will cane AMD’s Fiji architecture. Then Nvidia will swap to 16nm process when its Maxwell architecture is already power efficient. Then there is the move HBM2, where be should see around 1TB/sec memory bandwidth.

All up the future does not look that great for AMD.


More Details Uncovered On AMD’s ZEN Cores

August 17, 2015 by Michael  
Filed under Computing

Our well informed industry sources have shared a few more details about the AMD’s 2016 Zen cores and now it appears that the architecture won’t use the shared FPU like Bulldozer.

The new Zen uses a SMT Hyperthreading just like Intel. They can process two threads at once with a Hyperthreaded core. AMD has told a special few that they are dropping the “core pair” approach that was a foundation of Bulldozer. This means that there will not be a shared FPU anymore.

Zen will use a scheduling model that is similar to Intel’s and it will use competitive hardware and simulation to define any needed scheduling or NUMA changes.

Two cores will still share the L3 cache but not the FPU. This because in 14nm there is enough space for the FPU inside of the Zen core and this approach might be faster.

We mentioned this in late April where we released a few details about the 16 core, 32 thread Zen based processor with Greenland based graphics stream processor.

Zen will apparently be ISA compatible with Haswell/Broadwell style of compute and the existing software will be compatible without requiring any programming changes.

Zen also focuses on a various compiler optimisation including GCC with target of SPECint v6 based score at common compiler settings and Microsoft Visual studio with target of parity of supported ISA features with Intel.

Benchmarking and performance compiler LLVM targets SPECint v6 rate score at performance compiler settings.

We cannot predict any instruction per clock (IPC improvement) over Intel Skylake, but it helps that Intel replaced Skylake with another 14nm processor in later part of 2016. If Zen makes to the market in 2016 AMD might have a fighting chance to narrow the performance gap between Intel greatest offerings.


Console Software Sales Strong And Growing

August 13, 2015 by Michael  
Filed under Gaming

As the 7th console generation was coming to an end several years ago, there was much pessimism regarding the impending launch of the 8th generation. Just as 7th generation software sales were starting to lag, mobile gaming exploded, and PC gaming experienced a renaissance. It was easy to think that the console players were going to be going elsewhere to find their gaming entertainment by the time the new consoles hit the scene. However, the 8th generation consoles have had a successful launch. In fact, the Sony and Microsoft consoles are as successful as ever.

A comparison of the year over year console software sales suggests that the 8th generation is performing better than the 7th generation – provided you exclude the Nintendo consoles. The following graph shows physical and digital software sales for years 1 through 3 of each generation for the Xbox and PlayStation platforms.

The annual numbers take into account the staggered launch cycle, so year 1 comprises different sales years for Xbox 360 and PS3. The data shows that the Sony and Microsoft platforms have outperformed their 7th generation counterparts, especially in the first two years of the cycle. The 8th generation outperforms the 7th generation even in an analysis that excludes DLC, which now accounts for an additional 5-10 percent of software sales.

However, the picture is far different if we include the Nintendo platforms. The graph below shows the same data, but now includes the Wii and Wii U in their respective launch years.

The data shows how much the “Wii bubble” contributed to the explosive growth in software sales in 2008, the year the Wii really took off as a family and party device. This data corroborates a broader theme EEDAR has seen across our research – new, shortened gaming experiences that have added diversity to the market, especially mobile, have cannibalized the casual console market, not the core console market. People will find the best platform to play a specific experience, and for many types of experiences, that is still a sofa, controller, and 50 inch flat-screen TV.

The shift in consoles to core games is further exemplified by an analysis of sales by genre in the 7th vs. 8th generation. The graph below shows the percentage of sales by genre in 2007 versus 2014, ordered from more casual genres to more core genres. Casual genres like General Entertainment and Music over-indexed in 2007 while core genres like Action and Shooter over-indexed in 2014.

It has become trendy to call this console generation the last console generation. EEDAR believes one needs to be very specific when making these claims. While this might be the last generation with a disc delivery and a hard drive in your living room, EEDAR does not believe the living room, sit-down experience is going away any time soon.

AMD’s x86 16-core Heterogenous EHP Processor Spotted

August 6, 2015 by Michael  
Filed under Computing

It was rumored back in April about AMD’s upcoming Exascale Heterogeneous Processor (EHP) with 16 cores and a Greenland APU, and now it seems that the rest of the world has caught up to the news.

A paper was submitted at IEEE and it was the first time AMD mentioned sixteen Zen cores wrapped around the GPU and powered by HBM 2 memory. We believe that this is a 16-core processor with 32 thread support and not 32 core as many reported. We will know soon enough and then can have another “we told you so” headline.

We would not be surprised if we hear more about this AMD processor at the Hot Chips conference on August 23rd. The EHP computing solution uses a silicon interposer and an APU chip that, almost as a raison d’être for AMD over the past several years, packs a GPU and CPU into a well-tuned band.  All this will be surrounded by die-stacked DRAM.The Italian website that brought this news back to life claims that AMD expects to ship the product between 2016 and 2017. That is the sort of timing you can expect with the rest of the ZEN based cores on the market. One can only hope that it will happen sooner rather than later. AMD needs to get more of the high performance compute market and earn some profits.

The IEEE article gives a bit of light on AMD exascale computer strategy:

Exascale computing requires very high levels of performance capabilities while staying within very stringent power budgets. Hardware optimized for specific functions is much more energy efficient than implementing those functions with general purpose cores. However, there is a strong desire for supercomputer customers to not have to pay for custom components designed only for high-end HPC systems, and therefore high-volume GPU technology becomes a natural choice for energy-efficient data-parallel computing. To fully realize the capabilities of the GPU, we envision exascale compute nodes comprised of integrated CPUs and GPUs (i.e., accelerated processing units or APUs) along with the hardware and software support to enable scientists to effectively run their scientific experiments on an exascale system.  [In the paper submitted to IEEE...] We discuss the hardware and software challenges in building a heterogeneous exascale system, and we describe on-going research efforts at AMD to realize our exascale vision.