Subscribe to:

Subscribe to :: ::

Pricing For AMD’s Ryzen 1300X Leaked

July 20, 2017 by  
Filed under Computing

According to a Reddit post, upcoming Ryzen 3 SKUs, the Ryzen 3 1300X and the Ryzen 3 1200, will be hitting the market at US $129 and US $109 (exc. VAT), respectively.

While AMD has revealed a lot of details regarding its two upcoming quad-core Ryzen 3 SKUs yesterday, including the clocks and the launch date, we are still missing a couple of key details, including the TDP, amount of cache and the price.

In case you missed it yesterday, both Ryzen 3 SKUs are quad-core parts without SMT (Simultaneous Multi Threading) support, so they will stick to “just” four threads. The Ryzen 1300X works at 3.5GHz base and 3.7GHz Turbo clocks, while the Ryzen 3 1200 works at 3.1GHz base and 3.5GHz Turbo clocks. As rumored earlier, the Ryzen 3 lineup should retain the 2MB/8MB cache (L2/L3) as the Ryzen 5 series and should have the same 65W TDP, although these details are left to be confirmed.

Luckily, a Reddit user has managed to get unconfirmed details regarding the price of these two SKUs, suggesting that they should launch at US $129 for the Ryzen 3 1300X and US $109 for the Ryzen 3 1200, excluding VAT. While the price of the Ryzen 3 1300X sounds about right, and similar to what we heard before, we have our doubts regarding the Ryzen 3 1200 price, which we suspect would be closer to the US $100 mark.

In any case, we’ll know for sure in about two weeks when these parts are scheduled to hit retail/e-tail shelves. It will be quite interesting to see these Ryzen 3 SKUs compared to some Intel Core i3 Kaby Lake dual-core parts as we are quite sure that these will give Intel a hard time in that part of the market, offering significantly higher performance for much less money.


Is Virtual Reality To Expensive For The Masses

July 20, 2017 by  
Filed under Gaming

The current generation of virtual reality is not dead, but it’s not exactly full of life, either. What once was a pulsating buzz has faded into the background of an industry, not because there are newer, shinier toys to play with, but simply because for all the newness and shine of VR, there has been little evidence that a significant audience exists for the experiences we can deliver at this time.

Earlier this week, Oculus instituted a temporary $200 price cut of the Rift, dropping the headset and its Touch controllers to a $400 bundle that comes packed with seven free games (including Lucky’s Tale, Medium, Toybox, and Robo Recall) and an Xbox One controller for good measure. That’s in addition to the $200 price cut Oculus rolled out in March for the headset and Touch combo, meaning the company has slashed the price by 50% in just four months.

On its own, this could actually be an encouraging sign, but taken in context of the rest of the news coming out of the VR sector, it’s more concerning than convincing. For one, Oculus looks to be bringing up the rear among the three major high-end VR options on the market, despite being a first mover and having the significant financial backing of Facebook. Through the first half of this year, tracking firm Superdata put the Rift’s installed base at just 383,000 units, compared to HTC Vive’s 667,000 units and PlayStation VR’s 1.8 million.

Even ignoring its relative sales position, Oculus is already in a tough spot in the enthusiast VR fight, technologically a step behind the more expensive Vive, but still more expensive (when considering the cost of a VR-capable PC) and less mass market than the PSVR. That’s a difficult problem for marketing anything, doubly so when what you’re selling is an experience that by its nature needs to be experienced to be fully understood, triply so when you’re drastically scaling back the number of demo units in retail locations where interested customers could get their first taste of VR.

I also question Oculus’ decision to shutter its in-house Story Studio, which was set up with Pixar veterans to show how VR could shift the medium of film as much as it could games. The studio’s Henry won an Emmy in 2016. Its follow-up, Dear Angelica, premiered at Sundance earlier this year to rave reviews and has been submitted for Emmy consideration at this year’s awards, which are still a few months away. In short, Story Studio was exactly the sort of investment in a potentially disruptive medium you would expect a company with long-term ambitions to keep. Instead, they cut it loose, with head of content Jason Rubin essentially saying it was time for external filmmakers to pick up the narrative VR ball (albeit with some $50 million in funding from Oculus).

There’s a bit of a theme there. Just a couple months before closing Story Studio, Rubin pointed out for at GDC that Facebook–and by extension, Oculus–isn’t a content creation company.

“Facebook’s not a media company,” Rubin said. “So there may be a day where Facebook says we’re going to head towards our core competency… That’s why I don’t have internal teams. I have exactly one group of three people besides Story Studios because that didn’t exist outside.”

Facebook didn’t pay $2 billion for Oculus in 2014 because it wanted to make games. It wanted VR to be a popular thing it could leverage for its social network. If HTC Vive or Sony or Microsoft can make VR work better than Oculus, that still gets VR where the social network wants it to be. That’s not ideal for Facebook, but after the Rift’s slow start, the hundreds of millions it already owes in court judgments, the hundreds of millions more it might be made to pay in the future, and seeing the face of the VR revolution leave under a cloud of controversy, one could understand if the company’s commitment to VR began to waver.

Speaking of the competition, I’m not terribly optimistic with what they’re bringing to the table. Sony’s PSVR is leading the pack, but I’m still skeptical whether the company’s interest in the hardware will be any longer lasting than its support for Vita, or Wonderbook, or PlayStation TV, or Move, or EyeToy, or stereoscopic 3D. Sony’s E3 conference featured some promising games in Polyarc’s Moss, two titles from Until Dawn developer Supermassive, and Skyrim VR, but little that stands out as a system-seller the way that Resident Evil 7, or even the prospect of last year’s Batman and Star Wars VR experiences might have. When asked at E3 about whether that lineup would boost PSVR adoption, Sony’s Jim Ryan was unsure.

“I think we are still really just learning about VR,” Ryan said. “When hopefully we meet in a year’s time, I will be able to give you a better answer to this question. It still won’t be a perfect answer, but I’ll know more.”

That’s not exactly an overwhelming vote of confidence from PlayStation’s chief marketer. I’m not sure I want to bet the future health of VR on Sony’s continued support for a market that is (for now, at least) peripheral to its core business.

The situation with HTC and the Vive underscores another issue when trying to establish an emerging field like VR. Vive launched at the cutting edge, but since then has rolled out object tracker peripherals and a wireless adaptor, respectively giving developers more options and addressing a key complaint around high-end VR. In both cases, they would be better served as being part of the core hardware package rather than optional add-ons for what is already the most expensive option on the market. For the next generation of VR, perhaps they’ll be standard.

But who will invest in the next generation of enthusiast VR–on either the consumer side or the manufacturer side–if this generation disappoints? How long does a VR generation need to be before someone who spent $800 on a Vive (not to mention the cost of a VR-capable PC) feels they got their money’s worth and would re-up for a successor? How many great games does it need to have? How many generations does an HTC or Facebook need to take a bath on before the business turns around and justifies the continued investment?

Then there’s Microsoft, which will enter the fray this holiday season with its “mixed reality” VR headsets for Windows that are cheaper and require less of a set up than Oculus or Vive, but appear to make compromises on the technical side to get there. It’s telling that even with Microsoft launching the high-end, VR-capable Xbox One X this year, it is foregoing any sort of console VR push and relying on higher resolutions and better frame rates for Xbox One games as the sales pitch for a One X. Phil Spencer told us at E3 that VR was still years away from the mainstream for gamers, suggesting the company was waiting to launch its console VR until it had a proper wireless solution ready.

At this point, it seems more likely to me that the current enthusiast VR market is an expensive R&D exercise that won’t produce successful systems, but will lay the groundwork for the actual mass market VR, which will instead evolve both in audience and use-cases from the mobile VR world. (We call it mobile VR, but I don’t think I’m alone in having never once seen someone using a mobile VR headset on the subway, in the security line at the airport, or in the waiting room at a dentist.)

A number of the VR developers I’ve spoken to have mentioned wires, price, system-selling software, and installed base as key issues VR needs to tackle to become truly mainstream. As Google Daydream and the Oculus-powered Gear VR have shown, the first two are all but solved problems in mobile VR thanks to the use of existing smartphones. As for the other two, when your system is only $100 or so, the definition of a system-seller changes dramatically, which then has plenty of beneficial implications for the installed base. (Promotions like Samsung giving away Gear VR with new Galaxy phone purchases don’t hurt, either.)

All mobile VR really needs are better interfaces and more powerful phones. The Gear VR motion controller is a good first step for the former, and the latter is improving all the time. If VR is really going to go mass market, doesn’t it make more sense for it to grow not from the high-end early adopter market who would have dropped $600 on a PS3, but from the masses who made a compelling novelty like the $250 Wii a phenomenon?

Is Intel Worried About AMD’s Epyc Processor

July 19, 2017 by  
Filed under Computing

Intel is clearly feeling a little insecure about AMD’s new Epyc Server processor range based on the RyZen technology.

Intel’s press office retreated to the company safe and pulled out its favorite pink handbag and emerged swinging.

It did a direct comparison between the two, and in one slide, it mentioned that the Epyc processor was ‘inconsistent’, and called it ‘glued together’.

Intel noted that it required a lot of optimisations to get it to work effectively, comparing it to the rocky start AMD had with Ryzen on the desktop. That is pretty much fighting talk, and it has gone down rather badly.

TechPowerUp noted that even though Epyc did contain four dies, it offered some advantages as well, like better yields. On top of that, they noted: “So AMD’s server platform will require optimisations as well because Ryzen did, for incomparably different workloads? History does inform the future, but not to the extent that Intel is putting it here to, certainly. Putting things in the same perspective, is Intel saying that their Xeon ecosystem sees gaming-specific optimizations?”

Intel still has a healthy lead on AMD in the server space. However, since the launch of Ryzen, Intel has seen a significant drop in support in the desktop market.

Trash talking is usually a sign that there is not much difference between products and it never really works – other than to amuse.

AMD announced its line of Epyc processors last month. The range consists of chips between eight and 32 cores, all of which support eight channels of DDR4-2666 memory. Pricing was announced to start from $400.


AMD Goes EPYC To Take On Intel In The Server Space

June 30, 2017 by  
Filed under Computing

AMD has unveiled the first generation of its Zen-based Epyc server processors as it looks to take on Intel in the data centre market.

We knew this was coming, and AMD on Monday showed off its AMD Epyc 7000 series at an event in Austin, Texas. The lowest-spec offering is the Epyc 7251, which offers eight cores supporting 16 simultaneous threads, and a base frequency of 2.1GHz that tops out at 2.9GHz at maximum boost.

The Epyc 7601 is the firm’s top-of-the-line chip, and packs 32 cores, 64 threads and a base frequency of 2.2GHz, with maximum boost at 3.2GHz. AMD claims that, compared to Intel’s comparable Xeon processor – which offer up to 24 cores – the new Epyc 7601 offers 47 per cent higher performance.

What’s more, AMD claims that each Zen core is about 52 per cent faster per clock cycle than the previous generation, and boasts that the chips are more competitive in integer, floating point, memory bandwidth, and I/O benchmarks and workloads.

“With our Epyc family of processors, AMD is delivering industry-leading performance on critical enterprise, cloud, and machine intelligence workloads,” said Lisa Su, president and CEO of AMD.

“Epyc processors offer uncompromising performance for single-socket systems while scaling dual-socket server performance to new heights, outperforming the competition at every price point. We are proud to bring choice and innovation back to the datacenter with the strong support of our global ecosystem partners.”

Each Epyc processor – of which there are nine different models – also offers eight memory channels supporting up to 2666MHz DDR4 DRAM, 2TB of memory and 128 PCIe lanes. 

Server manufacturers have been quick to introduce products based on AMD Epyc 7000-series processors, including HPE, Dell, Asus, Gigabyte, Inventec, Lenovo, Sugon, Supermicro, Tyan, and Wistron, while the likes of Microsoft, Dropbox and Bloomberg also announced support for Epyc in the data centre. 

Monday’s launch marks the company’s first major foray back into servers and the data centre for almost a decade. The Opteron line of server microprocessors from AMD, first launched in 2003, found its way into an increasing number of the world’s top-100 most powerful supercomputers, peaking in 2010 and 2011 when 33 of the top 100 were powered by AMD Opteron.

Clearly feeling the heat, Intel has taken the bizarre approach of responding to AMD’s Epyc launch, and said that its rivals approach could lead to “inconsistent performance.”

“We take all competitors seriously, and while AMD is trying to re-enter the server market segment, Intel continues to deliver 20+ years of uninterrupted data center innovations while maintaining broad ecosystem investments,” the firm said in a statement.

Our Xeon CPU architecture is proven and battle tested, delivering outstanding performance on a wide range of workloads and specifically designed to maximise data centre performance, capabilities, reliability, and manageability. With our next-generation Xeon Scalable processors, we expect to continue offering the highest core and system performance versus AMD.

“AMD’s approach of stitching together 4 desktop die in a processor is expected to lead to inconsistent performance and other deployment complexities in the data centre.”


Is AMD’s Ryzen 1950X Ready To Hit The Market

June 26, 2017 by  
Filed under Computing

AMD’s Ryzen ThreadRipper 1950X CPU engineering sample, a 16-core/32-thread SKU, has been spotted on Geekbench running at 3.4GHz base clock.

This should be the flagship SKU and it appears it won’t have the 1998X model number, as previously rumored. The engineering sample works at 3.4GHz base clock and was running on an ASRock X399 Professional Gaming motherboard with 16GB of DDR4-2133 memory.

The ThreadRipper 1950X, as it is currently called, packs a massive 32MB of L3 cache and 8MB of L2 cache. Since this is an engineering sample, bear in mind that the performance figures are far from final as AMD will probably further optimize the performance and the sample was not running with lower clocked memory, with no details on the quad- or dual-channel setting.

According to the results posted on Geekbench and spotted by, the ThreadRipper 1950X managed to get a 4,167 score in the single-thread benchmark and 24,539 points in multi-thread benchmark.

The CPU was compared to Intel’s Xeon E5-2697A 4 CPU, which is also a 16-core/32-thread CPU based on Broadwell architecture and which scores 3,651 in single-thread and 30,450 points in multi-thread performance.


Will nVidia’s Next GeForce Go HBM2

June 22, 2017 by  
Filed under Computing

Volta is out for artificial intelligence, machine learning applications and it will be shipping in DGX 1 systems, mainly for deep learning and AI. The next Geforce will be a  completely separate chip.

Of course, Nvidia won’t jump and manufacture a high end Geforce card with 21 billion transistors. That would be the Volta that Nvidia CEO Jensen launched back in May. That would be both risky and expensive. One of the key reasons is that Nvidia doesn’t really have to push the technology possibilities as GP102 based 1080 Ti and Titan Xp look really good.

Our well-informed sources tell us that the next Geforce will not use HBM 2 memory. It is too early for that, and the HBM 2 is still expensive. This is, of course, when you ask Nvidia, as AMD is committed to make the HBM 2 GPU – codenamed Vega for more than a year now. Back with “Maxwell”, Nvidia committed to a  better memory compression path and continued to do so with Pascal.

The next Geforce – and its actual codename is still secret – will use GDDR5X memory as the best solution around. We can only speculate that the card is even Volta architecture Geforce VbG. The big chip that would replace the 1080 ti could end up with the Gx104 codename. It is still too early for the rumored GDDR6, that will arrive next year at the earliest.

All eyes are on AMD, as we still have to see the Vega 10 launching. At the last financial analyst conference call, the company committed to launch the HBM 2 based Vega GPU at Siggraph. This year, Siggraph takes place between July 30 and August 3.

AMD’s lack of a higher end card doesn’t really help its financial cause as you need high margin cards to improve your overall sales profits. The fact that the Radeon RX 570 and 580 are selling for miners definitely helps the RTG. The Radeon Technology Group is selling all they can make, and this is a good place to be. The delay for Vega is not so appealing, but again, if this card ends up being a good miners’ card, gamers might have a hard time getting them at all.


Apple And Ikea Team Up On Virtual Furniture

June 21, 2017 by  
Filed under Around The Net

Ikea, the famous flat-pack furniture manufacturer, is developing an app that will digitally overlay true-to-size furniture using Apple’s new ARKit technology. Looking through the window of an iPhone or iPad, you’ll be able to see how Ikea’s furniture could look in your home before you have to buy or assemble anything.

Apple unveiled ARKit at its WWDC conference earlier this month, naming the Swedish furniture company as one of its partners, but other details were scarce. Now, thanks to an interview with Ikea digital transformation manager Michael Valdsgaard at Di Digital, we’re getting a little more information on the fruits of that partnership.

According to Valdsgaard, the app will have realistic 3D renders of 500-600 pieces of furniture upon its launch, with items added sporadically. Ikea also hopes to add a feature that lets you buy furniture from the app after you virtually map it out in your house.

Just don’t be surprised if the app doesn’t have the exact rocking chair you want at launch — Ikea’s full catalog includes tens of thousands of items.

We already knew that Apple CEO Tim Cook is a big fan of AR, calling the technology “huge” and claiming it has more potential than VR. But in order to get behind Cook’s excitement, we’ll need to see some real world applications of the technology, besides just catching Pokemon. The Ikea app, which is reportedly aiming to launch in the autumn when iOS 11 is available, could be a good example.

Ikea did not immediately respond to a request for comment.

Was Apple’s “Planet Of The Apps” A Good Idea

June 19, 2017 by  
Filed under Around The Net

Apple’s debut into the world of original television programming shows that its self-obsession and lack of self-awareness make for dire telly.

Apple felt that the world was ready for it to show off its skills and make some original content. It had to be “Apple friendly” and still interest those users that it really does not like. What could go wrong?

Well according to even the most sympathetic reviewers in the Tame Apple press, the show is really dire.

The idea is to bring app developers in a competition to try to get mentoring and assistance from hosts Jessica Alba,, Gwyneth Paltrow and entrepreneur Gary Vaynerchuk. Now call me odd but I really would not think Gwyneth and Jessica know that much about programming and, try as I may, I have not ever been able to find the point of

Contestants describe their proposals as they ride an escalator down onto a stage where the judges sit, and then fire questions at the app developer.

Variety said the “Planet of the Apps” feels like something that was developed at a cocktail party, and not given much more rigorous thought or attention after the pitcher of mojitos was drained. It’s it’s a bland, tepid, barely competent knock-off of ” Shark Tank,” moaned the magazine.

“Apple made its name on game-changing innovations, but this show is decidedly not one of them. The program’s one slick innovation is the escalator pitch,” it added.

The show makes too many assumptions. The first one is that you will know who everyone is, when they are only famous for supporting Apple to the point of naming their own children after the company. The second assuming is that you will care about  Apple’s development process and believe it is a way to make money. Most developers of Apps for the fruity cargo cult would rather be doing something else like gouging their own eyes out with spoons. Apple does not make the creation process that easy and takes a way a big chunk of money for its lack of co-operation.

All up this shows the arrogance and narcissism of Apple in its rotting corpse glory. A sensible outfit would have spent the money having someone who knew what they were doing product a show.  Apple gets more positive publicity from its product placement on shows like Grim than it would ever get for this pile of tosh.


Could AMD’s Threadripper Undercut Intel’s 7900X

June 15, 2017 by  
Filed under Computing

According to a fresh report, AMD’s entry-level 16-core Threadripper CPU could cost as low as US $849.

According to the report coming from, the reported that the entry-level 16-core/32-threads Threadripper SKU, also known as the Threadripper 1998, which works at 3.2GHz base and 3.6GHz Turbo clock, lacks eXtended Frequency Range (XFR) feature and has a 155W TDP, could launch with a US $849 price.

If this rumor turns out to be true, AMD will significantly hurt Intel as this Threadripper will end up cheaper than Intel’s 10-core 7900X, which has a US $999 price tag (tray 1KU).

Although it could end up being slower than Intel’s 10-core chip in some scenarios, like gaming, the sheer number of cores and threads it offers would make it a great CPU for some CPU intensive tasks.

Hopefully, AMD will manage to bring more competition to the CPU market as it would both drive the prices down as well as most likely bring better CPUs in the future.


Is Apple Still The King Innovation

June 8, 2017 by  
Filed under Around The Net

Apple co-founder Steve Wozniak has said that Apple’s days of coming up with anything are gone.

Chatting to Bloomberg on what are likely to be the biggest tech breakthroughs in the coming years, and which companies are likely to make them, Woz didn’t list Apple as a contender.

Woz appeared to be ruling out Apple because it’s too big to come up with new tech.

If you look at the companies like Google and Facebook and Apple and Microsoft that changed the world — and Tesla included. They usually came from young people. They didn’t spring out of big businesses.

Woz said that small businesses, he argued, take bigger risks – and their founders create the products they really want, without the dilution that occurs with multiple decision-makers.

He thought AI was the hottest field right now, with self-driving cars at the top of the list of things likely to have a huge impact on our lives within the next five years or so.

On that basis, is it expected that Tesla is most likely to succeed, even though every major car manufacturer is working on autonomous cars.

“I think Tesla is on the best direction right now. They’ve put an awful lot of effort into very risky things […] I’m going to bet on Tesla,” he said.

He is not the only one. Already two major venture capitalists have suggested that Tesla is the closest thing to Apple during the pre-iPhone era.

The Tame Apple press of course does not think that Woz was saying anything bad. After all a technology company does not have to invent anything to be super, cool and great.


HTC Says Vivo Virtual Reality Headset Will Work With Apple’s New OS

June 7, 2017 by  
Filed under Consumer Electronics

Taiwanese consumer electronics giant HTC Corp has confirmed that its virtual reality (VR) headset will be compatible with Apple Inc’s High Sierra operating system (OS), which is scheduled for release later this year.

HTC’s Vive headset works in conjunction with Valve’s SteamVR virtual reality system, and Apple is working with Valve to make SteamVR compatible with its new OS, the U.S. tech firm said in a separate statement on Monday.

Compatibility with Apple’s Macintosh computers would greatly expand HTC’s VR reach, having so far focused on personal computers such as ones powered by Microsoft Corp’s Windows 10.

HTC has also worked in VR with Intel Corp and Alphabet Inc’s Google.

“With this, Apple brings support for HTC Vive and SteamVR to the 100 million active Mac users,” said David Dai, a senior analyst of Asian Emerging Technologies at researcher Sanford C. Bernstein. “That’s certainly good for the company.”

Apple used the Vive headset in a demonstration at the Worldwide Developers Conference on Monday, the first day of a five-day event, a HTC spokesperson told Reuters.


Are NAND SSDs A Security Risk

June 7, 2017 by  
Filed under Computing

NAND flash memory chips running solid-state drives (SSDs), include what could be called “programming vulnerabilities” that can be exploited to alter stored data or shorten the SSD’s lifespan.

According to Bleeping Computer, the programming logic powering of MLC NAND flash memory chips -the tech used for the latest generation of SSDs – is vulnerable to at least two types of attacks.

The first attack is called “program interference” and takes place when an attacker manages to write data with a certain pattern to a target’s SSD.

Writing this data repeatedly and at high speeds causes errors in the SSD, which then corrupts data stored on nearby cells. This attack is similar to the infamous Rowhammer attack on RAM chips.

The second attack is called “read disturb” and in this scenario, an attacker’s exploit code causes the SSD to perform a large number of read operations in a very short time, which causes a phenomenon of “read disturb errors” that alters the SSD’s ability to read data from nearby cells, even long after the attack stops.

The research was first mentioned in a paper with the catchy title Vulnerabilities in MLC NAND Flash Memory Programming: Experimental Analysis, Exploits, and Mitigation Techniques, authored by six researchers from Carnegie Mellon University, Seagate, and the Swiss Federal Institute of Technology in Zurich.


Can Free-To-Play Dominate Virtual Reality Gaming

June 5, 2017 by  
Filed under Gaming

Premium games will always be consumers’ first choice when it comes to virtual reality titles, says Oskar Burman.

Speaking at last week’s Nordic Game Conference, the former general manager of Rovio and CEO of Fast Travel Games said he does not believe free-to-play will take over the VR market the way it has on mobile – although it will have a significant presence.

“Free-to-play is going to become more common as the userbase grows,” he explained. “However, I don’t think free-to-play is going to be as dominant as it is on mobile. I think there will be more of a balanced mix. Even if free-to-play is eventually the majority, I still think there’s going to be a market for premium games – a little bit like it is on the PC today.”

With that in mind, he urged VR developers to build premium games, titles with a set price tag. He said that there is plenty of room to experiment with microtransactions and selling more substantial DLC, but studios certainly shouldn’t be afraid to charge for their work – particularly in the higher end of the VR space.

“It just makes sense,” he insisted. “If you spent $1000 or more on equipment, on a PC and a VR headset, then you wouldn’t mind spending a little bit more to actually buy a couple of good games.”

While there’s a lot of attention for virtual reality centred around the PC and console experiences, Burman reminded attendees that the target audience on mobile is “definitely higher”, estimating it to be close to 10m based on his studio’s own research. However, these potential customers cannot be targeted in the same way.

“The audience there, we call them semi-traditional gamers,” said Burman. “There are definitely gamers in there but there are also users who are not that familiar with gaming, who just want to try out this new VR thing. They might be trying some games, but also trying a lot of other stuff.”

His advice to any studio looking to raise awareness for their VR game was to approach influencers rather than traditional media – “because that’s just the way the market is” – and consider promoting them in much the same way they do PC and console titles.

Burman believes premium games with rapid start-up times and even microtransactions will thrive in the VR market

Burman took time to ponder about the changing genres we might see in virtual reality. He attributed the current wave of shorter action, shooter and puzzle games to a combination of the smaller teams and limited budgets behind most releases but expects this to expand as the audience grows. He suggested sports games might become more common, a relatively unexplored genre when it comes to VR, but says sandboxes could provide the real killer app.

“I’m really curious to see what’s happening with the creation genre and VR. What’s going to be the next Minecraft VR?”

“I’m really curious to see what’s happening with the creation genre and VR,” he said. “What’s going to be the next Minecraft VR? I’m pretty sure we’re going to see something spectacular happen in that space.”

The talk primarily dealt with learnings VR developers could take from other platforms, with Burman’s own experiences in mobile proving to be particularly fruitful. In some ways, VR titles need to be as accessible and instantly gratifying as games for smartphones.

“One thing we have learned is that start-up times really do matter in VR, both on mobile VR and wired,” said Burman. “Because if you have a VR game where it’s loading and you wait and wait and wait, it really breaks the immersion and takes you out of that experience. There are quite a few games in VR today that do have that problem. Learn from mobile: you have to get the game up and running as soon as possible. Try to avoid long loading times – that’s something mobile excels at.

“Another learning from mobile is those short bursts – the player logs in, does some farming and then goes out – that’s probably not going to work as well in VR, because it’s much more difficult to set up so it doesn’t make sense to do a very, very quick burst of gaming.”

On the subject of session lengths, Burman shared the results of a study Fast Travel Games carried out into how long players spend with their various platforms. Traditional PC and console games engaged players for between 90 and 150 minutes per session a couple of times per week. Meanwhile, mobile predictably has much shorter bursts of around two to four minutes, with users playing multiple times a day. Virtual reality falls somewhere in between.

“When we looked at VR games with the user studies that we’ve done, we see that players ideally spend 20 to 30 minutes per session,” said Burman. “And they do this with roughly the same frequency as console players.

“Some learnings from this is that gameplay depth is something we can bring over from PC and console. People do want to immerse themselves in the same kind of worlds that they find in traditional games, and the creation of these universes is going to be even more important for VR.”

A member of the audience pressed this further at the end of the session, asking if Burman expects VR to cut into the time already spent on traditional PC and console games.

“Yeah, it might eat into that time,” he answered. “People have a limited amount of time, and I think we’ll see some of those players moving over to VR.

“Mobile VR is a different beast. Mobile gaming is something you do when you’re on the toilet. You might do VR on the toilet – I haven’t tried it.”


Will Ryzen’s High Yields Pay Off For AMD

June 2, 2017 by  
Filed under Computing

The dark satanic rumor mill has manufactured a hell on earth yarn which claims that AMD’s 14nm manufacturing process has a yield of 80 percent.

AMD is knocking out Ryzen CPUs so effectively that a decrease in ASPs is likely to follow

The information comes from an Italian website Bitsandchips which has not named its source. However he/she told them that 80 percent of CPUs are produced with all eight cores fully functional.

The high yield of the Ryzen CPUs means that AMD can sell its EPYC (formerly known as Naples) datacentre processors (based on Ryzen) at a low cost.

Bitsandchips says that Dropbox is interested, quoting VP of infrastructure Akhil Gupta: “The combination of core performance, memory bandwidth, and I/O support make EPYC a unique offering. We look forward to continuing to evaluate EPYC as an option for our infrastructure.”

This also means that coupled with AMD’s extensive Wafer Supply Agreement which was amended to act as an insurance against foundry disaster, the company is doing well as far as fabrication is concerned.

The magazine also said that Dropbox was rather interested in Naples and was evaluating AMD EPYC CPUs in-house.Akhil Gupta, vice president of infrastructure at Dropbox said that he was impressed with the initial performance we see across workloads in a single-socket configurations.

“The combination of core performance, memory bandwidth, and I / O support EPYC make a unique offering. We look forward to continuing to evaluate EPYC as an option for our infrastructure.”


Tablet Shipments Continue Their Downward Spiral

May 30, 2017 by  
Filed under Around The Net

The death of the “game changing” keyboard-less netbook continues as Digitimes Research predicts that only 35 million tablets will be shipped globally during the second quarter of 2017.

This is a 5.7 percent fall compared to last quarter and 13.7 percent on the year.

Apple, which is responsible for the technology fad, will sell only a quarter of the world’s tablets. Tablets launched or to be launched by other international vendors for 47.3 percent and white-box units for 27.3 percent.

Following Apple is Samsung with 15.1 percent, Amazon with 7.4 percent , Huawei Technologies with 6.6 percent, Lenovo with 6.6 percent, Asustek Computer with 3.3 percent, TCL with 1.7 percent, Microsoft with 1.5 percent and Acer with 1.3 percent.

Taiwan-based ODMs and OEMs will together ship 11.5 million tablets in the quarter, slipping 3.4 percent on the quarter and 16.1 percent on the year.

The real winner from tablet sales is Foxconn Electronics which makes 73.1 percent of all tablets while Compal makes 11.4 percent.


Next Page »