Subscribe to:

Subscribe to :: TheGuruReview.net ::

Are Light Powered Transistors On The Horizon?

February 5, 2016 by Michael  
Filed under Computing

A team of researchers have emerged from their smoke filled labs claiming to have invented a transistor which runs on light rather than applied voltage.

According to Technology Review University of North Carolina in Charlotte say the new transistor controls the electrons to flow through it so that when the lights are on it and turns itself off when it gets dark.

This means that devices can be made smaller than field effect transistors because they don’t require doping in the same way and can be squeezed into smaller spaces. Meanwhile the speeds are faster.

Apparently the idea is not rocket science and is based on the idea that materials have been known to be photoconductive.

What the team has done is create a device which uses a ribbon of cadmium and selenium a couple of atoms thick. This can conduct more than a million times more current when on than off. This is about the same as regular transistors.

Of course it is years away from being a product yet. They still have not worked out how to send light to each transistor and if that will cost more power.

Courtesy-Fud

 

Will VR Headsets Work With Your PC?

January 5, 2016 by Michael  
Filed under Computing

Virtual reality (VR) will not be supported on most consumer computers as the technology booms and manufacturers prepare to introduce it on a consumer level this year, Nvidia has warned.

Jason Paul, the firm’s general manager of Shield, gaming and VR, told Venturebeat that graphics processors need to be about seven times more powerful than in a standard PC game to run VR, and that there will be only about 13 million PCs in the market that will be powerful enough to run them by next year when the first major PC-based VR headsets ship, at least on PCs.

However, Nvidia said that this number could be extended to 25 million if the VR game makers use Nvidia’s GameWorks VR software (of course), which is said to make the VR processing more efficient.

GameWorks VR is aimed at games and applications developers, and includes a feature called VR SLI, which provides increased performance for VR applications where multiple GPUs can be assigned to a specific eye to dramatically accelerate stereo rendering.

The software also delivers specific features for VR headset developers, including Context Priority, which provides control over GPU scheduling to support advanced VR features such as asynchronous time warp. This cuts latency and quickly adjusts images as gamers move their heads, without the need to re-render a new frame.

There’s also a feature in the SDK called Direct Mode, which treats VR headsets as head-mounted displays accessible only to VR applications, rather than a typical Windows monitor, providing better plug-and-play support and compatibility for VR headsets.

Nvidia said that GameWorks VR is already being integrated into leading game engines, such as those from Epic Games, which has announced support for GameWorks VR features in an upcoming version of the popular Unreal Engine 4.3. However, considering Paul’s comments, it mustn’t be getting implemented as much as the firm would like.

VR is becoming increasingly prevalent as device manufacturers try to offer enhanced experiences, especially in gaming. Oculus has been showing off what it can do for some time, and it seems its official debut is not too far away. But it was Oculus that seemed to kick-start this upward trend and, since it hit the headlines, we’ve seen a number of big technology companies giving it a go, especially smartphone makers.

The HTC Vive is one, for example. But, like Oculus, the headset is still in the initial rollout phase and not yet on sale commercially, requiring any developers wanting to have a pop at writing code for it to enter a selection process for distribution, which began only this summer.

Sony, another smartphone maker, has also dipped its toe in the world of VR via Project Morpheus, a headset like HTC’s Vive that looks to enhance gaming experiences, but specifically as an accessory for the PlayStation 4, which we assume won’t come with the concerns Nvidia has as it should work with the console right out of the box.

Courtesy-TheInq

 

Epic Looks Into AMD Issues

December 22, 2015 by Michael  
Filed under Gaming

Epic Games said it is investigating issue with Unreal Engine 4 and AMD CPUs.

The problem appears in Squad which is the first big, publicly available game using Epic Games’ Unreal Engine 4. The game was just stuck up on Steam so complaints about the AMD have been somewhat vocal.

The engine appears to create a poor performance on AMD CPUs due to an audio component of the engine. The issue has been reported before but no one took it that seriously. In fact some of theissues here seem to be a communication problem between Squad and Epic.

Squad developer Offworld Industries told Tweaktown that there was little it could do about this besides wait for Epic to fix it and release the fix in an engine patch.

However Epic’s senior marketing manager Dana Cowley said she didn’t even know about the problem until she was contacted by the media.

She said he was getting on the blower with the Squad team to investigate, and see how it could help.

There is a work around being suggested on the blogs which might help. If you navigate to C:UsersAppDataLocalSquadSavedConfigWindowsNoEditor, back up the Engine.ini file then open it with Notepad, find the [Audio] section, change MaxChannels from 128 to 96, 64, or 32, and save.

Courtesy-Fud

 

Is The Steam OS Really Good?

November 19, 2015 by Michael  
Filed under Computing

Benchmarks for Valve’s Steam machines are out and it does not look like the Linux powered OS is stacking up well against Windows.

According to Ars Technica the SteamOS gaming comes with a significant performance hit on a number of benchmarks.

The OS was put through Geekbench 3 which has a Linux version. The magazine used some mid-to-late-2014 releases that had SteamOS ports suitable for tests including Middle-Earth: Shadow of Mordor and Metro: Last Light Redux.

Both were intensive 3D games with built-in benchmarking tools and a variety of quality sliders to play with (including six handy presets in Shadow of Mordor’s case).

On SteamOS both games had a sizable frame rate hit. We are talking about 21- to 58-percent fewer frames per second, depending on the graphical settings. On our hardware running Shadow of Mordor at Ultra settings and HD resolution, the OS change alone was the difference between a playable 34.5 fps average on Windows and a 14.6 fps mess on SteamOS.

You would think that Valve’s own games wouldn’t have this problem, but Portal, Team Fortress 2, and DOTA 2 all took massive frame rate dips on SteamOS compared to their Windows counterparts.

Left 4 Dead 2 showed comparable performance between the two operating systems but nothing like what Steam thought it would have a couple of years ago.

Courtesy-Fud

 

Samsung Sells LCD Business

November 19, 2015 by Michael  
Filed under Consumer Electronics

Samsung has sold a large LCD display operation in order to concentrate full time on OLED-based products.

A report in Business Korea says that the facility in Cheonan, South Chungcheong Province, has shut down its L5 line, the fifth generation of LCD displays, and begun selling the equipment to other manufacturers.

The age of the equipment meant it was only suitable for notebook and small monitor displays. With OLED now rolling out in phones such as the recent Samsung Galaxy S6 Edge, and big-screen TVs, it seems that the company has decided to make a break with the past.

The Korean manufacturer sold off its fourth generation production line to a Chinese company last year. A spokesman for Samsung Display confirmed: “The company shut down the L5 line last month and is seeking companies that are willing to acquire idle equipment.”

Although the equipment and the products it produces may seem outdated, there is still a huge market for this stuff in lower end electronics. Some analysts believe that there are tens of billions of Korean Won in any sale. Ten billion Won is about £5.6m, which doesn’t sound nearly as much but is still better than poke in the eye.

The Cheonan factory is likely to be converted to make OLED products, with talk of deals for AMOLED phone displays for Huawei and even an acceleration of its on-again-off-again Ernie and Bert relationship with Apple said to be at the heart of the decision to ramp up production.

Samsung still operates three LCD production lines, but analysts question if this is the beginning of a move to OLED production only, and if so, what effect that will have on the company as demand for cheaper LCD screens continues to grow, with production ramping up in China.

Samsung has lost market share in the end user market with recent Galaxy products failing to sell as well as their predecessors. As such these component deals are the lifeblood of the business, with a contract to produce high-end screens for Apple alone worth billions.

Courtesy-TheInq

 

Oracle Goes Elastic To Take On Amazon’s AWS

November 3, 2015 by Michael  
Filed under Computing

Oracle has launched a direct rival to the Amazon Web Services (AWS) public cloud with its own Elastic Compute Cloud.

The product was revealed amid a flurry of cloud-related product announcements, including five in the infrastructure-as-a-service (IaaS) space, at the OpenWorld show in San Francisco on Tuesday.

Oracle Elastic Compute Cloud adds to the Dedicated Compute service the firm launched last year. The latest service lets customers make use of elastic compute capabilities to run any workload in a shared cloud compute zone, a basic public cloud offering.

“Last year we had dedicated compute. You get a rack, it’s elastic but it’s dedicated to your needs,” said Thomas Kurian, president of Oracle Product Development (pictured below).

“We’ve now added in Elastic Compute, so you can just buy a certain number of cores and it runs four different operating systems: Oracle Linux, Red Hat, Ubuntu or Windows, and elastically scale that up and down.”

Oracle has yet to release pricing details for the Elastic Compute Cloud service, but chairman and CTO Larry Ellison said on Sunday that it will be charged at the equivalent or lower than AWS pricing. For the dedicated model, Ellison revealed on Tuesday at OpenWorld that firms will pay half the cost for Oracle Dedicated Compute of the equivalent AWS shared compute option.

It is not surprising that Oracle would like the opportunity to have a piece of the public cloud pie. AWS earned its owner $2.08bn in revenue in the quarter ending 30 September.

Kurian shared current use details for the Oracle Cloud as evidence of the success it has seen so far. The firm manages 1,000PB of cloud storage, and in September alone processed 34 billion transactions on its cloud. This was a result of the 35,000 companies signed up to the Oracle Cloud, which between them account for 30 million users logging in actively each day.

However, Oracle’s chances of knocking Amazon off its cloud-leader perch, or even making a slight dent in its share, seem low. The AWS revenue was only made possible by the fact that Amazon owns 30 percent of the cloud infrastructure service market, with second and third-ranked Microsoft and IBM lagging behind at 10 and seven percent respectively.

Google and Salesforce have managed to capture less than five percent each. Indeed, realising how competitive the market is and Amazon’s dominant position, HP has just left the public cloud market.

Despite Oracle going head to head with AWS in the public cloud space, Amazon has been attempting to attract Oracle customers to its own platform.

“AWS and Oracle are working together to offer enterprises a number of solutions for migrating and deploying their enterprise applications on the AWS cloud. Customers can launch entire enterprise software stacks from Oracle on the AWS cloud, and they can build enterprise-grade Oracle applications using database and middleware software from Oracle,” the web giant notes on its site.

Amazon describes EC2 as letting users “increase or decrease capacity within minutes, not hours or days. You can commission one, hundreds or even thousands of server instances simultaneously”, making Oracle Elastic Compute Cloud a direct competitor.

Oracle has also added a hierarchical storage option for its archive storage cloud service, aimed at automatically moving data that requires long-term retention such as corporate records, scientific archives and cultural preservation content.

Ellison noted that this archiving service is priced at a 10th of the cost of Amazon’s S3 offering.

Kurian explained of the archive system: “I’ve got data I need to put into the cloud but I don’t need a recovery time objective. So you get it very, very cheap”, adding that it costs $1/TB per month.

The firm also launched what Kurian dubbed as its “lorry service” for bulk data transfer. This will see Oracle ship a storage appliance to a customer’s site, where they can then do a huge data transfer directly onto that machine at a much quicker rate than streaming it to the cloud. The appliance is then sent back to Oracle via DHL or FedEx, Kurian explained, for Oracle to then do the transfer on-site to the cloud for storage.

“This is much faster if you’re moving a huge amount of data. One company is moving 250PB of data. To stream that amount of data to the cloud would take a very long time,” he said.

Bulk data transfer will be available from November, while the archive service is available now.

“You can go up to shop.oracle.com as a customer, enter a credit card and you can buy the service, all the PaaS services and the storage service. We’re adding compute over the next couple of weeks,” Kurian explained.

“You pay for it by credit card or an invoice if you’re a corporate customer and pay for it by hour or month, by processor or by per gigabyte per hour or month for storage.”

Oracle Container Cloud, meanwhile, lets firms run apps in Docker containers and deploy them in the Oracle Compute Cloud, supporting better automation of app implementations using technologies like Kubernetes.

Oracle also launched additional applications that sit in its cloud, including the Data Visualisation Cloud Service. This makes visual analytics accessible to general business users who do not have access to Hadoop systems or the data warehouse.

“All you need is a spreadsheet to load your data and a browser to do the analysis,” Kurian explained.

Several new big data cloud services are also aimed at letting users more easily prepare and analyse data using Hadoop as the data store, for example Big Data Preparation and Big Data Discovery.

“With Big Data Preparation you can move data into your data lake, you can enrich the data, prepare it, do data wrangling, cleanse it and store it in the data lake. Big Data Discovery lets a business user sit in front of Hadoop, and through a browser-based dashboarding environment search the environment, discover patterns in the data, do analysis and curate subsets of the data for other teams to look at. It’s an analytic environment and complete Hadoop stack,” Kurian said.

Courtesy-TheInq

 

Will 2016 Be The Year For Virtual Reality Games?

October 15, 2015 by Michael  
Filed under Gaming

As the end of 2015 rapidly approaches (seriously, how on earth is it October already?), the picture of what we can expect from VR in 2016 is starting to look a little less fuzzy around the edges. There’s no question that next year is the Year of VR, at least in terms of mindshare. Right now it looks like no fewer than three consumer VR systems will be on the market during calendar 2016 – Oculus Rift, PlayStation VR and Valve / HTC Vive. They join Samsung’s already released Gear VR headset, although that device has hardly set the world on fire; it’s underwhelming at best and in truth, VR enthusiasts are all really waiting for one of the big three that will arrive next year.

Those fuzzy edges, though; they’re a concern, and as they come into sharper focus we’re starting to finally understand what the first year of VR is going to look like. In the past week or so, we’ve learned more about pricing for the devices – and for Microsoft’s approach, the similar but intriguingly different Hololens – and the aspect that’s brought into focus is simple; VR is going to be expensive. It’s going to be expensive enough to be very strictly limited to early adopters with a ton of disposable income. It’s quite likely going to be expensive enough that the market for software is going to struggle for the first couple of years at least, and that’s a worry.

Oculus Rift, we’ve learned, will cost “at least” $350. That’s just for the headset; you’ll also need a spectacularly powerful PC to play games in VR. No laptop will suffice, and you’re certainly out of luck with a Mac; even for many enthusiasts, the prospect of adding a major PC purchase or upgrade to a $350 headset is a hefty outlay for an early glimpse of the future. It’s likely (though as yet entirely unconfirmed) that Valve’s Vive headset will have a similar price tag and a similarly demanding minimum PC specification. The cheap end of the bunch is likely to be PlayStation VR – not because the headset will be cheap (Sony has confirmed that it is pricing it as a “platform” rather than a peripheral, suggesting a $300 or so price tag) but because the system you attach it to is a $350 PS4 rather than a much more expensive PC.

It is unreasonable, of course, to suggest that this means that people will be expected to pay upwards of $600 for Sony’s solution, or $1500 for the PC based solution. A great many people already own PS4s; quite a few own PCs capable of playing VR titles. For these people, the headset alone (and perhaps some software) is the cost of entry. That is still a pretty steep cost – enough to dissuade people with casual interest, certainly – but it’s tolerable for early adopters. The large installed base of PS4s, in particular, makes Sony’s offering interesting and could result in a market for PlayStation VR ramping up significantly faster than pessimistic forecasts suggest. On the PC side, things are a little more worrying – there’s the prospect of a standards war between Valve and Oculus, which won’t be good for consumers, and a question mark over how many enthusiasts actually own a PC powerful enough to run a VR headset reliably, though of course, the cost of PCs that can run VR will fall between now and the 2016 launch.

All the same, the crux of the matter remains that VR is going to be expensive enough – even the headsets alone – to make it into an early-adopter only market during its first year or so. It’s not just the cost, of course; the very nature of VR is going to make it into a slightly tough sell for anyone who isn’t a devoted enthusiast, and more than almost any other type of device, I think VR is going to need a pretty big public campaign to convince people to try it out and accept the concept. It’s one thing to wax lyrical about holodecks and sci-fi dreams; it’s quite another to actually get people to buy into the notion of donning a bulky headset that blocks you off from the world around you in the most anti-social way imaginable. If you’re reading a site like GamesIndustry.biz, you almost certainly get that concept innately; you may also be underestimating just how unattractive and even creepy it will seem to a large swathe of the population, and even to some of the gamer and enthusiast market VR hopes (needs!) to capture.

The multi, multi million dollar question remains, as it has been for some time – what about software? Again, Sony has something of an advantage in this area as it possesses very well regarded internal studios, superb developer relations and deep pockets; combined with its price and market penetration advantages, these ought to more than compensate for the difference in power between the PS4 and the PCs being used to power Rift and Vive, assuming (and it’s a big assumption) that the PS4′s solution actually works reliably and consistently with real games despite its lack of horsepower. The PC firms, on the other hand, need to rely on the excitement, goodwill and belief of developers and publishers to provide great games for VR in its early days. A handful of teams have devoted themselves to VR already and will no doubt do great things, but it’s a matter of some concern that a lot of industry people you talk to about PC VR today are still talking in terms of converting their existing titles to simply work in 3D VR; that will look cool, no doubt, but a conversion lacking the attention to controls, movement and interaction that’s required to make a VR world work will cause issues like motion sickness and straight-up disappointment to rear their ugly heads.

If VR is going to be priced as a system, not just a toy or a peripheral, then it needs to have software that people really, really want. Thus far, what we’ve seen are demos or half-hearted updates of old games. Even as we get close enough to consumer launches for real talk about pricing to begin, VR is still being sold off the back of science fiction dreams and long-held technological longings, not real games, real experiences, real-life usability. That desperately needs to change in the coming months.

At least Hololens, which this week revealed an eye-watering $3000 developer kit to ship early next year, has something of a roadmap in this regard; the device will no doubt be backed up by Microsoft’s own studios (an advantage it shares, perhaps to a lesser degree, with Sony) but more importantly, it’s a device not aimed solely at games, one which will in theory be able to build up a head of steam from sales to enterprise and research customers prior to making a splash in consumer markets with a more mature, less expensive proposition. I can’t help wondering why VR isn’t going down this road; why the headlong rush to get a consumer device on the market isn’t being tempered at least a little by a drive to use the obvious enterprise potential of VR to get the devices out into the wild, mature, established and affordable before pushing them towards consumers. I totally understand the enthusiasm that drives this; I just don’t entirely buy the business case.

At the very least, one would hope that if 2016 is the year of VR, it’s also the year in which we start to actually see VR in real-life applications beyond the gaming dens of monied enthusiasts. It’s a technology that’s perfectly suited to out-of-home situations; the architect who wants to give clients a walkthrough of a new building design; the museum that wants to show how a city looked in the past; the gaming arcade or entertainment venue that wants to give people an experience that most of them simply can’t have at home on their consoles. VR is something that a great many consumers will want to have access to given the right software, the right price point and crucially, the right experience and understanding of its potential. Getting the equipment into the hands of consumers at Tokyo Games Show or EGX is a start, but only a first step. If VR’s going to be a big part of the industry’s future, then come next year, VR needs to be everywhere; it needs to be unavoidable. It can’t keep running on dreams; virtual reality needs to take a step into reality.

 

Courtesy-GI.biz

Is Sony Dropping Morpheus?

September 21, 2015 by Michael  
Filed under Gaming

Sony has pulled back the curtains on its virtual reality headset, giving it an official introduction to the wild and a real-life name.

That name is PlayStation VR, which is an obvious but uninspired choice. The name that the unit had earlier, Morpheus, which was probably a nod towards starts-great-but-ends-badly film series The Matrix, had a bit more glamour about it.

The firm has shown off the hardware to the local journalistic crowd at the Tokyo Game Show, and provided the general press with information, details and specifications.

PlayStation VR was first discussed in March 2014 when it had the cooler name. Since then the firm has been hard at work getting something ready to announce and sell, according to a post on the PlayStation blog.

A game show in Tokyo would seem the most likely place for such an announcement.

Sony said that the system is “unique”, apparently because of a special sound system, and makes the most of the Sony PS4 and its camera. The firm is expecting the device to have a big impact on PlayStation gamers and gaming.

“The name PlayStation VR not only directly expresses an entirely new experience from PlayStation that allows players to feel as if they are physically inside the virtual world of a game, but reflects our hopes that we want our users to feel a sense of familiarity as they enjoy this amazing experience,” said Masayasu Ito, EVP and division president of PlayStation product business.

“We will continue to refine the hardware from various aspects, while working alongside third-party developers and publishers and SCE Worldwide Studios, in order to bring content that delivers exciting experiences only made possible with VR.”

Specifications are available, but they relate to a prototype and are subject to change. Sony said that the system has a 100-degree field of view, a 5.7in OLED display, a 120Hz refresh rate, and a panel resolution of 960×RGB×1080 per eye.

This will not put it at the high end of the market, as the field of view is only 10 degrees greater than with Google Cardboard, and 10 degrees under that of Oculus Rift. Some rivals go as wide as 210 degrees.

And no, no release date or price have been mentioned. We predict that these will be 2016 and expensive.

Courtesy-TheInq

Console Software Sales Strong And Growing

August 13, 2015 by Michael  
Filed under Gaming

As the 7th console generation was coming to an end several years ago, there was much pessimism regarding the impending launch of the 8th generation. Just as 7th generation software sales were starting to lag, mobile gaming exploded, and PC gaming experienced a renaissance. It was easy to think that the console players were going to be going elsewhere to find their gaming entertainment by the time the new consoles hit the scene. However, the 8th generation consoles have had a successful launch. In fact, the Sony and Microsoft consoles are as successful as ever.

A comparison of the year over year console software sales suggests that the 8th generation is performing better than the 7th generation – provided you exclude the Nintendo consoles. The following graph shows physical and digital software sales for years 1 through 3 of each generation for the Xbox and PlayStation platforms.

The annual numbers take into account the staggered launch cycle, so year 1 comprises different sales years for Xbox 360 and PS3. The data shows that the Sony and Microsoft platforms have outperformed their 7th generation counterparts, especially in the first two years of the cycle. The 8th generation outperforms the 7th generation even in an analysis that excludes DLC, which now accounts for an additional 5-10 percent of software sales.

However, the picture is far different if we include the Nintendo platforms. The graph below shows the same data, but now includes the Wii and Wii U in their respective launch years.

The data shows how much the “Wii bubble” contributed to the explosive growth in software sales in 2008, the year the Wii really took off as a family and party device. This data corroborates a broader theme EEDAR has seen across our research – new, shortened gaming experiences that have added diversity to the market, especially mobile, have cannibalized the casual console market, not the core console market. People will find the best platform to play a specific experience, and for many types of experiences, that is still a sofa, controller, and 50 inch flat-screen TV.

The shift in consoles to core games is further exemplified by an analysis of sales by genre in the 7th vs. 8th generation. The graph below shows the percentage of sales by genre in 2007 versus 2014, ordered from more casual genres to more core genres. Casual genres like General Entertainment and Music over-indexed in 2007 while core genres like Action and Shooter over-indexed in 2014.

It has become trendy to call this console generation the last console generation. EEDAR believes one needs to be very specific when making these claims. While this might be the last generation with a disc delivery and a hard drive in your living room, EEDAR does not believe the living room, sit-down experience is going away any time soon.

Courtesy-GI.biz

Does Steam Have A Security Issue?

July 28, 2015 by Michael  
Filed under Gaming

A security problem with the Steam gaming on-demand system means that players and their personal details are at risk.

It is possible that one day we will report on which companies made it through the night without being hacked or without exposing their users.

For now, though, the opposite is the norm and today we are reporting about a problem with gaming system Steam that, you guessed it, has dangled the personal details of punters within the reach of ne’er-do-wells.

The news is not coming out of Steam, or parent Valve, directly, but it is running rampant across social networks and the gaming community. The problem, according to reports and videos, was a bad one and made the overtaking of user accounts rather a simple job.

No badass end-of-level boss to beat here, just a stage in the authentication process. A video posted online demonstrates the efforts required, while some reports – with access to Steam’s PR hot air machine – say that the problem is fixed.

A statement released to gaming almanac Kotaku finds the firm in apologetic clean-up mode.

Steam told the paper that some users would have their passwords reset, those being the ones who might have seen their log-in changed under suspicious circumstances, and that in general users should already be protected from the risks at hand.

“To protect users, we are resetting passwords on accounts with suspicious password changes during that period or may have otherwise been affected,” the firm said.

“Relevant users will receive an email with a new password. Once that email is received, it is recommended that users log-in to their account via the Steam client and set a new password.

“Please note that, while an account password was potentially modified during this period, the password itself was not revealed. Also, if Steam Guard was enabled, the account was protected from unauthorized log-ins even if the password was modified.”

The firm added its apologies to the community.

Courtesy-TheInq

 

Intel’s 2nd Quarter Profits Up

July 17, 2015 by Michael  
Filed under Computing

The maker of chips and bits, Intel is doing better than the cocaine nose jobs of Wall Street expected.

Intel issued its quarterly results today and saw growth in its data centers and Internet-of-Things businesses offset weak demand for personal computers that use the company’s chips.

Intel said that it was expanding its line-up of higher-margin chips used in data centers to counter slowing demand from the PC industry. Its cunning plan to buy Altera for $16.7 billion in April was all about trying to do this.

Revenue from the data centres grew 9.7 percent to $3.85 billion in the second quarter from a year earlier, helped by cloud services companies and demand for data analytics.

Chief Financial Officer Stacy Smith was predicting robust growth rates of the data center group, Internet of Things group and NAND businesses.

Revenue from the PC business, which is still Intel’s largest, fell 13.5 percent to $7.54 billion in the quarter ended June 27.

However there was more doom about the PC market which Smith said was going to be weaker than previously expected.

Research firm Gartner thinks global PC shipments will fall 4.5 percent to 300 million units in 2015, and life is going to be pretty pants until 2016.

Intel forecast current-quarter revenue of $14.3 billion, plus or minus $500 million. Wall Street predicted a revenue of $14.08 billion.

The company’s net income fell to $2.71 billion from $2.80 billion a year earlier.

Net revenue fell 4.6 percent to $13.19 billion, but edged past the average analyst estimate of $13.04 billion. Intel’s stock fell about 18 percent this year.

Courtesy-Fud

Can Cansoles Ever Crack The Chinese Market?

July 14, 2015 by Michael  
Filed under Gaming

The launch of Sony’s PS4 and Microsoft’s Xbox One consoles in China hasn’t attracted much fanfare, perhaps because both firms were aware from the outset of what an uphill struggle this would be, and how much potential for disappointment there was if expectations were set too high. Last week saw the first stab at estimating figures, from market intelligence firm Niko Partners, who reckon that the two platforms combined will sell a little over half a million units this year; not bad, but a tiny drop in the ocean that is China’s market for videogames.

These are not confirmed sales figures, it’s important to note; market intelligence firms essentially make educated guesses, and some of those guesses are a damn sight more educated than others, so treating anything they publish as hard data is ill-advisable. Nonetheless, the basic conclusion of Niko Partners’ report is straightforward and seems to have invited no argument; the newly launched game consoles are making little impact on the Chinese market.

There are lots of reasons why this is happening. For a start, far from being starved of a much desired product, the limited pre-existing market for game consoles in China is actually somewhat saturated; the country is host to a thriving grey import market for systems from Hong Kong, Taiwan and Japan. This market hasn’t gone away with the official launch of the consoles, not least because the software made officially available in China is extremely limited. Anyone interested in console gaming will be importing games on the grey market anyway, which makes it more likely that they’ll acquire their console through the same means.

Moreover, there’s a big cultural difference to overcome. Game consoles are actually a pretty tough sell, especially to families, in countries where they’re not already well-established. Their continued strength in western markets is largely down to the present generation of parents being accustomed to game consoles in the home; cast your mind back to the 1980s and 1990s in those markets, though, and you may recall that rather a lot of parents were suspicious of game consoles not just because of tabloid fury over violent content, but because these machines were essentially computers shorn of all “educational” value. I didn’t own a console until I bought a PlayStation, because my parents – otherwise very keen for us to use and learn about computers, resulting in a parade of devices marching through the house, starting from the Amstrad CPC and ending up with a Gateway 2000 PC in which I surreptitiously installed a Voodoo 3D graphics board – wouldn’t countenance having a SNES in the house. That’s precisely the situation consoles in China now face with much of their target audience; a situation amplified even further by the extremely high-pressure nature of Chinese secondary education, which probably makes parents even more reluctant than mine when it comes to installing potentially time-sucking entertainment devices in their homes.

Besides; Chinese people, teens and adults alike, already play lots of games. PC games are enormously popular there; mobile games are absolutely huge. This isn’t virgin territory for videogames, it’s an extremely developed, high-value, complex market, and an expensive new piece of hardware needs to justify its existence in very compelling terms. Not least due to local content restrictions, neither PS4 nor Xbox One is doing that, nor are they particularly likely to do so in the future; the sheer amount of content and momentum that would be needed to make an impression upon such a mature landscape is likely to be beyond the scope of all but a truly herculean effort at local engagement and local development by either company – not just with games, but also with a unique local range of services and products beyond gaming – and neither is truly in a position to make that effort. It’s altogether more likely that both Sony and Microsoft will simply sell into China to satisfy pre-existing local demand as much as possible, without creating or fulfilling any expectations higher than that.

Is this important? Well, it’s important in so much as China is the largest marketplace in the world, with a fast-growing middle class whose appetite for luxury electronics is well-established. Apple makes increasingly large swathes of its revenue in China; companies with high-end gaming hardware would like to do something similar, were the barriers to success not raised so high. Without building a market in China, the global growth potential of the console business is fairly severely limited – the established rich nations in which consoles are presently successful have a pretty high rate of market penetration as it is, and growing sales there is only going to get tougher as birth-rates fall off (a major factor in Japan already, but most European and North American states are within spitting distance of the Japanese figures, which is worth bearing in mind next time someone shares some moronic clickbait about sexless Japan on your Facebook feed). So yes, the failure of consoles to engage strongly in China would be a big deal.

The deal looks even bigger, though, if you view China as something of a bellwether. It’s a unique country in many regards – regulations, media environment, culture, sheer scale – but in other regards, it’s on a developmental track that’s not so different from many other nations who are also seeing the rise of an increasingly monied urban middle class. If the primary difficulty in China is regulations and content restrictions, then perhaps Sony and Microsoft will find more luck in Brazil, in India, in Indonesia, in the Philippines and in the many other nations whose rapid development is creating larger and larger audiences with disposable income for entertainment. In that case, China may be the outlier, the one nation where special conditions deny consoles a chance at market success.

If the problem with China is more fundamental, though, it spells trouble on the road. If the issue is that developing nations are adopting other gaming platforms and systems long before consoles become viable for launch there, creating a huge degree of inertia which no console firm has the financial or cultural clout to overcome, then the chances are that consoles are never going to take root in any significant degree in the new middle class economies of the world. Games will be there, of course; mobile games, PC games, games on devices that haven’t even been invented yet (though honestly, Niko Partners’ tip of SmartTV games as a growth market is one that I simply can’t view from any angle that doesn’t demand instant incredulity; still, who knows?). Consoles, though, would then find themselves restricted geographically to the markets in which they already hold sway, which creates a really big limit on future growth.

That’s not the end of the world. The wealthy nations which consume consoles right now aren’t likely to go anywhere overnight, and the chances are that they’ll continue to sustain a console audience of many tens of millions – perhaps well over 100 million – for years if not decades to come. Moreover, the future of games is inevitably more fragmented than its present; different cultures, different contexts and different tastes will mean that it will be a truly rare game which is played and enjoyed to a large degree in all quadrants of the globe. There’ll still be a market for a game which “just” does great business in North America, Europe and so on; but it’ll be an increasingly small part of an ever-growing market, and its own potential for growth will be minimal. That, in the end, is a fairly hard cap on console development costs – you can’t spend vastly more money making something unless your audience either gets bigger, or more willing to pay, and there’s little evidence of either of those things in the console world right now.

The real figures from China, if and when they’re finally announced, will be interesting to see – but it’s unlikely that Niko Partners’ projections are terribly far from the truth. Whether any console company truly decides to put their weight behind a push in China, or in another developing country, over the coming years may be a deciding factor in the role consoles will play in the future of the industry as a whole.

Courtesy-GI.biz

Is Amazon Serious About PC Gaming?

June 9, 2015 by Michael  
Filed under Around The Net

Amazon has looked at the gaming market and felt that it is an area it can make a pile of dosh.

So far its games have been restricted to mobile devices. But it looks like that’s about to change: Amazon Game Studios is currently hiring for what it describes as an “ambitious new PC game project using the latest technology.”

It looks like this will be Amazon’s first ever PC release. Amazon hired notable developers like Kim Swift, designer of Portal, as well as Clint Hocking, who previously worked on franchises like Far Cry and Splinter Cell.
It has spent a small fortune licensing the CryEngine, the same one used to make high-end PC games like Crysis 3 and bought the game streaming service Twitch last August for $970 million, and made gaming a big focus for its Fire TV media box.

In a statement Amazon said: “We believe that games have just scratched the surface in their power to unite players,” the job posting reads, “and will produce some of the future’s most influential voices in media and art.”

Courtesy-Fud

Are Paid Mods On The Horizon For Gamers?

May 5, 2015 by Michael  
Filed under Gaming

Valve is no stranger to its ventures having a somewhat rocky start. Remember when the now-beloved Steam first appeared, all those years ago? Everyone absolutely loathed it; it only ever really got off the ground because you needed to install it if you wanted to play Half-Life 2. It’s hard now to imagine what the PC games market would look like if Valve hadn’t persisted with their idea; there was never any guarantee that a dominant digital distribution platform would appear, and it’s entirely plausible that a messy collection of publisher-owned storefronts would instead loom over the landscape, with the indie and small developer games that have so benefited from Steam’s independence being squeezed like grass between paving stones.

That isn’t to say that Valve always get things right; most of the criticisms leveled at Steam in those early days weren’t just Luddite complaints, but were indeed things that needed to be fixed before the system could go on to be a world-beater. Similarly, there have been huge problems that needed ironing out with Valve’s other large feature launches over the years, with Steam Greenlight being a good example of a fantastic idea that has needed (and still needs) a lot of tweaking before the balance between creators and consumers is effectively achieved.

You know where this is leading. Steam Workshop, the longstanding program allowing people to create mods (or other user-generated content) for games on Steam, opened up the possibility of charging for Skyrim mods earlier this month. It’s been a bit of a disaster, to the extent that Valve and Skyrim publisher Bethesda ended up shutting down the service after, as Gabe Newell succinctly phrased it, “pissing off the Internet”.

There were two major camps of those who complained about the paid mods system for Skyrim; those who objected to the botched implementation (there were cases of people who didn’t own the rights to mod content putting it up for sale, of daft pricing, and a questionable revenue model that awarded only 25% to the creators), and those who object in principle to the very concept of charging for mods. The latter argument, the more purist of the two, sees mods as a labour of love that should be shared freely with “the community”, and objects to the intrusion of commerce, of revenue shares and of “greedy” publishers and storefronts into this traditionally fan-dominated area. Those who support that point of view have, understandably, been celebrating the forced retreat of Valve and Bethesda.

Their celebrations will be short-lived. Valve’s retreat is a tactical move, not a strategic one; the intention absolutely remains to extend the commercial model across Steam Workshop generally. Valve acknowledges that the Skyrim modding community, which is pretty well established (you’ve been able to release Steam Workshop content for Skyrim since 2012), was the wrong place to roll out new commercial features – you can’t take a content creating community that’s been doing things for free for three years, suddenly introduce experimental and very rough payment systems, and not expect a hell of a backlash. The retreat from the Skyrim experiment was inevitable, with hindsight. With foresight, the adoption of paid mods more broadly is equally inevitable.

Why? Why must an area which has thrived for so long without being a commercial field suddenly start being about money? There are a few reasons for the inevitability of this change – and, indeed, for its desirability – but it’s worth saying from the outset that it’s pretty unlikely that the introduction of commercial models is going to impact upon the vast majority of mod content. The vast majority of mods will continue to be made and distributed for free, for the same reasons as previously; because the creator loves the game in question and wants to play around with its systems; because a budding developer wants a sandbox in which to learn and show off their skills to potential employers; because making things is fun. Most mods will remain small-scale and will, simply, not be of commercial value; a few creators will chance their arm by sticking a price tag on such things, but the market will quickly dispose of such behaviour.

Some mods, though, are much more involved and in-depth; to realise their potential, they impact materially and financially upon the working and personal lives of their creators. For that small slice out of the top of the mod world, the introduction of commercial options will give creators the possibility of justifying their work and focus financially. It won’t make a difference at all to very many, but to the few talented creative people who will be impacted, the change to their lives could be immense.

This is, after all, not a new rule that’s being introduced, but an old, restrictive one that’s being lifted. Up until now, it’s effectively been impossible to make money from the majority of mods. They rely upon someone else’s commercial, copyrighted content; while not outright impossible technically, the task of building a mod that’s sufficiently unencumbered with stuff you don’t own for it to be sold legally is daunting at best. As such, the rule up until now has been – you have to give away your mod for free. The rule that we’ll gradually see introduced over the coming years will be – you can still give away your mod for free, but if it’s good enough to be paid for, you can put a price tag on it and split the revenue with the creator of the game.

That’s not a bad deal. The percentages certainly need tweaking; I’ve seen some not unreasonable defences of the 25% share which Bethesda offered to mod creators, but with 30% being the standard share taken by stores and other “involved but not active” parties in digital distribution deals, I expect that something like 30% for Steam, 30% for the publisher and 40% for the mod creator will end up being the standard. Price points will need to be thrashed out, and the market will undoubtedly be brutal to those who overstep the mark. There’s a deeply thorny discussion about the role of F2P to be had somewhere down the line. Overall, though, it’s a reasonable and helpful freedom to introduce to the market.

It’s also one which PC game developers are thirsting for. Supporting mod communities is something they’ve always done, on the understanding that a healthy mod scene supports sales of the game itself and that this should be reward enough. By and large, this will remain the rationale; but the market is changing, and the rising development costs of the sort of big, AAA games that attract modding communities are no longer being matched by the swelling of the audience. Margins are being squeezed and new revenue streams are essential if AAA games are going to continue to be sustainable. It won’t solve the problems by itself, or overnight; but for some games, creating a healthy after-market in user-generated content, with the developer taking a slice off the top of the economy that develops, could be enough to secure the developer’s future.

Hence the inevitability. Developers need the possibility of an extra revenue stream (preferably without having to compromise the design of their games). A small group of “elite” mod creators need the possibility of supporting themselves through their work, especially as the one-time goal of a studio job at a developer has lost its lustre as the Holy Grail of a modder’s work. The vast majority of gamers will be pretty happy to pay a little money to support the work of someone creating content they love, just as it’s transpired that most music, film and book fans are perfectly happy to pay a reasonable amount of money for content they love when they’re given flexible opportunities to do so.

Paid mods are coming, then; not to Skyrim and probably not to any other game that’s already got an established and thriving mod community, but certainly to future games with ambitions of being the next modding platform. Valve and its partners will have to learn fast to avoid “pissing off the Internet” again; but for those whose vehement arguments are based on the non-commercial “purity” of this corner of the gaming world, enjoy it while it lasts; the reprieve won this week is a temporary one.

Courtesy-GI.biz

Will The Gaming Industry Pass $90 Billion In Sales This Year?

April 27, 2015 by Michael  
Filed under Gaming

It’s going to be another big year for games, as Newzoo is projecting that 2015 will see global gaming revenues jump 9.4 percent year-over-year to $91.5 billion. The future looks bright as well, with the research firm’s upcoming Global Games Market Report projecting worldwide revenues to reach $107 billion in 2017.

As the overall market grows, the distribution of where that money is coming from will also shift. Newzoo’s projections for this year have a surging Chinese market narrowly overtaking the US as the single biggest revenue contributor, bringing in $22.2 billion (up 23 percent) compared to the American market’s $22 billion (up 3 percent). As far as regions go, Asia-Pacific is far and away the largest source of gaming revenue, accounting for $43.1 billion (up 15 percent). Latin America is the smallest of the four major markets with just $4 billion in revenues, but it is also growing the quickest, up 18 percent year-over-year.

The platforms on which people spend money gaming are also in flux. Tablet revenues are expected to be up 27 percent year-over-year to $9.4 billion, with smartphone and watch revenues jumping 21 percent to $20.6 billion. However, PCs are the most popular platform for games, bringing in $27.1 billion (up 8 percent) from standard titles and MMOs, while casual webgames will draw an additional $6.6 billion (up 2 percent). Newzoo grouped TV, consoles, and VR devices into their own category, projecting them to bring in $25.1 billion (up 2 percent) in game revenues. The only market segment not seeing growth at the moment is the dedicated handheld, which Newzoo expects to bring in $2.7 billion in revenue this year (down 16 percent).

While the firm’s grouping of VR and smartwatch revenues in other categories may be unusual, it said both segments are too small to report for now.

“Short- to medium-term VR revenues will be limited and largely cannibalize on current console and PC game spending as a share of game enthusiasts invest in the latest technology and richest experience that VR offers,” Newzoo said. “Smartwatches will be a success but not add significant ‘new’ revenues to the $20.6 billion spent on smartphones this year.”

Courtesy-GI.biz