Subscribe to:

Subscribe to :: TheGuruReview.net ::

AMD Develops The Excavator Processor Specifically For Gamers

February 5, 2016 by Michael  
Filed under Computing

AMD has unveiled a handful of new processors as part of its 2016 desktop refresh, including the first chip based on the Excavator core to target desktop PCs. The firm will also release new motherboards with high-speed USB 3.1 ports and connectors to support M.2 Sata SSDs.

AMD’s new desktop processors are available now, and aimed chiefly at the enthusiast and gamer markets. They comprise three chips fitting into the firm’s FM2+ processor socket infrastructure for mainstream systems.

Two of these chips are based on the Godavari architecture and are APUs featuring Steamroller CPU cores and Graphics Core Next GPU cores. The A10-7860K has four CPU cores and eight GPU cores with a clock speed of 3.6GHz, while the A6-7470K has dual CPU cores and four GPU cores at a clock speed of 3.7GHz. Both have a maximum Turbo speed of 4GHz.

The A10-7860K is not AMD’s top-end chip, coming in below the A10-7870K and the A10-7890K, but it does replace three existing chips in the A10 line-up, the A10-7850K, A10-7700K and A10-7800.

“The interesting thing about the A10-7860K is that it delivers the same high 4GHz Turbo speed, but it is a 65W part, so it delivers comparable performance to the A10-7850K, but we’re dropping 30W,” said AMD client product manager Don Woligroski.

 

The third chip is badged under AMD’s Athlon brand, as it has CPU cores only and does not qualify as an APU. The Athlon X4 845 features four of the new Excavator cores used in the mobile Carrizo platform, clocked at 3.5GHz with a Turbo speed of up to 3.8GHz.

Neither is the Athlon X4 845 at the top of the Athlon stack, but is “more of an efficient, really great low-cost part”, according to Woligroski.

AMD will also deliver new motherboards to complement the latest processors sometime during the first quarter of 2016. These bring support for USB 3.1 Gen2 ports with the new Type-C connector, offering 10Gbps data rates, plus connectors for M.2 SATA SSD modules. M.2 modules are more usually seen in laptop and mobile systems because of their compact size.

Future AMD desktop chips will converge on a common socket infrastructure known as AM4, according to Woligroski. The first processors to use this are likely to be the upcoming Summit Ridge desktop chip and Bristol Ridge APU.

AMD also announced a new heatsink and fan combination for cooling the chips. The AMD Wraith Cooler (below) is claimed to deliver 34 percent more airflow while generating less than a 10th of the noise of its predecessor at 39dbA.

Courtesy-TheInq

 

Regardless Of The Rhetoric Windows Tablets Doing Ok

February 4, 2016 by Michael  
Filed under Computing

For years Microsoft held a torch for the tablet even while everyone else mocked them. When Apple turned the concept into a gimmick and everyone bought one, Microsoft was mocked for not really understanding the tablet.

Now it seems that Redmond is the only one making tablets that people want again, as the market slowly shrinks to the point before Jobs claimed “his” invention was a “game changer.”

Strategy Analytics said that final quarter of 2015 witnessing the worst year-on-year decline for a product that it has seen.

The company’s ‘Preliminary Global Tablet Shipments and Market Share by Operating System: Q4 2015′ report estimates that tablet shipment numbers fell to 69.9 million units in Q4, which is a record drop of 11 per cent. Over the full year of 2015, shipments reached 224 million units which represented a drop of 8 per cent.

TrendForce estimated a bigger drop over the course of the full year with a 12.2 per cent decline compared to 2014′s shipment numbers.

However Strategy Analytics said that the only one to do well was Microsoft. Windows tablets witnessed growth of 59 per cent in Q4 compared to the previous year.

Part of this is because 2-in-1 PCs are doing well and expected to do better. Strategy Analytics observed a huge 379 per cent  leap in year-on-year growth in Q4 2015.

Eric Smith, Senior Analyst, Tablet & Touchscreen Strategies service at Strategy Analytics, said: “2-in-1 Detachable Tablets have reached an inflection point in 2015 as computing needs continue to trend more and more mobile and Tablets with Windows 10 can compete against iOS in the premium and high price bands and equally well against Android in the mid and lower price bands.

“The Q4 2015 launch of Surface Pro 4 and Surface Book was met with many ‘Surface clones’ by Microsoft’s OEM partners at lower price points. This variety of devices will bolster momentum of Windows Tablets going forward.”

Apple is still the top tablet vendor with a share of 23.1 per cent in Q4 of last year. But it fell heavily from 27.3 per cent the previous year. Cupertino’s shipment numbers dropped from 21.4 million units to 16.1 million units this year.

Samsung was in second place with a 12.9 per cent market share, down from 13.9 per cent the previous year. Lenovo saw slight growth in third place with an increase from 4.7 per cent to a 5.7 per cent share in Q4 2015, with Amazon slipping to fourth place, dropping from 4.9 per cent to 4.4 per cent.

Courtesy-Fud

 

Microsoft Giving Yammer To All Office 365 Subscribers

February 4, 2016 by mphillips  
Filed under Computing

Microsoft is ramping up its efforts to expand the reach of its Yammer work social network — and better compete with other workplace collaboration tools – announcing that any organization with an Office 365 subscription will gain access to the service and have it automatically activated.

The service will start rolling out to users in waves. The automatic activation will allow businesses to quickly spin up online communities for their workers.

Microsoft will also let users sign in to Yammer with the same username and password they use to access all of their other Office 365 apps and services. System administrators will, however, have the ability to prevent users from accessing Yammer.

The first Yammer rollout will target businesses with fewer than 150 licenses and that have an Office 365 subscription that includes Yammer.

Microsoft bought Yammer in 2012 for $1.2 billion. At the time, it was a high-flying technology startup in the hot enterprise social network space, althought it hasn’t been taken up widely. Microsoft said that more than 500,000 businesses are using it, up from 200,000 at the time of its acquisition.

Yammer faces increased competition in the workplace collaboration space. Rival Slack’s real-time chat capabilities have made it a popular choice, though that software doesn’t replicate the message board and information feed aspects of Yammer’s product. However, when Facebook for Work becomes publicly available — it’s in a closed beta test — that offering will more closely compete with Yammer’s core functionality.

 

Will MediaTek’s Helio Debut This Year?

February 4, 2016 by Michael  
Filed under Computing

MediaTek’s a Senior Vice President and Chief Financial Officer David Ku has confirmed that the company plans to ship the X30 in 2016.

The X30 has been ephemeral product for quite some time although it had been expected that the X30 would follow the X20 eventually. Ku expects that phones based on Helio P10 and X20 should start to arrive in this quarter.

The majority of the design wins for the performance and mainstream phones for the first half of 2016 will be for last year’s flagship the Helio X10, the upcoming Helio P10 and soon to become new flagship Helio X20.

Ku mentioned during the company’s fourth financial quarter of 2015 that Mediatek will launch a Helio X30 and that this will happen in the second part of the year. This was the time of the year when Mediatek launched the X20.

The X30 will be released in 2016 but the phones will only show up in 2017.

The normal design cycle of the phone usually lasts 12 to 18 months. The Helio P20, the company’s first 16nm SoC is expected in the second half of 2016. With some luck, we might see some device shipping with this new SoC before the end of the year.

MediaTek didn’t give any additional information about the Helio X30, other than to acknowledge its existence. Let’s first see how the Helip X20 and P10 will do this year.

Courtesy-Fud

 

AMD Goes Virtual With GPUs

February 3, 2016 by Michael  
Filed under Computing

AMD has revealed what it claims are the world’s first hardware virtualized GPU products — AMD FirePro S-Series GPUs with Multiuser GPU (MxGPU) technology.

The big idea is to have a product for remote workstation, cloud gaming, cloud computing, and Virtual Desktop Infrastructure (VDI).

In the virtualization ecosystem, key components like the CPU, network controller and storage devices are being virtualized in hardware to deliver optimal user experiences. So far the GPU has been off the list.
AMD MxGPU technology, for the first time, brings the modern virtualization industry standard to the GPU hardware.

AMD MxGPU technology is based on SR-IOV (Single Root I/O Virtualization), a PCI Express standard and brings hardware GPU scheduling logic to the user.

The outfit claims that it preserves the data integrity of Virtualized Machines (VM) and their application data through hardware-enforced memory isolation logic preventing one VM from being able to access another VM’s data.

It also exposes all graphics functionality of the GPU to applications allowing for full virtualization support for not only graphics APIs like DirectX and OpenGL but also GPU compute APIs like OpenCL .

The new AMD FirePro S7150 and AMD FirePro S7150 x2 server graphics cards will combine with OEM offerings to create high-performance virtual workstations and address IT needs of simple installation and operation, critical data security and outstanding performance-per-dollar.

Typical VDI use cases include Computer-Aided Design (CAD), Media and Entertainment, and office applications powered by the industry’s first hardware-based virtualized GPU.

Sean Burke, corporate vice president and general manager, Radeon Technologies Group, AMD said that the AMD hardware virtualization GPU product line is another example of its commitment to offering customers exceptional cutting edge graphics in conjunction with fundamental API software support.

“We created the innovative AMD FirePro S-series GPUs to deliver a precise, secure, high performance and enriched graphics user experience — all provided without per user licensing fees required to use AMD’s virtualized solution.”

Jon Peddie, president, Jon Peddie Research. “The move to virtualization of high-performance graphics capabilities typically associated with standalone workstations only makes sense, and will likely gain significant traction in the coming years.”

Pat Lee, senior director, Remote Experience for Desktop and Application Products, VMware said that AMD FirePro S7150 and AMD FirePro S7150 x2 GPUs complement VMware Horizon by giving more users a richer, more compelling user experience. Systems equipped with AMD FirePro cards can provide VMware Horizon users with enhanced video and graphics performance, benefiting especially those installations that focus on CAD and other 3D intensive applications.”

IT budgets can support for up to 16 simultaneous users with a single AMD FirePro S7150 GPU card which features 8 GB of GDDR5 memory, while up to twice as many simultaneous users (32 in total) can be supported by a single AMD FirePro S7150 x2 card which includes a total of 16 GB of GDDR5 memory (8GB per GPU). Both models feature 256-bit memory bandwidth.

Based on AMD’s Graphics Core Next (GCN) architecture to optimize utilization and maximize performance, the AMD FirePro S7150 and S7150 x2 server GPUs feature:

• AMD Multiuser GPU (MxGPU) technology to enable consistent, predictable and secure performance from virtualized workstations with the world’s first hardware-based virtualized GPU products to enable users with workstation-class experiences matched with full ISV certifications.

• GDDR5 GPU Memory to help accelerate applications and process computationally complex workflows with ease.

• Error Correcting Code (ECC) Memory to ensure the accuracy of computations by correcting any single or double bit error as a result of naturally occurring background radiation.

• OpenCL 2.0 support to help professionals tap into the parallel computing power of modern GPUs and multicore CPUs to accelerate compute-intensive tasks in leading CAD/CAM/CAE and Media & Entertainment applications that support OpenCL allowing developers to take advantage of new GPU features.

• AMD PowerTune is an intelligent power management system that monitors both GPU activity and power draw. AMD PowerTune optimizes the GPU to deliver low power draw when GPU workloads do not demand full activity and delivers the optimal clock speed to ensure the highest possible performance within the GPU’s power budget for high intensity workloads.

AMD FirePro S7150 and S7150 x2 server GPUs are expected to be available from server technology providers in the first half of 2016.

The AMD FirePro S-Series GPUs with MxGPU technology are being exhibited in a Dell server system at SolidWorks World 2016 in Dallas, Texas at the moment.

Courtesy-Fud

 

Is nVidia’s Pascal Finally Coming In April

February 3, 2016 by Michael  
Filed under Computing

The dark satanic rumour mill has been flat out manufacturing hell on earth yarns that Nvidia is about to release a new Pascal GPU soon.

The logic is that Nvidia has the time to counter AMD’s Polaris by pushing out a Pascal GPU sooner than anyone expected.

Kotaku claims that NVIDIA looks set to beat AMD’s Polaris architecture when the new GPU appears. In fact it hinted that AMD brought down the price of the Radeon R9 Nano to $499 to counter this move in the high end of the market.

The latest rumor is that Nvidia will be churning out Pascal architecture in all its GPUs from April. When the new GPUs arrive they will be marketed as “TITAN-grade” which goes to show that they will be replacing the current offerings that are marketed under the “TITAN” brand. As for the main GP100 chip will come with 32GB of VRAM.

These rumors about the GPUs with the Pascal architecture are currently based on shipping manifests that have spotted on the Zauba database in India which deals with products that are imported or exported from the country.

It is thought that Nvidia’s CEO Jen-Hsun Huang will unveil the Pascal GPU in April during the GPU Technology Conference. In fact it is likely that Huang will announce it during his April 4 keynote which is the conference’s first day.

Courtesy-Fud

 

Microsoft Cloud Email Winning The Enterprise

February 2, 2016 by Michael  
Filed under Computing

Enterprises of all sizes are willingly surrendering their emails to the cloud, according to the analysts at Gartner, and the bulk of them are relying on Microsoft to keep them up in the air and spinning.

The cloud, in case you missed it, is everywhere. Even your nan uploads her photos to the cloud. Cloud email services have been embraced by consumers, but have been welcomed more cautiously in the business world. Until now, that is, according to a new Gartner cloud and email report.

The leading firms in this area are Google and Microsoft. The latter seems to have the edge, perhaps because Microsoft solutions are as entrenched in business as tedious meetings. Google is getting its game together, however, thanks to a mix of improvement and marketing.

“Although it is still early days for cloud email adoption, Microsoft and Google have achieved significant traction among enterprises of different sizes, industries and geographies,” said Nikos Drakos, a research vice president at Gartner.

“Companies considering cloud email should question assumptions that public cloud email is not appropriate in their region, size or industry. Our findings suggest that many varied organisations are already using cloud email, and the number is growing rapidly.”

Party like it’s 1999, because Microsoft has the market locked down and Gartner reckons that it is well in use in industries where regulation is a strong consideration. Google is more obviously installed at more relaxed locations.

“Among public companies using cloud-based email, Microsoft is more popular with larger organisations and has more than an 80 per cent share of companies using cloud email with revenue above $10bn,” added Jeffrey Mann, research vice president at Gartner.

“Google’s popularity is better among smaller companies, approaching a 50 per cent share of companies with revenue less than $50m.”

 

Courtesy-TheInq

 

Microsoft Goes Deep With Underwater Data Center

February 2, 2016 by mphillips  
Filed under Around The Net

Technology giants are finding some of the strangest places for data centers these days.

Facebook, for example, built a data center in Lulea in Sweden because the icy cold temperatures there would help cut the energy required for cooling. A proposed Facebook data center in Clonee, Ireland, will rely heavily on locally available wind energy. Google’s data center in Hamina in Finland uses sea water from the Bay of Finland for cooling.

Now, Microsoft is looking at locating data centers under the sea.

The company is testing underwater data centers with an eye to reducing data latency for the many users who live close to the sea and also to enable rapid deployment of a data center.

Microsoft, which has designed, built, and deployed its own subsea data center in the ocean, in the period of about a year, started working on the project in late 2014, a year after Microsoft employee, Sean James, who served on a U.S. Navy submarine, submitted a paper on the concept.

A prototype vessel, named the Leona Philpot after an Xbox game character, operated on the seafloor about 1 kilometer from the Pacific coast of the U.S. from August to November 2015, according to a Microsoft page on the project.

The subsea data center experiment, called Project Natick after a town in Massachusetts, is in the research stage and Microsoft warns it is “still early days” to evaluate whether the concept could be adopted by the company and other cloud service providers.

“Project Natick reflects Microsoft’s ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable,” the company said.

Using undersea data centers helps because they can serve the 50 percent of people who live within 200 kilometers from the ocean. Microsoft said in an FAQ that deployment in deepwater offers “ready access to cooling, renewable power sources, and a controlled environment.” Moreover, a data center can be deployed from start to finish in 90 days.

 

 

MediaTek Goes LTE CAT 6 On Low End SoCs

January 29, 2016 by Michael  
Filed under Computing

MediaTek appears to be ready to give three more entry level processors LTE Cat 6 so they can mangage a 300 Mbit download and 50 Mbit upload.  We already knew that the high-end deca-core X20 and mainstream eight core P10 were getting LTE Cat 6.

According to the Gizchina website, the company the three new SoCs carry the catchy titles of MT6739, MT6750 and MT6750T. .

The MT6739 will probably replace the MT6735. Both have quad A53 cores but it will mean that the MT6739 will get a Cat 6 upgrade from Cat 4. The MT6739 supports speeds of up to 1.5GHz, 512 KB L2 cache, 1280×720 at 60fps resolution, and video decode to 1080p 30fps with H.264 and 13 megapixel camera. This means it is an entry level SoC for phones that might fit into the $100 price range.

The MT6750 and MT6750T look like twins, only the T version supports full HD 1920×1080 displays. The MT6750 has eight cores, four A53 clocked at 1.5Ghz and four A53 clocked at 1.0GHz and is manufactured on TSMC’s new 28nm High Performance Mobile Computing manufacturing mode. This is the same manufacturing process MediaTek is using for the Helio P10 SoC. The new process allows lower leakage and better overall transistor performance at lower voltage.

The MT6750 SoC supports single channel LPDDR3 666MHz and eMCP up to 4GB. The SoC supports eMMC 5.1, 16 megapixel camera, 1080p 30 fps with both H.264 and H.265 decoding. It comes with an upgraded ARM Mali T860 MP2 GPU with 350 MHz and display support of 1280×720 HD720 ready with 60 FPS. This means the biggest upgrade is the Cat 6 upgrade and it makes sense – most of European and American networks now are demanding a Cat 6 or higher modem that supports carrier aggregation.

This new SOc looks like a slowed down version of Helios P10 and should be popular for entry level Android phones.

 

 

 

Courtesy-Fud

AMD’s GPU Goes Open

January 29, 2016 by Michael  
Filed under Computing

AMD’s top gaming guy Nicolas Thibieroz is starting to open up GPU technology to make sure the technology evolves.

He said that its GPUOpen is the beginning of a new philosophy at AMD and would continue the initiative started with Mantle. Now is time to do even more for developers. Apparently the creation of the Radeon Technology Group led by Raja Koduri was key in turning GPUOpen.

Thibieroz said that innovative results were only possible via the exchange of knowledge that happens within the game development community. While whole conferences are dedicated to this information sharing, it is often in more modest settings that inspiration takes form. Dinner conversations, plan files, developer forums or chats are common catalysts to graphics greatness.

However there are hurdles getting in the way of productivity and innovation. Developers can’t use their R&D investment on both consoles and PC because of the disparity between the two platforms.

Console games use low-level GPU features that may not be exposed on PC at the same level of functionality. This cases less efficient code paths to be implemented on PC instead.
Proprietary libraries or tools chains with “black box” APIs prevent developers from accessing the code for maintenance, porting or optimisations purposes.

“Game development on PC needs to scale to multiple quality levels, including vastly different screen resolutions. Triple monitor setups, 4K support or dual renders for VR rendering require vast amounts of GPU processing power yet brute force rendering only gets you so far. There is still a vast amount of graphics performance still untapped, and it’s time to explore smarter ways to intelligently render those increasing numbers of pixels. “Opening up” the GPU is how we solve this,” Thibieroz wrote.

GPUOpen is composed of two areas: Games & CGI for game graphics and content creation, and Professional Compute for high-performance GPU computing in professional applications.
GPUOpen will provide code and documentation allowing PC developers to exert more control on the GPU.

Current and upcoming GCN architectures, such as Polaris, include many features not exposed today in PC graphics APIs, and GPUOpen aims to empower developers with ways to use some of those features.

In addition to generating quality or performance advantages such access will also enable easier porting from current-generation consoles to the PC platform.
GPUOpen will also make a commitment to open source software.

“The game and graphics development community is an active hub of enthusiastic individuals who believe in the value of sharing knowledge. Full and flexible access to the source of tools, libraries and effects is a key pillar of the GPUOpen philosophy. Only through open source access are developers able to modify, optimize, fix, port and learn from software,” he said.

This will encouraging innovation and the development of amazing graphics techniques and optimisations in PC games.

AMD will start a collaborative engagement with the developer community. GPUOpen software will be hosted on public source code repositories such as GitHub as a way to enable sharing and collaboration. Engineers from different functions will also regularly write blog posts about various GPU-related topics, game technologies or industry news.

Courtesy-Fud

 

Mozilla Debuts Push Notifications With Firefox 44

January 28, 2016 by mphillips  
Filed under Computing

Mozilla has released Firefox 44, following in rival Chrome’s footsteps by adding push notifications to the browser.

The update to version 44 lets users opt in to receive notifications from websites even when a site’s tab has been closed or never opened.

Google added the same functionality to Chrome in April with version 42.

Both Chrome and Firefox rely on the W3C’s (Worldwide Web Consortium) under-construction “web push” protocol, and the associated Push API (application programming interface) for the feature.

Other changes in Firefox 44 include deprecation of support for the RC4 decipher over HTTPS connections safeguarded by TLS (Transport Layer Security), a move announced in September by all browser makers, who promised to drop RC4 support in 2016. The action was prompted by research that showed RC4 was easily cracked.

Google dropped RC4 with the update to Chrome 48 last week; Microsoft has said it will do the same for Internet Explorer (IE) and Edge at some point “in early 2016.”

Firefox 44 also added support for the Google-created “Brotli” compression algorithm, which is up to 25% more efficient in squeezing files delivered to browsers, resulting in faster page loading times and reduced data consumption for those on capped connection plans.

More information about Firefox’s push notifications can be found on Mozilla’s website.

 

 

 

Will eSport Grow To One Half Billion Dollar Market This Year?

January 28, 2016 by Michael  
Filed under Gaming

According to Newzoo’s 2016 Global eSports Market Report, this year is expected to be a “pivotal” one for the eSports sector. The firm said that last year’s tally for worldwide eSports revenues came to $325 million, and this year the full eSports economy should grow 43 percent to $463 million; Newzoo said this correlates with an audience of 131 million eSports enthusiasts and another 125 million “occasional viewers who tune in mainly for the big international events.” Overall, Newzoo’s report states that global and local eSports markets should jointly generate $1.1 billion in 2019.

Looking a bit deeper, Newzoo found that investment into and advertising associated with eSports continue to grow at a rapid clip. “This year has been dominated by the amount of investors getting involved in eSports. An increasing amount of traditional media companies have become aware of the value of the eSports sphere and have launched their first eSports initiatives. With these parties getting involved, there will be an increased focus on content and media rights. All major publishers have increased their investment into the space, realizing that convergence of video, live events and the game itself are providing consumers the cross-screen entertainment they desire from their favorite franchises,” Newzoo commented.

Online advertising in particular is the fastest growing revenue segment within eSports, jumping up 99.6 percent on a global scale compared to 2014. North America is expected to lead the charge worldwide.

“In 2016, North America will strengthen its lead in terms of revenues with an anticipated $175 million generated through merchandise, event tickets, sponsorships, online advertising and media rights. A significant part of these revenues flows back to the game publisher, but across all publishers, more money is invested into the eSports economy than is directly recouped by their eSports activities,” said Newzoo’s eSports Analyst, Pieter van den Heuvel.

“China and Korea together will represent 23 percent of global esports revenues, totalling $106 million in 2016. Audience-wise, the situation is different, with Asia contributing 44 percent of global eSports enthusiasts. Growth in this region is, for a large part, fuelled by an explosive uptake in Southeast Asia.”

While eSports is certainly on a good path for growth, game companies would be wise to not get too caught up by the hype. The average annual revenue per eSports enthusiast was $2.83 in 2015 and is expected to grow to $3.53 this year, Newzoo said, but that’s still a factor four lower than a mainstream sport such as basketball, which generates revenues of $15 per fan per year.

Peter Warman, CEO at Newzoo added, “The initial buzz will settle down and the way forward on several key factors, such as regulations, content rights and involvement of traditional media, will become more clear. The collapse of MLG was a reminder that this market still has a long road to maturity and we need to be realistic about the opportunities it provides. In that respect, it is in nobody’s interest that current market estimates differ so strongly. Luckily, when zooming in on the highest market estimates of more than $700 million, the difference is explainable by an in-depth look. This estimate only differs in the revenues generated in Asia (Korea in particular), and by taking betting revenues into account. At Newzoo, we believe betting on eSports should not be mixed into direct eSports revenues as the money does not flow into the eSports economy. Similarly, sports betting is not reported in sports market reports.”

Courtesy-GI.biz

 

VMware Eliminates 800 Jobs As Part Of Transition Strategy

January 28, 2016 by mphillips  
Filed under Around The Net

VMware is eliminating approximately 800 jobs as it transitions from its traditional products to newer, emerging technologies.

The company expects 2016 will be a key transition year as “we expect the effect of our new products to outweigh the decline in our compute products,” CEO Pat Gelsinger said on its latest earnings conference call.

The company has been facing challenges in its software business as its customers are increasingly using public cloud providers like Amazon Web Services and Microsoft Azure.

“Public cloud providers do provide VMware, but for many of the newer, cloud workloads, many are opting for containers or even OpenStack which doesn’t require what’s considered expensive VMware licenses,” wrote Patrick Moorhead, president and principal analyst at Moor Insights & Strategy, in an email.

VMware reported that its total revenue under generally accepted accounting principles (GAAP) for 2015 was $6.57 billion, an increase of 9 percent from 2014, or up 12 percent year-over-year on a constant currency basis. The company expects 2016 revenue will be up to $6.935 billion, an increase of as much as 4 percent from 2015.

The guidance was influenced by concern about business from weakening economies like Russia, Brazil and China.

Gelsinger said during the conference call that the company recognizes that its blockbuster compute products are reaching maturity, and will play a decreasing role in the business. But the company  expects newer emerging products will pick up the slack.

One of the company’s new focus areas is on extending customers’ private cloud workloads into the public cloud via vCloud Air Network and vCloud Air. But Gelsinger clarified that its vCloud Air cloud computing service will have a narrower focus, providing specialized cloud software and services distinct from other public cloud providers, suggesting that the company does not want to take the big players head-on in the commodity cloud business.

 

 

 

Qualcomm Goes 4.5 LTE Pro

January 27, 2016 by Michael  
Filed under Computing

Recently, Qualcomm has published a new corporate presentation detailing its path from 3GPP’s “Release 13” (March 2016) and beyond for LTE networks– also more conveniently known as 4.5G LTE “Advanced Pro” – with a development timeframe between 2016 and 2020.

This will be an “intermediate” standard before the wireless industry continues with “Release 15” in 2020 and beyond, also known as 5G technology. The company intends to make LTE Advanced Pro an opportunity to use up more spectrum before 5G networks launch next decade and wants it to support further backwards-compatibility with existing LTE deployments, extremely low latencies, and unlicensed spectrum access, among many other new features.

In its new 4.5G presentation, Qualcomm has highlighted ten major bullet points that it expects to be present in its next-generation LTE “Advanced Pro” specification. The first point describes delivering fiber-like speeds by using Carrier Aggregation (introduced with “LTE Advanced” networks in 2013) to aggregate both licensed and unlicensed spectrum across more carriers, and to use simultaneous connections to different cell types for higher spectral efficiency (for example: using smaller, single-user pCells combined with large, traditional cell towers).

Qualcomm’s second bullet point is to basically make native use of Carrier Aggregation with LTE Advanced Pro by supporting up to 32 carriers at once across a much fatter bandwidth pipe. This will primarily be achieving using a new development called “Licensed Assisted Access.”

In short, Licensed Assisted Access (LAA) is the 3GPP’s effort to standardize LTE use inside of 5GHz WiFi spectrum. It was introduced in 2015 and allows mobile users to simultaneously use both licensed and unlicensed spectrum bands at the same time. This makes sense from an economic scarcity standpoint, as a fairly large number of channels are available for use in unlicensed bands (more than 500MHz in many regions). This should ultimately allow carriers with “low interference” in unlicensed spectrums to aggregate with licensed band carriers to make the most efficient use of all locally-available spectrum.

Qualcomm says that network traffic can be distributed across both licensed and unlicensed carriers when unlicensed bands are being lightly used. The result is that Licensed Assisted Access (LAA) users win by getting higher throughput and lower latency. In 3GPP Release 14 and beyond, Qualcomm eventually anticipates improving upon LAA with “Enhanced License Assisted Access” (eLAA). This second-generation design will include features such as uplink / downlink aggregation, dual unlicensed-and-licensed connectivity across small cells and large traditional cells, and a further signal complexity reduction for more efficient channel coding and higher data rates.

The company’s third bullet point for LTE Advanced Pro is to achieve “significantly lower latency” – up to ten times lower to be precise – yet still be able to operate on the same bands as current LTE towers. They expect to achieve this primarily through a new Frequency Division Duplexing (FDD) / Time Division Duplexing (TDD) design with significantly lower round-trip times (RTTs) and time transmission intervals (TTIs). We are looking at around 70 microseconds to transmit 14 OFDM data symbols versus the current LTE / LTE-A timeframe of 1 millisecond for the same amount of data. The company also expects to achieve significantly lower latency in TDP/UDP throughput limitations (from current LTE-A peak rates), significantly lower latency in VoIP applications, and significantly lower latency for next-gen automotive LTE connection needs.

The fourth bullet point, and very important, is to increase traffic flexibility by converting uplink resources for offloading downlink traffic. In much more technical terms, Qualcomm will utilize a new “Flexible Duplex” design that has self-contained TDD subframes and a dynamic uplink / downlink pattern adaptive to real-time network traffic instead of a stagnant data stream. We can expect to see this implemented around 3GPP Release 14.

Qualcomm’s fifth bullet point for 4.5G LTE Advanced Pro is to enable many more antennas at the base station level to significantly increase capacity and coverage. In 3GPP Release 13 this will be called “Full Dimension MIMO.” This technology uses elevation beamforming by using a 2D antenna signal array in order to exploit 3D beamforming. Later down the road in 3GPP Releases 14 and beyond, we can expect support for what the company calls higher-order “Massive MIMO.” This will consist of more than 16 antennas in an array and should enable devices to connect to even higher spectrum bands.

The sixth bullet point deals with increasing efficiency for Internet of Things applications, also known as “LTE IoT.” One element of this strategy includes enhanced power-save modes (extended DRX sleep cycles) for small devices. More importantly, this also means beyond 10 years of battery life for certain use cases. The company wants to use more narrowband operating modes (see: 1.4MHz and 180KHz) in order to reduce device costs, and wants to deploy “deeper cellular coverage” with up to 20dB signal attention. Previously, regular LTE and LTE-Advanced would top out around ~50dB for most carriers. Going up to 20dB will certainly make a noticeable difference for many indoor, multi-floor users in corporate environments and for those around heavy foliage, mountain ranges and hillsides in both urban and suburban environments.

The seventh bullet point deals with integrating LTE into the connected cars of the future. Qualcomm calls this “Vehicle-to-Everything” Communications, or V2X for short. The goal is to connect cars at higher-than-gigabit LTE Advanced Pro speeds to one another, and to also connect them with nearby pedestrians and IoT-connected objects in the world around them. Privacy and political issues aside, this will supposedly make our collective driving experiences “safer and more autonomous.” Specifics include “always-on sensing,” vehicle machine learning, vehicle computer vision, “bring your own driver” and vehicle-to-infrastructure communication, all from within the car. The company calls the result of V2X automotive integration “on-device intelligence.”

To further things along with ubiquitous gigabit LTE, Qualcomm also eventually wants you to completely ditch your cable / satellite / fiber optic (FTTN and FTTP) television subscriptions and leverage the speeds of its LTE Advanced Pro technology for a “converged digital TV network.” This means television broadcasts over LTE to multiple devices, simultaneously – basically, an always-on LTE Advanced Pro TV broadcast stream to 4K home televisions, tablets and smartphones for the whole family, all at once and at any time of day.

In the ninth bullet point, Qualcomm is boasting LTE-Advanced Pro’s capability for proximity sensing – without the use of GPS – autonomously. This includes using upgraded cell towers for knowing when friends are nearby and for discovering retail services and events, all without triggering WiFi or GPS modules on your device.

The tenth bullet point is an extension of the last one and uses LTE technologies at large for advanced public safety services (including 9-1-1 emergencies) – all without triggering WiFi or GPS modules for proximity data. This new “LTE Emergency Safety System” deployment will deliver both terrestrial emergency information as well as automotive road hazard information. Qualcomm expects this to emulate current Professional / Land Mobile Radio (PMR / LMR) push-to-talk systems on walkie-talkies.

For now, LTE Category 12 600Mbps (upgrade to current 3GPP Release 12) comes in 2016

While the gigabit-and-higher speeds of 3GPP Release 13 and beyond are still a couple years off, Qualcomm wants to kick things off with an update to 3GPP Release 12 (launched Q2 2014) with 600Mbps downlinks and 150Mbps uplinks achieved through the carrier aggregation technique.

During CES 2016, Qualcomm showed off its new “X12 LTE” modem add-on for the Samsung-made Snapdragon 820 Automotive SoC family, or “Snapdragon 820A” Series. The unit features LTE-Advanced (LTE-A) carrier aggregation (3x in the downlink and 2x in the uplink), comes with a new dual LTE FDD/TDD “Global Mode” capability, and supports dual SIM cards.

The X12 LTE modem features UE Category 12 on the downlink with speeds up to 600Mbps (75MBps) achieved through a transition from 64QAM (Quadrature Amplitude Modulation) in the older UE Category 9 specification (see: Snapdragon 810 modem) to a much higher 256 QAM. It is also possible to enable up to 4 x 4 MIMO on the downlink carrier which results in better bandwidth and improved coverage. New modem uses UE Category 13 on the uplink side for speeds up to 150Mbps (18.75MBps) with 64 QAM. The unit also has LTE-U support (LTE in unlicensed spectrum) to allow it to operate on 2.4GHz and 5GHz unlicensed channels for additional spectrum. Additionally, it can bond both LTE and WiFi links together to boost download speeds with LTE + WiFi Link Aggregation (LWA).

Wikipedia.org – History of 3GPP LTE User Equipment Category releases

Qualcomm has recorded a webinar with FierceWireless all about its roadmap from 4.5G LTE Advanced Pro technology (2016 – 2020) to next-next generation LTE 5G technology (2020 and beyond), which can be found here.

The company will also be present at Mobile World Congress 2016 between February 22nd and 25th in Barcelona, Spain to demonstrate new features of LTE Advanced Pro – including “Enhanced Licensed Assisted Access” (eLAA) and MuLTEfire (“multi-fire”) – at its exhibition booth. Our staff is sure to be present at the event and we look forward to sharing more hands-on demos very soon.

Courtesy-Fud

 

 

Is Capcom Finally Getting Into eSports?

January 26, 2016 by Michael  
Filed under Gaming

On February 16, Street Fighter V will launch on PlayStation 4 and PC. It will not be launching to Xbox One thanks to an exclusivity deal signed with Sony. And as Capcom director of brand marketing and eSports Matt Dahlgren told GamesIndustry.biz recently, there are a few reasons for that.

Dahlgren called the deal “the largest strategic partnership that fighting games have ever seen,” and said it addressed several problems the publisher has had surrounding its fighting games for years.

“Basically every SKU of a game we released had its own segmented community,” he said. “No one was really able to play together and online leaderboards were always segmented, so it was very difficult to find out who would be the best online and compare everybody across the board.”

Street Fighter V should alleviate that problem as it’s only on two platforms, and gamers on each will be able to play with those on the other. Dahlgren said it will also help salt away problems that stemmed from differences between platforms. For example, the Xbox 360 version of Street Fighter IV had less input lag than the PS3 version. That fraction of a second difference between button press and action on-screen might have been unnoticeable to most casual players, but it was felt by high-level players who know the game down to the last frame of animation.

“There were varying degrees of input lag, so when those players ended up playing each other, it wasn’t necessarily on an equal playing field,” Dahlgren said. “This time around, by standardizing the platform and making everyone play together, there will be a tournament standard and everyone is on an equal playing field.”

Finally, Dahlgren said the deal with Sony will help take Street Fighter to the next level when it comes to eSports. In some ways, it’s a wonder it’s not there already.

“I think fighting games are one of the purest forms of 1v1 competition,” Dahlgren said. “A lot of the other eSports games out there are team-based, and while there’s an appeal to those, there’s something about having a single champion and having that 1v1 showdown that’s just inherently easy for people to understand.”

Street Fighter has a competitive gaming legacy longer than League of Legends or DOTA, but isn’t mentioned in the same breath as those hits on the eSports scene. In some ways, that legacy might have stymied the franchise’s growth in eSports.

“A lot of our community was really built by the fans themselves,” Dahlgren said. “Our tournament scene was built by grassroots tournament organizers, really without the help of Capcom throughout the years. And I would say a lot of those fans have been somewhat defensive [about expanding the game's appeal to new audiences]. It hasn’t been as inclusive as it could have been. With that said, I do definitely feel a shift in our community. There’s always been a talking point with our hardcore fans as to whether or not Street Fighter is an eSport, and what eSports could do for the scene. Could it potentially hurt it? There’s been all this controversy behind it.”

Even Capcom has shifted stances on how to handle Street Fighter as an eSport.

“In the past, we were actually against partnering up with any sort of corporations or companies out there that were treating eSports more like a business,” Dahlgren said. “And that has to do out of respect for some of our long-term tournament organizers… Our fear was that if we go out and partner up with companies concerned more about making a profit off the scene instead of the values that drive the community, then it could end up stomping out all these tournament organizers who are very passionate and have done so much for our franchise.”

“In the past, we were actually against partnering up with any sort of corporations or companies out there that were treating eSports more like a business.”

So instead of teaming with the MLGs or ESLs of the world, Capcom teamed with Twitch and formed its own Pro Tour in 2014. Local tournament organizers handle the logistics of the shows and retain the rights to their brands, while Capcom provides marketing support and helps with production values.

“I can’t say Capcom wouldn’t partner up with some of the other, more established eSports leagues out there,” Dahlgren said. “I do think there’s a way to make both of them exist, but our priority in the beginning was paying homage to our hardcore fans that helped build the scene, protecting them and allowing them to still have the entrepreneurial spirit to grow their own events. That comes first, before partnering with larger organizations.”

Just as Capcom’s stance toward tournaments has changed to better suit Street Fighter’s growth as an eSport, so too has the business model behind the game. The company has clearly looked at the success of many free-to-play eSports favorites and incorporated elements of them (except the whole “free-to-play” thing) into Street Fighter V. Previously, Capcom would release a core Street Fighter game, followed by annual or bi-annual updates with a handful of new fighters and balancing tweaks. Street Fighter V will have no such “Super” versions, with all new content and tweaks made to the game on a rolling basis.

“We are treating the game now more as a platform and a service, and are going to be continually adding new content post-launch,” Dahlgren said. “This is the first time we’re actually having our own in-game economy and in-game currency. So the more you play the game online, you’re going to generate fight money, and then you can use that fight money to earn DLC content post-launch free of charge, which is a first in our franchise. So essentially we’re looking at an approach that takes the best of both worlds. It’s not too far away from what our players really expect from a SF game, yet we get some of the benefits of continually releasing content post-launch and giving fans more of what they want to increase engagement long-term.”

Even if it’s not quite free-to-play, Street Fighter V may at least be cheaper to play. Dahlgren said that pricey arcade stick peripherals are not as essential for dedicated players as they might have seemed in the past.

“Since Street Fighter comes from an arcade heritage, a lot of people have this general belief that arcade sticks are the premier way of playing,” Dahlgren said. “I think now that the platform choice has moved more towards consoles, pad play has definitely become much more prevalent. I would believe that at launch you’re probably going to have more pad players than you actually have stick players. And in the competitive scene, we’ve seen the rise of a lot of very impressive pad players, which has pretty much shown that Street Fighter is a game that’s not necessarily dictated by the controller you play with; it’s the strategies and tactics you employ. And both of them are essentially on equal playing ground.”

Courtesy-GI.biz