Subscribe to:

Subscribe to :: TheGuruReview.net ::

Are Light Powered Transistors On The Horizon?

February 5, 2016 by Michael  
Filed under Computing

A team of researchers have emerged from their smoke filled labs claiming to have invented a transistor which runs on light rather than applied voltage.

According to Technology Review University of North Carolina in Charlotte say the new transistor controls the electrons to flow through it so that when the lights are on it and turns itself off when it gets dark.

This means that devices can be made smaller than field effect transistors because they don’t require doping in the same way and can be squeezed into smaller spaces. Meanwhile the speeds are faster.

Apparently the idea is not rocket science and is based on the idea that materials have been known to be photoconductive.

What the team has done is create a device which uses a ribbon of cadmium and selenium a couple of atoms thick. This can conduct more than a million times more current when on than off. This is about the same as regular transistors.

Of course it is years away from being a product yet. They still have not worked out how to send light to each transistor and if that will cost more power.

Courtesy-Fud

 

Qualcomm Goes 4.5 LTE Pro

January 27, 2016 by Michael  
Filed under Computing

Recently, Qualcomm has published a new corporate presentation detailing its path from 3GPP’s “Release 13” (March 2016) and beyond for LTE networks– also more conveniently known as 4.5G LTE “Advanced Pro” – with a development timeframe between 2016 and 2020.

This will be an “intermediate” standard before the wireless industry continues with “Release 15” in 2020 and beyond, also known as 5G technology. The company intends to make LTE Advanced Pro an opportunity to use up more spectrum before 5G networks launch next decade and wants it to support further backwards-compatibility with existing LTE deployments, extremely low latencies, and unlicensed spectrum access, among many other new features.

In its new 4.5G presentation, Qualcomm has highlighted ten major bullet points that it expects to be present in its next-generation LTE “Advanced Pro” specification. The first point describes delivering fiber-like speeds by using Carrier Aggregation (introduced with “LTE Advanced” networks in 2013) to aggregate both licensed and unlicensed spectrum across more carriers, and to use simultaneous connections to different cell types for higher spectral efficiency (for example: using smaller, single-user pCells combined with large, traditional cell towers).

Qualcomm’s second bullet point is to basically make native use of Carrier Aggregation with LTE Advanced Pro by supporting up to 32 carriers at once across a much fatter bandwidth pipe. This will primarily be achieving using a new development called “Licensed Assisted Access.”

In short, Licensed Assisted Access (LAA) is the 3GPP’s effort to standardize LTE use inside of 5GHz WiFi spectrum. It was introduced in 2015 and allows mobile users to simultaneously use both licensed and unlicensed spectrum bands at the same time. This makes sense from an economic scarcity standpoint, as a fairly large number of channels are available for use in unlicensed bands (more than 500MHz in many regions). This should ultimately allow carriers with “low interference” in unlicensed spectrums to aggregate with licensed band carriers to make the most efficient use of all locally-available spectrum.

Qualcomm says that network traffic can be distributed across both licensed and unlicensed carriers when unlicensed bands are being lightly used. The result is that Licensed Assisted Access (LAA) users win by getting higher throughput and lower latency. In 3GPP Release 14 and beyond, Qualcomm eventually anticipates improving upon LAA with “Enhanced License Assisted Access” (eLAA). This second-generation design will include features such as uplink / downlink aggregation, dual unlicensed-and-licensed connectivity across small cells and large traditional cells, and a further signal complexity reduction for more efficient channel coding and higher data rates.

The company’s third bullet point for LTE Advanced Pro is to achieve “significantly lower latency” – up to ten times lower to be precise – yet still be able to operate on the same bands as current LTE towers. They expect to achieve this primarily through a new Frequency Division Duplexing (FDD) / Time Division Duplexing (TDD) design with significantly lower round-trip times (RTTs) and time transmission intervals (TTIs). We are looking at around 70 microseconds to transmit 14 OFDM data symbols versus the current LTE / LTE-A timeframe of 1 millisecond for the same amount of data. The company also expects to achieve significantly lower latency in TDP/UDP throughput limitations (from current LTE-A peak rates), significantly lower latency in VoIP applications, and significantly lower latency for next-gen automotive LTE connection needs.

The fourth bullet point, and very important, is to increase traffic flexibility by converting uplink resources for offloading downlink traffic. In much more technical terms, Qualcomm will utilize a new “Flexible Duplex” design that has self-contained TDD subframes and a dynamic uplink / downlink pattern adaptive to real-time network traffic instead of a stagnant data stream. We can expect to see this implemented around 3GPP Release 14.

Qualcomm’s fifth bullet point for 4.5G LTE Advanced Pro is to enable many more antennas at the base station level to significantly increase capacity and coverage. In 3GPP Release 13 this will be called “Full Dimension MIMO.” This technology uses elevation beamforming by using a 2D antenna signal array in order to exploit 3D beamforming. Later down the road in 3GPP Releases 14 and beyond, we can expect support for what the company calls higher-order “Massive MIMO.” This will consist of more than 16 antennas in an array and should enable devices to connect to even higher spectrum bands.

The sixth bullet point deals with increasing efficiency for Internet of Things applications, also known as “LTE IoT.” One element of this strategy includes enhanced power-save modes (extended DRX sleep cycles) for small devices. More importantly, this also means beyond 10 years of battery life for certain use cases. The company wants to use more narrowband operating modes (see: 1.4MHz and 180KHz) in order to reduce device costs, and wants to deploy “deeper cellular coverage” with up to 20dB signal attention. Previously, regular LTE and LTE-Advanced would top out around ~50dB for most carriers. Going up to 20dB will certainly make a noticeable difference for many indoor, multi-floor users in corporate environments and for those around heavy foliage, mountain ranges and hillsides in both urban and suburban environments.

The seventh bullet point deals with integrating LTE into the connected cars of the future. Qualcomm calls this “Vehicle-to-Everything” Communications, or V2X for short. The goal is to connect cars at higher-than-gigabit LTE Advanced Pro speeds to one another, and to also connect them with nearby pedestrians and IoT-connected objects in the world around them. Privacy and political issues aside, this will supposedly make our collective driving experiences “safer and more autonomous.” Specifics include “always-on sensing,” vehicle machine learning, vehicle computer vision, “bring your own driver” and vehicle-to-infrastructure communication, all from within the car. The company calls the result of V2X automotive integration “on-device intelligence.”

To further things along with ubiquitous gigabit LTE, Qualcomm also eventually wants you to completely ditch your cable / satellite / fiber optic (FTTN and FTTP) television subscriptions and leverage the speeds of its LTE Advanced Pro technology for a “converged digital TV network.” This means television broadcasts over LTE to multiple devices, simultaneously – basically, an always-on LTE Advanced Pro TV broadcast stream to 4K home televisions, tablets and smartphones for the whole family, all at once and at any time of day.

In the ninth bullet point, Qualcomm is boasting LTE-Advanced Pro’s capability for proximity sensing – without the use of GPS – autonomously. This includes using upgraded cell towers for knowing when friends are nearby and for discovering retail services and events, all without triggering WiFi or GPS modules on your device.

The tenth bullet point is an extension of the last one and uses LTE technologies at large for advanced public safety services (including 9-1-1 emergencies) – all without triggering WiFi or GPS modules for proximity data. This new “LTE Emergency Safety System” deployment will deliver both terrestrial emergency information as well as automotive road hazard information. Qualcomm expects this to emulate current Professional / Land Mobile Radio (PMR / LMR) push-to-talk systems on walkie-talkies.

For now, LTE Category 12 600Mbps (upgrade to current 3GPP Release 12) comes in 2016

While the gigabit-and-higher speeds of 3GPP Release 13 and beyond are still a couple years off, Qualcomm wants to kick things off with an update to 3GPP Release 12 (launched Q2 2014) with 600Mbps downlinks and 150Mbps uplinks achieved through the carrier aggregation technique.

During CES 2016, Qualcomm showed off its new “X12 LTE” modem add-on for the Samsung-made Snapdragon 820 Automotive SoC family, or “Snapdragon 820A” Series. The unit features LTE-Advanced (LTE-A) carrier aggregation (3x in the downlink and 2x in the uplink), comes with a new dual LTE FDD/TDD “Global Mode” capability, and supports dual SIM cards.

The X12 LTE modem features UE Category 12 on the downlink with speeds up to 600Mbps (75MBps) achieved through a transition from 64QAM (Quadrature Amplitude Modulation) in the older UE Category 9 specification (see: Snapdragon 810 modem) to a much higher 256 QAM. It is also possible to enable up to 4 x 4 MIMO on the downlink carrier which results in better bandwidth and improved coverage. New modem uses UE Category 13 on the uplink side for speeds up to 150Mbps (18.75MBps) with 64 QAM. The unit also has LTE-U support (LTE in unlicensed spectrum) to allow it to operate on 2.4GHz and 5GHz unlicensed channels for additional spectrum. Additionally, it can bond both LTE and WiFi links together to boost download speeds with LTE + WiFi Link Aggregation (LWA).

Wikipedia.org – History of 3GPP LTE User Equipment Category releases

Qualcomm has recorded a webinar with FierceWireless all about its roadmap from 4.5G LTE Advanced Pro technology (2016 – 2020) to next-next generation LTE 5G technology (2020 and beyond), which can be found here.

The company will also be present at Mobile World Congress 2016 between February 22nd and 25th in Barcelona, Spain to demonstrate new features of LTE Advanced Pro – including “Enhanced Licensed Assisted Access” (eLAA) and MuLTEfire (“multi-fire”) – at its exhibition booth. Our staff is sure to be present at the event and we look forward to sharing more hands-on demos very soon.

Courtesy-Fud

 

 

AMD Launches Opteron Server On SoC

January 19, 2016 by Michael  
Filed under Computing

AMD has launched the Seattle ARM server system on chip (SoC) under the new Opteron A1100 brand.

The firm revealed the first details about the SoC in June 2013 and has now launched it in collaboration with software and hardware partners to “accelerate time-to-deployment of ARM-based systems and drive forward ecosystem support for ARM in the data center”.

AMD’s processor is said to be a step forward for customers looking for a data center-class ARM solution.

“The macro trend of convergence between networking, storage and servers is an important catalyst in this evolution,” said Scott Aylor, AMD’s VP and GM of enterprise solutions.

“Customers now have access to 64-bit ARM processors from the only silicon provider that has decades of experience delivering professional enterprise and embedded products.”

The Seattle chip is the firm’s first 64-bit ARM processor based on the Cortex A57 architecture. AMD didn’t reveal at what frequency the eight cores will run, but did say that there will be 4MB of shared Level 2 and 8MB of shared Level 3 cache as well as an integrated 10Gbps Ethernet controller and “extensive offload engines”, such as encryption, to increase power efficiency.

It’s been long time coming so a lot rides on Seattle as AMD is the only well-established server chip vendor developing ARM-based processors. The firm is putting a lot of effort into making Seattle competitive, not just through the ARM architecture but by integrating Seamicro’s Freedom Fabric and connectivity for storage.

“The AMD Opteron A1100 processor brings a new choice in scalability across network infrastructure and data centres,” said Lakshmi Mandyam, ARM’s director of server systems and ecosystems.

“AMD brings recognised expertise in the server and embedded markets, making them an ideal partner to deliver a 64-bit ARM processor with the impressive balance of performance and power-efficiency to address an increasingly diverse set of workloads.”

The Opteron A1100 SoC has been in advanced development for several quarters and is now available in mass production quantities.

Courtesy-TheInq

 

Does AMD Have A Bright Future Ahead?

January 11, 2016 by Michael  
Filed under Computing

Most of the investment has been speculative and short term as traders are hoping for AMD to move above $3.50 before July but this is a little odd. AMD hasn’t traded above $3.50 since September 2014 and it is $2.94 now.

Last year was up-and-down year for AMD. The shares struggled during the summer months, but have rallied over 70 per cent during the fourth quarter. But until Zen arrives we can’t see much on the horizon for investors to get excited about.

But in share prices it is all about timing and if AMD’s cunning plan pays off there are going to be some seriously wealthy people out there. AMD has slowly entered the data centre market with a longer roadmap and enhanced architecture. “Zen” targeted at high growth markets such as data centres and HPC (high performance computers). Currently this is controlled by Intel which will once again find itself facing a better and cheaper rival. If it works then Zen’s stock would jump into the growth trajectory. The advantage is that if the shares are this low then the gains are going to be huge.

AMD is revamping its GPU product line to make it competitive with Nvidia. AMD has also secured three design wins for its semi-custom SoCs. These wins will start earning revenue beginning in the second half of 2016 with the first two wins expected to generate a combined revenue of $1 billion over a period of three years. The third design win is rumoured to be with Nintendo for its NX game console.

PricewaterCoopers expects worldwide console game sales to reach $28 billion by the end of 2016. However, it expects PC games, where Nvidia is King, to overtake console games. However that is assuming that AMD does not pull a rabbit out of its hat over discrete GPUs too and manages to claw back some sales fast.

Of course it could all go tits up. Zen might not work, or not produce what it is claimed. In which case there will be a lot of investor who lose money. But with AMD’s share prices this low, they are not going to lose that much.

Courtesy-Fud

 

Will VR Headsets Work With Your PC?

January 5, 2016 by Michael  
Filed under Computing

Virtual reality (VR) will not be supported on most consumer computers as the technology booms and manufacturers prepare to introduce it on a consumer level this year, Nvidia has warned.

Jason Paul, the firm’s general manager of Shield, gaming and VR, told Venturebeat that graphics processors need to be about seven times more powerful than in a standard PC game to run VR, and that there will be only about 13 million PCs in the market that will be powerful enough to run them by next year when the first major PC-based VR headsets ship, at least on PCs.

However, Nvidia said that this number could be extended to 25 million if the VR game makers use Nvidia’s GameWorks VR software (of course), which is said to make the VR processing more efficient.

GameWorks VR is aimed at games and applications developers, and includes a feature called VR SLI, which provides increased performance for VR applications where multiple GPUs can be assigned to a specific eye to dramatically accelerate stereo rendering.

The software also delivers specific features for VR headset developers, including Context Priority, which provides control over GPU scheduling to support advanced VR features such as asynchronous time warp. This cuts latency and quickly adjusts images as gamers move their heads, without the need to re-render a new frame.

There’s also a feature in the SDK called Direct Mode, which treats VR headsets as head-mounted displays accessible only to VR applications, rather than a typical Windows monitor, providing better plug-and-play support and compatibility for VR headsets.

Nvidia said that GameWorks VR is already being integrated into leading game engines, such as those from Epic Games, which has announced support for GameWorks VR features in an upcoming version of the popular Unreal Engine 4.3. However, considering Paul’s comments, it mustn’t be getting implemented as much as the firm would like.

VR is becoming increasingly prevalent as device manufacturers try to offer enhanced experiences, especially in gaming. Oculus has been showing off what it can do for some time, and it seems its official debut is not too far away. But it was Oculus that seemed to kick-start this upward trend and, since it hit the headlines, we’ve seen a number of big technology companies giving it a go, especially smartphone makers.

The HTC Vive is one, for example. But, like Oculus, the headset is still in the initial rollout phase and not yet on sale commercially, requiring any developers wanting to have a pop at writing code for it to enter a selection process for distribution, which began only this summer.

Sony, another smartphone maker, has also dipped its toe in the world of VR via Project Morpheus, a headset like HTC’s Vive that looks to enhance gaming experiences, but specifically as an accessory for the PlayStation 4, which we assume won’t come with the concerns Nvidia has as it should work with the console right out of the box.

Courtesy-TheInq

 

Will AMD Bring Two New GPUs To Market In 2016?

November 18, 2015 by Michael  
Filed under Computing

AMDs’ head graphics guy, Raja Koduri promised that AMD will have two new GPUs out next year.

Koduri was talking to Forbes about how AMD needed to get some new architectural designs and create brand new GPUs into the shops.

He added that this is something that AMD has been pretty pants about lately.

He promised two brand new GPUs in 2016, which are hopefully going to both be 14nm/16nm FinFET from GlobalFoundries or TSMC and will help make Advanced Micro Devices more power and die size competitive.

AMD’s GPU architectures have gotten rather elderly, he said.

AMD also wants to increase its share in professional graphics. Apparently this is so low that any competition it brings Nvidia could significantly help their market share in this high margin business. The company has hired

Sean Burke to help drive this forward. Sean was a president at Flex and Nortek and a senior executive at Hewlett-Packard, Compaq and Dell. For those who came in late he was the father of Dell’s Dimension and Compaq’s Prolinea.

Koduri’s cunning plan is to capture consumer and professional graphics will be by providing fully immersive experiences that range from education and medicine to gaming and virtual reality with plenty of overlap in between.

He is also interested in expanding into “instinctive computing” applications which involve medicine, factory automation, automotive and security. These are computing applications that are more natural to the environment and less obvious to the user and should come as natural user experiences.

Koduri has three make attack plans. The first is to gain discrete GPU market share in 2016 and 2017 as well as win the next generation of consoles, which will be 4K. Ironically the AMD chips in the consoles on the market at the moment can handle 4K but they don’t.

Koduri wants console makers will continue to stick with Radeon IP for their next generation consoles and give Advanced Micro Devices an even bigger advantage in the gaming space.

DirectX 12 in the latest shipping version of Windows does seem to give Radeon GPUs a significant performance uplift against Nvidia, he said.

Courtesy-Fud

 

HP Inc. Releases ZBook Studio Laptop With 4K Screen

November 13, 2015 by mphillips  
Filed under Computing

The newly formed HP Inc. has announced its first product release, the ZBook Studio, a feature-packed, 15.6-in. laptop with a 4K screen.

The laptop can be configured to be as speedy as a gaming laptop, but is targeted at mobile workers.

The laptop marks the first product launched by HP Inc., which officially commenced operations last week after Hewlett-Packard split into two companies: HP Inc. and Hewlett-Packard Enterprise. More laptops, hybrids and tablets are expected to be released by HP Inc. in the coming months.

The ZBook Studio is 18 millimeters thick and weighs 1.99 kilograms (4.6 pounds). It  can be configured with Nvidia Quadro graphics cards, which are more for professional graphics and engineering applications.

The laptop has up to 2TB of storage capacity, but HP is selling a separate dock with a Thunderbolt 3 port, which will make it easy to add external storage drives.

Beyond the Intel Core chips, the ZBook Studio is one of the few laptops that can be configured with a Xeon server-class chip. Starting at $1,699, the laptop will ship in December.

A cheaper option would be HP’s new ZBook 15u, which starts at $1,099. It has a 1080p screen, up to 1.5TB of storage and can be configured with an AMD FirePro graphics processor, which competes with Nvidia’s Quadro.

A 4K screen can be included in HP’s ZBook 15 and 17 laptops, which have 15.6-in. and 17.3-in. screens, respectively. The ZBook 15 has up to 3TB of storage, while the ZBook 17 offers up to 4TB of storage.

The laptops have up to 64GB of memory and can be configured with Intel Xeon or Core chips. The laptops are scheduled for release in January; prices weren’t immediately available.

 

 

 

 

Will AMD Go Back To Black In 2016

November 12, 2015 by Michael  
Filed under Computing

AMD’s EMEA component sales manager Neil Spicer is “confident” his outfit can return to profitability in 2016.

Talking to CRN http://www.channelweb.co.uk/crn-uk/news/2433958/amd-confident-profitability-will-return Spicer said he is sure that profitability will return as long as the company sticks to its principles.

“From a personal stance, I am confident [AMD can be profitable]. I believe we are working with exactly the right customers, and over the last few years we have become much simpler to execute and do business with.”

He said that in order to achieve profit, the company must ensure it is investing in the right areas.

“Moving forwards to 2016, we have to have profitable share growth,” he said. “So it’s choosing the right business to go after, both with the company itself and the ecosystem of partners. There is no point in us as a vendor chasing unprofitable partners.

“We want to focus [in the areas] we are good at – that’s where we are going to invest heavily. That’s things like winning the graphics battle with gaming and so forth, and we want to be part of this Windows 10 upgrade cycle.”

Spicer so far has been a little optimistic this year. He thought that Windows 10 would drive an upgrade refresh, particularly as AMD works so well with the new OS.

He also thinks that the combination of Windows 10, the advent of e-sports – competitive online gaming – and new technology and products AMD is launching, means “PC is an exciting market”.

Of course Spicer was extremely enthusiastic about Zen which he thinks will help its play in the high-end desktop space, and the server area. More cynical observers think that Zen will be AMD’s last roll of the dice.

Courtesy-Fud

 

Oracle Goes Elastic To Take On Amazon’s AWS

November 3, 2015 by Michael  
Filed under Computing

Oracle has launched a direct rival to the Amazon Web Services (AWS) public cloud with its own Elastic Compute Cloud.

The product was revealed amid a flurry of cloud-related product announcements, including five in the infrastructure-as-a-service (IaaS) space, at the OpenWorld show in San Francisco on Tuesday.

Oracle Elastic Compute Cloud adds to the Dedicated Compute service the firm launched last year. The latest service lets customers make use of elastic compute capabilities to run any workload in a shared cloud compute zone, a basic public cloud offering.

“Last year we had dedicated compute. You get a rack, it’s elastic but it’s dedicated to your needs,” said Thomas Kurian, president of Oracle Product Development (pictured below).

“We’ve now added in Elastic Compute, so you can just buy a certain number of cores and it runs four different operating systems: Oracle Linux, Red Hat, Ubuntu or Windows, and elastically scale that up and down.”

Oracle has yet to release pricing details for the Elastic Compute Cloud service, but chairman and CTO Larry Ellison said on Sunday that it will be charged at the equivalent or lower than AWS pricing. For the dedicated model, Ellison revealed on Tuesday at OpenWorld that firms will pay half the cost for Oracle Dedicated Compute of the equivalent AWS shared compute option.

It is not surprising that Oracle would like the opportunity to have a piece of the public cloud pie. AWS earned its owner $2.08bn in revenue in the quarter ending 30 September.

Kurian shared current use details for the Oracle Cloud as evidence of the success it has seen so far. The firm manages 1,000PB of cloud storage, and in September alone processed 34 billion transactions on its cloud. This was a result of the 35,000 companies signed up to the Oracle Cloud, which between them account for 30 million users logging in actively each day.

However, Oracle’s chances of knocking Amazon off its cloud-leader perch, or even making a slight dent in its share, seem low. The AWS revenue was only made possible by the fact that Amazon owns 30 percent of the cloud infrastructure service market, with second and third-ranked Microsoft and IBM lagging behind at 10 and seven percent respectively.

Google and Salesforce have managed to capture less than five percent each. Indeed, realising how competitive the market is and Amazon’s dominant position, HP has just left the public cloud market.

Despite Oracle going head to head with AWS in the public cloud space, Amazon has been attempting to attract Oracle customers to its own platform.

“AWS and Oracle are working together to offer enterprises a number of solutions for migrating and deploying their enterprise applications on the AWS cloud. Customers can launch entire enterprise software stacks from Oracle on the AWS cloud, and they can build enterprise-grade Oracle applications using database and middleware software from Oracle,” the web giant notes on its site.

Amazon describes EC2 as letting users “increase or decrease capacity within minutes, not hours or days. You can commission one, hundreds or even thousands of server instances simultaneously”, making Oracle Elastic Compute Cloud a direct competitor.

Oracle has also added a hierarchical storage option for its archive storage cloud service, aimed at automatically moving data that requires long-term retention such as corporate records, scientific archives and cultural preservation content.

Ellison noted that this archiving service is priced at a 10th of the cost of Amazon’s S3 offering.

Kurian explained of the archive system: “I’ve got data I need to put into the cloud but I don’t need a recovery time objective. So you get it very, very cheap”, adding that it costs $1/TB per month.

The firm also launched what Kurian dubbed as its “lorry service” for bulk data transfer. This will see Oracle ship a storage appliance to a customer’s site, where they can then do a huge data transfer directly onto that machine at a much quicker rate than streaming it to the cloud. The appliance is then sent back to Oracle via DHL or FedEx, Kurian explained, for Oracle to then do the transfer on-site to the cloud for storage.

“This is much faster if you’re moving a huge amount of data. One company is moving 250PB of data. To stream that amount of data to the cloud would take a very long time,” he said.

Bulk data transfer will be available from November, while the archive service is available now.

“You can go up to shop.oracle.com as a customer, enter a credit card and you can buy the service, all the PaaS services and the storage service. We’re adding compute over the next couple of weeks,” Kurian explained.

“You pay for it by credit card or an invoice if you’re a corporate customer and pay for it by hour or month, by processor or by per gigabyte per hour or month for storage.”

Oracle Container Cloud, meanwhile, lets firms run apps in Docker containers and deploy them in the Oracle Compute Cloud, supporting better automation of app implementations using technologies like Kubernetes.

Oracle also launched additional applications that sit in its cloud, including the Data Visualisation Cloud Service. This makes visual analytics accessible to general business users who do not have access to Hadoop systems or the data warehouse.

“All you need is a spreadsheet to load your data and a browser to do the analysis,” Kurian explained.

Several new big data cloud services are also aimed at letting users more easily prepare and analyse data using Hadoop as the data store, for example Big Data Preparation and Big Data Discovery.

“With Big Data Preparation you can move data into your data lake, you can enrich the data, prepare it, do data wrangling, cleanse it and store it in the data lake. Big Data Discovery lets a business user sit in front of Hadoop, and through a browser-based dashboarding environment search the environment, discover patterns in the data, do analysis and curate subsets of the data for other teams to look at. It’s an analytic environment and complete Hadoop stack,” Kurian said.

Courtesy-TheInq

 

Will 2016 Be The Year For Virtual Reality Games?

October 15, 2015 by Michael  
Filed under Gaming

As the end of 2015 rapidly approaches (seriously, how on earth is it October already?), the picture of what we can expect from VR in 2016 is starting to look a little less fuzzy around the edges. There’s no question that next year is the Year of VR, at least in terms of mindshare. Right now it looks like no fewer than three consumer VR systems will be on the market during calendar 2016 – Oculus Rift, PlayStation VR and Valve / HTC Vive. They join Samsung’s already released Gear VR headset, although that device has hardly set the world on fire; it’s underwhelming at best and in truth, VR enthusiasts are all really waiting for one of the big three that will arrive next year.

Those fuzzy edges, though; they’re a concern, and as they come into sharper focus we’re starting to finally understand what the first year of VR is going to look like. In the past week or so, we’ve learned more about pricing for the devices – and for Microsoft’s approach, the similar but intriguingly different Hololens – and the aspect that’s brought into focus is simple; VR is going to be expensive. It’s going to be expensive enough to be very strictly limited to early adopters with a ton of disposable income. It’s quite likely going to be expensive enough that the market for software is going to struggle for the first couple of years at least, and that’s a worry.

Oculus Rift, we’ve learned, will cost “at least” $350. That’s just for the headset; you’ll also need a spectacularly powerful PC to play games in VR. No laptop will suffice, and you’re certainly out of luck with a Mac; even for many enthusiasts, the prospect of adding a major PC purchase or upgrade to a $350 headset is a hefty outlay for an early glimpse of the future. It’s likely (though as yet entirely unconfirmed) that Valve’s Vive headset will have a similar price tag and a similarly demanding minimum PC specification. The cheap end of the bunch is likely to be PlayStation VR – not because the headset will be cheap (Sony has confirmed that it is pricing it as a “platform” rather than a peripheral, suggesting a $300 or so price tag) but because the system you attach it to is a $350 PS4 rather than a much more expensive PC.

It is unreasonable, of course, to suggest that this means that people will be expected to pay upwards of $600 for Sony’s solution, or $1500 for the PC based solution. A great many people already own PS4s; quite a few own PCs capable of playing VR titles. For these people, the headset alone (and perhaps some software) is the cost of entry. That is still a pretty steep cost – enough to dissuade people with casual interest, certainly – but it’s tolerable for early adopters. The large installed base of PS4s, in particular, makes Sony’s offering interesting and could result in a market for PlayStation VR ramping up significantly faster than pessimistic forecasts suggest. On the PC side, things are a little more worrying – there’s the prospect of a standards war between Valve and Oculus, which won’t be good for consumers, and a question mark over how many enthusiasts actually own a PC powerful enough to run a VR headset reliably, though of course, the cost of PCs that can run VR will fall between now and the 2016 launch.

All the same, the crux of the matter remains that VR is going to be expensive enough – even the headsets alone – to make it into an early-adopter only market during its first year or so. It’s not just the cost, of course; the very nature of VR is going to make it into a slightly tough sell for anyone who isn’t a devoted enthusiast, and more than almost any other type of device, I think VR is going to need a pretty big public campaign to convince people to try it out and accept the concept. It’s one thing to wax lyrical about holodecks and sci-fi dreams; it’s quite another to actually get people to buy into the notion of donning a bulky headset that blocks you off from the world around you in the most anti-social way imaginable. If you’re reading a site like GamesIndustry.biz, you almost certainly get that concept innately; you may also be underestimating just how unattractive and even creepy it will seem to a large swathe of the population, and even to some of the gamer and enthusiast market VR hopes (needs!) to capture.

The multi, multi million dollar question remains, as it has been for some time – what about software? Again, Sony has something of an advantage in this area as it possesses very well regarded internal studios, superb developer relations and deep pockets; combined with its price and market penetration advantages, these ought to more than compensate for the difference in power between the PS4 and the PCs being used to power Rift and Vive, assuming (and it’s a big assumption) that the PS4′s solution actually works reliably and consistently with real games despite its lack of horsepower. The PC firms, on the other hand, need to rely on the excitement, goodwill and belief of developers and publishers to provide great games for VR in its early days. A handful of teams have devoted themselves to VR already and will no doubt do great things, but it’s a matter of some concern that a lot of industry people you talk to about PC VR today are still talking in terms of converting their existing titles to simply work in 3D VR; that will look cool, no doubt, but a conversion lacking the attention to controls, movement and interaction that’s required to make a VR world work will cause issues like motion sickness and straight-up disappointment to rear their ugly heads.

If VR is going to be priced as a system, not just a toy or a peripheral, then it needs to have software that people really, really want. Thus far, what we’ve seen are demos or half-hearted updates of old games. Even as we get close enough to consumer launches for real talk about pricing to begin, VR is still being sold off the back of science fiction dreams and long-held technological longings, not real games, real experiences, real-life usability. That desperately needs to change in the coming months.

At least Hololens, which this week revealed an eye-watering $3000 developer kit to ship early next year, has something of a roadmap in this regard; the device will no doubt be backed up by Microsoft’s own studios (an advantage it shares, perhaps to a lesser degree, with Sony) but more importantly, it’s a device not aimed solely at games, one which will in theory be able to build up a head of steam from sales to enterprise and research customers prior to making a splash in consumer markets with a more mature, less expensive proposition. I can’t help wondering why VR isn’t going down this road; why the headlong rush to get a consumer device on the market isn’t being tempered at least a little by a drive to use the obvious enterprise potential of VR to get the devices out into the wild, mature, established and affordable before pushing them towards consumers. I totally understand the enthusiasm that drives this; I just don’t entirely buy the business case.

At the very least, one would hope that if 2016 is the year of VR, it’s also the year in which we start to actually see VR in real-life applications beyond the gaming dens of monied enthusiasts. It’s a technology that’s perfectly suited to out-of-home situations; the architect who wants to give clients a walkthrough of a new building design; the museum that wants to show how a city looked in the past; the gaming arcade or entertainment venue that wants to give people an experience that most of them simply can’t have at home on their consoles. VR is something that a great many consumers will want to have access to given the right software, the right price point and crucially, the right experience and understanding of its potential. Getting the equipment into the hands of consumers at Tokyo Games Show or EGX is a start, but only a first step. If VR’s going to be a big part of the industry’s future, then come next year, VR needs to be everywhere; it needs to be unavoidable. It can’t keep running on dreams; virtual reality needs to take a step into reality.

 

Courtesy-GI.biz

Can Samsung Compete With Intel In The x86 Chip Space?

October 9, 2015 by Michael  
Filed under Computing

Samsung is not doing that well in smartphones. To be fair, no one is, but Samsung has the ability to become something much more interesting – it could replace AMD as Intel’s rival.

Actually AMD is pretty cheap right now and if it was not for the pesky arrangement that prevents AMD’s buyer getting its x86 technology then it would have been snapped up a while ago. But with, or without AMD, Samsung could still make a good fist of chipmaking if it put its mind to it. At the moment its chipmaking efforts are one of the better things on its balance sheet.

Its high-margin semiconductor business is more than making up for the shortfall in smartphones. Selling chips to rivals would be more lucrative if they were not spinning their own mobile business. The products it have are worth $11.7 billion this year, more than half the company’s total.

Growing demand for chips and thin-film displays is probably the main reason that Samsung now expects operating profit to have reached $6.3 billion. After applying Samsung’s 16 percent corporate tax rate, its chip division is likely to bring in net income of slightly less than $10 billion.

To put this figure into perspective Intel expects to earn $10.5 billion in this year. Samsung is also sitting on a $48 billion net cash pile. Samsung could see its handset and consumer electronics business as a sideline and just focus on bumping off Intel.

The two sides of such a war would be fascinating. Intel has its roots in the PC chip market which is still suffering while Samsung is based in the mobile chip market which is growing. Intel has had no luck crossing into the mobile market, but Samsung could start looking at server and PC chips.

AMD is still dying and unable to offer Intel any challenge but there is a large market for those PC users who do not want to buy Intel. What Samsung should have done is use its huge cash pile to buy its way into the PC market. It might have done so with the IBM tech which went to Lenovo. It is still not out of the running on that front. Lenovo might be happy to sell IBM tech to Samsung.

Another scenario is that it might try to buy an x86 licence from Intel. With AMD dying, Intel is sitting on a huge monopoly for PC technology. It is only a matter of time before an anti-trust suit appears. Intel might think it is worthwhile to get a reliable rival to stop those allegations taking place. Samsung would be a dangerous rival, but it would take a while before it got itself established. Intel might do well to consider it. Of course Samsung might buy AMD which could sweeten that deal for Intel.

Samsung could try adapting its mobile chip technology for the PC/server market – it has the money to do it. Then it has a huge job marketing itself as the new Intel.
It might just work.

Courtesy-Fud

 

Is Sony Dropping Morpheus?

September 21, 2015 by Michael  
Filed under Gaming

Sony has pulled back the curtains on its virtual reality headset, giving it an official introduction to the wild and a real-life name.

That name is PlayStation VR, which is an obvious but uninspired choice. The name that the unit had earlier, Morpheus, which was probably a nod towards starts-great-but-ends-badly film series The Matrix, had a bit more glamour about it.

The firm has shown off the hardware to the local journalistic crowd at the Tokyo Game Show, and provided the general press with information, details and specifications.

PlayStation VR was first discussed in March 2014 when it had the cooler name. Since then the firm has been hard at work getting something ready to announce and sell, according to a post on the PlayStation blog.

A game show in Tokyo would seem the most likely place for such an announcement.

Sony said that the system is “unique”, apparently because of a special sound system, and makes the most of the Sony PS4 and its camera. The firm is expecting the device to have a big impact on PlayStation gamers and gaming.

“The name PlayStation VR not only directly expresses an entirely new experience from PlayStation that allows players to feel as if they are physically inside the virtual world of a game, but reflects our hopes that we want our users to feel a sense of familiarity as they enjoy this amazing experience,” said Masayasu Ito, EVP and division president of PlayStation product business.

“We will continue to refine the hardware from various aspects, while working alongside third-party developers and publishers and SCE Worldwide Studios, in order to bring content that delivers exciting experiences only made possible with VR.”

Specifications are available, but they relate to a prototype and are subject to change. Sony said that the system has a 100-degree field of view, a 5.7in OLED display, a 120Hz refresh rate, and a panel resolution of 960×RGB×1080 per eye.

This will not put it at the high end of the market, as the field of view is only 10 degrees greater than with Google Cardboard, and 10 degrees under that of Oculus Rift. Some rivals go as wide as 210 degrees.

And no, no release date or price have been mentioned. We predict that these will be 2016 and expensive.

Courtesy-TheInq

MediaTek Goes Power Management

September 15, 2015 by Michael  
Filed under Computing

Beancounters at Digitimes research claim that Mediatek’s moves to buy Richtek Technology will help the analog IC vendor to push its power management (PWM) gear to the smartphone and TV panel sectors.

Lately Richtek has been moving away from its notebook and motherboard segments and into PWM ICs for telecommunication and consumer applications. The idea was to avoid being damaged too much by the slump in the PC market.

Richtek has scored orders from Samsung Electronics for production of entry-level and mid-range smartphones and has also been ramping its PWM solutions to China’s LCD TV panel sector.

MediaTek has also announced its plans to acquire LCD driver IC maker Ilitech it means that the outfit will build up a comprehensive supply chain for the TV industry in China, Digitimes noted.

MediaTek could also use the planned 12-inch joint venture fab to be built by Powerchip Technology, which is the parent company of Ilitek, in Hefei, China.

The JV fab will provide foundry services for LCD driver ICs in 2018-2019.

Courtesy-Fud

 

Will MediaTek Buy RichTek?

September 11, 2015 by Michael  
Filed under Computing

MediaTek is planning to write a cheque for a 51 per cent stake in analogue ICs Richtek Technology and might even buy the whole company.

The company will offer US$5.94 for each Richtek common share. After completing the tender offer and going through relevant legal procedures, the company will move forward taking over the remaining shares of Richtek. The follow-up acquisition of Richtek shares is expected to complete in the second quarter of 2016.

Ming-Kai Tsai, MediaTek chairman and CEO said that Richtek was is a leader in analogue ICs and provides comprehensive power management solutions to satisfy various customer demand, backed by an experienced management and R&D team.

“We believe, through the deal, the competitive edges of both companies will be leveraged to maximize the platform synergy, strengthen MediaTek in Internet of Things segment and further enhance MediaTek’s competitiveness in the fast-changing and ever-competitive global semiconductor market,” he said.

Richtek chairman Kenneth Tai claimed the two outfits were complementary in power management IP and products which creates a leadership position in this field.

He said that by using MediaTek’s platform leadership, Richtek could optimize power management performance on the system level to enable competitive products for customers and further expand analogue IC offerings to propel the company into its next stage of growth.

Courtesy-Fud

MediaTeks Helio X30 Processor Comes To Light

September 3, 2015 by Michael  
Filed under Computing

The rumored Helio X30 is real and if you thought that X20 was not enough to see off Snapdragon 820, it looks like the Helio X30 has a much better chance.

All new Helio X20 deca-core has two A72 at 2.5GHz, four A53 at 2.0 and four A53 cores at 1.4 GHz. It has Core pilot 3.0 is a smart scheduler that decides which core gets what task.

This processor has every chance to be faster than Snapdragon 620 from Qualcomm. The Snapdragon 620 comes with four A72 cores at 1.8GHz and four A53 at 1.4 GHz but we are unsure how Helio X20 goes will match up against the Snapdragon 820 with its custom quad Krait cores.

But the the Helio X30 has four A72 cores at 2.5GHz, two A72 clocked at 2GHz, two Cortex A53 clocked at 1.5GHz and two low power A53 at 1GHz. A senior executive from MediaTek told us that not all cores were created equal.

Despite the fact that the  word “A53″ on the box looks like “A53″ on the other box, one is optimized for performance and the other for low power. If it is unclear if the A53 based cluster from MediaTek is the same as A53 cluster from Qualcomm.

As you can read at Fudzilla we spent quite some time learning about the potential gains of having three clusters. The X20 can have 30 to 40 percent less power consumption, simply by being smart how it uses all ten cores / three clusters.

With Helio X30 you will gain more performance with six out of ten cores being based on the A72 core. Having ten cores in four clusters raises another question, how efficient will the four cluster approach be versus the three cluster approach?

MediaTek has not officially confirmed or launched the Helio X30, but we expect that this will happen soon. The X30 should be shipping in devices in early 2016. at least this is what we would expect to place it well against the  Snapdragon 820.

Courtesy-Fud