Subscribe to:

Subscribe to :: TheGuruReview.net ::

Can nVidia Beat Google?

April 24, 2017 by  
Filed under Computing

Google’s internal benchmarks of its own TPU, or tensor processing unit indicated that its purpose built AI board cleaned Nvidia’s clock when it came to number crunching and power consumption.

However this week Nvidia has blogged that Google’s numbers fail to take into account how wonderful its new boards are.

Google compaired its board to the older, Kepler-based, dual-GPU K80 rather than the Pascal based GPUs.

Nvidia moaned that Google’s team released technical information about the benefits of TPUs this past week but did not compare the TPU to the current generation Pascal-based P40.

While the TPU has 13x the performance of K80 is provisionally true, but there’s a snag. That 13x figure is the geometric mean of all the various workloads combined.

Nvidia’s argument is that Pascal has a much higher memory bandwidth and far more resources for inference performance than K80. As a result, the P40 offers 26x more inference performance than one die of a K80.

As Extreme Tech points out there are all sorts of things which are “unclear” about Nivida’s claims.

For example it is unclear if Nvidia’s claim takes Google’s tight latency caps into account. At the small batch sizes Google requires for its 8ms latency threshold, K80 utilization is just 37 percent of maximum theoretical performance. The vagueness of the claims make it difficult to evaluate them for accuracy.

Google’s enormous lead in incremental performance per watt will be difficult to overcome. Google said that its boffins modelled the expected performance improvement of a TPU with GDDR5 instead of DDR3, with more memory bandwidth.

Scaling memory bandwidth up by 4x would improve overall performance by 3x, at the cost of ~10% more die space. So, it is saying that it can boost the TPU side of the equation as well.

While no one is saying that the P40 is slower than the K80, but Google’s data shows a huge advantage for TPU performance-per-watt compared with GPUs, particularly once host server power is subtracted from the equation.

Basically GPU has lots of hardware that a chip like Google’s TPU simply doesn’t need.

Courtesy-Fud

Bose Headphones Accused Of Spying On Users

April 21, 2017 by  
Filed under Consumer Electronics

Bose Corp spies on its wireless headphone owners by using an app that tracks the music, podcasts and other audio they listen to, and violates their privacy rights by selling such data without permission, a lawsuit charged.

The complaint filed by Kyle Zak in federal court in Chicago seeks an injunction to stop Bose’s “wholesale disregard” for the privacy of customers who download its free Bose Connect app from Apple Inc or Google Play stores to their smartphones.

“People should be uncomfortable with it,” Christopher Dore, a lawyer representing Zak, said in an interview. “People put headphones on their head because they think it’s private, but they can be giving out information they don’t want to share.”

Bose did not respond on Wednesday to requests for comment on the proposed class action case. The Framingham, Massachusetts-based company has said annual sales top $3.5 billion.

Zak’s lawsuit was the latest to accuse companies of trying to boost profit by quietly amassing customer information, and then selling it or using it to solicit more business.

After paying $350 for his QuietComfort 35 headphones, Zak said he took Bose’s suggestion to “get the most out of your headphones” by downloading its app, and providing his name, email address and headphone serial number in the process.

But the Illinois resident said he was surprised to learn that Bose sent “all available media information” from his smartphone to third parties such as Segment.io, whose website promises to collect customer data and “send it anywhere.”

Audio choices offer “an incredible amount of insight” into customers’ personalities, behavior, politics and religious views, citing as an example that a person who listens to Muslim prayers might “very likely” be a Muslim, the complaint said.

“Defendants’ conduct demonstrates a wholesale disregard for consumer privacy rights,” the complaint said.

Zak is seeking millions of dollars of damages for buyers of headphones and speakers, including QuietComfort 35, QuietControl 30, SoundLink Around-Ear Wireless Headphones II, SoundLink Color II, SoundSport Wireless and SoundSport Pulse Wireless.

He also wants a halt to the data collection, which he said violates the federal Wiretap Act and Illinois laws against eavesdropping and consumer fraud.

Dore, a partner at Edelson PC, said customers do not see the Bose app’s user service and privacy agreements when signing up, and the privacy agreement says nothing about data collection.

Edelson specializes in suing technology companies over alleged privacy violations.

NAND Prices Appear To Be Sckyrocketing

April 21, 2017 by  
Filed under Computing

NAND flash prices have been inflating excessively lately and Phison Electronics chairman Khein Seng Pua warned that prices are set to go up again in the third quarter as end-market demand surges.
He told Digitimes that while prices might decrease a little in the second quarter, Chipmakers’ ongoing transition from 2D to 3D NAND memory has led to tight supply and inflated the chip prices.

System OEMs are reluctant to deliver their products as the more they sell the more they lose due to soaring NAND flash costs, Pua warned.

Meanwhile, chipmakers’ supply to channel distributors has been falling short of demand prompting the distributors to promote lower-capacity storage devices.

“Channel distributors particularly those in China have turned to promote 96GB SSDs instead of 128GB ones due to insufficient chip supply,” Pua said.

Distributors have even experienced tight supply of 8GB and 4GB eMMC devices.

Pua believes NAND flash prices will soon see correction following excessive gains but Apple’s new iPhone will take a lot of NAND flash from the market and push prices up again.

Chipmakers’ transition to 3D NAND memory will become smooth in general between May and June, which will help ease the supply shortages, Pua indicated.
The industry’s output of 64-layer 3D NAND will account for more than half of the total output in the fourth quarter of 2017 Pua said.

Courtesy-Fud

Can AMD Go Wireless In The Virtual reality Space?

April 20, 2017 by  
Filed under Computing

You might seen we’ve writing about millimeter waves several times. and we usually attributed this term to 5G. AMD has just acquired Nitero, a millimeter wave company that wants to use this technology to cut the cord on your VR and AR headset. 

AMD has figured out that cables are a very limiting factor in a Virtual Reality or Augmented Reality. This is not a big secret as even if you only had a few minutes to play with one, you quickly realize that making things wireless is more comfortable.

The acquisition provides AMD with a broader portfolio of IP capable of enabling VR headset and solution providers with key technology required to create more immersive computing experiences.

Mark Papermaster, AMD chief technology officer and senior vice president said:

“Unwieldly headset cables remain a significant barrier to drive widespread adoption of VR. Our newly acquired wireless VR technology is focused on solving this challenge, and is another example of AMD making long-term technology investments to develop high-performance computing and graphics technologies that can create more immersive computing experiences.”

Nitero has designed a phased-array beamforming millimeter wave chip to address the challenges facing wireless VR and AR. This is the same frequency that Intel and Qualcomm will use for Wi-Gig. This enables very fast speeds within a room, but due to its high frequency the signal won’t really penetrate any walls.

This is not that important for the VR and AR markets as we don’t see a case where you need to leave an office or a room with the VR / AR headset on.

The 60GHz technology has the potential to enable multi-gigabit transmit performance with low latency in room-scale VR environments. It will rely heavily on the beamforming characteristics to solve the requirement for line-of-sight associated with traditional high-frequency mm-wave systems. The main goal is potentially eliminating wired VR headsets and letting users to become more easily immersed in virtual and augmented worlds.

Nitero co-founder and CEO Pat Kelly said:

“Our world class engineering team has been focused on solving the difficult problem of building wireless VR technologies that can be integrated into next-generation headsets. We are excited to play a role in furthering AMD’s long-term technology vision.”

Pat joined AMD as corporate vice president, Wireless IP highlighting the importance of the whole acquisition and the whole technology potential. Fudzilla calls this a step in the right direction. 

Courtesy-Fud

AMD Goes Custom Power With Ryzen

April 19, 2017 by  
Filed under Computing

AMD has released a new custom “balanced” power plan for those using Ryzen CPU on Windows 10 OS.

Until today, AMD Ryzen CPU users were limited to using the “high performance” plan in Windows 10 OS, at least if they want to get most performance out of their Ryzen CPU. Now, AMD has released a new tweaked “balanced” power plan that should provide a compromise between performance and power efficiency which “automatically balances performance with energy consumption on capable hardware”.

According to the explanation posted by AMD’s Robert Hallock, the new power plan reduces the times and thresholds for P-state transition in order to improve clockspeed ramping as well as disables core parking for “more wakeful cores”.

These tweaks are apparently enough for the new plan to provide similar performance to the Microsoft’s “high performance” power plan setting, at least according to AMD’s own slides. As far as power is concerned, the new balanced power plan does not change how the processor handles low-power idle states, so basically, you’ll get additional performance without compromising the power efficiency.

The new balanced plan is quite simple to install and you can find both the download link as well as check out further explanation over at AMD’s community blog. AMD will also include the final power plan with next AMD chipset drivers for Ryzen CPUs.

Courtesy-Fud

Is AMD’s Ryzen 3 Coming In The 2H Of 2017

April 18, 2017 by  
Filed under Computing

After launching the Ryzen 7 CPU lineup, AMD will launch its mainstream Ryzen 5 lineup in just under a week, but today we have additional information about an entry-level Ryzen 3 SKU, the Ryzen 3 1200.

Scheduled to launch sometime in the second half of this year, the Ryzen 3 lineup will compete well against Intel’s Core i3 dual-core lineup. It is still not clear if AMD will include dual-core SKUs in its Ryzen 3 lineup, but it is most likely that all will be quad-core SKUs with and without SMT-enabled. Earlier rumors also suggest that there will be a Ryzen 3 1200X SKU that should be similar but with support for XFR (eXtended Frequency Range) technology, which may give it a further overclocking boost.

According to details leaked by ASRock’s support page and originally spotted by Computerbase.de, the Ryzen 3 1200 SKU works at 3.1GHz frequency (most likely 3.4GHz Turbo) and has a 65W TDP.

Courtesy-Fud

T-Mobile And Dish Networks Score Most In FCC’s Wireless Spectrum Bids

April 17, 2017 by  
Filed under Mobile

T-Mobile US Inc bid $8 billion and Dish Network Corp $6.2 billion to acquirethe bulk of broadcast airwaves spectrum for sale in a government auction, according to a statement from the U.S. Federal Communications Commission.

The two carriers accounted for most of the $19.8 billion in winning bids, the FCC said. Comcast Corp agreed to acquire $1.7 billion in spectrum, AT&T Inc bid $910 million and investment firm Columbia Capital offered $1 billion.

The FCC said 175 broadcast stations were selling airwaves to 50 wireless and other telecommunications companies. Companies plan to use the spectrum to build new networks or improve existing coverage.

The spectrum auction’s end is widely expected to kick off a wave of deal-making in the telecom industry. Until now, companies participating in the auction have been restrained by a quiet period, but that will end after April 27, when down payments are due from auction winners.

T-Mobile said its $8 billion winning bid would enable it “to compete in every single corner of he country.” The company, controlled by Deutsche Telekom AG , said the investment will quadruple its low-band holdings.

Verizon Communications Inc and Sprint Corp opted not to bid.

“What is most interesting to us was (Verizon) was nowhere to be found,” Jennifer Fritzsche, an analyst at Wells Fargo, said in a research note, adding that “we continue to believe Verizon’s interests lay in the higher band spectrum assets.”

Craig Moffett, an analyst at MoffettNathanson, said in an email that there were three surprises in the results: “Comcast bought less than expected, Dish Network bought more, and Verizon bought nothing at all.”

Moffett said Dish’s spectrum spending underscored “the growing importance of the company’s valuation as it relates to their spectrum holdings.”

Comcast sold spectrum from three of its NBCUniversal owned stations in New York, Philadelphia and Chicago for $481.6 million.

The FCC also announced new channel assignments for 957 non-winning stations that must change channels to clear the new wireless airwaves for use.

Of the $19.8 billion bid, more than $7 billion will go to reduce the U.S. deficit and $10.05 billion to broadcasters relinquishing spectrum. Up to $1.75 billion will go to broadcasters that incur costs in changing channels.

Sellers had initially sought $86.4 billion for 126 megahertz. Many analysts had expected broadcasters to earn more and sell more spectrum.

Was Apple Really Selling Bricked Phones?

April 17, 2017 by  
Filed under Mobile

Australian users have a bit of a DIY mentality – like New Zealanders they can’t see the point of paying a fortune for something that they can get a mate to fix cheaper.  Normally they would only take it in to Apple if the problem cannot be fixed with masking tape and number eight fencing wire.  Apple has a huge problem with this. It makes a fortune charging fees to have its spotty blue shirts repairing things that most uses could fix with a screwdriver and WD40.

According to the Australian Competition and Consumer Commission, Apple thought it would be a rather super, cool, and revolutionary thing to brick iPhones which had not been repaired by its Genii. The way users would have to return the phone to be fixed.

Australia’s consumer watchdog has sued Apple claiming that the bricking happened in a software update which had cracked screens fixed by third parties and then refused to unlock them on the grounds that customers had had the devices serviced by non-Apple repairers.

The Australian Competition and Consumer Commission told the court that consumer guarantee rights under the Australian Consumer Law exist independently of any manufacturer’s warranty and are not extinguished simply because a consumer has goods repaired by a third party.

Of course Apple is not saying anything. We have no doubt that its acolytes really believe that they are saving the customers’ souls from the dangers of cheap repairs. Everyone knows that all the phones don’t really belong to the users but are given in a sacred trust to the user for large amounts of cash on the assumption that they will never touch without the blessing of the church.

The regulator said that between September 2014 and February 2016, Apple customers who downloaded software updates then connected their devices to their computers received a message saying the device “could not be restored and the device had stopped functioning”.

Apple engaged in “misleading or deceptive conduct and made false or misleading representations to consumers” about its software updates and customers’ rights to have their products repaired by the company, the commission said.

As well as fines, the ACCC said it was seeking injunctions, declarations, compliance program orders, corrective notices, and costs.

Courtesy-Fud

Samsung’s Bixby Digital Voice Assistant Being Delayed

April 14, 2017 by  
Filed under Mobile

Samsung’s highly promoted Bixby voice assistant will not be available for the Galaxy S8 smartphone on April 21, as previously announced.

The company released a statement that said Bixby will be available in the U.S. on the Galaxy S8 “later in the spring.” Samsung didn’t explain the delay.

The Bixby will join a pack of artificial intelligence assistants that includes Amazon’s Alexa, Apple’s Siri and the Google Assistant that are changing the way people interact with their devices.

Some U.S.-based reviewers and analysts had noticed that the Bixby feature wasn’t fully demonstrated when the S8 was announced March 29.

Also, some news reports said Bixby encountered voice recognition problems in English compared to its performance with the Korean language.

The shipment delay applies only to the voice feature in Bixby, while Samsung said other key features of Bixby, like Vision, Home and Reminder will be available in the global launch of Galaxy S8 on April 21.

Samsung went out of its way to promote Bixby well in advance of the Galaxy S8 launch. It was announced in a blog on March 20, nine days before the phone’s launch, by Injong Rhee, executive vice president of software and services for Samsung Electronics.

Rhee pointed out a physical button on the side of the phone that would activate Bixby, differentiating it from Alexa or  Siri and others that are activated by a spoken trigger word. Bixby would offer a “deeper experience” than some others, including support for touch commands. Also, Bixby is designed to know the current state of an app to allow users to carry out work in progress without further explanation. Rhee said the Bixby interface is “much more natural and easier to use.”

Bixby was already two years behind those digital assistants as well as Google Assistant, analysts said. “Bixby is going to be playing catch up,” said Gartner analyst Werner Goertz in March.

One analyst forgave the Bixby delay. “I commend Samsung for trying to get it right rather than just launching and hoping for the best,” said Jack Gold, an analyst at J. Gold Associates.

“It’s never a good idea to put out less than great software on a consumer device. So in this case, if Samsung can delay a few weeks and get a better product, it makes sense to do so. That said, voice recognition generally is not all that easy to do. It’s not just the recognition software itself, but the whole voice chain that has to be tailored. That includes everything from the microphone through the audio channel on the phone to the recognition algorithms and the user interface. If they tested and it wasn’t at their expected level of accuracy, then it’s better to get it right than to get it out fast.”

Study Reveals Cyber Attacks Have Cost Company Shareholders Billions

April 13, 2017 by  
Filed under Computing

Cyber security breaches diminish businesses share prices permanently, with financials the worst hit, a study issued by IT consultant CGI and Oxford Economics has revealed.

Severe cyber security breaches, such as those having legal or regulatory consequences, involve the loss of hundreds of thousands of records and hurt the firm’s brand, caused share prices to fall on average 1.8 percent on a permanent basis, the analysis of 65 companies affected since 2013 globally has found.

Investors in a typical FTSE 100 firm would be worse off by an average of £120 million after such a breach, the report said. Overall the cost to shareholders of these 65 companies would be in excess of 42 billion pounds ($52.40 billion).

CGI’s analysis compared each company’s share price against a cohort of similar companies to isolate the impact of cyber breaches from other market movements, during incidents detailed in a breach index compiled by Dutch security firm Gemalto.

Two-thirds of firms had their share price adversely impacted after suffering a cyber breach. Financial firms were the worst affected, followed closely by communications firms.

“Financial services experience the greatest burden in terms of impact, reflecting the high levels of regulation, the importance of customer confidence and the potential for financial fraud to be a facet of the breach,” the report said.

hose least affected were retail, hospitality and travel companies.

Hacking attacks and other cyber security breaches have impacted companies across the world in recent years, from retailer Target in the United States in 2013 to British communications firm TalkTalk in 2015.

Hacked Dallas Emergency Sirens Add Extra Encryption

April 13, 2017 by  
Filed under Around The Net

Dallas city officials have put in place additional encryption and other security measures to the outdoor warning sirens hacked last week.

The hack also prompted the city to evaluate critical systems for potential vulnerabilities, City Manager T.C. Broadnax said in a statement late Monday. City officials are reviewing security for financial systems, a flood warning system, police-fire dispatch and the 911/311 system.

Broadnax told reporters separately on Monday that the hack came over a radio frequency and not over a wired computer network. The attack was “not a system software issue; it was a radio issue,” he told the Dallas Observer and others.

The city believes the hack came from the Dallas area, but officials haven’t detailed how it occurred. Dallas police are working with the FBI and the Federal Communications Commission (FCC) to validate what they think happened and find the source. The hack caused all 156 emergency sirens to activate for about 90 minutes, scaring some residents and doubling the number of calls to 911.

Radio security experts theorized the incident may have been a simple “replay attack” where the hacker recorded the radio signal sent out on April 5 at noon as part of a monthly test of the emergency siren system. Then, the hacker could have played that signal back repeatedly early Saturday. It would take a hacker with a software defined radio (SDR) or other off-the-shelf radio frequency test equipment to pull off the attack, said Chris Risley, CEO of Bastille Networks, a company that remediates radio frequency vulnerabilities.

Frequencies used for outdoor sirens are public and are managed by the FCC. Various security techniques, including encryption, are used to protect signals sent by radio.

Even if a “replay attack” was not used, the regularly scheduled siren test would allow an attacker to make multiple recordings of the “activate sirens” radio stream over several months and then analyze it for specific commands to trigger the alert, he added. SDRs are becoming cheaper and more capable and there is an abundance of open source software that can decode activation protocols.

Risley said other cities are probably just as vulnerable as Dallas.

The Dallas incident highlights how vulnerable and unprotected U.S. enterprises and government authorities are, said Matt Little, chief product officer for encryption provider PKWare. “Traditional security perimeters are breaking down. This attack reaffirms how necessary encryption is,” he said.

Many siren systems are decades old and Dallas may have been relying on low-level encryption, perhaps even 64-bit encryption based on the Data Encryption Standard (DES) from the late 1970s, he said.

“Sirens are analogous to a lot of aging critical infrastructure that was built for high availability, and always has to be online, so security took a back seat to that,” Little said.

Dallas may have decided after the hack to upgrade encryption or improve the authentication system regarding who gets access to encryption keys, Little said.

Are Smartphones On The Way Out?

April 13, 2017 by  
Filed under Mobile

This is not an extremely late April 1st, and we admit that it is a little early given that its replacement has not shown up yet, but we predict that it will go the way of the dodo, the Norwegian Blue, the bleeper and the Crackberry.

OK it is probably a few years off, but the technology is so persuasive that its death will be longer than the exit of a hero in a South American soap opera.

For a while now smartphone sales have slowed. Basically the structure developed by Nokia and stolen by Apple and copied everywhere has run out of places to go. There is no more innovation in smartphones any longer, despite what is claimed particularly by the Tame Apple Press. Chip speeds have increased slightly and are about as fast as they are going to get. Even if someone gets a chip to the speeds of a PC it is not going to make a hell of a lot of difference.

What is coming next is being sorted out by the likes of Microsoft, Facebook, Amazon, Google along with Elon Musk. Apple of course is waiting for the next biggest thing to be developed by others before it takes a risk.

So what will get rid of it? While the Tame Apple Press think it will be something more like the Amazon Echo, Sony PlayStation VR, and the SmartWatch that is mostly because that is pretty much Apple’s current agenda.

No doubt AR and VR could be the way it is going. Certainly some sort of interface which projects detailed 3D images straight into your eyes while you interact with your environment. So instead of typing this on a screen I will be typing it on a nice egonomic bit of rubber while the words are appearing before my eyes. A more portable version would put a keyboard onto any surface.

Microsoft thinks that is the way things will go and the tech will replace the smartphone, the TV, and sex, and anything else with a screen with sounds going in through a headphone.

As artificial intelligence systems like Apple’s Siri, Amazon’s Alexa, Samsung’s Bixby, and Microsoft’s Cortana get smarter, there is going to be a rise not just in talking to computers, but having them talk back.

All this makes the smartphone redundant and limited. Sure it will be a good decade before this brave new world takes off and it will be a slow slide rather than anything great, but we are seeing the change start happening now. The world is bored with smartphones and they are just not having the impact they used to.

Courtesy-Fud

Will The A.I. Space Heat-Up This Year?

April 13, 2017 by  
Filed under Computing

Major component manufacturers in the artificial intelligence (AI) market have all increased their efforts to develop more aggressive processors for AI-fueled markets in 2017 including autonomous vehicles, enterprise drones, medical care, smart factories, image recognition, and general neural network research and development.

Intel’s Nervana platform is a $400 million investment in AI

Back in November, Intel announced what it claims is a comprehensive AI platform for data center and compute applications called Nervana, with its focus aimed directly at taking on Nvidia’s GPU solutions for enterprise users. The platform is the result of the chipmaker’s acquisition of 48-person startup Nervana Systems back in August for $400 million that was led by former Qualcomm researcher Naveen Rao. Built using FPGA technology and designed for highly-optimized AI solutions, Intel claims Nervana will deliver up to a 100-fold reduction in the time it takes to train a deep learning model within the next three years.

The company intends to integrate Nervana technology into Xeon and Xeon Phi processor lineups. During Q1, it will test the Nervana Engine chip, codenamed ‘Lake Crest,’ and make it available to key customers later within the year. The chip will be specifically optimized for neural networks to deliver the highest performance for deep learning, with unprecedented compute density and a high-bandwidth interconnect.

For its Xeon Phi processor lineup, the company says that its next generation series codenamed “Knights Mill” is expected to deliver up to four times better performance for deep learning. Intel has also announced a preliminary version of Skylake-based Intel Xeon processors with support for AVX-512 instructions to significantly boost performance of inference tasks in machine learning workloads.

“Intel is committed to AI and is making major investments in technology and developer resources to advance AI for business and society,” said Intel CEO Brian Krzanich.

Nvidia partners with Microsoft on AI cloud computing platform

Earlier last month, Nvidia showed no signs of slowing its AI cloud computing efforts by announcing a partnership with Microsoft for a hyperscale GPU accelerator called HGX-1. The partnership includes integration with Microsoft Project Olympus, an open, modular, very flexible hyperscale cloud hardware platform that includes a universal motherboard design (1U/2U chassis), high-density storage expansion, and a broad ecosystem of compliant hardware products developed by the OCP community.

Nvidia claims that HGX-1 establishes an industry standard for cloud-based AI workloads similar to what the ATX form factor did for PC motherboards more than two decades ago. The HGX-1 is powered by eight Tesla P100 accelerators connected through NVLink and the PCI-E standard. Nvidia’s hyperscale GPU accelerator will, it claims, allow cloud service providers to easily adopt Tesla and Quadro accelerator cards to meet the surging demand for AI computing. The company plans to host another GPU technology conference in May 2017 where it is expected to unveil more updates on its AI plans.

On the consumer front, Nvidia’s Shield platform integrates with Amazon Echo, Nest and Ring to provide customers with a “connected home experience”, while Spot is its direct take on Amazon Echo and brings ambient AI assistance into the living room. For drivers, the company’s latest AI car supercomputer is called Xavier and is powered by an 8-core custom ARM64 processor, and a 512-core Volta-based GPU. The unit is designed with ASIL D safety rating, the highest classification of initial hazard, and can deliver 30 tera ops of double-precision learning in a 30W design.

Qualcomm’s acquisition of NXP signals investment in AI market

Back in October, San Diego-based Qualcomm bought NXP, the leader in high-performance, mixed-signal semiconductor electronics – and a leading solutions supplier to the automotive industry – for $47 billion. The two companies, joined into a single entity, have now represented what is considered a strong contender in automotive, IoT, and security and networking industries. Using several automotive safety sensor IPs from the acquisition, including radar microcontroller units (MCUs), anti-lock braking systems, MCU-enabled airbags, and real-time tire pressure monitoring, Qualcomm is now positioned to be a “one-stop solution” for many automotive customers.

With its Snapdragon and Adreno graphics capabilities, the company is well positioned to compete with Nvidia in the automotive market and stands a much better chance of developing its self-driving car platform with the help of NXP and Freescale IP in its product portfolios.

AMD targets AI learning workloads with Radeon Instinct accelerators

Back in December, AMD also announced its strategy to dramatically push its presence into the AI-related server computing business with the launch of new Radeon Instinct accelerator cards and MIOpen, a free, comprehensive open-source library for developing deep learning and machine intelligence applications.

The company’s Radeon Instinct series is expected to be a general-purpose solution for developing AI-based applications using deep learning frameworks. This includes autonomous vehicles, HPC, nano-robots, personal assistance, autopilot drones, telemedicine, smart home, financial services and energy, among other sectors. Some analysts note, however, that AMD is the only company uniquely positioned with both x86 and GPU processor technologies, allowing it to fulfill the role of meeting a variety of data center solutions on demand. The company has been developing what it claims is an efficient connection between both application processor types to meet the growing needs of AI’s technological demands.

The MIOpen deep learning library was expected to be released in Q1, but may have been delayed by a couple of weeks. Meanwhile, AMD’s Radeon Open Compute (ROCm) Platform lets programmers focus on training neural networks through Caffe, Torch 7, and Tensorflow, rather than wasting time doing mundane, low level, performance tuning tasks.

Courtesy-Fud

FCC Won’t Support Move To All In-flight Mobile Calls

April 12, 2017 by  
Filed under Mobile

The chairman of the Federal Communications Commission has announced that he is proposing to pull the plug on a 2013 regulatory proceeding that had sought to lift the ban on mobile phones on U.S. airlines.

The FCC said in 2013 that it would consider allowing air travelers to make mobile phone calls but never finalized it.

“I stand with airline pilots, flight attendants, and America’s flying public against the FCC’s ill-conceived 2013 plan to allow people to make cellphone calls on planes. I do not believe that moving forward with this plan is in the public interest,” FCC Chairman Ajit Pai said in a statement.

Pai needs the backing of the two other commissioners for the 2013 proposal to be formally abandoned.

In 2013, the FCC said special equipment could be installed on planes to allow in-flight calls and said it had already been deployed successfully in other countries without incident.

The FCC under then chairman Tom Wheeler said there were “no technical reasons to prohibit such technology to operate” but proposed leaving it to airlines whether to allow mobile phone calls.

In 2013, Pai said Wheeler’s proposal would likely require airlines to become commercial mobile radio station carriers to offer in-flight calling.

The U.S. Transportation Department said in December that it was considering allowing passengers to make in-flight calls using Wi-Fi. The agency also sought comment on whether it should ban all voice calls on all U.S. flights.

Last month, major airlines said the Trump administration should delay action on the in-air mobile call proposal as it reviews other pending regulatory issues.

Are Big Changes In Store For DDR5 In 2018?

April 12, 2017 by  
Filed under Computing

Last week, the JEDEC Solid State Technology Association announced that it now has a full standard for the widely anticipated DDR5 memory that is expected to arrive in June 2018, based on new Hybrid DIMM technologies such as NVDIMM-P, which is intended to give servers the ability to store RAM data in between reboots.

DDR5 to use 3D chip stacking with TSVs

As with any new forecasted memory standard, the association says that DDR5 (Double Data Rate 5) memory will offer improved performance with greater power efficiency as compared to previous generation DRAM technologies. DDR5 will provide double the bandwidth and density of DDR4 along with improved channel efficiency, making it ideal for high performance combined with improved power management and cost savings.

DDR5 is expected to become the industry’s first DIMM approach that will include 3D chip stacking using through-silicon vias (TSVs), similar to what Toshiba has been doing with NAND flash since 2015. Since TSVs can be placed anywhere on the chip rather than just at the edge, it’s easy to implement a wide data bus with higher performance and low-power through shorter distances.

NVDIMM-P: Combining NVM or NAND flash with DRAM memory space

Mian Quddus, Chairman of the JEDEC Board of Directors, said that increasing server performance requirements are driving the need for more advanced technologies, and the standardization of next generation memory such as DDR5 and the new generation persistent modules NVDIMM-P will be essential to fulfilling those needs.

The organization has announced that it is also working on a standard for non-volatile hybrid memory called NVDIMM-P, or “Non-Volatile Dual Inline Memory, Persistent,” that basically would map DRAM and NAND to the same memory space. The proposed standard effectively provides both byte- and block-level drive access.

Courtesy-Fud

Next Page »