Subscribe to:

Subscribe to :: TheGuruReview.net ::

Smartwatch Shipments Forecasted To Surge In 2015

April 1, 2015 by mphillips  
Filed under Consumer Electronics

Smartwatch shipments will swell by an impressive 500% this year, fueled by the interest in the coming Apple Watch and its impact on other smartwatches already in the market, according to market research firm IDC.

IDC also lowered its Apple Watch forecast to 15.9 million shipments in 2015, down from 22 million, a 28% reduction. That’s partly because more details have surfaced and the sales date of April 24 was later than IDC had previously expected, according to IDC analyst Ryan Reith in an email.

Even with that reduction, the Apple Watch will account for 62% of the smartwatch market in 2015, Reith said. IDC had expected the 22 million in Apple Watch shipments for this year in a forecast conducted last August, well before the shipment date and other details were announced at Apple’s Spring Forward event on March 9, Reith said.

A February forecast by CCS Insight put potential Apple Watch sales at 20 million in 2015. Apple Watch goes on sale April 24, starting at $349, but some gold-encased models will start at $10,000. Apple will begin taking pre-orders April 10.

IDC grouped the Apple Watch with Motorola’s Moto 360, several Samsung Gear watches and others under a category it calls smart wearables, or devices capable of running third-party apps. For all of 2015, IDC said 25.7 million smart wearables will ship, up 511% from the 4.2 million shipped in 2014.

The IDC smartwatch forecast is much lower than a recent prediction by Gartner, which says 40 million smartwatches will ship in 2015, up from about 5 million in 2014.

 

 

Will AMD Release Pascal Early Next Year?

April 1, 2015 by Michael  
Filed under Computing

The world is still expecting the birth of the High Bandwidth Memory (HBM) boosted Fiji GPU, and the first cards based on the new chip should launch in late June, or the end of Q2 2015 if you prefer.

AMD is looking ahead and their engineers are working hard on the company’s next generation HBM card, currently codenamed Greenland. We are not sure if this is the name of the whole generation or this is simply a single GPU backed by HBM, that will end up in APUs.

Like we said, we doubt that Fiji will actually launch on the Pacific island of Fiji and that the Greenland launch event will be held on Greenland (Denmark), but we can confirm that the Greenland GPU will use HBM memory. There is still no confirmation on the manufacturing process, but we would expect that Greenland ends in either 14nm GlobalFoundries process or TSMC’s 16nm process. Greenland will be a part of AMD’s next generation K12 APU, which means that this multiple Zen core APU will get some great graphics performance. It is not clear if Greenland is a part of the Caribbean Islands (Fiji) generation or if it belongs to a successor generation.

At this time we cannot confirm (or deny) whether or not Greenland will launch as a desktop card, too, and we can only speculate that Greenland is shrunk derivative of the Fiji generation architecture.

Nvidia’s first HBM Pascal card that is coming by early 2016. Pascal will use the 2.5 D HMB approach and probably HBM 2 memory, and we expect that AMD’s Fiji successor will use HBM 2 memory as well 2 memory as well.

Details are limited, apart of the fact that Greenland can end up in the next generation APU such as K12, making the architecture quite scalable. High Bandwidth Memory combined with new K12 cores might create the fastest integrated product of all time, and let’s not forget that AMD is putting a lot of emphasis on Heterogeneous System Architecture (HSA) and the compute side of things. With the help of HBM-powered Greenland that can end up with 500GB/s bandwidth, along with multiple Zen 64-bit CPU cores, you can expect quite a lot of compute performance from this new integrated chip.

Courtesy-Fud

Will Intel To Buy Altera?

April 1, 2015 by Michael  
Filed under Computing

Shares in Intel have surged after the news leaked that it is trying to buy fellow chipmaker Altera for a cool $10 billion.

If it goes ahead it will be Intel’s biggest purchase ever and the latest merger in the quickly consolidating semiconductor sector.

For those who came in late. Altera, was once part of AMD, and makes programmable chips widely used in mobile phone towers, the military and other industrial applications. Altera’s value to Intel is its programmable chips, which are increasingly being used in data centres, where they are customized for specialized functions such as providing web-search results or updating social networks.

It is seen as part of Intel Chief Executive Officer Brian Krzanich’s glorious plans to seek out new markets, and new technologies and to boldly go where noIntel has gone before.

Earlier this month, Intel slashed nearly $1 billion from its first-quarter revenue forecast to $12.8 billion, plus or minus $300 million, as small businesses put off upgrading their personal computers.

Altera is one of the only semiconductor companies with better gross margins than Intel, and with about two-thirds of its revenue from telecom, wireless, military/aerospace.

The story has yet to be confirmed by anyone other than the business press which probably means it is true.

Intel’s previous biggest deal, is the $7.7 billion purchase of security software maker McAfee in 2011.

Courtesy-Fud

Chat Tool Slack Admits To Being Hacked

March 31, 2015 by mphillips  
Filed under Around The Net

The popular group chat tool Slack had its central database hacked in February, according to the company, potentially compromising users’ profile information like log-on data, email addresses and phone numbers.

The database also holds any additional information users may have added to their profiles like their Skype IDs.

The passwords were encrypted using a hashing technique. There was no indication the hackers were able to decrypt the passwords, Slack Technologies said in a blog post. No financial or payment information was accessed or compromised, it said.

The unauthorized access took place over about four days in February. The company said it has made changes to its infrastructure to prevent future incidents.

Slack was contacting a “very small number” of individual users who had suspicious activity tied to their accounts, or whose messages may have been accessed. Slack did not say how many users it thinks may have been affected in this way. A company spokeswoman declined to comment further.

There’s been strong interest in Slack’s business chat app since it launched last year, and its user base now tops 500,000.

To beef up security, Slack added a two-factor authentication feature on Friday. If it’s enabled, users must enter a verification code in addition to their normal password whenever they sign in to Slack. The company recommends that all users turn it on.

Slack has also released a password kill-switch feature, to let team owners and administrators reset passwords for an entire team at once. Barring that, users can reset their passwords in their profile settings.

 

 

Phone Movements, Gestures May Be Key To Fighting Mobile Malware

March 31, 2015 by mphillips  
Filed under Mobile

Mobile malware is a growing problem, but researchers from University of Alabama at Birmingham (UAB) have developed a new way of detecting when suspicious mobile apps start trouble, such as trying to call premium-rate numbers unbeknowst to a phone’s owner.

The technique relies on using the phone’s motion, position and ambient sensors to learn the gestures that users typically make when they initiate phone calls, take pictures or use the phone’s NFC reader to scan credit cards.

Some mobile malware programs already abuse these services and security researchers expect their number will only increase.

The technology developed by the UAB researchers can monitor those three services and can check whether attempts to access them are accompanied by the natural gestures users are expected to make. If they’re not, they were likely initiated by malware.

The research, which involved collecting data from real-life scenarios to train the technology, showed that detecting different gestures and using them to differentiate between user-initiated actions and automated ones can be done with a high degree of accuracy. As such, the technique can be a viable malware defense.

The technology doesn’t require root access on the device and it’s better than the signature-based approach used by most mobile antivirus programs, according to Nitesh Saxena, director of UAB’s Security and Privacy In Emerging computing and networking Systems Lab.

 

Toshiba And SanDisk To Launch 48-Layer 3D Flash Chip

March 31, 2015 by Michael  
Filed under Computing

Toshiba has announced the world’s first 48-layer Bit Cost Scalable (BiCS) flash memory chip.

The BiCS is a two-bit-per-cell, 128Gb (16GB) device with a 3D-stacked cell structure flash that improves density and significantly reduces the overall size of the chip.

Toshiba is already using 15nm dies so, despite the layering, the finished product will be competitively thin.

24 hours after the first announcement, SanDisk made one of its own regarding the announcement. The two companies share a fabrication plant and usually make such announcements in close succession.

“We are very pleased to announce our second-generation 3D NAND, which is a 48-layer architecture developed with our partner Toshiba,” said Dr Siva Sivaram, executive vice president of memory technology at SanDisk.

“We used our first generation 3D NAND technology as a learning vehicle, enabling us to develop our commercial second-generation 3D NAND, which we believe will deliver compelling storage solutions for our customers.”

Samsung has been working on its own 3D stacked memory for some time and has released a number of iterations. Production began last May, following a 10-year research cycle.

Moving away from the more traditional design process, the BiCS uses a ‘charge trap’ which stops electrons leaking between layers, improving the reliability of the product.

The chips are aimed primarily at the solid state drive market, as the 48-layer stacking process is said to enhance reliability, write speed and read/write endurance. However, the BiCS is said to be adaptable to a number of other uses.

All storage manufacturers are facing a move to 3D because, unless you want your flash drives very long and flat, real estate on chips is getting more expensive per square inch than a bedsit in Soho.

Micron has been talking in terms of 3D NAND since an interview with The INQUIRER in 2013 and, after signing a deal with Intel, has predicted 10TB in a 2mm chip by the end of this year.

Production of the chips will roll out initially from Fab 5 before moving in early 2016 to Fab 2 at the firm’s Yokkaichi Operations plant.

This is in stark contrast to Intel, which mothballed its Fab 42 chip fabrication plant in Chandler, Arizona before it even opened, as the semiconductors for computers it was due to produce have fallen in demand by such a degree.

The Toshiba and Sandisk BiCS chips are available for sampling from today.

Courtesy-TheInq

 

Alibaba, BMG Sign Music Distribution Deal

March 31, 2015 by mphillips  
Filed under Around The Net

Germany’s BMG music rights company announced that it has signed a music digital distribution deal with China’s Alibaba Group Holding Ltd, as the world’s largest e-commerce giant firms up its bid to become a digital media empire.

The deal, one of the first in China made by a major music publisher rather than a label, will bring more than 2.5 million copyrights to Alibaba, whose music platforms already had many of the songs from artists including Kylie Minogue, the Rolling Stones and Jean-Michael Jarre, an Alibaba spokeswoman said.

Alibaba has set its eyes on becoming an online-media powerhouse, with music, film and television. The $210-billion firm has touted the potential for selling digital products as well as physical products in China, despite the country’s track record of users not paying for media content.

In the process, it is vying with Tencent Holdings Ltd, China’s biggest social networking and online entertainment firm, and search leader Baidu Inc and its online video unit, iQiyi.

For BMG, the tie-up is both a chance to boost earnings by its artists in China and part of its attempt to “grow the legitimate music market in China”, the company said.

BMG last November linked up with Chinese independent company Giant Jump to manage publishing and recording rights both at home and overseas.

Alibaba’s Digital Entertainment arm will “promote BMG writers and artists through channels such as its streaming apps Xiami and TTPod” and “monitor and take action against digital and mobile services who may infringe the rights of BMG clients,” the subsidiary of Bertelsmann AG, Europe’s largest media company, said in a statement.

“Internet and particular mobile media are quickly providing an answer to the music industry’s long-time challenge of how to monetize the vast untapped potential of the Chinese market,” BMG Chief Executive Hartwig Masuch said in Monday’s statement.

 

Can Small Developers Profit In The Mobile Space?

March 31, 2015 by Michael  
Filed under Mobile

Is there a future for smaller developers on mobile devices? That’s an awful question to have to ask, yet it’s one being asked – with some variation in the phrasing or the approach – in quite a lot of contexts recently. Mobile platforms, once seen as safe refuge from the drastic collapse of the mid-range PC and console market, are now themselves displaying the same symptoms; soaring budgets, dependency on franchise or licensed IP, and a market increasingly dominated by the tiny percentage of games which “hit”, leaving not even scraps for the vast number of games that “miss”.

We watched that happen to console games over the course of the last hardware generation. It’s a vicious cycle and it can be hard to tell where it begins; did consumers stop buying sub-AAA titles, leading publishers to stop funding them? Or did publishers, spooked by rising development budgets, throw all their weight behind a handful of “sure-fire” hit titles, starving sub-AAA development of finances, resources and ultimately, existence? A little from column A, a little from column B, perhaps; though I suspect it had more to do with column B, for reasons which are now being recycled in the mobile industry.

There’s no doubt that costs at the top end of the mobile business are soaring, with development and – more notably – marketing costs trending upwards at a rate which makes the rise in console development costs through the 2000s look positively leisurely. Companies like King and Supercell are spending up to half a billion dollars a year on marketing alone, and margins are tumbling as costs outpace revenue growth. This creates a twofold barrier to entry for newcomers. Firstly, the rapid advancement of mobile technology has pushed up basic development costs – say what you like about the console model, which freezes hardware advancement in five or six year increments, but it does at least give a long-term level playing field that gives developers an opportunity to master and ultimately profit from the systems.

Secondly, even if you can afford the now much more expensive development process for a top-end mobile game, the chances are you can’t afford to market it – not when you’re facing an onslaught of expensive marketing from the likes of King, or takeovers of some of the world’s most famous public spaces by Supercell. Winning mindshare from those giants isn’t within the grasp of a plucky startup; much as we’d all love to pretend that it’s all about the quality of the games, the reality is that at the top end of the market, it’s more to do with brand recognition, amplified by the “flocking” behaviour that’s endemic to social games.

To some degree, these companies are victims of their own past success. Much of the marketing spend that’s inflating their operating costs is not aimed at supporting the launch of new games, but at sustaining the growth of games that first launched three or four years ago. Clash of Clans remains Supercell’s big hit; Candy Crush Saga is still King’s biggest revenue stream. GungHo has Puzzle&Dragons, Zynga has Farmville. These are all old games, yet they are the marketing focus of their respective companies. This is actually not just a feature of those companies, but a feature of the market overall; a look at the top charts on the iOS App Store reveals that many of the games which remain most popular now are games that date back several years. That forces mobile developers to view sustaining the growth of their old hits as being just as important as developing new hits; more important, actually, because the old game is already a proven success, while all money spent on new development is, by definition, speculative.

It’s a tough balance to strike – focusing resources on your existing games, risking having no new titles to take up the slack when the old star finally fades; or focusing on developing new games, effectively re-rolling the dice that gave you boxcars in the past, but with the ever-present risk that you’ll never see anything but snake eyes again. This has echoes, actually, of a similar balancing problem that console publishers faced – which brings us back to the question of how the vicious cycle that killed off mid-range development got rolling in the first place. Faced with rising development costs, publishers had two choices. They could keep funding lots of games, knowing that plenty of them would turn out to be sub-AAA and might lose money; or they could dump all of their resources into a handful of titles, “guaranteeing” that each one would be an AAA title through sheer force of finance, spending money to bulk out the feature lists in a quest to stamp out every market risk imaginable.

We all know which way that went. We sometimes say that companies like game publishers are averse to “risk”, but that’s not the whole truth; they’re actually averse to one specific type of risk, operational risk – the risk that a game will bomb and lose money – while being entirely too relaxed about another form of risk, financial risk – the risk that grows as a game’s development and launch gets more expensive. Essentially, publishers (and big companies in general) are remarkably comfortable about spending enormous amounts of money on things; they’ll pump endless amounts of cash into the development of a game in order to tick every box on the feature list, telling themselves that they’re reducing operational risk (“it’s got multiplayer now, people love multiplayer! It’s much less likely to fail with multiplayer!”) while ignoring the now catastrophically high level of financial risk which means that, should the game actually fail, it threatens to take the whole company with it.

I digress, but this is as much about the careers of the company’s managers and producers as it is about the health or culture of the company itself; managers make a perfectly rational calculation that operational risk is much more dangerous to their careers than financial risk. Being in charge of a small game that flopped looks pretty bad, and being in charge of a small game that did well is a career positive but not an enormous one; while being in charge of a big game that succeeded is a career-making move, and being in charge of a big game that flopped is actually not all that bad either, perhaps no worse than being in charge of a smaller flop, since the industry tends to respect experience of managing large projects, no matter how badly they did in the end.

That’s what happened in the console and PC games business, and it was nothing short of apocalyptic for the small development studios which had once been the backbone of the industry. Many of those studios and their staff saw mobile development as a liferaft; yet here we are again. The raft is sinking into the same sea that consumed the mid-range development sector on console and PC. Budgets are rising, launches are getting more expensive and, from what I can gather, publishers are responding just the same as before – throwing more and more money at a smaller and smaller selection of products, trying vainly to insulate themselves from operational risk while all the while constructing a huge, shaky tower of financial risk. If anything, it’s even worse on mobile than it was on console, since one of the best things about mobile – the long tail that means successful games can keep on being successful for ages – conspires to give publishers a whole new way of avoiding operational risk, by dumping money into old, proven games rather than new, risky ventures.

If that sounds bleak, though, I’d urge you to consider the flipside of what happened on PC and console. The mid-range disappeared, yes; it was upsetting, it wrecked people’s lives in many cases, and it narrowed the kind of games available for quite a long while. Yet at the same time a whole new low-end of games (and I mean “low-end” in budget terms, not quality) emerged. The indie scene blossomed, from a new generation of bedroom coders through to a whole new wave of small, creative, innovative studios who have done more to push the medium of games forward than almost anything else in the past decade. Not every indie game was a success, but most of them were sufficiently low-budget that their failure could be chalked up to experience and the creators could move on to the next thing. “Fail again. Fail better”; Beckett’s words could be the slogan of the indie game scene.

Moreover, and crucially for how we choose to perceive the change in the mobile games industry, these “low-end” creators have a different concept of success to a large publisher or studio. Sure, Notch made billions and bought a Hollywood mansion; but while much of our attention gets focused on such enormous successes, the truth is that for most indie creators and small developers, “success” looks like the bills being paid, the paycheques being sent out and the funds for working on the next title being secured. Never mind billions; many smaller games would comfortably cover their costs and pay their creators a healthy wage with a few hundred thousand dollars in revenue, or perhaps even less.

On mobile, too, far away from the giant marketing budgets and multi-million-dollar development plans of the big players, such a sector can and will thrive. We are rapidly approaching the point where everyone in the developed world, and a healthy percentage of people elsewhere, will carry a smartphone; that’s a whole lot of niches to be filled, a whole lot of niches to be satisfied, an almost unimaginable number of daily moments to be brightened with a spark of entertainment. It’s nonsense to imagine that King, or Supercell, or any of the other huge companies dominating the top end of the market, are going to release games that satisfy and enthral everyone with a phone in their pocket. For those plucky, interesting developers who challenge themselves to fill the cracks and flow into the niches, there’s absolutely a living – a good living – to be made. There are challenges, the greatest of which may be discovery, but just as PCs and even consoles have seen the flowering of indie talent, the end of the mobile device “gold rush” doesn’t mean the doors are shut for smaller developers; as long as your dream is to make a living from creating games, rather than to buy a Hollywood mansion and a private jet, your dream is still alive.

Courtesy-GI.biz

GPS To Become Obsolete?

March 30, 2015 by mphillips  
Filed under Uncategorized

Finding GPS unreliable in certain situations, the U.S. government is focusing on developing a more reliable real-time position tracking technology whose signals won’t disappear in blind spots and can’t be jammed.

The Defense Advanced Research Projects Agency is developing “radically” new technologies to deliver a more advanced position- and navigation-tracking system that is more reliable and accurate than GPS, according to a document on DARPA research projects recently released.

DARPA — which is a part of the U.S. Department of Defense — thinks that new real-time positioning technology would give the U.S. military an advantage over rivals. GPS technology has provided a strategic advantage, but it isn’t foolproof, as it can be jammed by opponents or also be inaccessible in some parts of the world.

“The need to be able to operate effectively in areas where GPS is inaccessible, unreliable or potentially denied by adversaries has created a demand for alternative precision timing and navigation capabilities,” DARPA said in the document.

Beyond the military, GPS has had a significant impact on individuals, business and economies. GPS has helped deliver customized content and services to mobile users, and also helped in the timely delivery of goods. But GPS isn’t flexible, and DARPA wants to make its alternative more flexible and customizable with the help of algorithms.

New types of self-contained instruments are under development that could better track position, time and direction of motion, which are critical aspects of GPS. DARPA is developing high-precision clocks, self-calibrating gyroscopes and accelerometers, and high-precision navigation instruments that can track position for long periods without relying on external sources.

DARPA is also researching new technologies that could make real-time tracking possible through a number of sources. DARPA is developing sensors that “use signals of opportunity” such as television, radio, cell towers, satellites, and even lightning, for real-time tracking. The effort, called ASPN (All Source Positioning and Navigation) alleviates issues related to fixing locations in buildings, deep foliage, underwater or underground, where GPS access can be limited.

The ultimate goal is to develop a compact navigation system that could be given to soldiers, put on tanks or implemented in guidance systems.

 

 

 

Amazon Unleashes Unlimited Storage For $5 A Month

March 30, 2015 by mphillips  
Filed under Around The Net

Amazon upped the ante: Unlimited cloud storage for individuals for $5 a month ($59.99 per year).

Amazon’s Unlimited Everything Plan allows users to store an infinite number of  photos, videos, files, documents, movies and music in its Cloud Drive.

The site also announced a separate $12 per year plan for unlimited photos. People who subscribe to Amazon Prime already get unlimited capacity for photos. Both the Unlimited Everything Plan and the Photos Plan have three-month free trial periods.

Online storage and file sharing service providers, such as Google Drive, Dropbox, and iCloud, have been engaged in a pricing war over the past year. Last fall, Dropbox dropped its Pro plan pricing for individuals to $9.99 per month for 1TB of capacity. Dropbox offers 2GB of capacity for free.

Dropbox also offers members 500MB of storage each time they get a friend to sign up; there’s a 16GB max on referrals, though. With Dropbox Pro, members can get 1GB instead of 500MB each time they refer someone.

Google Drive offers 15GB of capacity for free and charges $1.99 per month for 100GB and $9.99 per month for 1TB.

Apple’s iCloud offers 5GB of capacity for free, and charges 99 cents per month for 20GB, $3.99 per month for 200GB and $9.99 per month for 1TB.

Microsoft’s OneDrive offers 15GB of capacity for free, and charges $1.99 per month for 100GB, $3.99 per month for 200GB and $6.99 per month for 1TB.

While Amazon offers unlimited file size uploads for desktop users, it limits file sizes to 2GB for mobile devices.

 

 

Will Intel Challenge nVidia In The GPU Space?

March 30, 2015 by Michael  
Filed under Computing

Intel has released details of its next -generation Xeon Phi processor and it is starting to look like Intel is gunning for a chunk of Nvidia’s GPU market.

According to a briefing from Avinash Sodani, Knights Landing Chief Architect at Intel, a product update by Hugo Saleh, Marketing Director of Intel’s Technical Computing Group, an interactive technical Q&A and a lab demo of a Knights Landing system running on an Intel reference-design system, Nvidia could be Intel’s target.

Knights Landing and prior Phi products are leagues apart and more flexible for a wider range of uses. Unlike more specialized processors, Intel describes Knights Landing as taking a “holistic approach” to new breakthrough applications.

The current generation Phi design, which operates as a coprocessor, Knights Landing incorporates x86 cores and can directly boot and run standard operating systems and application code without recompilation.

The test system had socketed CPU and memory modules was running a stock Linux distribution. A modified version of the Atom Silvermont x86 cores formed a Knights Landing ’tile’ which was the chip’s basic design unit consisting of dual x86 and vector execution units alongside cache memory and intra-tile mesh communication circuitry.

Each multi-chip package includes a processor with 30 or more tiles and eight high-speed memory chips.

Intel said the on-package memory, totaling 16GB, is made by Micron with custom I/O circuitry and might be a variant of Micron’s announced, but not yet shipping Hybrid Memory Cube.

The high-speed memory is similar to the DDR5 devices used on GPUs like Nvidia’s Tesla.

It looks like Intel saw that Nvidia was making great leaps into the high performance arena with its GPU and thought “I’ll be having some of that.”

The internals of a GPU and Xeon Phi are different, but share common ideas.

Nvidia has a big head start. It has already announced the price and availability of a Titan X development box designed for researchers exploring GPU applications to deep learning. Intel has not done that yet for Knights Landing systems.

But Phi is also a hybrid that includes dozens of full-fledged 64-bit x86 cores. This could make it better at some parallelizable application categories that use vector calculations.

Courtesy-Fud

Panasonic On The Hunt For Acquisition Targets

March 27, 2015 by mphillips  
Filed under Consumer Electronics

Japanese electronics giant Panasonic Corp said it is gearing up to spend 1 trillion yen ($8.4 billion) on acquisitions over the next four years, bolstered by a stronger profit outlook for its automotive and housing technology businesses.

Chief Executive Kazuhiro Tsuga said at a briefing on Thursday that Panasonic doesn’t have specific acquisition targets in mind for now. But he said the firm will spend around 200 billion yen on M&A in the fiscal year that kicks off in April alone, and pledged to improve on Panasonic’s patchy track record on big deals.

“With strategic investments, if there’s an opportunity to accelerate growth, you need funds. That’s the idea behind the 1 trillion yen figure,” he said. Tsuga has spearheaded a radical restructuring at the Osaka-based company that has made it one of the strongest turnaround stories in Japan’s embattled technology sector.

Tsuga previously told Reuters that company was interested in M&A deals in the European white goods market, a sector where Panasonic has comparatively low brand recognition.

The firm said on Thursday it’s targeting operating profit of 430 billion yen in the next fiscal year, up nearly 25 percent from the 350 billion yen it expects for the year ending March 31.

Panasonic’s earnings have been bolstered by moving faster than peers like Sony Corp and Sharp Corp to overhaul business models squeezed by competition from cheaper Asian rivals and caught flat-footed in a smartphone race led by Apple Inc and Samsung Electronics. Out has gone reliance on mass consumer goods like TVs and smartphones, and in has come a focus on areas like automotive technology and energy-efficient home appliances.

Tsuga also sought to ease concerns that an expensive acquisition could set back its finances, which took years to recover from the deal agreed in 2008 to buy cross-town rival Sanyo for a sum equal to about $9 billion at the time.

 

 

Microsoft Confirms Windows 10 Will Support 8K Resolution

March 27, 2015 by Michael  
Filed under Computing

Software King of the World Microsoft’s Windows 10 operating system will support screen resolutions that will not be available on commercial displays for years.

At the WinHEC conference Microsoft revealed that Windows 10 will support 8K (7680*4320) resolution for monitors, which is unlikely show up on the market this year or next.

It also showed off minimum and maximum resolutions supported by its upcoming Windows 10. It looks like the new operating system will support 6″+ phone and tablet screens with up to 4K (3840*2160) resolution, 8″+ PC displays with up to 4K resolution and 27″+ monitors with 8K (7680*4320) resolution.

To put this in some perspective, the boffins at the NHK (Nippon H?s? Ky?kai, Japan Broadcasting Corp.) think that 8K ultra-high-definition television format will be the last 2D format as the 7680*4320 resolution (and similar resolution) is the highest 2D resolution that the human eye can process.

This means that 8K and similar resolutions will stay around for a long time and it makes sense to add their support to hardware and software.

NHK is already testing broadcasting in 8K ultra-high-definition resolutions, VESA has ratified DisplayPort and embedded DisplayPort standards to connect monitors with up to 8K resolution to graphics adapters and a number of upcoming games will be equipped for textures for 8K UHD displays.

However monitors that support 8K will not be around for some time because display makers will have to produce new types of panels for them.

Redmond will be ready for the advanced UHD monitors well before they hit the market. Many have criticized Microsoft for poor support of 4K UHD resolutions in Windows 8.

Courtesy-Fud

 

AMD Shows Plans For ARM Servers

March 27, 2015 by Michael  
Filed under Computing

Buried in AMD’s shareholders’ report, there was a some suprising detail about the outfit’s first ARM 64-bit server SoCs.

For those who came in late, they are supposed to be going on sale in the first half of 2015.

We know that the ARM Cortex-A57 architecture based SoC has been codenamed ‘Hierofalcon.’

AMD started sampling these Embedded R-series chips last year and is aiming to release the chipset in the first half of this year for embedded data center applications, communications infrastructure, and industrial solutions.

But it looks like the Hierofalcon SoC will include eight Cortex-A57 cores with 4MB L2 cache and will be manufactured on a 28nm process. It will support two 64-bit DDR3/4 memory channels with ECC up to 1866MHz and up to 128GB per CPU. Connectivity options will include two 10GbE KR, 8x SATA 3 6Gb/s, 8 lanes PCIe Gen 3, SPI, UART, and I2C interfaces. The chip will have a TDP between 15 to 30W.

The SOC ranges between a TDP of 15 – 30 W. The highly integrated SoC includes 10 Gb KR Ethernet and PCI-Express Gen 3 for high-speed network connectivity, making it ideal for control plane applications. The chip also features a dedicated security processor which enables AMD’s TrustZone technology for enhanced security. There’s also a dedicated cryptographic security co-processor on-board, aligning to the increased need for networked, secure systems.

Soon after Hierofalcon is out, AMD will be launching the SkyBridge platform that will feature interchangeable 64-bit ARM and x86 processors. Later in 2016, the company will be launching the K12 chip, its custom high performance 64-bit ARM core.

Courtesy-Fud

Facebook Opening Parse For IoT Development

March 27, 2015 by mphillips  
Filed under Around The Net

Facebook is opening up Parse, its suite of back-end software development tools, to create Internet of Things apps for items like smart home appliances and activity trackers.

By making Parse available for IoT, Facebook hopes to strengthen its ties to a wider group of developers in a growing industry via three new software development kits aimed specifically at IoT, unveiled Wednesday at the company’s F8 developer conference in San Francisco.

The tools are aimed at making it easier for outside developers to build apps that interface with Internet-connected devices. Garage door manufacturer Chamberlain, for example, uses Parse for its app to let people open and lock their garage door from their smartphones.

Or, hypothetically, the maker of a smart gardening device could use Parse to incorporate notifications into their app to remind the user to water their plants, said Ilya Sukhar, CEO of Parse, during a keynote talk at F8.

Facebook bought Parse in 2013, putting itself in the business of selling application development tools. Parse provides a hosted back-end infrastructure to help third party developers build their apps. Over 400,000 developers have built apps with Parse, Sukhar said on Wednesday.

Parse’s new SDKs are available on GitHub as well as on Parse’s site.