Subscribe to:

Subscribe to :: TheGuruReview.net ::

Panasonic On The Hunt For Acquisition Targets

March 27, 2015 by mphillips  
Filed under Consumer Electronics

Japanese electronics giant Panasonic Corp said it is gearing up to spend 1 trillion yen ($8.4 billion) on acquisitions over the next four years, bolstered by a stronger profit outlook for its automotive and housing technology businesses.

Chief Executive Kazuhiro Tsuga said at a briefing on Thursday that Panasonic doesn’t have specific acquisition targets in mind for now. But he said the firm will spend around 200 billion yen on M&A in the fiscal year that kicks off in April alone, and pledged to improve on Panasonic’s patchy track record on big deals.

“With strategic investments, if there’s an opportunity to accelerate growth, you need funds. That’s the idea behind the 1 trillion yen figure,” he said. Tsuga has spearheaded a radical restructuring at the Osaka-based company that has made it one of the strongest turnaround stories in Japan’s embattled technology sector.

Tsuga previously told Reuters that company was interested in M&A deals in the European white goods market, a sector where Panasonic has comparatively low brand recognition.

The firm said on Thursday it’s targeting operating profit of 430 billion yen in the next fiscal year, up nearly 25 percent from the 350 billion yen it expects for the year ending March 31.

Panasonic’s earnings have been bolstered by moving faster than peers like Sony Corp and Sharp Corp to overhaul business models squeezed by competition from cheaper Asian rivals and caught flat-footed in a smartphone race led by Apple Inc and Samsung Electronics. Out has gone reliance on mass consumer goods like TVs and smartphones, and in has come a focus on areas like automotive technology and energy-efficient home appliances.

Tsuga also sought to ease concerns that an expensive acquisition could set back its finances, which took years to recover from the deal agreed in 2008 to buy cross-town rival Sanyo for a sum equal to about $9 billion at the time.

 

 

AMD Shows Plans For ARM Servers

March 27, 2015 by Michael  
Filed under Computing

Buried in AMD’s shareholders’ report, there was a some suprising detail about the outfit’s first ARM 64-bit server SoCs.

For those who came in late, they are supposed to be going on sale in the first half of 2015.

We know that the ARM Cortex-A57 architecture based SoC has been codenamed ‘Hierofalcon.’

AMD started sampling these Embedded R-series chips last year and is aiming to release the chipset in the first half of this year for embedded data center applications, communications infrastructure, and industrial solutions.

But it looks like the Hierofalcon SoC will include eight Cortex-A57 cores with 4MB L2 cache and will be manufactured on a 28nm process. It will support two 64-bit DDR3/4 memory channels with ECC up to 1866MHz and up to 128GB per CPU. Connectivity options will include two 10GbE KR, 8x SATA 3 6Gb/s, 8 lanes PCIe Gen 3, SPI, UART, and I2C interfaces. The chip will have a TDP between 15 to 30W.

The SOC ranges between a TDP of 15 – 30 W. The highly integrated SoC includes 10 Gb KR Ethernet and PCI-Express Gen 3 for high-speed network connectivity, making it ideal for control plane applications. The chip also features a dedicated security processor which enables AMD’s TrustZone technology for enhanced security. There’s also a dedicated cryptographic security co-processor on-board, aligning to the increased need for networked, secure systems.

Soon after Hierofalcon is out, AMD will be launching the SkyBridge platform that will feature interchangeable 64-bit ARM and x86 processors. Later in 2016, the company will be launching the K12 chip, its custom high performance 64-bit ARM core.

Courtesy-Fud

Lexmark Scoops Up Kofax For Nearly $1B

March 26, 2015 by mphillips  
Filed under Around The Net

Lexmark International Inc, known for its printers, said it plans to acquire Kofax Ltd in an about $1 billion deal that would double the size of its enterprise software business.

PC and printer makers have struggled in the recent past as companies reduced printing to cut costs and consumers shifted to mobile devices from PCs.

Hewlett-Packard Co plans to separate its computer and printer businesses from its corporate hardware and services operations this year.

Xerox Corp has also increasingly focused on IT services to make up for the falling sales of its copiers and printers.

Lexmark divested its inkjet printer business in 2013 and has since boosted its enterprise software business.

The Kofax deal will help the company’s Perceptive Software business achieve its revenue target of $500 million in 2016, Lexmark said.

The business makes software to scan everything from spreadsheets to medical images and provides services to banking, healthcare, insurance and retail companies. It contributed about 8 percent to Lexmark’s revenue in 2014 and has grown at more than 30 percent in the past two years.

Kofax provides data services to the financial, insurance and healthcare companies such as Citigroup Inc, Metlife Inc and Humana Inc.

Lexmark said it expects the deal to “significantly” expand operating margins in its enterprise software business, which would now be worth about $700 million. It will also add about 10 cents per share to the company’s adjusted profit in 2015.

 

 

Vessel Launches Early Access, Paid Subcription Video Service

March 25, 2015 by mphillips  
Filed under Consumer Electronics

Online video platform Vessel officially debuted its paid subscription service on Tuesday, offering programming at least three days before other websites in a bid to reshape an industry dominated by free content on Google Inc’s YouTube.

Vessel, which costs viewers $3 a month, was founded by former Hulu Chief Executive Jason Kilar and Chief Technology Officer Richard Tom. They aim to create an early window for a selection of web video, similar to the way movies are released in theaters before they arrive on cable TV or the Internet.

“Early access is very valuable,” Kilar said in an interview. “There are a lot of consumers who would love to see something early.”

More than 130 creators will provide early access to content on Vessel. After the exclusive period ends, videos can go to YouTube, Vimeo, Vevo or other free, ad-supported sites, and are free on Vessel.

YouTube stars such as Ingrid Nilsen, Rhett & Link and Shane Dawson are among creators whose videos will make their debut on Vessel. Other programming comes from online networks such as food-oriented Tastemade and celebrities such as Alec Baldwin.

Video creators on Vessel keep 70 percent of ad revenue, compared with 55 percent that is typical on YouTube, plus 60 percent of Vessel subscription revenue.

With those incentives, the new service will be an easier sell to creators than offering viewers who are used to watching videos for free, said Brett Sappington, director of research at Parks Associates.

“Vessel must rely on content creators’ popularity and self-marketing to entice their loyal viewers into paying a monthly fee,” he said.

The service is free for one year for viewers who sign up within the first three days.

It is unlikely YouTube will lose significant revenue from a migration to Vessel, Sappington said. YouTube made its debut a decade ago and has more than 1 billion users.

 

 

Can MediaTek Bring The Cortex-A72 To Market In The Fall?

March 23, 2015 by Michael  
Filed under Computing

MediaTek became the first chipmaker to publicly demo a SoC based on ARM’s latest Cortex-A72 CPU core, but the company’s upcoming chip still relies on the old 28nm manufacturing process.

We had a chance to see the upcoming MT8173 in action at the Mobile World Congress a couple of weeks ago.

The next step is to bring the new Cortex-A72 core to a new node and into mobiles. This is what MediaTek is planning to do by the end of the year.

Cortex-A72 smartphone parts coming in Q4

It should be noted that MediaTek’s 8000-series parts are designed for tablets, and the MT8173 is no exception. However, the new core will make its way to smartphone SoCs later this year, as part of the MT679x series.

According to Digitimes Research, MediaTek’s upcoming MT679x chips will utilize a combination of Cortex-A53 and Cortex-A57 cores. It is unclear whether MediaTek will use the planar 20nm node or 16nm FinFET for the new part.

By the looks of it, this chip will replace 32-bit MT6595, which is MediaTek’s most successful high performance part yet, with a few relatively big design wins, including Alcatel, Meizu, Lenovo and Zopo. The new chip will also supplement, and possibly replace the recently introduced MT6795, a 64-bit Cortex-A53/Cortex-A72 part used in the HTC Desire 826.

More questions than answers

Digitimes also claims the MT679x Cortex-A72 parts may be the first MediaTek products to benefit from AMD technology, but details are scarce. We can’t say whether or not the part will use AMD GPU technology, or some HSA voodoo magic. Earlier this month we learned that MediaTek is working with AMD and the latest report appears to confirm our scoop.

The other big question is the node. The chip should launch toward the end of the year, so we probably won’t see any devices prior to Q1 2016. While 28nm is still alive and kicking, by 2016 it will be off the table, at least in this market segment. Previous MediaTek roadmap leaks suggested that the company would transition to 20nm on select parts by the end of the year.

However, we are not entirely sure 20nm will cut it for high-end parts in 2016. Huawei has already moved to 16nm with its latest Kirin 930 SoC, Samsung stunned the world with the 14nm Exynos 7420, and Qualcomm’s upcoming Snapdragon 820 will be a FinFET part as well.

It is obvious that TSMC’s and Samsung’s 20nm nodes will not be used on most, if not all, high-end SoCs next year. With that in mind, it would be logical to expect MediaTek to use a FinFET node as well. On the other hand, depending on the cost, 20nm could still make sense for MediaTek – provided it ends up significantly cheaper than FinFET. While a 20nm chip wouldn’t deliver the same level of power efficiency and performance, with the right price it could find its way to more affordable mid-range devices, or flagships designed by smaller, value-oriented brands (especially those focusing on Chinese and Indian markets).

Courtesy-Fud

Will nVidia’s Pascal Make It To Market In 2016?

March 20, 2015 by Michael  
Filed under Computing

Pascal is Nvidia’s next generation architecture and it is coming after Maxwell of course. The company says it will launch next year, but details are still sketchy.

According Nvidia CEO Jen Hsun Huang, it is coming with Mixed Precision and this is the new architecture that will succeed Maxwell. Nvidia claims that the new GPU core has its own architectural benefits.

3D memory or High Bandwidth Memory (HBM), is a big thing and Jen Hsun Huang claims 32GB is possible with the new architecture, compared to 12GB on the new Maxwell-based Titan X. This is a staggering increase from the current standard of 4GB per card, to 12GB with Titan, and probably up to 32GB with Pascal. NV Link should enable a very fast interconnect that has 5 times the performance of PCI Express, which we all use right now. More memory and more bandwidth are obviously needed for 4K/UHD gaming.

Huang also shared some very rough estimates, including Convolution Compute performance, will be four times faster with FP16 precision in mixed precision mode. The 3D memory offers a six-fold increase in GPU to memory bandwidth.

Convolution and bandwidth at the front, and bandwidth to convolution at the back of the GPU, should get be five times faster than on Maxwell cards. It is complex fuzzy logic that is hard to explain with so few details shared by Nvidia about the Pascal architecture.

The width update interconnect with NV Link should get you a twofold performance increase and when you when you multiply these two numbers, Nvidia ends up with a comes to 10x compute performance increase compared to Maxwell, at least in what Nvidia CEO calls the “CEO bench”.

He warned the audience that this is a very rough estimate. This 10X number mainly targets deep learning, as it will be able to teach the deep learning network ten times faster. This doesn’t meant that the GPU offers 10 times the GPU performance for gaming compared to Maxwell, not even close, we predict.

Volta made it back to the roadmap and currently it looks like the new architecture will be introduced around 2018, or about three years from now.

Courtesy-Fud

HP Copies SanDisk With StoreVirtual

March 19, 2015 by Michael  
Filed under Computing

HP has announced a series of new storage offerings aimed at mid-sized businesses, becoming the latest player in the rush to snap up flash-hungry enterprise customers.

Following on from the recent announcement of SanDisk InfiniFlash, the new StoreVirtual 4335 hybrid flash array uses Adaptive Optimization tiering functionality to deliver 12 times more storage for a power and footprint saving of 90 percent per 2-node cluster over traditional hard disks.

The system is compatible with HP StoreVirtual Virtual Storage Appliance (VSA) and HP’s Helion OpenStack.

HP has confirmed that it is able to introduce the hybrid storage platform to mid-sized businesses in a cost-effective manner with just a couple of mouse clicks and zero downtime.

Also new is StoreOnce Backup, which is also aimed at small and medium-scale business deployments. The StoreOnce 2900 protects up to 70TB of data in a single 12-hour window and can restore 41TB in the same time, with up to 31.5TB available in a 2U rackspace footprint. It is fully compatible with HP Recovery Manager Central.

HP Virtual Store Appliance (VSA) offers a fully software-defined storage option with deduplicated disk backup features and app-based virtual machines, including VMware Sphere, Microsoft Hyper-V and Linux KVM.

Though up to 50TB is available, users can start small with a free 1TB service to try it out.

Finally, the StoreEasy NAS storage range has been refreshed with new products the 1450, 1650, 1850 all based on HP ProLiant Gen9 Servers, with 25-times faster RAID rebuilds and disaster recovery options.

By combining StoreEasy and LiveVault TurboRestore, customers can create a hybrid cloud backup with continuous redundant protection at two disparate locations, ideal for disaster recovery.

All of these products will be available by the end of March, with prices starting at $5,500.

Courtesy-TheInq

Will TSMC Win Apple’s A9 Business?

March 18, 2015 by Michael  
Filed under Computing

TSMC is reportedly getting the majority of Apple A9 orders, which would be a big coup for the company.

An Asian brokerage firm released a research note, claiming that disputes over the number of Apple A9 orders from TSMC and Samsung are “coming to an end.”

The unnamed brokerage firm said TSMC will gain more orders due to its superior yield-ramp and “manufacturing excellence in mass-production.”

This is not all, as the firm also claims TSMC managed to land orders for all Apple A9X chipsets, which will power next generation iPads. With the A9X, TSMC is expected to supply about 70 percent of all Apple A9-series chips, reports Focus Taiwan.

While Samsung managed to beat other mobile chipmakers (and TSMC), and roll out the first SoC manufactured on a FinFET node, TSMC is still in the game. The company is already churning out 16nm Kirin 930 processors for Huawei, and it’s about to get a sizable chunk of Apple’s business.

TSMC should have no trouble securing more customers for its 16FF process, which will be supplemented by the superior 16FF+ process soon. In addition, TSMC is almost certain to get a lot of business from Nvidia and AMD once their FinFET GPUs are ready.

Courtesy-Fud

Is Valve’s Steam Machine A Flop?

March 17, 2015 by Michael  
Filed under Gaming

There’s not a lot to argue with the consensus view that Valve had the biggest and most exciting announcement of GDC this year, in the form of the Vive VR headset it’s producing with hardware partner HTC. It may not be the ultimate “winner” of the battle between VR technologies, but it’s done more than most to push the whole field forwards – and it clearly sparked the imaginations of both developers and media in San Francisco earlier this month. Few of those who attended GDC seem particularly keen to talk about anything other than Vive.

From Valve’s perspective, that might be just as well – the incredibly strong buzz around Vive meant that it eclipsed Valve’s other hardware-related announcement at GDC, the unveiling of new details of the Steam Machines initiative. Ordinarily, it might be an annoying (albeit very high-quality) problem to have one of your announcements completely dampen enthusiasm for the other; in this instance, it’s probably welcome, because what trickled out of GDC regarding Steam Machines is making this look like a very stunted, unloved and disappointing project indeed.

To recap briefly; Steam Machines is Valve’s attempt to create a range of attractive, small-form-factor PC hardware from top manufacturers carrying Valve’s seal of approval (hence being called “Steam Machines” and quite distinctly not “PCs”), running Valve’s own gaming-friendly flavour of the Linux OS, set up to connect to your living room TV and controlled with Valve’s custom joypad device. From a consumer standpoint, they’re Steam consoles; a way to access the enormous library of Steam content (at least the Linux-friendly parts of it) through a device that’s easy to buy, set up and control, and designed from the ground up for the living room.

That’s a really great idea, but one which requires careful execution. Most of all, if it’s going to work, it needs a fairly careful degree of control; Valve isn’t building the machines itself, but since it’s putting its seal of approval on them (allowing them to use the Steam trademark and promoting them through the Steam service), it ought to have the power to enforce various standards related to specification and performance, ensuring that buyers of Steam Machines get a clear, simple, transparent way to understand the calibre of machine they’re purchasing and the gaming performance they can expect as a result.

Since the announcement of the Steam Machines initiative, various ways of implementing this have been imagined; perhaps a numeric score assigned to each Machine allowing buyers to easily understand the price to performance ratio on offer? Perhaps a few distinct “levels” of Steam Machine, with some wiggle room for manufacturers to distinguish themselves, but essentially giving buyers a “Good – Better – Best” set of options that can be followed easily? Any such rating system could be tied in to the Steam store itself, so you could easily cross-reference and find out which system is most appropriate for the kind of games you actually want to play.

In the final analysis, it would appear that Valve’s decision on the myriad possibilities available to it in this regard is the worst possible cop-out, from a consumer standpoint; the company’s decided to do absolutely none of them. The Steam Machines page launched on the Steam website during GDC lists 15 manufacturers building the boxes; many of those manufacturers are offering three models or more at different price and performance levels. There is absolutely no way to compare or even understand performance across the different Steam Machines on offer, short of cross-referencing the graphics cards, processors, memory types and capacities and drive types and capacities used in each one – and if you’ve got the up-to-date technical knowledge to accurately balance those specifications across a few dozen different machines and figure out which one is the best, then you’re quite blatantly going to be the sort of person who saves money by buying the components separately and wouldn’t buy a Steam Machine in a lifetime.

“Valve seems to have copped out entirely from the idea of using its new systems to make the process of buying a gaming PC easier or more welcoming for consumers”

In short, unless there’s a pretty big rabbit that’s going to be pulled out of a hat between now and the launch of the first Steam Machines in the autumn, Valve seems to have copped out entirely from the idea of using its new systems to make the process of buying a gaming PC easier or more welcoming for consumers – and in the process, appears to have removed pretty much the entire raison d’etre of Steam Machines. The opportunity for the PC market to be grown significantly by becoming more “console-like” isn’t to do with shoving PC components into smaller boxes; that’s been happening for years, occasionally with pretty impressive results. Nor is it necessarily about reducing the price, which has also been happening for some years (and which was never going to happen with Steam Machines anyway, as Valve is of no mind to step in and become a loss-leading platform holder).

Rather, it’s about lowering the bar to entry, which remains dizzyingly high for PC gaming – not financially, but in knowledge terms. A combination of relatively high-end technical knowledge and of deliberate and cynical marketing-led obfuscation of technical terminology and product numbering has meant that the actual process of figuring out what you need to buy in order to play the games you want at a degree of quality that’s acceptable is no mean feat for an outsider wanting to engage (or re-engage) with PC games; it’s in this area, the simplicity and confidence of buying a system that you know will play all the games marketed for it, that consoles have an enormous advantage over the daunting task of becoming a PC gamer.

Lacking any guarantee of performance or simple way of understanding what sort of system you’re buying, the Steam Machines as they stand don’t do anything to make that process easier. Personally, I ought to be slap bang in the middle of the market for a Steam Machine; I’m a lapsed PC gamer with a decent disposable income who is really keen to engage with some of the games coming out in the coming year (especially some of the Kickstarted titles which hark back to RPGs I used to absolutely adore), but I’m totally out of touch with what the various specifications and numbers mean. A Steam Machine that I could buy with the confidence that it would play the games I want at decent quality would be a really easy purchase to justify; yet after an hour flicking over and back between the Steam Machines page launched during GDC and various tech websites (most of which assume a baseline of knowledge which, in my case, is a good seven or eight years out of date), I am no closer to understanding which machine I would need or what kind of price point is likely to be right for me. Balls to it; browser window full of tabs looking at tech spec mumbo-jumbo closed, PS4 booted up. Sale lost.

This would be merely a disappointment – a missed opportunity to lower the fence and let a lot more people enjoy PC gaming – were it not for the extra frisson of difficulty posed by none other than Valve’s more successful GDC announcement, the Vive VR headset. You see, one of the things that’s coming across really clearly from all the VR technology arriving on the market is that frame-rate – silky-smooth frame-rate, at least 60FPS and preferably more if the tech can manage it – is utterly vital to the VR experience, making the difference between a nauseating, headache-inducing mess and a Holodeck wet dream. Suddenly, the question of PC specifications has become even more important than before, because PCs incapable of delivering content of sufficient quality simply won’t work for VR. One of the appealing things about a Steam Machine ought to be the guarantee that I’ll be able to plug in a Vive headset and enjoy Valve’s VR, if not this year then at some point down the line; yet lacking any kind of certification that says “yes, this machine is going to be A-OK for VR experiences for now”, the risk of an expensive screw-up in the choice of machine to buy seems greater than ever before.

I may be giving Steam Machines a hard time unfairly; it may be that Valve is actually going to slap the manufacturers into line and impose a clear, transparent way of measuring and certifying performance on the devices, giving consumers confidence in their purchases and lowering the bar to entry to PC gaming. I hope so; this is something that only Valve is in a position to accomplish and that is more important than ever with VR on the horizon and approaching fast. The lack of any such system in the details announced thus far is bitterly disappointing, though. Without it, Steam Machines are nothing more than a handful of small form-factor PCs running a slightly off-kilter OS; of no interest to hobbyists, inaccessible to anyone else, and completely lacking a compelling reason to exist.

Courtesy-Gi.biz

Can Linux Ever Succeed On The Desktop?

March 16, 2015 by Michael  
Filed under Computing

Every three years I install Linux and see if it is ready for prime time yet, and every three years I am disappointed. What is so disappointing is not so much that the operating system is bad, it has never been, it is just that who ever designs it refuses to think of the user.

To be clear I will lay out the same rider I have for my other three reviews. I am a Windows user, but that is not out of choice. One of the reasons I keep checking out Linux is the hope that it will have fixed the basic problems in the intervening years. Fortunately for Microsoft it never has.

This time my main computer had a serious outage caused by a dodgy Corsair (which is now a c word) power supply and I have been out of action for the last two weeks. In the mean time I had to run everything on a clapped out Fujitsu notebook which took 20 minutes to download a webpage.

One Ubuntu Linux install later it was behaving like a normal computer. This is where Linux has always been far better than Windows – making rubbish computers behave. I could settle down to work right? Well not really.

This is where Linux has consistently disqualified itself from prime-time every time I have used it. Going back through my reviews, I have been saying the same sort of stuff for years.

Coming from Windows 7, where a user with no learning curve can install and start work it is impossible. Ubuntu can’t. There is a ton of stuff you have to upload before you can get anything that passes for an ordinary service. This uploading is far too tricky for anyone who is used to Windows.

It is not helped by the Ubuntu Software Centre which is supposed to make like easier for you. Say that you need to download a flash player. Adobe has a flash player you can download for Ubuntu. Click on it and Ubuntu asks you if you want to open this file with the Ubuntu Software Center to install it. You would think you would want this right? Thing is is that pressing yes opens the software center but does not download Adobe flash player. The center then says it can’t find the software on your machine.

Here is the problem which I wrote about nearly nine years ago – you can’t download Flash or anything proprietary because that would mean contaminating your machine with something that is not Open Sauce.

Sure Ubuntu will download all those proprietary drivers, but you have to know to ask – an issue which has been around now for so long it is silly. The issue of proprietary drives is only a problem for those who are hard core open saucers and there are not enough numbers of them to keep an operating system in the dark ages for a decade. However, they have managed it.

I downloaded LibreOffice and all those other things needed to get a basic “windows experience” and discovered that all those typefaces you know and love are unavailable. They should have been in the proprietary pack but Ubuntu has a problem installing them. This means that I can’t share documents in any meaningful way with Windows users, because all my formatting is screwed.

LibreOffice is not bad, but it really is not Microsoft Word and anyone who tries to tell you otherwise is lying.

I download and configure Thunderbird for mail and for a few good days it actually worked. However yesterday it disappeared from the side bar and I can’t find it anywhere. I am restricted to webmail and I am really hating Microsoft’s outlook experience.

The only thing that is different between this review and the one I wrote three years ago is that there are now games which actually work thanks to Steam. I have not tried this out yet because I am too stressed with the work backlog caused by having to work on Linux without regular software, but there is an element feeling that Linux is at last moving to a point where it can be a little bit useful.

So what are the main problems that Linux refuses to address? Usability, interface and compatibility.

I know Ubuntu is famous for its shit interface, and Gnome is supposed to be better, but both look and feel dated. I also hate Windows 8′s interface which requires you to use all your computing power to navigate through a touch screen tablet screen when you have neither. It should have been an opportunity for Open saucers to trump Windows with a nice interface – it wasn’t.

You would think that all the brains in the Linux community could come up with a simple easy to use interface which lets you have access to all the files you need without much trouble. The problem here is that Linux fans like to tinker they don’t want usability and they don’t have problems with command screens. Ordinary users, particularly more recent generations will not go near a command screen.

Compatibly issues for games has been pretty much resolved, but other key software is missing and Linux operators do not seem keen to get them on board.

I do a lot of layout and graphics work. When you complain about not being able to use Photoshop, Linux fanboys proudly point to GIMP and say that does the same things. You want to grab them down the throat and stuff their heads down the loo and flush. GIMP does less than a tenth of what Photoshop can do and it does it very badly. There is nothing that can do what CS or any real desktop publishers can do available on Linux.

Proprietary software designed for real people using a desktop tends to trump anything open saucy, even if it is producing a technology marvel.

So in all these years, Linux has not attempted to fix any of the problems which have effectively crippled it as a desktop product.

I will look forward to next week when the new PC arrives and I will not need another Ubuntu desktop experience. Who knows maybe they will have sorted it in three years time again.

Courtesy-Fud

 

LG, Huawei Soon To Offer High-end Smartphones

March 13, 2015 by mphillips  
Filed under Mobile

HTC and Samsung Electronics impressed Mobile World Congress attendees with new high-end smartphones, but they won’t be the only game in town for long: LG Electronics and Huawei Technologies are gearing up to announce new devices next month.

The shortage of new flagship smartphones at the show was a bit of a disappointment. But for those who weren’t entirely convinced by the HTC One M9, Samsung’s Galaxy S6 or the Galaxy S6 edge, more devices are on the way for buyers who aren’t afraid of pricier products.

The most highly anticipated is the successor to the LG G3, which unsurprisingly is expected to be called the G4. LG has so far kept quiet on when the smartphone will be unveiled, but an event is expected to take place in April. To steal some of Samsung’s thunder, the company would do well to at least start posting teasers before April 10, which is when the Galaxy S6 and S6 edge go on sale.

In light of the growing focus on design at Mobile World Congress, it wouldn’t be surprising if LG uses better materials for the G4 than it did for the G3. But don’t necessarily bet on a nice high-end, all-metal design or a metal frame combined with a glass back (which the Galaxy S6 has).

The G3 might be made of plastic but it looked much better than the Galaxy S5. So, LG isn’t under as much pressure as Samsung was to update the looks of its flagship. Also, sticking with plastic allows the company to keep the price down.

The specifications for LG’s new smartphone are the subject of multiple rumors, and include a screen with a 1620 x 2880 pixel resolution. But I am keeping my fingers crossed for a 5.3-inch screen that keeps the G3′s 1440 by 2560 pixel resolution.

That would mean shrinking the screen size from 5.5 to 5.3 inches, which might seem like a strange move, but to me the G3 feels a bit too wide. Also, LG has shown it isn’t averse to the concept: the G Flex2 has a 5.5-inch screen instead of the 6-inch screen on the G Flex.

While LG is quiet on its plans for the G4 launch, Huawei has started to post teasers for an event on April 8. The date likely isn’t a random pick, since the company is expected to present the P8. It also comes before the Samsung ship date.

 

 

SUSE Goes OpenStack Cloud 5

March 13, 2015 by Michael  
Filed under Computing

SUSE has released OpenStack Cloud 5, the latest version of the its infrastructure-as-a-service private cloud distro.

Version 5 adds the OpenStack brand front and centre, and its credentials are based on the latest Juno build of the OpenStack open source platform.

This version includes enhanced networking flexibility, with additional plug-ins available and the addition of distributed virtual routing. This enables individual computer nodes to handle routing tasks together, or if needs be, clustering together.

Increased operational efficiency comes in the form of a new seamless integration with existing servers running outside the cloud. In addition, log collection is centralized into a single view.

As you would expect, SUSE OpenStack 5 is designed to fit perfectly alongside the company’s other products, including the recently launched Suse Enterprise Storage and Suse Linux Enterprise Server 12 as well as nodes from earlier versions.

Deployment has also been simplified as part of a move to standardise “as-a-service” models.

Also included is the company’s new Sahara data processing project designed to run Hadoop and Spark on top of OpenStack without degradation. MapR has released support for its own service by way of a co-branded plug-in.

“Furthering the growth of OpenStack enterprise deployments, Suse OpenStack Cloud makes it easier for customers to realise the benefits of a private cloud, saving them money and time they can use to better serve their own customers and business,” said Brian Green, managing director, UK and Ireland, at Suse.

“Automation and high availability features translate to simplicity and efficiency in enterprise data centers.”

Suse OpenStack Cloud 5 becomes generally available from today.

Courtesy-TheInq

 

Will AMD FreeSync Appear Next Week?

March 12, 2015 by Michael  
Filed under Computing

While AMD FreeSync capable monitors are now available in select regions, AMD has given a short update regarding it saying that the FreeSync driver will be coming on March 19th.

According to AMD, FreeSync monitors are now available in some countries in EMEA (Europe, Middle East and Africa) region and since “gamers are excited to bring home an incredibly smooth and tearing-free PC gaming experience powered by AMD Radeon GPUs and AMD A-Series APUs”, AMD has announced that FreeSync-capable driver for single-GPU configurations will be available on March 19th. Unfortunately, those running on AMD Crossfire system will have to wait until April.

Plenty of manufacturers, including Acer, LG, BenQ, Iiyama anmd many more, have already announced their own FreeSync monitors so there will be plenty of choice when it comes to screen size and resolution.

In case you missed it earlier, FreeSync is AMD’s response to Nvidia G-Sync and syncs the refresh rate of the monitor with the rendering rate of the AMD Radeon GPU, thus removing screen tearing and reduce stuttering in games. We had a chance to check it out during CES 2015 and it looked pretty good.

Most manufacturers announced that their FreeSync monitors will be available during this month so finally we will have a chance to see AMD’s FreeSync push in retail/e-tail.

Courtesy-Fud

MediaTek To Go With AMD GPUs

March 11, 2015 by Michael  
Filed under Computing

One of the hottest things we learned at the Mobile World Congress is that MediaTek is working with AMD on mobile SoC graphics.

This is a big deal for both companies, as this means that AMD is getting back into the ultra-low power graphics market, while MediaTek might finally get faster graphics and gain more appeal in the high end segment. The choice of ARM Mali or Imaginations Technologies GPUs is available for anyone, but as most of you know Qualcomm has its own in-house Adreno graphics, while Nvidia uses ultra-low power Maxwell GPUs for its latest SoCs.

Since Nvidia exited the mobile phone business, it is now a two horse race between the ever dominant Qualcomm and fast growing MediaTek. The fact that MediaTek will get AMD graphics just adds fuel to the fire.

We have heard that key AMD graphics people are in continuous contact with MediaTek and that they have been working on an SoC graphics solution for a while.

MediaTek can definitely benefit from faster graphics, as the recently pictured tablet SoC MT8173 powered by two Cortex-A72 clocked up to 2.4GHz and two Cortex-A53 has PowerVR GX6250 graphics (two clusters). The most popular tablet chip Appel’s A8X has PowerVR Series 6XT GXA6850 (octa-core) which should end up significantly faster, but at the same time significantly more expensive.

MediaTek MT6795 a 28nm eight-core with a 2.2GHz clock and PowerVR G6200 GPU at 700 MHz, which is 100 MHz faster than one we tested on the Meizu MX4, which was one of the fastest SoCs until Qualcomm’s Snapdragon 810 came out in late February.

AMD and MediaTek declined to comment this upcoming partnership, but our industry sources know that they both have been working on new graphics for future chips that will be announced at a later date. It’s cool to see that AMD will return to this market, especially as the company sold of its Imageon graphics back in 2009 – for a lousy $65 million to Qualcomm. Imageon by ATI was the foundation for Adreno graphics.

We have been reassured some 18 months ago by some AMD senior graphics people, that “AMD didn’t forget how to make good ultra-low power graphics” and we guess that this cooperation proves that.

Courtesy-Fud

 

Fuel Cells Could Offer Hope For Smartphone Batteries

March 10, 2015 by mphillips  
Filed under Mobile

While processors, memory and other components have advanced in leaps and bounds, progress in smartphone battery technology has been much slower over the last couple of decades.

All those people you see charging their phones at airports, coffee shops and other public places are a testament to how often batteries die out during the day. So while engineers are fighting against basic chemistry and physics to improve current Lithium Ion cells, is there a better way to recharge?

One answer might be fuel cells, which generate electricity through a chemical reaction and provide instant power anywhere. Unlike portable battery packs, they don’t need to be charged in advance. You just need a fuel cell cartridge.

The promise has been there for some time. A few years ago, electronics companies tried to popularize fuel cells based on methanol but they failed to take off. This time around, the focus is on hydrogen.

As hydrogen gas enters the fuel cell through a membrane, the electrons are stripped off and travel through an external circuit — that’s the flow of electricity. Upon exiting the fuel cell, the electrons are recombined with the ionized hydrogen and oxygen from the air, so the only by-product is water.

There’s already one hydrogen fuel cell on the market, with another promised for this year. Both were on show at this year’s Mobile World Congress in Barcelona.

The main difference between them is in how the hydrogen is packaged so it’s safe to handle.

Intelligent Energy’s Upp stores it in a metal hydride compound that’s contained in a cartridge that snaps onto the fuel cell with magnets. Each cartridge is good for about 5 recharges of a smartphone and once exhausted should be returned to an exchange station for a fresh one.

The fuel cell, which is already on sale at Apple Stores in the U.K., costs £149 (US$228) and each cartridge is £6 (US$9). One downside: its heavy. The fuel cell and cartridges weigh 620 grams (1.3 pounds), and that’s not something you want to carry in your bag all the time.