Subscribe to:

Subscribe to :: TheGuruReview.net ::

Panasonic On The Hunt For Acquisition Targets

March 27, 2015 by mphillips  
Filed under Consumer Electronics

Japanese electronics giant Panasonic Corp said it is gearing up to spend 1 trillion yen ($8.4 billion) on acquisitions over the next four years, bolstered by a stronger profit outlook for its automotive and housing technology businesses.

Chief Executive Kazuhiro Tsuga said at a briefing on Thursday that Panasonic doesn’t have specific acquisition targets in mind for now. But he said the firm will spend around 200 billion yen on M&A in the fiscal year that kicks off in April alone, and pledged to improve on Panasonic’s patchy track record on big deals.

“With strategic investments, if there’s an opportunity to accelerate growth, you need funds. That’s the idea behind the 1 trillion yen figure,” he said. Tsuga has spearheaded a radical restructuring at the Osaka-based company that has made it one of the strongest turnaround stories in Japan’s embattled technology sector.

Tsuga previously told Reuters that company was interested in M&A deals in the European white goods market, a sector where Panasonic has comparatively low brand recognition.

The firm said on Thursday it’s targeting operating profit of 430 billion yen in the next fiscal year, up nearly 25 percent from the 350 billion yen it expects for the year ending March 31.

Panasonic’s earnings have been bolstered by moving faster than peers like Sony Corp and Sharp Corp to overhaul business models squeezed by competition from cheaper Asian rivals and caught flat-footed in a smartphone race led by Apple Inc and Samsung Electronics. Out has gone reliance on mass consumer goods like TVs and smartphones, and in has come a focus on areas like automotive technology and energy-efficient home appliances.

Tsuga also sought to ease concerns that an expensive acquisition could set back its finances, which took years to recover from the deal agreed in 2008 to buy cross-town rival Sanyo for a sum equal to about $9 billion at the time.

 

 

Oracle Launches OpenStack Platform With Intel

March 27, 2015 by Michael  
Filed under Computing

Oracle and Intel have teamed up for the first demonstration of carrier-grade network function virtualization (NFV), which will allow communication service providers to use a virtualized, software-defined model without degradation of service or reliability.

The Oracle-led project uses the Intel Open Network Platform (ONP) to create a robust service over NFV, using intelligent direction of software to create viable software-defined networking that replaces the clunky equipment still prevalent in even the most modern networks.

Barry Hill, Oracle’s global head of NFV, told The INQUIRER: “It gets us over one of those really big hurdles that the industry is desperately trying to overcome: ‘Why the heck have we been using this very tightly coupled hardware and software in the past if you can run the same thing on standard, generic, everyday hardware?’. The answer is, we’re not sure you can.

“What you’ve got to do is be smart about applying the right type and the right sort of capacity, which is different for each function in the chain that makes up a service.

“That’s about being intelligent with what you do, instead of making some broad statement about generic vanilla infrastructures plugged together. That’s just not going to work.”

Oracle’s answer is to use its Communications Network Service Orchestration Solution to control the OpenStack system and shrink and grow networks according to customer needs.

Use cases could be scaling out a carrier network for a rock festival, or transferring network priority to a disaster recovery site.

“Once you understand the extent of what we’ve actually done here, you start to realize just how big an announcement this is,” said Hill.

“On the fly, you’re suddenly able to make these custom network requirements instantly, just using off-the-shelf technology.”

The demonstration configuration optimizes the performance of an Intel Xeon E5-2600 v3 processor designed specifically for networking, and shows for the first time a software-defined solution which is comparable to the hardware-defined systems currently in use.

In other words, it can orchestrate services from the management and orchestration level right down to a single core of a single processor, and then hyperscale it using resource pools to mimic the specialized characteristics of a network appliance, such as a large memory page.

“It’s kind of like the effect that mobile had on fixed line networks back in the mid-nineties where the whole industry was disrupted by who was providing the technology, and what they were providing,” said Hill.

“Suddenly you went from 15-year business plans to five-year business plans. The impact of virtualization will have the same level of seismic change on the industry.”

Today’s announcement is fundamentally a proof-of-concept, but the technology that powers this kind of next-generation network is already evolving its way into networks.

Hill explained that carrier demand had led to the innovation. “The telecoms industry had a massive infrastructure that works at a very slow pace, at least in the past,” he said.

“However, this whole virtualization push has really been about the carriers, not the vendors, getting together and saying: ‘We need a different model’. So it’s actually quite advanced already.”

NFV appears to be the next gold rush area for enterprises, and other consortium are expected to make announcements about their own solutions within days.

The Oracle/Intel system is based around OpenStack, and the company is confident that it will be highly compatible with other systems.

The ‘Oracle Communications Network Service Orchestration Solution with Enhanced Platform Awareness using the Intel Open Network Platform’ – or OCNSOSWEPAUTIONP as we like to think of it – is currently on display at Oracle’s Industry Connect event in Washington DC.

The INQUIRER wonders whether there is any way the marketing department can come up with something a bit more catchy than OCNSOSWEPAUTIONP before it goes on open sale.

Courtesy-TheInq

 

USB 3.1 To Arrive With New Desktops Later This Year

March 27, 2015 by mphillips  
Filed under Computing

The emerging USB 3.1 standard is on track to reach desktops as hardware companies release motherboards with ports that can transfer data twice as fast as the previous USB technology.

MSI recently announced a 970A SLI Krait motherboard that will support the AMD processors and the USB 3.1 protocol. Motherboards with USB 3.1 ports have also been released by Gigabyte, ASRock and Asus, but those boards support Intel chips.

USB 3.1 can shuffle data between a host device and peripheral at 10Gbps, which is two times faster than USB 3.0. USB 3.1 is also generating excitement for the reversible Type-C cable, which is the same on both ends so users don’t have to worry about plug orientation.

The motherboards with USB 3.1 technology are targeted at high-end desktops. Some enthusiasts like gamers seek the latest and greatest technologies and build desktops with motherboards sold by MSI, Asus and Gigabyte. Many of the new desktop motherboards announced have the Type-C port interface, which is also in recently announced laptops from Apple and Google.

New technologies like USB 3.1 usually first appear in high-end laptops and desktops, then make their way down to low-priced PCs, said Dean McCarron, principal analyst of Mercury Research.

PC makers are expected to start putting USB 3.1 ports in more laptops and desktops starting later this year.

 

 

 

Microsoft Confirms Windows 10 Will Support 8K Resolution

March 27, 2015 by Michael  
Filed under Computing

Software King of the World Microsoft’s Windows 10 operating system will support screen resolutions that will not be available on commercial displays for years.

At the WinHEC conference Microsoft revealed that Windows 10 will support 8K (7680*4320) resolution for monitors, which is unlikely show up on the market this year or next.

It also showed off minimum and maximum resolutions supported by its upcoming Windows 10. It looks like the new operating system will support 6″+ phone and tablet screens with up to 4K (3840*2160) resolution, 8″+ PC displays with up to 4K resolution and 27″+ monitors with 8K (7680*4320) resolution.

To put this in some perspective, the boffins at the NHK (Nippon H?s? Ky?kai, Japan Broadcasting Corp.) think that 8K ultra-high-definition television format will be the last 2D format as the 7680*4320 resolution (and similar resolution) is the highest 2D resolution that the human eye can process.

This means that 8K and similar resolutions will stay around for a long time and it makes sense to add their support to hardware and software.

NHK is already testing broadcasting in 8K ultra-high-definition resolutions, VESA has ratified DisplayPort and embedded DisplayPort standards to connect monitors with up to 8K resolution to graphics adapters and a number of upcoming games will be equipped for textures for 8K UHD displays.

However monitors that support 8K will not be around for some time because display makers will have to produce new types of panels for them.

Redmond will be ready for the advanced UHD monitors well before they hit the market. Many have criticized Microsoft for poor support of 4K UHD resolutions in Windows 8.

Courtesy-Fud

 

Facebook Opening Parse For IoT Development

March 27, 2015 by mphillips  
Filed under Around The Net

Facebook is opening up Parse, its suite of back-end software development tools, to create Internet of Things apps for items like smart home appliances and activity trackers.

By making Parse available for IoT, Facebook hopes to strengthen its ties to a wider group of developers in a growing industry via three new software development kits aimed specifically at IoT, unveiled Wednesday at the company’s F8 developer conference in San Francisco.

The tools are aimed at making it easier for outside developers to build apps that interface with Internet-connected devices. Garage door manufacturer Chamberlain, for example, uses Parse for its app to let people open and lock their garage door from their smartphones.

Or, hypothetically, the maker of a smart gardening device could use Parse to incorporate notifications into their app to remind the user to water their plants, said Ilya Sukhar, CEO of Parse, during a keynote talk at F8.

Facebook bought Parse in 2013, putting itself in the business of selling application development tools. Parse provides a hosted back-end infrastructure to help third party developers build their apps. Over 400,000 developers have built apps with Parse, Sukhar said on Wednesday.

Parse’s new SDKs are available on GitHub as well as on Parse’s site.

 

Google Said To Be Devolping Bill Payment Service For Gmail

March 26, 2015 by mphillips  
Filed under Around The Net

Google reportedly is working on a service to allow that will allow users to pay their bills from their Gmail accounts.

The service, dubbed Pony Express, would ask users to provide personal information, including credit card and Social Security numbers, to a third-party company that would verify their identity, according to a Re/code report on Tuesday.

Google also would work with vendors that distribute bills on behalf of service providers like insurance companies, telecom carriers and utilities, according to the article, which was based on a document seen by Re/code that describes the service.

It’s not clear whether Pony Express is the actual name of the service or if Google will change the name once it launches. It’s planned to launch by the end of the year, according to the report.

A Google spokeswoman declined to comment.

A handful of vendors such as Intuit, Invoicera and BillGrid already offer e-billing payment and invoicing software. Still, a Google service, especially one within Gmail, could be useful and convenient to consumers if the company is able to simplify the online payment process.

A benefit for Google could be access to valuable data about people’s e-commerce activities, although there would be privacy issues to sort out. Google already indexes people’s Gmail messages for advertising purposes.

Plus, the service could give Google an entry point into other areas of payment services. The company has already launched a car insurance shopping servicefor California residents, which it plans to expand to other states.

It’s unclear who Google’s partners would be for the service, but screen shots published by Re/Code show Cascadia Financial, a financial planning company, and food delivery service GreatFoods.

 

 

HP Takes Helion OpenStack Private

March 26, 2015 by Michael  
Filed under Computing

HP has announced its first off-the-shelf configured private cloud based on OpenStack and Cloud Foundry.

HP Helion Rack continues the Helion naming convention for HP’s cloud offerings, and will, it is hoped, help enterprise IT departments speed up cloud deployment by offering a solid template system and removing the months of design and build.

Helion Rack is a “complete” private cloud with integrated infrastructure-as-a-service and platform-as-a-service capabilities that mean it should be a breeze to get it working with cloud-dwelling apps.

“Enterprise customers are asking for private clouds that meet their security, reliability and performance requirements, while also providing the openness, flexibility and fast time-to-value they require,” said Bill Hilf, senior vice president of product management for HP Helion.

“HP Helion Rack offers an enterprise-class private cloud solution with integrated application lifecycle management, giving organisations the simplified cloud experience they want, with the control and performance they need.”

HP cites the key features of its product as rapid deployment, simplified management, easy scaling, workload flexibility, faster native-app development and, of course, the open architecture of OpenStack and Cloud Foundry, providing a vast support network for implementation, use cases and customisation.

The product is built on HP ProLiant DL servers, and is assembled by HP and configured with the HP Helion OpenStack and Development Platform. HP and its partners can then work alongside customers to find the best way to exploit the product knowing that it is up and running from day one.

HP Helion Rack will be available in April with prices varying by configuration. Finance is available for larger configurations.

Suse launched its own OpenStack Cloud 5 with Sahara data processing earlier this month, just one of many other implementations of OpenStack designed to help roll out the cloud revolution quickly to enterprises, but offering a complete 360 package is something that HP is pioneering.

 

Courtesy-TheInq

Lexmark Scoops Up Kofax For Nearly $1B

March 26, 2015 by mphillips  
Filed under Around The Net

Lexmark International Inc, known for its printers, said it plans to acquire Kofax Ltd in an about $1 billion deal that would double the size of its enterprise software business.

PC and printer makers have struggled in the recent past as companies reduced printing to cut costs and consumers shifted to mobile devices from PCs.

Hewlett-Packard Co plans to separate its computer and printer businesses from its corporate hardware and services operations this year.

Xerox Corp has also increasingly focused on IT services to make up for the falling sales of its copiers and printers.

Lexmark divested its inkjet printer business in 2013 and has since boosted its enterprise software business.

The Kofax deal will help the company’s Perceptive Software business achieve its revenue target of $500 million in 2016, Lexmark said.

The business makes software to scan everything from spreadsheets to medical images and provides services to banking, healthcare, insurance and retail companies. It contributed about 8 percent to Lexmark’s revenue in 2014 and has grown at more than 30 percent in the past two years.

Kofax provides data services to the financial, insurance and healthcare companies such as Citigroup Inc, Metlife Inc and Humana Inc.

Lexmark said it expects the deal to “significantly” expand operating margins in its enterprise software business, which would now be worth about $700 million. It will also add about 10 cents per share to the company’s adjusted profit in 2015.

 

 

Broadband Providers File Suit Against FCC

March 25, 2015 by mphillips  
Filed under Around The Net

Several U.S. broadband providers have filed lawsuits against the Federal Communications Commission’s recently approved net neutrality rules, launching what is a expected to be a series of legal entanglements.

Broadband industry trade group USTelecom filed a lawsuit against the FCC in the U.S. Court of Appeals for the District of Columbia, which has in the past twice rejected the FCC’s net neutrality regulations.

The group argues the new rules are “arbitrary, capricious, and an abuse of discretion” and violate various laws, regulations and rulemaking procedures.

Texas-based Internet provider Alamo Broadband Inc challenged the FCC’s new rules in the U.S. Court of Appeals for the Fifth Circuit in New Orleans, making a similar argument.

The rules, approved in February and posted online on March 12, treat both wireless and wireline Internet service providers as more heavily regulated “telecommunications services,” more like traditional telephone companies.

Broadband providers are banned under the rules from blocking or slowing any traffic and from striking deals with content companies for smoother delivery of traffic to consumers.

USTelecom President Walter McCormick said in a statement that the group’s members supported enactment of “open Internet” principles into law but not using the new regulatory regime that the FCC chose.

“We do not believe the Federal Communications Commission’s move to utility-style regulation … is legally sustainable,” he said.

Industry sources have previously told Reuters that USTelecom and two other trade groups, CTIA-The Wireless Association and the National Cable and Telecommunications Association, were expected to lead the expected legal challenges.

Verizon Communications Inc, which won the 2010 lawsuit against the FCC, is likely to hold back from filing an individual lawsuit this time around, an industry source familiar with Verizon’s plan has told Reuters.

FCC officials have said they were prepared for lawsuits and the new rules were on much firmer legal ground than previous iterations. The FCC said Monday’s petitions were “premature and subject to dismissal.”

 

 

Cisco Uncovers Malware Targeting POS Systems

March 25, 2015 by Michael  
Filed under Computing

Cisco has revealed details of a new point of sale (PoS) attack that could part firms from money and users from personal data.

The threat has been called PoSeidon by the Cisco team and comes at a time when eyes are on security breaches at firms like Target.

Cisco said in a blog post that PoSeidon is a new threat that has the ability to breach machines and scrape them for credit card information.

Credit card numbers and keylogger data is sent to an exfiltration server, while the mechanism is able to update itself and presumably evade some detection.

Cisco’s advice is for the industry to keep itself in order and network admins to keep systems up to date.

“PoSeidon is another malware targeting PoS systems that demonstrates the sophisticated techniques and approaches of malware authors. Attackers will continue to target PoS systems and employ various obfuscation techniques in an attempt to avoid detection,” said the firm.

“As long as PoS attacks continue to provide returns, attackers will continue to invest in innovation and development of new malware families. Network administrators will need to remain vigilant and adhere to industry best practices to ensure coverage and protection against advancing malware threats.”

The security industry agrees that PoS malware is a cash cow for cyber thieves, highlighting the importance of vigilance and keeping systems up to date.

“PoS malware has been extremely productive for criminals in the last few years, and there’s little reason to expect that will change anytime soon,” said Tim Erlin, director of product management at Tripwire.

“It’s no surprise that, as the information security industry updates tools to detect this malicious software, the authors will continue to adjust and innovate to avoid detection.

“Standards like the PCI Data Security Standard can only lay the groundwork for protecting retailers and consumers from these threats. A standard like PCI can specify a requirement for malware protection, but any specific techniques included may become obsolete as malware evolves.

“Monitoring for new files and changes to files can detect when malware installs itself on a system, as PoSeidon does.”

Courtesy-TheInq

SAP Mobile App May Have Allowed Hackers To Upload Fake Medical Data

March 24, 2015 by mphillips  
Filed under Around The Net

SAP has fixed two security flaws in a mobile medical app, one of which could have allowed an attacker to upload fake patient data.

The issues were found in SAP’s Electronic Medical Records (EMR) Unwired, which stores clinical data about patients including lab results and images, said Alexander Polyakov, CTO of ERPScan, a company based in Palo Alto, Calif., that specializes in enterprise application security.

Researchers with ERPScan found a local SQL injection flaw that could allow other applications on a mobile device to get access to an EMR Unwired database. That’s not supposed to happen, as mobile applications are usually sandboxed to prevent other applications from accessing their data.

“For example, you can upload malware to the phone, and this malware will be able to get access to this embedded database of this health care application,” Polyakov said in a phone interview.

The company also found another issue in EMR Unwired, where an attacker could tamper with a configuration file and then change medical records stored on the server, according to an ERPScan advisory.

“You can send fake information about the medical records, so you can imagine what can be done after that,” Polyakov said. “You can say, ‘This patient is not ill’.”

SAP fixed both of the issues about a month ago, Polyakov said.

The German software giant also fixed another flaw about a week ago found by ERPScan researchers, which affected its mobile device management software, a mobile client that allows access to the company’s other business applications.

 

 

Juniper Networks Goes OpenStack

March 24, 2015 by Michael  
Filed under Computing

Juniper and Mirantis are getting close, with news that they are to form a cloud OpenStack alliance.

The two companies have signed an engineering partnership that the companies believe will lead to a reliable, scalable software-defined networking solution.

Mirantis OpenStack will now inter-operate with Juniper Contrail Networking, as well as OpenContrail, an open source software-defined networking system.

The two companies have published a reference architecture for deploying and managing Juniper Contrail Networking with Mirantis OpenStack to simplify deployment and reduce the need for third-party involvement.

Based on OpenStack Juno, Mirantis OpenStack 6.0 will be enhanced by a Fuel plugin in the second quarter that will make it even easier to deploy large-scale clouds in house.

However, Mirantis has emphasized that the arrival of Juniper to the fold is not a snub to the recently constructed integration with VMware.

Nick Chase of Mirantis explained, “…with this Juniper integration, Mirantis will support BOTH VMware vCenter Server and VMware NSX AND Juniper Networks Contrail Networking. That means that even if they’ve got VMware in their environment, they can choose to use NSX or Contrail for their networking components.

“Of course, all of that begs the question, when should you use Juniper, and when should you use VMware? Like all great engineering questions, the answer is ‘it depends’. How you choose is going to be heavily influenced by your individual situation, and what you’re trying to achieve.”

Juniper outlined its goals for the tie-up as:

- Reduce cost by enabling service providers and IT administrators to easily embrace SDN and OpenStack technologies in their environments

- Remove the complexity of integrating networking technologies in OpenStack virtual data centres and clouds

- Increase the effectiveness of their operations with fully integrated management for the OpenStack and SDN environments through Fuel and Juniper Networks® Contrail SDN Controller

The company is keen to emphasise that this is not meant to be a middle finger at VMware, but rather a demonstration of the freedom of choice offered by open source software. However, it serves as another demonstration of how even the FOSS market is growing increasingly proprietary and competitive.

Courtesy-TheInq

 

Google Updates Android Smart Lock With On-body Detection

March 24, 2015 by mphillips  
Filed under Mobile

Google is adding a feature to Android’s smart lock that could significantly reduce the number of times users need to key in a passcode to unlock their phones.

On-body detection uses the accelerometer in the phone to detect when it’s being held or carried. If enabled, the feature requires a passcode the first time the phone is accessed but then keeps the device unlocked until it is placed down.

That means, for example, that users walking down the street won’t have to unlock the phone every time they take their phones out of their pockets.

The feature wasn’t widely announced by Google, but it began operating in some phones on Friday.

Like the other elements of smart lock, it should be used with caution as it can’t detect who is carrying the phone.

“If you unlock your device and hand it to someone else, your device also stays unlocked as long as the other person continues to hold or carry it,” reads a message displayed on phones with the new feature.

The smart lock feature was introduced with Android 5.0 KitKat and allows users to set zones around trusted places, such as a home or office, and Wi-Fi or Bluetooth devices, such as a computer or car radio. When the phone is in those zones it will remain unlocked once it’s been unlocked the first time.

It can also recognize faces and remain unlocked when it sees a trusted face.

 

 

 

 

NASA Testing Virtual Reality Smart Glasses

March 23, 2015 by mphillips  
Filed under Consumer Electronics

NASA is testing virtual reality smart glasses that may one day assist astronauts as they travel to an asteroid or even Mars.

The space agency is using glasses from Osterhout Design Group (ODG), a San Francisco-based company that develops wearables for enterprises and government use. NASA engineers and astronauts are set to test the company’s smart glasses, which are equipped with augmented reality and virtual reality technologies. The glasses are being tested using NASA applications and software.

“The intended purpose and usefulness of glasses like this are unlimited,” said Jay Bolden, a NASA spokesman, in an email to Computerworld. “Advanced glasses could aid in navigation, where cockpit displays are broadcast on the goggles in much the same way fighter pilot heads up displays operate today.”

Bolden also noted that astronauts on a journey to an asteroid or Mars could use the smart glasses to access chart, map and technical information, instead of having to carry many pounds of technical journals and papers with them.

“For a two-hour flight on a 737 from Cleveland to Dallas, each pilot carries 15 pounds of manuals and that weight isn’t really a big deal in the grand scheme,” he noted. “However, for a multiple-week mission to an asteroid or the moon, or a multi-year mission to Mars, every pound saved means additional life-critical supplies — food, water, oxygen, or fuel — can be shipped in their place.”

The smart glasses also could give more information to NASA engineers and scientists working on Earth.

“Real time applications also include the ability for ground support teams to see first hand what astronauts discover and video,” Bolden said. “Instead of bringing a 50-pound boulder back for ground analysis, the astronaut can use glasses to scan, measure ‎and catalog where it was found and then chip off a 5-pound sample for ground analysis.”

 

 

Medical Data Becoming More Valuable To Hackers

March 23, 2015 by mphillips  
Filed under Around The Net

The personal information stored in health care records fetches increasingly impressive sums on underground markets, making any company that stores such data a very attractive target for attackers.

“Hackers will go after anyone with health care information,” said John Pescatore, director of emerging security trends at the SANS Institute, adding that in recent years hackers have increasingly set their sights on EHRs (electronic health records).

With medical data, “there’s a bunch of ways you can turn that into cash,” he said. For example, Social Security numbers and mailing addresses can be used to apply for credit cards or get around corporate antifraud measures.

This could explain why attackers have recently targeted U.S. health insurance providers. Last Tuesday, Premera Blue Cross disclosed that the personal details of 11 million customers had been exposed in a hack that was discovered in January. Last month, Anthem, another health insurance provider, said that 78.8 million customer and employee records were accessed in an attack.

Both attacks exposed similar data, including names, Social Security numbers, birth dates, telephone numbers, member identification numbers, email addresses and mailing addresses. In the Premera breach, medical claims information was also accessed.

If the attackers try to monetize this information, the payout could prove lucrative.

Credentials that include Social Security numbers can sell for a couple of hundred dollars since the data’s lifetime is much longer compared to pilfered credit card numbers, said Matt Little, vice president of product development at PKWARE, an encryption software company with clients that include health care providers. Credit card numbers, which go for a few dollars, tend to work only for a handful of days after being reported stolen.