Subscribe to:

Subscribe to :: TheGuruReview.net ::

Panasonic On The Hunt For Acquisition Targets

March 27, 2015 by mphillips  
Filed under Consumer Electronics

Japanese electronics giant Panasonic Corp said it is gearing up to spend 1 trillion yen ($8.4 billion) on acquisitions over the next four years, bolstered by a stronger profit outlook for its automotive and housing technology businesses.

Chief Executive Kazuhiro Tsuga said at a briefing on Thursday that Panasonic doesn’t have specific acquisition targets in mind for now. But he said the firm will spend around 200 billion yen on M&A in the fiscal year that kicks off in April alone, and pledged to improve on Panasonic’s patchy track record on big deals.

“With strategic investments, if there’s an opportunity to accelerate growth, you need funds. That’s the idea behind the 1 trillion yen figure,” he said. Tsuga has spearheaded a radical restructuring at the Osaka-based company that has made it one of the strongest turnaround stories in Japan’s embattled technology sector.

Tsuga previously told Reuters that company was interested in M&A deals in the European white goods market, a sector where Panasonic has comparatively low brand recognition.

The firm said on Thursday it’s targeting operating profit of 430 billion yen in the next fiscal year, up nearly 25 percent from the 350 billion yen it expects for the year ending March 31.

Panasonic’s earnings have been bolstered by moving faster than peers like Sony Corp and Sharp Corp to overhaul business models squeezed by competition from cheaper Asian rivals and caught flat-footed in a smartphone race led by Apple Inc and Samsung Electronics. Out has gone reliance on mass consumer goods like TVs and smartphones, and in has come a focus on areas like automotive technology and energy-efficient home appliances.

Tsuga also sought to ease concerns that an expensive acquisition could set back its finances, which took years to recover from the deal agreed in 2008 to buy cross-town rival Sanyo for a sum equal to about $9 billion at the time.

 

 

Oracle Launches OpenStack Platform With Intel

March 27, 2015 by Michael  
Filed under Computing

Oracle and Intel have teamed up for the first demonstration of carrier-grade network function virtualization (NFV), which will allow communication service providers to use a virtualized, software-defined model without degradation of service or reliability.

The Oracle-led project uses the Intel Open Network Platform (ONP) to create a robust service over NFV, using intelligent direction of software to create viable software-defined networking that replaces the clunky equipment still prevalent in even the most modern networks.

Barry Hill, Oracle’s global head of NFV, told The INQUIRER: “It gets us over one of those really big hurdles that the industry is desperately trying to overcome: ‘Why the heck have we been using this very tightly coupled hardware and software in the past if you can run the same thing on standard, generic, everyday hardware?’. The answer is, we’re not sure you can.

“What you’ve got to do is be smart about applying the right type and the right sort of capacity, which is different for each function in the chain that makes up a service.

“That’s about being intelligent with what you do, instead of making some broad statement about generic vanilla infrastructures plugged together. That’s just not going to work.”

Oracle’s answer is to use its Communications Network Service Orchestration Solution to control the OpenStack system and shrink and grow networks according to customer needs.

Use cases could be scaling out a carrier network for a rock festival, or transferring network priority to a disaster recovery site.

“Once you understand the extent of what we’ve actually done here, you start to realize just how big an announcement this is,” said Hill.

“On the fly, you’re suddenly able to make these custom network requirements instantly, just using off-the-shelf technology.”

The demonstration configuration optimizes the performance of an Intel Xeon E5-2600 v3 processor designed specifically for networking, and shows for the first time a software-defined solution which is comparable to the hardware-defined systems currently in use.

In other words, it can orchestrate services from the management and orchestration level right down to a single core of a single processor, and then hyperscale it using resource pools to mimic the specialized characteristics of a network appliance, such as a large memory page.

“It’s kind of like the effect that mobile had on fixed line networks back in the mid-nineties where the whole industry was disrupted by who was providing the technology, and what they were providing,” said Hill.

“Suddenly you went from 15-year business plans to five-year business plans. The impact of virtualization will have the same level of seismic change on the industry.”

Today’s announcement is fundamentally a proof-of-concept, but the technology that powers this kind of next-generation network is already evolving its way into networks.

Hill explained that carrier demand had led to the innovation. “The telecoms industry had a massive infrastructure that works at a very slow pace, at least in the past,” he said.

“However, this whole virtualization push has really been about the carriers, not the vendors, getting together and saying: ‘We need a different model’. So it’s actually quite advanced already.”

NFV appears to be the next gold rush area for enterprises, and other consortium are expected to make announcements about their own solutions within days.

The Oracle/Intel system is based around OpenStack, and the company is confident that it will be highly compatible with other systems.

The ‘Oracle Communications Network Service Orchestration Solution with Enhanced Platform Awareness using the Intel Open Network Platform’ – or OCNSOSWEPAUTIONP as we like to think of it – is currently on display at Oracle’s Industry Connect event in Washington DC.

The INQUIRER wonders whether there is any way the marketing department can come up with something a bit more catchy than OCNSOSWEPAUTIONP before it goes on open sale.

Courtesy-TheInq

 

USB 3.1 To Arrive With New Desktops Later This Year

March 27, 2015 by mphillips  
Filed under Computing

The emerging USB 3.1 standard is on track to reach desktops as hardware companies release motherboards with ports that can transfer data twice as fast as the previous USB technology.

MSI recently announced a 970A SLI Krait motherboard that will support the AMD processors and the USB 3.1 protocol. Motherboards with USB 3.1 ports have also been released by Gigabyte, ASRock and Asus, but those boards support Intel chips.

USB 3.1 can shuffle data between a host device and peripheral at 10Gbps, which is two times faster than USB 3.0. USB 3.1 is also generating excitement for the reversible Type-C cable, which is the same on both ends so users don’t have to worry about plug orientation.

The motherboards with USB 3.1 technology are targeted at high-end desktops. Some enthusiasts like gamers seek the latest and greatest technologies and build desktops with motherboards sold by MSI, Asus and Gigabyte. Many of the new desktop motherboards announced have the Type-C port interface, which is also in recently announced laptops from Apple and Google.

New technologies like USB 3.1 usually first appear in high-end laptops and desktops, then make their way down to low-priced PCs, said Dean McCarron, principal analyst of Mercury Research.

PC makers are expected to start putting USB 3.1 ports in more laptops and desktops starting later this year.

 

 

 

Microsoft Confirms Windows 10 Will Support 8K Resolution

March 27, 2015 by Michael  
Filed under Computing

Software King of the World Microsoft’s Windows 10 operating system will support screen resolutions that will not be available on commercial displays for years.

At the WinHEC conference Microsoft revealed that Windows 10 will support 8K (7680*4320) resolution for monitors, which is unlikely show up on the market this year or next.

It also showed off minimum and maximum resolutions supported by its upcoming Windows 10. It looks like the new operating system will support 6″+ phone and tablet screens with up to 4K (3840*2160) resolution, 8″+ PC displays with up to 4K resolution and 27″+ monitors with 8K (7680*4320) resolution.

To put this in some perspective, the boffins at the NHK (Nippon H?s? Ky?kai, Japan Broadcasting Corp.) think that 8K ultra-high-definition television format will be the last 2D format as the 7680*4320 resolution (and similar resolution) is the highest 2D resolution that the human eye can process.

This means that 8K and similar resolutions will stay around for a long time and it makes sense to add their support to hardware and software.

NHK is already testing broadcasting in 8K ultra-high-definition resolutions, VESA has ratified DisplayPort and embedded DisplayPort standards to connect monitors with up to 8K resolution to graphics adapters and a number of upcoming games will be equipped for textures for 8K UHD displays.

However monitors that support 8K will not be around for some time because display makers will have to produce new types of panels for them.

Redmond will be ready for the advanced UHD monitors well before they hit the market. Many have criticized Microsoft for poor support of 4K UHD resolutions in Windows 8.

Courtesy-Fud

 

AMD Shows Plans For ARM Servers

March 27, 2015 by Michael  
Filed under Computing

Buried in AMD’s shareholders’ report, there was a some suprising detail about the outfit’s first ARM 64-bit server SoCs.

For those who came in late, they are supposed to be going on sale in the first half of 2015.

We know that the ARM Cortex-A57 architecture based SoC has been codenamed ‘Hierofalcon.’

AMD started sampling these Embedded R-series chips last year and is aiming to release the chipset in the first half of this year for embedded data center applications, communications infrastructure, and industrial solutions.

But it looks like the Hierofalcon SoC will include eight Cortex-A57 cores with 4MB L2 cache and will be manufactured on a 28nm process. It will support two 64-bit DDR3/4 memory channels with ECC up to 1866MHz and up to 128GB per CPU. Connectivity options will include two 10GbE KR, 8x SATA 3 6Gb/s, 8 lanes PCIe Gen 3, SPI, UART, and I2C interfaces. The chip will have a TDP between 15 to 30W.

The SOC ranges between a TDP of 15 – 30 W. The highly integrated SoC includes 10 Gb KR Ethernet and PCI-Express Gen 3 for high-speed network connectivity, making it ideal for control plane applications. The chip also features a dedicated security processor which enables AMD’s TrustZone technology for enhanced security. There’s also a dedicated cryptographic security co-processor on-board, aligning to the increased need for networked, secure systems.

Soon after Hierofalcon is out, AMD will be launching the SkyBridge platform that will feature interchangeable 64-bit ARM and x86 processors. Later in 2016, the company will be launching the K12 chip, its custom high performance 64-bit ARM core.

Courtesy-Fud

Facebook Opening Parse For IoT Development

March 27, 2015 by mphillips  
Filed under Around The Net

Facebook is opening up Parse, its suite of back-end software development tools, to create Internet of Things apps for items like smart home appliances and activity trackers.

By making Parse available for IoT, Facebook hopes to strengthen its ties to a wider group of developers in a growing industry via three new software development kits aimed specifically at IoT, unveiled Wednesday at the company’s F8 developer conference in San Francisco.

The tools are aimed at making it easier for outside developers to build apps that interface with Internet-connected devices. Garage door manufacturer Chamberlain, for example, uses Parse for its app to let people open and lock their garage door from their smartphones.

Or, hypothetically, the maker of a smart gardening device could use Parse to incorporate notifications into their app to remind the user to water their plants, said Ilya Sukhar, CEO of Parse, during a keynote talk at F8.

Facebook bought Parse in 2013, putting itself in the business of selling application development tools. Parse provides a hosted back-end infrastructure to help third party developers build their apps. Over 400,000 developers have built apps with Parse, Sukhar said on Wednesday.

Parse’s new SDKs are available on GitHub as well as on Parse’s site.

 

Verizon To Bolster 100G Metro Fiber-optic Network

March 26, 2015 by mphillips  
Filed under Consumer Electronics

Verizon has announced new technology to bolster its super-fast 100 Gbps fiber-optic network serving metro areas, but didn’t reveal where the work will be done or other details.

The vague announcement raised the question of whether Verizon is simply trying to show its competitive value against Google and AT&T, which have both announced fiber Internet services in a number of cities.

“I think Verizon is trying to play catch up to the others without saying it that way,” said independent analyst Jeff Kagan. “The only question I still have is will Verizon be a real competitor or is this mostly just talk to cover their butts in the rapidly changing marketplace?”

What Verizon did disclose in a news release was that it will be modernizing undisclosed portions of its so-called 100G (for 100 Gbps) metro optical network using packet-optimized networking gear from Ciena and Cisco. Testing and deployment of the Ciena 6500 optical switch and Cisco’s Network Covergence System will happen this year, with plans to go live in 2016. /

“We are not announcing specific geographies at this time,” Verizon spokeswoman Lynn Staggs said in an email. She said the new equipment is not directly related to fiber connections to the premises of homes or businesses. By comparison, both Google Fiber and AT&T GigaPower are designed with 1 Gbps connections to homes, schools and businesses in mind.

Staggs said Verizon is upgrading connectivity between central Verizon offices and the backbone network. On top of that service, there is generally an “access” network for the last mile to connect the customer and the metro network, she added.

No matter how Verizon describes the ultimate purpose of its metro network, it is clear to analysts and others that Verizon’s metro upgrades could be used to prepare for last-mile fiber connections to businesses, schools and even homes to take on Google and AT&T directly. “Deploying a new coherent, optimized and highly scalable metro network means Verizon stays ahead of the growth trajectory while providing an even more robust network infrastructure for future demand,” said Lee Hicks, vice president of Verizon network planning, in a statement.

 

 

Google Said To Be Devolping Bill Payment Service For Gmail

March 26, 2015 by mphillips  
Filed under Around The Net

Google reportedly is working on a service to allow that will allow users to pay their bills from their Gmail accounts.

The service, dubbed Pony Express, would ask users to provide personal information, including credit card and Social Security numbers, to a third-party company that would verify their identity, according to a Re/code report on Tuesday.

Google also would work with vendors that distribute bills on behalf of service providers like insurance companies, telecom carriers and utilities, according to the article, which was based on a document seen by Re/code that describes the service.

It’s not clear whether Pony Express is the actual name of the service or if Google will change the name once it launches. It’s planned to launch by the end of the year, according to the report.

A Google spokeswoman declined to comment.

A handful of vendors such as Intuit, Invoicera and BillGrid already offer e-billing payment and invoicing software. Still, a Google service, especially one within Gmail, could be useful and convenient to consumers if the company is able to simplify the online payment process.

A benefit for Google could be access to valuable data about people’s e-commerce activities, although there would be privacy issues to sort out. Google already indexes people’s Gmail messages for advertising purposes.

Plus, the service could give Google an entry point into other areas of payment services. The company has already launched a car insurance shopping servicefor California residents, which it plans to expand to other states.

It’s unclear who Google’s partners would be for the service, but screen shots published by Re/Code show Cascadia Financial, a financial planning company, and food delivery service GreatFoods.

 

 

Azul Goes Java Embedded

March 26, 2015 by Michael  
Filed under Computing

Azul Systems, the company behind the wildly popular Zing and Zulu runtimes for Java, has been discussing its latest product, Zulu Embedded.

Azul specializes in bespoke open source Java runtimes and has announced that it is expanding into embedded product lines.

Scott Sellers, CEO and co-founder, and Howard Green, VP of marketing, were keen to extol the virtues of an embedded system.

“If you go with an Oracle system, not only do you have to pay a license fee but you are restricted to off-the-peg solutions,” explains Sellers.

“Because we are an open source solution we can create exactly what the customer needs, then feed that expertise back into the community where it will eventually end up in the official builds of Java.”

Oracle now bases its products around the open source community before releasing its own stable, closed source editions, so Zulu Embedded will often contain cutting edge functionality which is not available to standard (and paying) Java users.

“Our products are built out of a customer need. It’s not just about cost, but about finding new ways to use the Java runtime, which is still the most popular programming language in the world, and creating ways of getting it to do new things,” says Green.

The arrival of Zulu Embedded will open a whole host of opportunities for Internet of Things (IoT) building, but Sellers is keen for the product to be seen as more than just an IoT platform.

“Of course, by creating customized solutions we are able to strip out the libraries that are unnecessary and make a more nimble runtime with a smaller footprint, which makes it ideal for the IoT, but there is far more to it than that – everything from routers, to set-top boxes to ATMs,” explains Green.

The product officially launches today, but has been subject to a significant amount of testing in the field with selected customers.

“In actual fact, it has been available on a limited basis since last September and there are already over two million units running Zulu Embedded in the field,” says Green.

The product will be monetized by offering enterprise-grade support options to customers, while the product itself is freely available.

“We see the end-of-life schedule of Java SE as a major selling point for our own product,” says Green.

Oracle’s support for Java SE 7 has already expired, and it’s another two years before version 8 also reaches end-of-life. Azul, meanwhile, remains committed to its open source products indefinitely.

“Compared to all the alternatives which are either limited in lifespan or have large upfront licensing costs, we’re sure that, combined with our ongoing support, we’re the right choice for anyone wanting flexible deployment of Java,” says Sellers.

Zulu Embedded works across a huge number of platforms, including Mac, Windows and Linux, on Intel and AMD x64 architectures with ARM compatibility to follow.

It is also compatible with physical servers such as Windows Server, hypervisors including VMware and Hyper-V and cloud solutions like Microsoft Azure, Red Hat, Suse and Docker.

For Java as a language, however, Zulu Embedded is something of a return to its roots.

“Sun Microsystems [the original owners of Java] were very successful in the embedded market and paved the way for the vast number of applications that already have a Java runtime. With the end of support for Java 7, many people will be looking at where to go next,” explains Sellars.

Consumer users of Java have repeatedly lashed out at Oracle for its use of bundleware in Java installations, which recently spread to Mac users.

Zulu is available immediately from the Azul website, along with details on working with the Embedded version.

We’ve come a long way in the past nine years, when Sun and Azul were counter-suing over patents. Today, open source is the beating heart of Java, though many won’t realize it.

Courtesy-TheInq

HP Takes Helion OpenStack Private

March 26, 2015 by Michael  
Filed under Computing

HP has announced its first off-the-shelf configured private cloud based on OpenStack and Cloud Foundry.

HP Helion Rack continues the Helion naming convention for HP’s cloud offerings, and will, it is hoped, help enterprise IT departments speed up cloud deployment by offering a solid template system and removing the months of design and build.

Helion Rack is a “complete” private cloud with integrated infrastructure-as-a-service and platform-as-a-service capabilities that mean it should be a breeze to get it working with cloud-dwelling apps.

“Enterprise customers are asking for private clouds that meet their security, reliability and performance requirements, while also providing the openness, flexibility and fast time-to-value they require,” said Bill Hilf, senior vice president of product management for HP Helion.

“HP Helion Rack offers an enterprise-class private cloud solution with integrated application lifecycle management, giving organisations the simplified cloud experience they want, with the control and performance they need.”

HP cites the key features of its product as rapid deployment, simplified management, easy scaling, workload flexibility, faster native-app development and, of course, the open architecture of OpenStack and Cloud Foundry, providing a vast support network for implementation, use cases and customisation.

The product is built on HP ProLiant DL servers, and is assembled by HP and configured with the HP Helion OpenStack and Development Platform. HP and its partners can then work alongside customers to find the best way to exploit the product knowing that it is up and running from day one.

HP Helion Rack will be available in April with prices varying by configuration. Finance is available for larger configurations.

Suse launched its own OpenStack Cloud 5 with Sahara data processing earlier this month, just one of many other implementations of OpenStack designed to help roll out the cloud revolution quickly to enterprises, but offering a complete 360 package is something that HP is pioneering.

 

Courtesy-TheInq

Lexmark Scoops Up Kofax For Nearly $1B

March 26, 2015 by mphillips  
Filed under Around The Net

Lexmark International Inc, known for its printers, said it plans to acquire Kofax Ltd in an about $1 billion deal that would double the size of its enterprise software business.

PC and printer makers have struggled in the recent past as companies reduced printing to cut costs and consumers shifted to mobile devices from PCs.

Hewlett-Packard Co plans to separate its computer and printer businesses from its corporate hardware and services operations this year.

Xerox Corp has also increasingly focused on IT services to make up for the falling sales of its copiers and printers.

Lexmark divested its inkjet printer business in 2013 and has since boosted its enterprise software business.

The Kofax deal will help the company’s Perceptive Software business achieve its revenue target of $500 million in 2016, Lexmark said.

The business makes software to scan everything from spreadsheets to medical images and provides services to banking, healthcare, insurance and retail companies. It contributed about 8 percent to Lexmark’s revenue in 2014 and has grown at more than 30 percent in the past two years.

Kofax provides data services to the financial, insurance and healthcare companies such as Citigroup Inc, Metlife Inc and Humana Inc.

Lexmark said it expects the deal to “significantly” expand operating margins in its enterprise software business, which would now be worth about $700 million. It will also add about 10 cents per share to the company’s adjusted profit in 2015.

 

 

Facebook To Open Messenger App To Third-Party Developers

March 25, 2015 by mphillips  
Filed under Around The Net

Facebook’s Messenger app mostly been used for keeping in touch with friends. Now people can also use it to send each other money. In the future, it could become a platform which other apps could use, if recent rumors prove true.

This Wednesday and Thursday at its F8 conference in San Francisco, Facebook will show off new tools to help third-party developers build apps, deploy them on Facebook and monetize them through Facebook advertising.

Among those tools might be a new service for developers to publish content or features of their own inside Messenger, according to a TechCrunch article. Facebook did not respond to requests for comment.

Such a service could make Messenger more useful, if the right developers sign on. Search features, photo tools or travel functions could be incorporated into Messenger and improve users’ chats around events or activities.

However, Messenger already lets users exchange money, and it also handles voice calls. Layer on more services and Messenger could become bloated and inconvenient to use.

In other words, making Messenger a platform would be a gamble.

A more versatile Messenger could generate new user data Facebook could leverage for advertising, helping it counter a user growth slowdown in recent quarters. It could also boost Facebook’s perennial efforts to increase participants in its developer platform and the number of users of its third-party apps.

Even if Facebook doesn’t turn Messenger into a platform at F8, it will likely do so in the future, said John Jackson, an IDC analyst focused on mobile business strategies. For the same reasons Facebook might turn Messenger into a platform, it could do the same for other apps like WhatsApp or Instagram, he said.

“The objective is to enrich and multiply the nature of interactions on the platform,” providing valuable data along the way, he said.

 

Broadband Providers File Suit Against FCC

March 25, 2015 by mphillips  
Filed under Around The Net

Several U.S. broadband providers have filed lawsuits against the Federal Communications Commission’s recently approved net neutrality rules, launching what is a expected to be a series of legal entanglements.

Broadband industry trade group USTelecom filed a lawsuit against the FCC in the U.S. Court of Appeals for the District of Columbia, which has in the past twice rejected the FCC’s net neutrality regulations.

The group argues the new rules are “arbitrary, capricious, and an abuse of discretion” and violate various laws, regulations and rulemaking procedures.

Texas-based Internet provider Alamo Broadband Inc challenged the FCC’s new rules in the U.S. Court of Appeals for the Fifth Circuit in New Orleans, making a similar argument.

The rules, approved in February and posted online on March 12, treat both wireless and wireline Internet service providers as more heavily regulated “telecommunications services,” more like traditional telephone companies.

Broadband providers are banned under the rules from blocking or slowing any traffic and from striking deals with content companies for smoother delivery of traffic to consumers.

USTelecom President Walter McCormick said in a statement that the group’s members supported enactment of “open Internet” principles into law but not using the new regulatory regime that the FCC chose.

“We do not believe the Federal Communications Commission’s move to utility-style regulation … is legally sustainable,” he said.

Industry sources have previously told Reuters that USTelecom and two other trade groups, CTIA-The Wireless Association and the National Cable and Telecommunications Association, were expected to lead the expected legal challenges.

Verizon Communications Inc, which won the 2010 lawsuit against the FCC, is likely to hold back from filing an individual lawsuit this time around, an industry source familiar with Verizon’s plan has told Reuters.

FCC officials have said they were prepared for lawsuits and the new rules were on much firmer legal ground than previous iterations. The FCC said Monday’s petitions were “premature and subject to dismissal.”

 

 

Cisco Uncovers Malware Targeting POS Systems

March 25, 2015 by Michael  
Filed under Computing

Cisco has revealed details of a new point of sale (PoS) attack that could part firms from money and users from personal data.

The threat has been called PoSeidon by the Cisco team and comes at a time when eyes are on security breaches at firms like Target.

Cisco said in a blog post that PoSeidon is a new threat that has the ability to breach machines and scrape them for credit card information.

Credit card numbers and keylogger data is sent to an exfiltration server, while the mechanism is able to update itself and presumably evade some detection.

Cisco’s advice is for the industry to keep itself in order and network admins to keep systems up to date.

“PoSeidon is another malware targeting PoS systems that demonstrates the sophisticated techniques and approaches of malware authors. Attackers will continue to target PoS systems and employ various obfuscation techniques in an attempt to avoid detection,” said the firm.

“As long as PoS attacks continue to provide returns, attackers will continue to invest in innovation and development of new malware families. Network administrators will need to remain vigilant and adhere to industry best practices to ensure coverage and protection against advancing malware threats.”

The security industry agrees that PoS malware is a cash cow for cyber thieves, highlighting the importance of vigilance and keeping systems up to date.

“PoS malware has been extremely productive for criminals in the last few years, and there’s little reason to expect that will change anytime soon,” said Tim Erlin, director of product management at Tripwire.

“It’s no surprise that, as the information security industry updates tools to detect this malicious software, the authors will continue to adjust and innovate to avoid detection.

“Standards like the PCI Data Security Standard can only lay the groundwork for protecting retailers and consumers from these threats. A standard like PCI can specify a requirement for malware protection, but any specific techniques included may become obsolete as malware evolves.

“Monitoring for new files and changes to files can detect when malware installs itself on a system, as PoSeidon does.”

Courtesy-TheInq

SAP Mobile App May Have Allowed Hackers To Upload Fake Medical Data

March 24, 2015 by mphillips  
Filed under Around The Net

SAP has fixed two security flaws in a mobile medical app, one of which could have allowed an attacker to upload fake patient data.

The issues were found in SAP’s Electronic Medical Records (EMR) Unwired, which stores clinical data about patients including lab results and images, said Alexander Polyakov, CTO of ERPScan, a company based in Palo Alto, Calif., that specializes in enterprise application security.

Researchers with ERPScan found a local SQL injection flaw that could allow other applications on a mobile device to get access to an EMR Unwired database. That’s not supposed to happen, as mobile applications are usually sandboxed to prevent other applications from accessing their data.

“For example, you can upload malware to the phone, and this malware will be able to get access to this embedded database of this health care application,” Polyakov said in a phone interview.

The company also found another issue in EMR Unwired, where an attacker could tamper with a configuration file and then change medical records stored on the server, according to an ERPScan advisory.

“You can send fake information about the medical records, so you can imagine what can be done after that,” Polyakov said. “You can say, ‘This patient is not ill’.”

SAP fixed both of the issues about a month ago, Polyakov said.

The German software giant also fixed another flaw about a week ago found by ERPScan researchers, which affected its mobile device management software, a mobile client that allows access to the company’s other business applications.