Subscribe to:

Subscribe to :: TheGuruReview.net ::

Microsoft Unveils Hologram Visor

January 23, 2015 by mphillips  
Filed under Consumer Electronics

Microsoft Corp surprised the tech world with the unveiling of a prototype hologram visor that can bring the Minecraft video game, Skype calls and even the landscape of Mars to three-dimensional life.

The veteran tech pioneer, which long ago lost the mantle of the world’s most inventive company, is making a bold play to regain that title in the face of stiff competition from Google Inc and Apple Inc.

Virtual or enhanced reality is the next frontier in computing interaction, with Facebook Inc focusing on its Oculus virtual reality headset and Google working on its Glass project.

Microsoft said its wire-free Microsoft HoloLens device will be available around the same time as Windows 10 this autumn. Industry analysts were broadly excited at the prospect, but skeptical that it could produce a working model at a mass-market price that soon.

“That was kind of a ‘Oh wow!’ moment,” said Mike Silver, an analyst at Gartner who tried out the prototype on Wednesday. “You would expect to see a relatively high-priced model this year or next year, then maybe it’ll take another couple of years to bring it down to a more affordable level.”

Microsoft does not have a stellar record of bringing ground-breaking technology to life. Its Kinect motion-sensing game device caused an initial stir but never gripped the popular imagination.

The company showed off a crude test version of the visor – essentially jerry-rigged wires and cameras pulled over the head – to reporters and industry analysts at a gathering at its headquarters near Seattle.

It did not allow any photographs or video of the experience, but put some images on its website.

 

 

ARM Develops IoT For Students

January 23, 2015 by Michael  
Filed under Computing

ARM has created a course to teach IoT skills to students at University College London (UCL)

The course is designed to encourage graduates in science, technology, engineering and maths (Stem) to seek careers in IT.

The IoT Education Kit will teach students how to use the Mbed IoT operating system to create smartphone apps that control mini-robots or wearable devices.

Students are expected to be interested in building their own IoT business, or joining IoT-focused enterprises like ARM. The course will also try to limit the number of Stem graduates pursuing non-technology careers.

ARM reported statistics from a 2012 study by Oxford Policy and Research revealing how many engineering graduates (36 percent of males, 51 percent of females), technology graduates (44 percent, 53 percent) and computer scientists (64 percent, 66 percent) end up with non-Stem jobs.

The IoT Education Kit will be rolled out by UCL’s Department of Electronics from September 2015, with a week-long module for full-time and continuing professional development students.

The Kit comprises a complete set of teaching materials, Mbed-enabled hardware boards made by Nordic Semiconductor, and software licensed from ARM. A second teaching module for engineering graduates is being developed for 2016.

“Students with strong science and mathematical skills are in demand and we need to make sure they stay in engineering,” said ARM CTO Mike Muller.

“The growth of the IoT gives us a great opportunity to prove to students why our profession is more exciting and sustainable than others.”

UCL professor Izzat Darwazeh also highlighted the importance of Stem skills, saying that “many students are not following through to an engineering career and that is a real risk to our long-term success as a nation of innovators”.

Courtesy-TheInq

LG Refutes Claims Of Overheating Qualcomm Phone Processors

January 23, 2015 by mphillips  
Filed under Mobile

South Korean smartphone maker LG Electronics Inc said on Thursday that it has not experienced any overheating problems with Qualcomm Inc’s new Snapdragon processor that is powering a curved-screen device going on sale later this month.

“I am very much aware of the various concerns in the market about the (Snapdragon) 810, but the chip’s performance is quite satisfactory,” Woo Ram-chan, LG vice president for mobile product planning, told reporters at a press event for the company’s G Flex2 smartphone.

The comment came after Bloomberg reported a day earlier that Samsung Electronics Co Ltd, the world’s top smartphone maker, decided not to use the new Qualcomm processor for the next flagship Galaxy S smartphone after the chip overheated during testing. Samsung and Qualcomm have declined to comment on the report, which cited unidentified sources.

Samsung is widely expected to unveil the new Galaxy S smartphone in early March, and Bloomberg reported that the Korean firm will use its own processors instead.

But LG’s Woo said on Thursday that internal tests for the G Flex2, powered by the new Qualcomm processor, show that the new product emits less heat than other existing devices. The new phone is scheduled to start selling in South Korea on Jan. 30.

“I don’t understand why there is a issue over heat,” he said.

 

 

Rumors Say Samsung Is Quietly Seeking To Buy Blackberry

January 23, 2015 by Michael  
Filed under Mobile

For a while, the rumor mill has manufactured hell on earth yarns claiming that Samsung is set to buy the Canadian smartphone maker Blackberry.

The deal always seems to fall through, and in any event has never happened.

However the Financial Post has found evidence that this time Samsung is actively pursuing a plan to take over or buy a significant stake in BlackBerry.

The story is still a rumour because both companies have denied such a plan may be in the works, but a document obtained by the Financial Post, prepared for Samsung by New York-based independent investment bank Evercore Partners, outlines the case for, and the potential structure of a possible purchase of BlackBerry.

The paper is a little elderly and was written in the last quarter of 2014, but a source familiar with the matter said that Samsung remains very interested in acquiring all or part of BlackBerry for the right price.

J.K. Shin, Samsung’s co-chief executive, told The Wall Street Journal that his company is in talks to use some of BlackBerry’s technology in the South Korean company’s devices, but is not interested in an acquisition. “We want to work with BlackBerry and develop this partnership, not acquire the company.”

But it appears that Samsung was caught off guard by a Reuters leak earlier this week. It had hoped it could move in quickly on BlackBerry, and the company’s share price would stay low. When the news went up and the share price rose its bid looked a little weak.

BlackBerry appears to have learned of the price Samsung was hoping to pay through the Reuters leak, before the company could make a formal offer. This is the sort of thing Samsung wanted to avoid.

In five years, BlackBerry thought the return on their turnaround strategy as implemented by John Chen was going to do better than the cash they will be receiving today.

Still, the source maintains that Samsung is still keen on making a deal happen. The talk earlier this week about Samsung extending its cooperation with BlackBerry, which was notably lacking in specifics, is “just setting it up,” the source said. “Samsung hasn’t walked away” from an acquisition. “They’re leaning towards it.”

Courtesy-Fud

Google, The Wireless Carrier?

January 23, 2015 by mphillips  
Filed under Mobile

Google has put in place the framework for its own cellular service by acquiring capacity on the networks of Sprint and T-Mobile USA, according to news reports.

The sprawling search company would sell the service directly to consumers, according to The Wall Street Journal, which cited unnamed sources. Tech news site The Information reported on the deals earlier this week.

Google is heavily involved in mobile through its Android operating system, the world’s most widely used mobile OS, as well as through selling mobile advertising, and is pushing to make more radio spectrum available for wireless services. But the partnerships with Sprint and T-Mobile would bring the company into the cellular business itself, offering Google phone plans directly to consumers.

The deals would make Google an MVNO (mobile virtual network operator), a carrier that doesn’t build or operate its own network but sells services that run on the partners’ infrastructure. Sprint is the third-largest U.S. mobile carrier and T-Mobile is the fourth largest.

As a powerful and well-heeled newcomer, Google might disrupt the cellular industry, just as it has the wired broadband business with its Google Fiberservice. The U.S. mobile industry has been wracked by new business models and falling prices in recent years.

It’s not clear whether the company will launch a full-scale national effort or a more limited rollout. There are terms in Google’s contract with Sprint that would allow for renegotiation if Google draws a huge number of subscribers, the Journal said.

 

 

 

Facebook Going After Hoaxes, Fake News Stories

January 22, 2015 by mphillips  
Filed under Around The Net

Facebook Inc said it has put measures in place  to clamp down on “hoaxes” and fake news stories that can spread rather rapidly  on its 1.35-billion member online social network.

The company said it had introduced an option to allow Facebookusers to flag a story as “purposefully fake or deceitful news” to reduce the distribution of news stories reported as hoaxes.

Facebook said it will not remove fake news stories from its website. Instead, the company’s algorithm, which determines how widely user posts are distributed, will take into account hoax reports.

“A post with a link to an article that many people have reported as a hoax or chose to delete will get reduced distribution in the News Feed,” Facebook explained.

Facebook has become an increasingly important source of news, with 30 percent of adults in the U.S. consuming news on the world’s largest social network, according to a 2013 study by the Pew Research Center in collaboration with the John S. and James L. Knight Foundation.

Facebook cited stories about dinosaur sightings and research supposedly proving the existence of Santa Claus as examples of fake news stories.

Facebook said “satirical” content, such as news stories “intended to be humorous, or content that is clearly labeled as satire,” should not be affected.

 

 

Samsung Dumped Qualcomm Processors In Next Galaxy S Phone

January 22, 2015 by mphillips  
Filed under Mobile

Samsung Electronics Co Ltd will not use Qualcomm Inc’s  processors for the next version of the South Korean technology giant’s flagship Galaxy S smartphone, according to Bloomberg.

Such an outcome would be a blow for Qualcomm’s prospects for 2015, with the company already having guided for weaker-than-usual annual revenue growth in a five-year outlook issued in November. Samsung, the world’s No.1 smartphone maker, has been one of the U.S. company’s top customers.

Qualcomm’s new Snapdragon 810 chip overheated during Samsung’s testing, Bloomberg reported. The South Korean company will use its own processors instead, Bloomberg said.

A Qualcomm spokesman declined to comment on the report. A Samsung spokeswoman said the company does not comment on rumours.

Analysts have said the Snapdragon 810 chip has been dealing with a variety of performance issues that may not be corrected in time for the launch of Samsung’s next Galaxy S smartphone.

The South Korean firm is widely expected to unveil the device on the sidelines of the Mobile World Congress trade show in early March. Samsung will need to ensure that the phone does not disappoint in order to keep its global market share from slipping further, analysts said.

Samsung has already used its own Exynos processors in flagship devices such as the Galaxy S5 to some extent, though analysts said Qualcomm’s Snapdragon chips were more widely used. Greater adoption of Exynos chips in Samsung smartphones would help boost sales for the struggling foundry business.

“Samsung will likely show off the new Galaxy S phone in about a month and a half, so one would have to assume that the chips have been tested a fair amount in order for them to be used,” said HMC Investment analyst Greg Roh.

 

 

The ESA Goes With Red Hat For Cloud Services

January 22, 2015 by Michael  
Filed under Computing

The European Space (ESA) has deployed a private, on-premise cloud platform designed to serve its community in Europe. The infrastructure is partly based on a custom version of Red Hat Enterprise Linux (RHEL).

The ESA Cloud needs to be constantly available to the space agency’s large user base, ensuring high levels of reliability and flexibility and the management capabilities of a modern IT environment, according to Red Hat.

Hosted applications include software development and testing, satellite data processing, document management and “more traditional” corporate IT services used during day-to-day operations.

The ESA Cloud infrastructure is based on systems from VCE, including a blade architecture with x86 CPUs, and cloud management software from Orange Business Services.

RHEL is one of the platforms supported within the ESA Cloud, and the space agency worked closely with Red Hat to customise the enterprise OS.

The customisation and implementation phase was particularly important, the ESA said, because its requirements are “dramatically” different to those of any other enterprise.

The scenarios Red Hat and the ESA IT team had to deal with were quite often “absolutely new”, the company stated.

The ESA Cloud is designed to provide complex virtual environments “within minutes” to end users, shortening the time needed to reach an organisation’s business and scientific targets.

Monitoring computing resources consumed in real time is another important feature of ESA’s private cloud, allowing the IT team to optimise the available capacity to support specific agency projects.

The first ESA Cloud data center is ready for production in Frascati, Italy, and the space agency has already completed a similar site in Darmstadt, Germany.

Future targets include increasing the number of available services, and disaster recovery capabilities to face “any possible large-scale calamity”.

Courtesy-TheInq

Do Game Developers Have Unrealistic Expectations?

January 22, 2015 by Michael  
Filed under Gaming

Over the last few years, the industry has seen budget polarization on an enormous scale. The cost of AAA development has ballooned, and continues to do so, pricing out all but the biggest warchests, while the indie and mobile explosions are rapidly approaching the point of inevitable over-saturation and consequential contraction. Stories about the plight of mid-tier studios are ten-a-penny, with the gravestones of some notable players lining the way.

For a company like Ninja Theory, in many ways the archetypal mid-tier developer, survival has been a paramount concern. Pumping out great games (Ninja Theory has a collective Metacritic average of 75) isn’t always enough. Revitalizing a popular IP like DMC isn’t always enough. Working on lucrative and successful external IP like Disney Infinity isn’t always enough. When the fence between indie and blockbuster gets thinner and thinner, it becomes ever harder to balance upon.

Last year, Ninja Theory took one more shot at the upper echelons. For months the studio had worked on a big budget concept which would sit comfortably alongside the top-level, cross-platform releases of the age: a massive, multiplayer sci-fi title that would take thousands of combined, collaborative hours to exhaust. Procedurally generated missions and an extensive DLC structure would ensure longevity and engagement. Concept art and pre-vis trailers in place, the team went looking for funding. Razor was on its way.

Except the game never quite made it. Funding failed to materialize, and no publisher would take the project on. It didn’t help that the search for a publishing deal arrived almost simultaneously with the public announcement of Destiny. Facing an impossible task, the team abandoned the project and moved on with other ideas. Razor joined a surprisingly large pile of games that never make it past the concept stage.

Sadly, it’s not a new story. In fact, at the time, it wasn’t even a news story. But this time Ninja Theory’s reaction was different. This was a learning experience, and learning experiences should be shared. Team lead and co-founder Tameem Antoniades turned the disappointment not just into a lesson, but a new company ethos: involve your audience at an early stage, retain control, fund yourself, aim high, and don’t compromise. The concept of the Independent AAA Proposition, enshrined in a GDC presentation give by Antoniades, was born.

Now the team has a new flagship prospect, cemented in this fresh foundation. In keeping with the theme of open development and transparency, Hellblade is being created with the doors to its development held wide open, with community and industry alike invited to bear witness to the minutiae of the process. Hellblade will be a cross-platform game with all of the ambition for which Ninja Theory is known, and yet it is coming from an entirely independent standpoint. Self-published and self-governed, Hellblade is the blueprint for Ninja Theory’s future.

“We found ourselves as being one of those studios that’s in the ‘squeezed middle’,” project lead Dominic Matthews says. “We’re about 100 people, so we kind of fall into that space where we could try to really diversify and work on loads of smaller projects, but indie studios really have an advantage over us, because they can do things with far lower overheads. We have been faced with this choice of, do we go really, really big with our games and become the studio that is 300 people or even higher than that, and try to tick all of these boxes that the blockbuster AAA games need now.

“We don’t really want to do that. We tried to do that. When we pitched Razor, which we pitched to big studios, that ultimately didn’t go anywhere. That was going to be a huge game; a huge game with a service that would go on for years and would be a huge, multiplayer experience. Although I’m sure it would have been really cool to make that, it kind of showed to us that we’re not right to try to make those kinds of games. Games like Enslaved – trying to get a game like that signed now would be impossible. The way that it was signed, there would be too much pressure for it to be…to have the whole feature set that justifies a $60 price-tag.

“That $60 price-tag means games have to add multiplayer, and 40 hours of gameplay minimum, and a set of characters that appeal to as many people as they possibly can. There’s nothing wrong with games that do that. There’s some fantastic games that do, AAA games. Though we do think that there’s another space that sits in-between. I think a lot of indie games are super, super creative, but they can be heavily stylised. They work within the context of the resources that people have.

“We want to create a game that’s like Enslaved, or like DMC, or like Heavenly Sword. That kind of third-person, really high quality action game, but make it work in an independent model.”

Cutting out the middle-man is a key part of the strategy. But if dealing with the multinational machinery of ‘big pubs’ is what drove Ninja Theory to make such widespread changes, there must surly have been some particularly heinous deals that pushed it over the edge?

“I think it’s just a reality of the way that those publisher/developer deals work,” Matthews says. “In order for a publisher to take a gamble on your game and on your idea, you have to give up a lot. That includes the IP rights. It’s just the realities of how things work in that space. For us, I think any developer would say the same thing, being able to retain your IP is a really important thing. So far, we haven’t been out to do that.

“With Hellblade, it’s really nice that we can be comfortable in the fact that we’re not trying to appeal to everyone. We’re not trying to hit unrealistic forecasts. Ultimately, I think a lot of games have unrealistic forecasts. Everyone knows that they’re unrealistic, but they have to have these unrealistic forecasts to justify the investment that’s going into development.

“Ultimately, a lot of games, on paper, fail because they don’t hit those forecasts. Then the studios and the people that made those games, they don’t get the chance to make any more. It’s an incredibly tough market. Yes, we’ve enjoyed working with our publishers, but that’s not to say that the agreements that developed are all ideal, because they’re not. The catalyst to us now being able to do this is really difficult distribution. We can break away from that retail $60 model, where every single game has to be priced that way, regardless of what it is.

Driven into funding only games that will comfortably shift five or six million units, Matthews believes that publishers have no choice but to stick to the safe bets, a path that eventually winnows down diversity to the point of stagnation, where only a few successful genres ever end up getting made: FPS, sports, RPG, maybe racing. Those genres become less and less distinct, while simultaneously shoe-horning in mechanics that prove popular elsewhere and shunning true innovation.

While perhaps briefly sustainable, Matthews sees that as a creative cul-de-sac. Customers, he feels, are too smart to put up with it.

“Consumers are going to get a bit wary of games that have hundreds of millions of dollars spent on them”

“I think consumers are going to get a bit wary. Get a bit wary of games that have hundreds of millions of dollars spent on them. I think gamers are going to start saying, ‘For what?’

“The pressures are for games to appeal to more and more people. It used to be if you sold a million units, then that was OK. Then it was three million units. Now it’s five million units. Five million units is crazy. We’ve never sold five million units.”

It’s not just consumers who are getting wise, though. Matthews acknowledges that the publishers also see the dead-end approaching.

“I think something has to be said for the platform holders now. Along with digital distribution, the fact that the platform holders are really opening their doors and encouraging self-publishing and helping independent developers to take on some of those publishing responsibilities, has changed things for us. I think it will change things for a lot of other developers. “Hellblade was announced at the GamesCom Playstation 4 press conference. My perception of that press conference was that the real big hitters in that were all independent titles. It’s great that the platform holders have recognised that. There’s a real appetite from their players for innovative, creative games.

“It’s a great opportunity for us to try to do things differently. Like on Hellblade, we’re questioning everything that we do. Not just on development, but also how we do things from a business perspective as well. Normally you would say, ‘Well, you involve these types of agencies, get these people involved in this, and a website will take this long to create.’ The next thing that we’re doing is, we’re saying, ‘Well, is that true? Can we try and do these things a different way,’ because you can.

“There’s definitely pressure for us to fill all those gaps left by a publisher, but it’s a great challenge for us to step up to. Ultimately, we have to transition into a publisher. That’s going to happen at some point, if we want to publish our own games.”

Courtesy-GI.biz

IBM Seeking Cloud Expansion Through Acquisitions

January 22, 2015 by mphillips  
Filed under Computing

IBM will favor purchases that strengthen its cloud services, the company’s CFO said Tuesday, as it seeks ways to expand its business after 11 straight quarters of declining revenue.

“Most of our acquisitions will probably be on an ‘as a service’ basis, as opposed to an on-premise model,” CFO Martin Schroeter said during IBM’s quarterly earnings call, in response to a question.

“That’s the nature of the market and where we have a lot of opportunity, because we don’t play in some of those areas today,” he said.

IBM could use the growth. On Tuesday it said revenue for the last quarter declined across all major segments — hardware, software and services. Profits were down as well, though they beat the forecast of financial analysts polled by Thomson Reuters.

IBM sees cloud services as one of its best chances for growth, as sales of its more traditional products, including mainframes and Unix servers, continue to decline.

Two years ago it bought SoftLayer to help it compete with Amazon Web Services, and last year it bought Cloudant, which provides a database as a service, and Light House Security, another cloud provider. This year, it looks like more cloud deals will be in the works.

Meanwhile, CEO Ginni Rometty has been selling off businesses that produce little or no profit. In October, she announced a plan to sell IBM’s chip manufacturing business for US$1.3 billion to Global Foundries, and before that she sold its x86 server business to Lenovo.

So IBM’s revenue is shrinking in part by design, but it needs to expand its other, more profitable businesses to compensate for the losses. And that isn’t yet happening at a fast enough rate.

 

 

Technology Has Had Negative Impact On Privacy, Survey Reveals

January 21, 2015 by mphillips  
Filed under Around The Net

Internet users in the U.S., France Germany and other nations are increasingly concerned about the impact technology has on privacy, and feel legal protections are insufficient.

In 11 of the 12 countries surveyed as part of a report published by Microsoft, respondents said that technology’s effect on privacy was mostly negative. Most concerned were people in Japan and France, where 68 percent of the respondents thought technology has had a mostly negative impact on privacy.

A majority want better legal protections and say the rights of Internet users should be governed by local laws irrespective of where companies are based.

Internet users in India, Indonesia and Russia were the least concerned, according to the survey. In general, those in developing countries were less bothered.

Surveys like this one should always be looked at with a healthy dose of skepticism. But there is little doubt that people are wary of how their personal data is used by companies and governments, according to John Phelan, communications officer at European consumer organization BEUC.

That people shouldn’t take privacy for granted has been highlighted on several occasions in just the last week.

Shortly after the horrific Paris shootings, British Prime Minister David Cameron was criticized for saying that authorities should have the means to read all encrypted traffic.

Also, U.S. mobile operator Verizon Wireless found itself in hot water over the way one of its advertising partners used the Unique Identifier Headers Verizon embeds in its customers’ Internet traffic to recreate tracking cookies that had been deleted by users. Online advertising company Turn defended its practises, but still said on Friday it would stop using the method by next month.

Worries about privacy aren’t likely to subside anytime soon, with more devices becoming connected as part of the expected Internet of Things boom.

The “Views from Around the Globe: 2nd Annual Poll on How Personal Technology is Changing our Lives” survey queried 12,002 Internet users in the U.S., China, India, Brazil, Indonesia, South Africa, South Korea, Russia, Germany, Turkey, Japan and France.

 

 

 

NSA Reportedly Hijacked And Repurposed Third-Party Malware

January 21, 2015 by mphillips  
Filed under Around The Net

In addition to having its own bag of digital tricks, the U.S. National Security Agency reportedly hijacks and repurposes third-party malware.

The NSA is using its network of servers around the world to monitor botnets made up of thousands or millions of infected computers. When needed, the agency can exploit features of those botnets to insert its own malware on the already compromised computers, through a technology codenamed Quantumbot, German news magazine Der Spiegel reported.

One of the secret documents leaked by former NSA contractor Edward Snowden and published by Der Spiegel contains details about a covert NSA program called DEFIANTWARRIOR that’s used to hijack botnet computers and use them as “pervasive network analysis vantage points” and “throw-away non-attributable CNA [computer network attack] nodes.”

This means that if a user’s computer is infected by cybercriminals with some malware, the NSA might step in, deploy their own malware alongside it and then use that computer to attack other interesting targets. Those attacks couldn’t then be traced back to the NSA.

According to the leaked document, this is only done for foreign computers. Bots that are based in the U.S. are reported to the FBI Office of Victim Assistance.

The NSA also intercepts and collects data that is stolen by third-party malware programs, especially those deployed by other foreign intelligence agencies, if it is valuable. It refers to this practice as “fourth party collection.”

In 2009, the NSA tracked a Chinese cyberattack against the U.S. Department of Defense and was eventually able to infiltrate the operation. It found that the Chinese attackers were also stealing data from the United Nations so it continued to monitor the attackers while they were collecting internal UN data, Der Spiegel reported.

It goes deeper than that. One leaked secret document contains an NSA worker’s account of a case of fifth party collection. It describes how the NSA infiltrated the South Korean CNE (computer network exploitation) program that targeted North Korea.

“We found a few instances where there were NK officials with SK implants on their boxes, so we got on the exfil [data exfiltration] points, and sucked back the data,” the NSA staffer wrote in the document. “However, some of the individuals that SK was targeting were also part of the NK CNE program. So I guess that would be the fifth party collect you were talking about.”

In other words, the NSA spied on a foreign intelligence agency that was spying on a different foreign intelligence agency that had interesting data of its own.

Sometimes the NSA also uses the servers of unsuspecting third parties as scapegoats, Der Spiegel reported. When exfiltrating data from a compromised system, the data is sent to such servers, but it is then intercepted and collected en route though the NSA’s vast upstream surveillance network.

 

 

 

 

Canonical To Jump Deeper Into The IoT With Ubuntu

January 21, 2015 by Michael  
Filed under Computing

Canonical has announced a new version of the Ubuntu operating system designed to bring a united front to the Internet of Things (IoT), after a preview alpha was trialed late last year.

The super-stripped down, lightweight Snappy Ubuntu Core is designed to allow developers to create IoT applications quickly and easily and release them securely across the network.

This means that many devices with firmware that would have been unpatched after vulnerabilities such as Heartbleed can now be updated quickly, easily and silently.

Apps are at the heart of the infrastructure, with app store functionality able to offer off-the-peg firmware, applications and runtime libraries to help facilitate common standards across the IoT.

“We found that the IoT required a way of installing apps similar to the way you do on your phone,” Maarten Ectors, Ubuntu VP for the IoT, told The INQUIRER.

“Developers can have app stores for things that don’t have app stores today. That could be your vacuum cleaner, it could be your robot, it could be a drone.”

The company hopes that the future of robots will be a large part of the success of Snappy, and is working closely with a range of start-ups and Kickstarter projects to bring home automation and intelligent robotics to life.

“As people add more items and add complexity to their home networks, they want stuff to just work and to keep working, no matter what vulnerabilities we discover in the huge mountain of open source software that is powering all of it,” added Mark Williams, founder and guvnor of Ubuntu.

“Many of these items that you’ll be buying will be Ubuntu anyway, but Snappy will allow them to be fully robust, fully automated and fully secure.”

Ubuntu Core requires a tiny footprint. It can work with as little as 600MHz of processing power and 128MB of RAM, with suitable ARM processor baseboards starting at $35 retail.

Also x86 compatible, this flexibility means that the overall product could see IoT products being mass produced for matters of pennies.

Last year Broadcom offered a similar device called the Wiced Sense, a $20 kit aimed at helping to design IoT prototypes.

The first Snappy Ubuntu Core products are expected to be announced in the second quarter. Expect to see a lot of them on Christmas lists for 2015.

Courtesy-TheInq

 

Microsoft Looking Into Light Beams

January 21, 2015 by Michael  
Filed under Around The Net

Microsoft Researchers have worked out a way that means you will never have to plug in your phone again.

Yunxin Liu, Zhen Qin and Chunshui Zhao from Microsoft Research’s Beijing campus have developed a new system they call AutoCharge.

The researchers’ paper said that “wireless power methods have several disadvantages, preventing them from being used in our targeted usage scenarios”

Electromagnetic radiation of wireless power is much higher than wireless communications (Wi-Fi or 3G). Thus, safety to human bodies is a big issue in wireless power. As a result, wireless power is usually used only in extreme scenarios such as in outer space, for military purposes, or in very short ranges.

Radio frequencies used in wireless power are much lower than the frequencies of light, it is hard to emit the radio waves within a straight beam. This causes energy waste if the receiver is not large enough and makes it hard to ensure safety.
The current crop of wireless charging solutions for smartphones typically require special phone cases and ‘charging pads’, and work using electromagnetic induction. Power is transmitted only over a few centimetres.

However the researchers came up with a way of using solar power techniques to charge smartphones.

Indoor surrounding light is usually much than the sunlight and thus cannot be used to charge a smartphone but instead of relying on the sun, the team built a prototype charger that can be mounted on a ceiling and automatically locate a smartphone lying on a table, then charge it using a directed beam of light.

The light charger has two modes. In the ‘detection’ mode, it uses a camera and image recognition software to detect objects with the size and shape of a smartphone lying on a table. The charger will rotate until it detects an object that looks like a smartphone.

The device then enters charging mode and turns on its light. The prototype used an UltraFire CREE XM-L T6 Focusing LED Flashlight.

Courtesy-Fud

Twitter Acquires Marketing Start-up ZipDial

January 21, 2015 by mphillips  
Filed under Mobile

Twitter Inc announced plans to acquire Indian mobile phone marketing start-up ZipDial, reportedly for $30 million to $40 million, as the U.S. microblogging service looks to expand in the world’s second-biggest mobile market.

Bengaluru-based ZipDial gives clients phone numbers for use in marketing campaigns. Consumers call the numbers and hang up before connecting and incurring charges, and then receive promotion-related text messages.

The start-up’s clients include International Business Machines Corp, Yum! Brands Inc’s KFC and Procter & Gamble Co’s Gillette.

The service capitalizes on a local tradition of communicating through so-called missed calls. A person may give a friend a missed call to signal arrival at an agreed destination, for instance, without having to pay the cost of a phone call.

Such “unique behavior” was behind ZipDial, the start-up said in a statement announcing the Twitter deal.

Twitter did not disclose terms of the purchase. Techcrunch, citing unidentified sources, reported the deal at $30 million to $40 million.

“This acquisition significantly increases our investment in India, one of the countries where we’re seeing great growth,” Twitter said in a statement.

The acquisition is the latest in India by global tech giants who have snapped up companies in a fledgling startup scene, concentrated in the tech hub of Bengaluru in southern India.

Last year, Facebook Inc bought Little Eye Labs, a start-up that builds performance analysis and monitoring tools for mobile apps. Yahoo! Inc bought Bookpad, whose service allows developers to add document viewing and editing to their own applications.