Subscribe to:

Subscribe to :: TheGuruReview.net ::

Oracle Identifies Its Products Affected By Heartbleed, But No Estimates On Fixes

April 18, 2014 by mphillips  
Filed under Around The Net

Oracle issued a comprehensive list of its software that may or may not be impacted by the OpenSSL (secure sockets layer) vulnerability known as Heartbleed, while warning that no fixes are yet available for some likely affected products.

The list includes well over 100 products that appear to be in the clear, either because they never used the version of OpenSSL reported to be vulnerable to Heartbleed, or because they don’t use OpenSSL at all.

However, Oracle is still investigating whether another roughly 20 products, including MySQL Connector/C++, Oracle SOA Suite and Nimbula Director, are vulnerable.

Oracle determined that seven products are vulnerable and is offering fixes. These include Communications Operation Monitor, MySQL Enterprise Monitor, MySQL Enterprise Server 5.6, Oracle Communications Session Monitor, Oracle Linux 6, Oracle Mobile Security Suite and some Solaris 11.2 implementations.

Another 14 products are likely to be vulnerable, but Oracle doesn’t have fixes for them yet, according to the post. These include BlueKai, Java ME and MySQL Workbench.

Users of Oracle’s growing family of cloud services may also be able to breath easy. “It appears that both externally and internally (private) accessible applications hosted in Oracle Cloud Data Centers are currently not at risk from this vulnerability,” although Oracle continues to investigate, according to the post.

Heartbleed, which was revealed by researchers last week, can allow attackers who exploit it to steal information on systems thought to be protected by OpenSSL encryption. A fix for the vulnerable version of OpenSSL has been released and vendors and IT organizations are scrambling to patch their products and systems.

Observers consider Heartbleed one of the most serious Internet security vulnerabilities in recent times.

Meanwhile, this week Oracle also shipped 104 patches as part of its regular quarterly release.

The patch batch includes security fixes for Oracle database 11g and 12c, Fusion Middleware 11g and 12c, Fusion Applications, WebLogic Server and dozens of other products. Some 37 patches target Java SE alone.

A detailed rundown of the vulnerabilities’ relative severity has been posted to an official Oracle blog.

 

Does Samsung Fingerprint Sensor Work?

April 18, 2014 by Michael  
Filed under Computing

Security experts from from Germany’s Security Research Labs have broken into Samsung’s fingerprint technology by taking a fingerprint smudge from the smartphone and creating a “wood glue dummy” finger with it. Apparently the S5 falls for the fault every time.

The problem is because the scanner has such a high trust rating within the phone, it will also mean that any thief will have access to the owners PayPal account. Neither of these actions require an additional password to be entered. PayPal has said that while it was taking the findings from Security Research Labs seriously, it was confident that fingerprint authentication offers and easier and more secure way to pay on mobile devices than passwords or credit cards.

The scan unlocks a secure cryptographic key that serves as a password replacement for the phone and this can be deactivated from a lost or stolen device, and you can create a new one. Paypal also uses sophisticated fraud and risk management tools to try to prevent fraud before it happens.

However you would think someone would have learnt by now a similar method was used to break the iPhone 5S’ fingerprint scanner last year. A better method was to cut the iPhone owner’s finger off. It was more messy but a lot more satisfying. There is a video of German researchers figuring out ways of making your phone talk after the break.

 

 

Courtesy-Fud

Did Mercury Experience Volcanism?

April 18, 2014 by Michael  
Filed under Around The Net

Explosive volcanic eruptions apparently shaped Mercury’s surface for billions of years — a surprising finding, given that until recently scientists had thought the phenomenon was impossible on the sun-scorched planet.

This discovery could shed new light on the origins of Mercury, investigators added.

On Earth, explosive volcanic eruptions can lead to catastrophic damage, such as when Mount St. Helens detonated in 1980 in the deadliest and most economically destructive volcanic event in U.S. history.

Explosive volcanism happens because Earth’s interior is rich in volatiles — water, carbon dioxide and other compounds that vaporize at relatively low temperatures. As molten rock rises from the depths toward Earth’s surface, volatiles dissolved within it vaporize and expand, increasing pressure so much that the crust above can burst like an overinflated balloon.

Mercury was long thought to be bone-dry when it came to volatiles. As such, researchers thought explosive volcanism could not happen there.

However, in 2008, after the initial flyby of Mercury by NASA’s MESSENGER spacecraft (short for MErcury Surface, Space ENvironment, GEochemistry, and Ranging), researchers found unusually bright reflective material dotting the planet’s surface.

This stuff appears to be pyroclastic ash, which is a sign of volcanic explosions. The large number of these deposits suggested that Mercury’s interior was not always devoid of volatiles, as scientists had long assumed.

It was unclear from MESSENGER’s first flybys over what time periods those explosions had occurred. Now scientists find Mercury’s volatiles did not escape in a rash of explosions early in the planet’s history. Instead, explosive volcanism apparently lasted for billions of years on Mercury.

Investigators analyzed 51 pyroclastic sites across Mercury’s surface using data from MESSENGER collected after the spacecraft began orbiting around the innermost planet in the solar system in 2011. These orbital readings provided a far more detailed view of the deposits and the vents that spewed them out compared with data from the initial flybys.

The orbital data revealed that some of the vents were much more eroded than others. This revealed the explosions did not all happen at the same time.

If the explosions did happen over a brief period and then stopped, “you’d expect all the vents to be degraded by approximately the same amount,” study lead author Timothy Goudge, a planetary scientist at Brown University, said in a statement. “We don’t see that; we see different degradation states. So the eruptions appear to have been taking place over an appreciable period of Mercury’s history.”

The researchers noted that about 90 percent of these ash deposits are located within craters formed by meteorite impacts. These deposits must have accumulated after each crater formed; if a deposit were laid down before a crater formed, it would have been destroyed by the impact that formed the crater.

Scientists can estimate the age of an impact crater by looking at how eroded its rims and walls are. Using that method, Goudge and his colleagues found that some pyroclastic deposits were found in craters ranging in age between 1 billion years to more than 4 billion years old. Explosive volcanic activity was thus not confined to a brief time after Mercury’s formation about 4.5 billion years ago, researchers said.

“The most surprising discovery was the range of ages over which these deposits appear to have formed, as this really has implications for how long Mercury retained volatiles in its interior,” Goudge told Space.com.

Earlier models of how Mercury formed suggested most of its volatiles would not have survived the planet-formation process. For instance, since Mercury has an unusually large iron core, past models posited that the planet might have once been much larger, but had its outer layers and its volatiles removed by a huge impact early in the planet’s history.

This scenario now seems unlikely given these new findings, in combination with other data collected by MESSENGER showing traces of the volatiles sulfur, potassium, and sodium on Mercury’s surface.

Future research will aim to identify more of these pyroclastic deposits and their source vents.

“More detailed observations and studies of single vents and associated deposits will elucidate some of the detailed aspects of what pyroclastic activity might have been like on Mercury,” Goudge said.

The scientists detailed their findings online March 28 in the Journal of Geophysical Research: Planets.

Courtesy-Space

 

Intel Looks To Android To Boost Tablet Business

April 17, 2014 by mphillips  
Filed under Consumer Electronics

Its becoming more obvious lately that Intel and Microsoft are no longer joined at the hip. Intel is trying desperately to make a dent in the tablet market, and with Windows struggling on those devices, Android is where it’s at.

Intel hopes to see its processors used in 40 million tablets this year, and 80% to 90% of those will be running Google’s Android OS, CEO Brian Krzanich said on Tuesday.

“Our mix of OSes reflects pretty much what you see in the marketplace,” Krzanich said during Intel’s quarterly earnings call.

Most Intel-powered tablets running Android today use the older Medfield and Clover Trail+ chips. More Android tablets running the latest Atom processor, called Bay Trail, will ship later this quarter.

That’s not to say Intel is abandoning Windows — far from it. It’s just going where the market is today. Krzanich said he expects Windows to “grow and gain traction,” and more Intel-based tablets running both Android and Windows will be shown in June at the massive Computex trade show in Taipei.

The first Android-based Bay Trail tablet, the DreamTab, was announced in January, but it hasn’t shipped yet.

Intel is chasing ARM, the U.K. company whose processor designs are used in most tablets today, including those running both Android and Apple’s iOS.

The 40 million Intel tablets that will ship this year will give the company 15% to 20% of the tablet market, Intel CFO Stacy Smith said on the earnings call.

Intel is providing discounts and development funds to tablet makers to reduce the cost of using its chips. It’s looking for growth with the white-box Chinese tablet makers, which are expected to ship up to 130 million tablets this year.

Intel chips are available in some tablets now priced under $99, but most will be priced between $125 and $250, Krzanich said.

Microsoft hasn’t made much of a dent yet in Google’s and Apple’s share of the market, but IDC estimated last month that Windows would have 10.2% of the tablet market by 2017. Dell, Toshiba, Lenovo and Hewlett-Packard have launched Windows 8 tablets with Bay Trail, and Microsoft’s own Surface Pro 2 uses an Intel Core processor, but the tablets haven’t sold well.

 

 

 

Google Glass One Day Sale Proves Successful

April 17, 2014 by mphillips  
Filed under Consumer Electronics

Google’s one-day sale of Google Glass appears to have been a success with all units sold out, a blog post by the technology titan suggests.

“All spots in the Explorer Program have been claimed for now, but if you missed it this time, don’t worry,” the Google Glass team wrote on its blog on Wednesday.

“We’ll be trying new ways to expand the Explorer program in the future.”

Google did not respond to a request for more information, but an earlier post about the one-day sale spoke of brisk sales of the $1,500 Internet-enabled headset.

“We’ve sold out of Cotton (white), so things are moving really fast,” the team wrote.

Aside from the white version, Glass was being offered in shades marketed as Charcoal, Tangerine, Shale (grey) and Sky (blue). Buyers had the choice of their favorite shade or frame. Google announced the one-day sale available to all U.S. residents over 18 last week, adding it wasn’t ready to bring the gizmo to other countries. Shoppers who missed it have to sign up for updates at the Glass website.

Only a few thousand early adopters and developers had Glass before the one-day sale, which coincided with a major software update for the heads-up display that put video calling on hold.

An official launch of Google Glass may happen later this year.

 

Reddit Going After More Users, Advertisers With New Feature

April 17, 2014 by mphillips  
Filed under Around The Net

Reddit, a website with a retro-’90s look and space-alien mascot that tracks everything from online news to celebrity Q&As, is trying to attract even more followers, and advertising, by allowing members of its passionate community to post their own news more quickly and easily.

Reddit, majority owned by Conde Nast parent Advanced Publications, last month unveiled a new feature that lets users of the nine-year-old site post live updates, allowing them to report in real time.

The live updates allow selected users, dubbed “reporters” by Reddit, to instantly stream unlimited posts during the course of an event such as the conflict in the Ukraine, an earthquake in Los Angeles, or a game played in real time, without having to refresh the page.

The capability is still in testing mode. So far only users selected on a case-by-case basis can create a live thread. The feature has attracted attention. For example, live threads linked to “Twitch plays Pokemon,” in which users of the Twitch website played an old Nintendo game, garnered 2 million page views in 30 days.

“Reddit members are doing amazing things with very minimal tools and were hitting some barriers,” said Erik Martin, general manager.

Martin, who said the site is not yet profitable and declined to give specific revenue figures, added: “We want to give people a more powerful way to make updates.”

Reddit’s move toward enabling users to fluidly update is the latest move in a battle between social media sites including Facebook, Twitter and LinkedIn to use news to engage users, and attract more ad dollars.

Before, Reddit users could not update in real time. The new feature is similar to how people instantly send tweets but keeps the updates together through one thread or “subreddit.”

Reddit, which also gets revenue through e-commerce, has ramped up efforts of late to attract more advertisers. Next week, it plans to unveil city and country targeting capabilities that allow advertisers to address users by geographic market.

One recent ad, specific to Reddit, featured the actors Jeff Goldblum and Bill Murray, stars of the movie “The Grand Budapest Hotel,” as individual threads.

Some 62 percent of Reddit users get their news through the platform while about half of all Facebook and Twitter users do the same, according to a recent report on the State of the News Media from the Pew Research Center.

“Reddit is all about the community, that is the value they brought to the site as they created it,” said Kelly McBride, a senior faculty member at the Poynter Institute, who has been following Reddit since it was founded.

“News has always been really important to Reddit,” she said.

Reddit has more than 114 million unique visitors worldwide and has doubled its traffic in 12 months, said Martin. Facebook has more than 1 billion users and Twitter has more than 240 million.

 

MediaTek Shows Off New LTE SoC

April 17, 2014 by Michael  
Filed under Computing

MediaTek has shown off one of its most interesting SoC designs to date at the China Electronic Information Expo. The MT6595 was announced a while ago, but this is apparently the first time MediaTek showcased it in action.

It is a big.LITTLE octa-core with integrated LTE support. It has four Cortex A17 cores backed by four Cortex A7 cores and it can hit 2.2GHz. The GPU of choice is the PowerVR G6200. It supports 2K4K video playback and recording, as well as H.265. It can deal with a 20-megapixel camera, too.

The really interesting bit is the modem. It can handle TD-LTE/FDD-LTE/WCDMA/TD-SCDMA/GSM networks, hence the company claims it is the first octa-core with on board LTE. Qualcomm has already announced an LTE-enabled octa-core, but it won’t be ready anytime soon. The MT6595 will – it is expected to show up in actual devices very soon.

Of course, MediaTek is going after a different market. Qualcomm is building the meanest possible chip with four 64-bit Cortex A57 cores and four A53 cores, while MediaTek is keeping the MT6595 somewhat simpler, with smaller 32-bit cores.

Courtesy-Fud

Microsoft Updates Office Online

April 16, 2014 by mphillips  
Filed under Computing

Microsoft is updating its Web-based Office Online suite, closing the features gap with the main Office 365 and Office 2013 suites installed on users’ devices.

“We know you want features that allow you to move as seamlessly as possible between Office Online and the desktop,” wrote Kaberi Chowdhury, an Office Online technical product manager, in a blog post Monday.

Improvements to Excel Online include the ability to insert new comments, edit and delete existing comments, and properly open and edit spreadsheets that contain Visual Basic for Applications (VBA) code.

Meanwhile, Word Online has a new “pane” where users can see all comments in a document, and reply to them or mark them as completed. It also has a refined lists feature that is better able to recognize whether users are continuing a list or starting one. In addition, footnotes and end notes can now be added more conveniently inline.

PowerPoint Online has a revamped text editor that offers a layout view that more closely resembles the look of finished slides, according to Microsoft. It also has improved performance and video functionality, including the ability to play back embedded YouTube videos.

For users of OneNote Online, Microsoft is now adding the ability to print out the notes they’ve created with the application.

Microsoft is also making Word Online, PowerPoint Online and OneNote Online available via Google’s Chrome Web Store so that Chrome browser users can add them to their Chrome App launcher. Excel Online will be added later.

The improvements in Office Online will be rolled out to users this week, starting Monday.

Office Online, which used to be called Office Web Apps, competes directly against Google Docs and other browser-based office productivity suites. It’s meant to offer users a free, lightweight, Web-based version of these four applications if they don’t have the desktop editions on the device they’re using at that moment.

 

Google Reveals Email Scanning Practices In Revised Terms Of Service

April 16, 2014 by mphillips  
Filed under Around The Net

Google Inc updated its terms of service earlier this week, informing users that their incoming and outgoing emails are automatically analyzed by software to create targeted ads.

The revisions more explicitly spell out the manner in which Google software scans users’ emails, both when messages are stored on Google’s servers and when they are in transit, a controversial practice that has been at the heart of litigation.

Last month, a U.S. judge decided not to combine several lawsuits that accused Google of violating the privacy rights of hundreds of millions of email users into a single class action.

Users of Google’s Gmail email service have accused the company of violating federal and state privacy and wiretapping laws by scanning their messages so it could compile secret profiles and target advertising. Google has argued that users implicitly consented to its activity, recognizing it as part of the email delivery process.

Google spokesman Matt Kallman said in a statement that the changes “will give people even greater clarity and are based on feedback we’ve received over the last few months.”

Google’s updated terms of service added a paragraph stating that “our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored.

 

Mt. Gox Founder Refuses To Appear In U.S. Regarding Bankruptcy

April 16, 2014 by mphillips  
Filed under Around The Net

Mark Karpeles, the founder of Mt. Gox, has refused to come to the United States to answer questions about the Japanese bitcoin exchange’s U.S. bankruptcy case, Mt. Gox lawyers told a federal judge on Monday.

In the court filing, Mt. Gox lawyers cited a subpoena from the U.S. Department of Treasury’s Financial Crimes Enforcement Network, which has closely monitored virtualcurrencies like bitcoin.

“Mr. Karpeles is now in the process of obtaining counsel to represent him with respect to the FinCEN Subpoena. Until such time as counsel is retained and has an opportunity to ‘get up to speed’ and advise Mr. Karpeles, he is not willing to travel to the U.S.”, the filing said.

The subpoena requires Karpeles to appear and provide testimony in Washington, D.C., on Friday.

The court papers also said a Japanese court had been informed of the issue and that a hearing was scheduled on Tuesday in Japan.

Bitcoin is a digital currency that, unlike conventional money, is bought and sold on a peer-to-peer network independent of central control. Its value has soared in the last year, and the total worth of bit coins minted is now about $7 billion.

Mt. Gox, once the world’s biggest bitcoin exchange, filed for bankruptcy protection in Japan last month, saying it may have lost nearly half a billion dollars worth of the virtual coins due to hacking into its computer system.

According to Monday’s court filings, the subpoena did not specify topics for discussion.

In the court filings, Karpelès’ lawyers asked the court to delay the bankruptcy deposition to May 5, 2014 but said that Mt. Gox could not guarantee that Karpeles would attend that either.

 

Are Deep Discounts Good For Gaming?

April 16, 2014 by Michael  
Filed under Gaming

Double Fine has warned indies of the dangers of devaluing their products, citing its new publishing initiative as a way of protecting against that outcome.

In an interview with USgamer, COO Justin Bailey expressed concern over the harmful side-effects of low price-points and deep discounting for indie games. By giving away too much for too little, he warned, indie developers could reach a similar situation as that found in the casual market.

“I think what indies really need to watch out for is not becoming the new casual games,” he said. “I don’t think that’s a problem from the development side. Indies are approaching it as an artform and they’re trying to be innovative, but what’s happening in the marketplace is indies are being pushed more and more to have a lower price or have a bunch of games bundled together.”

Double Fine is publishing MagicalTimeBean’s Escape Goat 2, the first occasion it has assisted another developer in that way, and it won’t be the last. According to Bailey, what seems to be a purely business decision on the surface has a strong altruistic undercurrent.

“Double Fine wants to keep indies premium. You see that in our own games and how we’re positioning them. We fight the urge to just completely drop the price. That’s one of the things we want to encourage in this program. Getting people to stick to a premium price point and to the platforms that allow you to do that.”

“We’re not looking to replace… we’re trying to augment the system,” he replies. “We’re making small strides right now. Costume Quest 2 is a high-budget game. It’s one that I thought it was best to have a publishing partner who can also spend some marketing funds around it.”

Double Fine is not the first developer to express concern over the tendency among indies to drastically lower prices.

In January, Jason Rohrer published an article imploring developers to consider the loyal fans who buy their games full-price only to see them on sale at a huge discount just a few weeks or months later. Last month, Positech Games’ Cliff Harris went further, suggesting that low price-points actually change the way players see and interact with the games they purchase.

Courtesy-GI.biz

 

Can Lasers Find Gravitional Waves?

April 16, 2014 by Michael  
Filed under Around The Net

The ripples from violent cosmic collisions can be felt far across the universe, and thanks to a new, sensitive detector expected to start collecting data next year, scientists might be able to see evidence of those gravitational waves from Earth for the first time.

When two neutron stars (remnants of supernova explosions) merge or when a black hole merges with a neutron star, the reverberations of the merger can extend throughout the cosmos. Light, however, only tells us so much. To learn more about the mass and motion of the collision, astronomers want to use gravitational waves, ripples in space-time created during these massive crashes.

Next year, astrophysicists are set to switch on one of the most sensitive gravitational-wave detectors ever created. The observatory is called the Laser Interferometer Gravitational-Wave Observatory (LIGO, for short). It originally had six observing runs between 2004 and 2010, and has been offline for half a decade to make upgrades. The return, its backers say, will be worth it. [Photos: Hunting Gravitational Waves with LIGO]

“It opens us up to [viewing] a larger number of astrophysical events,” said David Reitze, the LIGO Laboratory’s principal investigator and director. One improvement will be better sensitivity in lower frequencies, which will let astronomers look for black holes of between 100 and 500 times the mass of the sun if they exist.

A new documentary about LIGO, titled “LIGO, A Passion for Understanding,” is set to premiere on Space.com April 15. You can watch it on Space.com or directly from the filmmaker Kai Staats here: http://www.kaistaats.com/film/ligo/.

From the Big Bang to big star explosions

Gravitational waves hit the headlines in March when the scientific instrument BICEP2 (short for Background Imaging of Cosmic Extragalactic Polarization) found the first direct evidence of cosmic inflation, or the huge expansion of the cosmos that happened shortly after the Big Bang.

LIGO, however, searches for waves at higher frequencies, in the 10 hertz to 10 kilohertz band. The primordial waves discovered by BICEP2, Reitze said, are 20 orders of magnitude lower in frequency.

While LIGO was not really designed to look for primordial waves, scientists did search for them. Researchers described what was then the most accurate upper limit on the primary gravitational- wave background in high frequencies. The results were published in a 2009 issue of the journal Nature in 2009 (and have since been superseded by BICEP2, Reitze said.)

Another prominent result came when scientists measured how round pulsars — super-dense, tiny, spinning remnants of supernovas — are by tracking asymmetries on their surfaces. “If it has a bump and the bump is big enough, it will generate a gravitational wave,” Reitze said.

Measurements of the Crab Nebula’s pulsar by LIGO yielded no “mountains” higher than one meter (3.4 feet).

Stopping for trains and earthquakes

LIGO originally consisted of two interferometers (telescope receivers that work together) at Hanford (near Richland, Wash.) and one in Livingston, La. The $205 million advanced LIGO has one interferometer in each location, with the third one going somewhere off continent — likely India. The government there is calling this project “one of the hallmark science detectors,” Reitze said, and site evaluation is underway.

Wherever the interferometer goes, it has to be a region that is not too prone to earthquakes, lest the sensor get mixed up. Despite advanced stabilization technology, shaking does happen. “And we monitor the shaking to make sure it doesn’t corrupt the data,” Reitze said. [Watch the trailer for "LIGO, A Passion for Understanding"]

There’s a procedure at Hanford and Livingston to stop science work when earthquakes occur. Observations at Livingston also must stop temporarily if a 50-car cargo train rumbles by on a track about 1.5 miles (2.4 kilometers) away.

“There are automatic protocols (using sensors and software) which monitor earthquakes and take the interferometers out of ‘science mode’ during seismic disturbances from earthquakes,” Reitze said.

Each interferometer works by injecting a laser into a vacuum system, which splits the beam in half to put the resulting beams at right angles to each other. Each beam goes to mirrors about 2.5 miles (4 kilometers) away, which reflect back.

The gravitational waves cause tiny but measurable distortions in the laser beams that cause a “changing interference pattern” in the photo sensors that read the laser reflections, Reitze said. “In comparative scale, if you take the nucleus of an atom and take its diameter and divide that by 10,000, that’s the type of a distance change we’re looking at.”

LIGO’s capabilities will be 10 times more sensitive than before in searching for binary neutron star mergers, and it will be better able to pick up on many other cosmic phenomena as well – such as black holes and supernovas. The principal funder was the National Science Foundation, and the California Institute of Technology leads laboratory operations.

 

Courtesy-Space

Intel Shows Off New Hybrid Laptop Geared Towards Schools

April 15, 2014 by mphillips  
Filed under Computing

Intel unveiled a laptop-tablet hybrid with Windows 8.1 for the education market, where Chromebooks and tablets are also vying for customers.

The Intel Education 2-in-1 hybrid has a 10.1-inch screen that can detach from a keyboard base to turn into a tablet. Intel makes reference designs, which are then replicated by device makers and sold to educational institutions.

The 2-in-1 has a quad-core Intel Atom processor Z3740D, which is based on the Bay Trail architecture. The battery lasts about eight hours in tablet mode, and three more hours when docked with the keyboard base, which has a second battery.

Intel did not immediately return requests for comment on the estimated price for the hybrid or when it would become available.

Education is a hotly contested market among computer makers, as Apple pushes its iPads and MacBooks while PC makers like Dell, Hewlett-Packard and Lenovo hawk their Chromebooks.

Some features in the Intel 2-in-1 are drawn from the company’s Education tablets, which also run on Atom processors, but have the Android OS.

The 2-in-1 hybrid has front-facing and rear-facing cameras, and a snap-on magnification lens that allows students to examine items at a microscopic level.

The computer can withstand a drop of 70 centimeters, a feature added as protection for instances in which children mishandle laptops and let them fall. The keyboard base also has a handle.

The screen can be swiveled and placed on the keyboard, giving it the capability of a classic convertible laptop. This feature has been drawn from Intel’s Classmate series of education laptops.

The 2-in-1 has software intended to make learning easier, including tools for the arts and science. Intel’s Kno app provides access to 225,000 books. Typically, some of the books available via Kno are free, while others are fee-based.

 

 

BlackBerry Plans To Release Patch For ‘Heartbleed’ Vulnerability

April 15, 2014 by mphillips  
Filed under Mobile

BlackBerry Ltd said it will release security updates for messaging software for Android and iOS devices by Friday to address vulnerabilities in programs related to the “Heartbleed” security threat.

Researchers last week warned they uncovered Heartbleed, a bug that targets the OpenSSL software commonly used to keep data secure, potentially allowing hackers to steal massive troves of information without leaving a trace.

Security experts initially told companies to focus on securing vulnerable websites, but have since warned about threats to technology used in data centers and on mobile devices running Google Inc’s Android software and Apple Inc’s iOS software.

Scott Totzke, BlackBerry senior vice president, told Reuters on Sunday that while the bulk of BlackBerry products do not use the vulnerable software, the company does need to update two widely used products: Secure Work Space corporate email and BBM messaging program for Android and iOS.

He said they are vulnerable to attacks by hackers if they gain access to those apps through either WiFi connections or carrier networks.

Still, he said, “The level of risk here is extremely small,” because BlackBerry’s security technology would make it difficult for a hacker to succeed in gaining data through an attack.

“It’s a very complex attack that has to be timed in a very small window,” he said, adding that it was safe to continue using those apps before an update is issued.

Google spokesman Christopher Katsaros declined comment. Officials with Apple could not be reached.

Security experts say that other mobile apps are also likely vulnerable because they use OpenSSL code.

Michael Shaulov, chief executive of Lacoon Mobile Security, said he suspects that apps that compete with BlackBerry in an area known as mobile device management are also susceptible to attack because they, too, typically use OpenSSL code.

He said mobile app developers have time to figure out which products are vulnerable and fix them.

“It will take the hackers a couple of weeks or even a month to move from ‘proof of concept’ to being able to exploit devices,” said Shaulov.

Technology firms and the U.S. government are taking the threat extremely seriously. Federal officials warned banks and other businesses on Friday to be on alert for hackers seeking to steal data exposed by the Heartbleed bug.

Companies including Cisco Systems Inc, Hewlett-Packard Co, International Business Machines Corp, Intel Corp, Juniper Networks Inc, Oracle Corp Red Hat Inc have warned customers they may be at risk. Some updates are out, while others, like BlackBerry, are rushing to get them ready.

 

Can Micro-Consoles Compete With True Gaming Consoles?

April 15, 2014 by Michael  
Filed under Gaming

With Amazon’s Fire TV device the first out the door, the second wave of microconsoles has just kicked off. Amazon’s device will be joined in reasonably short order by one from Google, with an app-capable update of the Apple TV device also likely in the works. Who else will join the party is unclear; Sony’s Vita TV, quietly soft-launched in Japan last year, remains a potentially fascinating contender if it had the right messaging and services behind it, but for now it’s out of the race. One thing seems certain, though; at least this time we’re actually going to have a party.

“Second wave”, you see, rather implies the existence of a first wave of microconsoles, but last time out the party was disappointing, to say the least. In fact, if you missed the first wave, don’t feel too bad; you’re in good company. Despite enthusiasm, Kickstarter dollars and lofty predictions, the first wave of microconsole devices tanked. Ouya, Gamestick and their ilk just turned out to be something few people actually wanted or needed. Somewhat dodgy controllers and weak selections of a sub-set of Android’s game library merely compounded the basic problem – they weren’t sufficiently cheap or appealing compared to the consoles reaching their end-of-life and armed with a vast back catalogue of excellent, cheap AAA software.

“The second wave microconsoles will enjoy all the advantages their predecessors did not. They’ll be backed by significant money, marketing and development effort, and will have a major presence at retail”

That was always the reality which deflated the most puffed-up “microconsoles will kill consoles” argument; the last wave of microconsoles sucked compared to consoles, not just for the core AAA gamer but for just about everyone else as well. Their hardware was poor, their controllers uncomfortable, their software libraries anaemic and their much-vaunted cost savings resulting from mobile game pricing rather than console game pricing tended to ignore the actual behaviour of non-core console gamers – who rarely buy day-one software and as a result get remarkably good value for money from their console gaming experiences. Comparing mobile game pricing or F2P models to $60 console games is a pretty dishonest exercise if you know perfectly well that most of the consumers you’re targeting wouldn’t dream of spending $60 on a console game, and never have to.

Why is the second wave of microconsoles going to be different? Three words: Amazon, Google, Apple. Perhaps Sony; perhaps even Samsung or Microsoft, if the wind blows the right direction for those firms (a Samsung microconsole, sold separately and also bundled into the firm’s TVs, as Sony will probably do with Vita TV in future Bravia televisions, would make particular sense). Every major player in the tech industry has a keen interest in controlling the channel through which media is consumed in the living room. Just as Sony and Microsoft originally entered the games business with a “trojan horse” strategy for controlling living rooms, Amazon and Google now recognise games as being a useful way to pursue the same objective. Thus, unlike the plucky but poorly conceived efforts of the small companies who launched the first wave of microconsoles, the second wave is backed by the most powerful tech giants in the world, whose titanic struggle with each other for control of the means of media distribution means their devices will have enormous backing.

To that end, Amazon has created its own game studios, focusing their efforts on the elusive mid-range between casual mobile games and core console games. Other microconsole vendors may take a different approach, creating schemes to appeal to third-party developers rather than building in-house studios (Apple, at least, is almost guaranteed to go down this path; Google could yet surprise us by pursuing in-house development for key exclusive titles). Either way, the investment in software will come. The second wave of microconsoles will not be “boxes that let you play phone games on your TV”; at least not entirely. Rather, they will enjoy dedicated software support from companies who understand that a hit exclusive game would be a powerful way to drive installed base and usage.

Moreover, this wave of microconsoles will enjoy significant retail support. Fire TV’s edge is obvious; Amazon is the world’s largest and most successful online retailer, and it will give Fire TV prime billing on its various sites. The power of being promoted strongly by Amazon is not to be underestimated. Kindle Fire devices may still be eclipsed by the astonishing strength of the iPad in the tablet market, but they’re effectively the only non-iPad devices in the running, in sales terms, largely because Amazon has thrown its weight as a retailer behind them. Apple, meanwhile, is no laggard at retail, operating a network of the world’s most profitable stores to sell its own goods, while Google, although the runt of the litter in this regard, has done a solid job of balancing direct sales of its Nexus handsets with carrier and retail sales, work which it could bring to bear effectively on a microconsole offering.

In short, the second wave microconsoles will enjoy all the advantages their predecessors did not. They’ll be backed by significant money, marketing and development effort, and will have a major presence at retail. Moreover, they’ll be “trojan horse” devices in more ways than one, since their primary purpose will be as media devices, streaming content from Amazon, Google Play, iTunes, Hulu, Netflix and so on, while also serving as solid gaming devices in their own right. Here, then, is the convergence that microconsole advocates (and the rather less credible advocates of Smart TV) have been predicting all along; a tiny box that will stream all your media off the network and also build in enough gaming capability to satisfy the mainstream of consumers. Between the microconsole under the TV and the phone in your pocket, that’s gaming all sewn up, they reckon; just as a smartphone camera is good enough for almost everyone, leaving digital SLRs and their ilk to the devoted hobbyist, the professional and the poseur, a microconsole and a smartphone will be more than enough gaming for almost everyone, leaving dedicated consoles and gaming PCs to a commercially irrelevant hardcore fringe.

There are, I think, two problems with that assessment. The first is the notion that the “hardcore fringe” who will use dedicated gaming hardware is small enough to be commercially irrelevant; I’ve pointed out before that the strong growth of a new casual gaming market does not have to come at the cost of growth in the core market, and may even support it by providing a new stream of interested consumers. This is not a zero-sum game, and will not be a zero-sum game until we reach a point where there are no more non-gaming consumers out there to introduce to our medium. Microconsoles might do very well and still cause not the slightest headache to PlayStation, Xbox or Steam.

The second problem with the assessment is a problem with the microconsoles themselves – a problem which the Fire TV suffers from very seriously, and which will likely be replicated by subsequent devices. The problem is control.

Games are an interactive experience. Having a box which can run graphically intensive games is only one side of the equation – it is, arguably, the less important side of the equation. The other side is the controller, the device through which the player interacts with the game world. The most powerful graphics hardware in the world would be meaningless without some enjoyable, comfortable, well-designed method of interaction for players; and out of the box, Fire TV doesn’t have that.

Sure, you can control games (some of them, anyway) with the default remote control, but that’s going to be a terrible experience. I’m reminded of terribly earnest people ten years ago trying to convince me that you could have fun controlling complex games on pre-smartphone phones, or on TV remote controls linked up to cable boxes; valiant efforts ultimately doomed not only by a non-existent business ecosystem but by a terrible, terrible user experience. Smartphones heralded a gaming revolution not just because of the App Store ecosystem, but because it turned out that a sensitive multi-touch screen isn’t a bad way of controlling quite a lot of games. It still doesn’t work for many types of game; a lot of traditional game genres are designed around control mechanisms that simply can’t be shoehorned onto a smartphone. By and large, though, developers have come to grips with the possibilities and limitations of the touchscreen as a controller, and are making some solid, fun experiences with it.

With Fire TV, and I expect with whatever offering Google and Apple end up making, the controller is an afterthought – both figuratively and literally. You have to buy it separately, which keeps down the cost of the basic box but makes it highly unlikely that the average purchaser will be able to have a good game experience on the device. The controller itself doesn’t look great, which doesn’t help much, but simply being bundled with the box would make a bold statement about Fire TV’s gaming ambitions. As it is, this is not a gaming device. It’s a device that can play games if you buy an add-on; the notion that a box is a “gaming device” just because its internal chips can process game software, even if it doesn’t have the external hardware required to adequately control the experience, is the kind of notion only held by people who don’t play or understand games.

This is the Achilles’ Heel of the second generation of microconsoles. They offer a great deal – the backing of the tech giants, potentially huge investment and enormous retail presence. They could, with the right wind in their sales, help to bring “sofa gaming” to the same immense, casual audience that presently enjoys “pocket gaming”. Yet the giant unsolved question remains; how will these games be controlled? A Fire TV owner, a potential casual gamer, who tries to play a game using his remote control and finds the experience frustrating and unpleasant won’t go off and buy a controller to make things better; he’ll shrug and return to the Hulu app, dismissing the Games panel of the device as being a pointless irrelevance.

The answer doesn’t have to be “bundle a joypad”. Perhaps it’ll be “tether to a smartphone”, a decision which would demand a whole new approach to interaction design (which would be rather exciting, actually). Perhaps a simple Wiimote style wand could double as a remote control and a great motion controller or pointer. Perhaps (though I acknowledge this as deeply unlikely) a motion sensor like a “Kinect Lite” could be the solution. Many compelling approaches exist which deserve to be tried out; but one thing is absolutely certain. While the second generation of microconsoles are going to do very well in sales terms, they will primarily be bought as media streaming boxes – and will never be an important games platform until the question of control gets a good answer.

Courtesy-GI.biz