Subscribe to:

Subscribe to :: TheGuruReview.net ::

Flickering Hard Drive LED Can Be Used By Hackers

February 24, 2017 by  
Filed under Computing

The mostly ignored blinking lights on servers and desktop PCs may give away secrets if a hacker can hijack them with malware.

Researchers in Israel have come up with an innovative hack that turns a computer’s LED light into a signaling system that shows passwords and other sensitive data.

The researchers at Ben-Gurion University of the Negev demonstrated the hack in a YouTube video posted Wednesday. It shows a hacked computer broadcasting the data through a computer’s LED light, with a drone flying nearby reading the pattern.

The researchers designed the scheme to underscore vulnerabilities of air-gapped systems, or computers that have been intentionally disconnected from the internet.

Air-gapped systems generally carry highly confidential information or operate critical infrastructure. But the researchers have been coming up with sneaky ways to extract data from these computers, like using the noise from the PC’s fan or hard drive to secretly broadcast the information to a nearby smartphone.

Their latest hack leverages the LED activity light for the hard disk drive, which can be found on many servers and desktop PCs and is used to indicate when memory is read or written.

The researchers found that with malware, they could control the LED light to emit binary signals by flashing on and off. That flickering could send out a maximum of 4,000 bits per second, or enough to leak out passwords, encryption keys and files, according to their paper. It’s likely no one would notice anything wrong.

“The hard drive LED flickers frequently, and therefore the user won’t be suspicious about changes in its activity,” said Mordechai Guri, who led the research, in a statement.

To read the signals from the LED light, all that’s needed is a camera or an optical sensor to record the patterns. The researchers found they could read the signal from 20 meters away from outside a building. With an optical zoom lens, that range could be even longer.

It wouldn’t be easy for hackers to pull off this trick. They’d have to design malware to control the LED light and then somehow place it on an air-gapped system, which typically is heavily protected.

They’d also need to find a way to read the signals from the LED light. To do so, a bad actor might hijack a security camera inside the building or fly a drone to spy through a window at night.

However, the danger of an LED light being hijacked can be easy to solve. The researchers recommend placing a piece of tape over the light, or disconnecting it from the computer.

Verizon Bringing 5G To 11 U.S. Cities

February 24, 2017 by  
Filed under Mobile

Verizon Communications Inc has announced that it would begin offering its high-speed wireless 5G network to certain customers in 11 U.S. cities in the first half of 2017.

Verizon will begin pilot testing 5G “pre-commercial services” in cities, including Atlanta, Dallas, Denver, Houston, Miami, Seattle and Washington, D.C.

The company had said last July that it laid out plans to conduct trials for its 5G network this year.

New 5G networks are expected to provide speeds at least 10 times and up to maybe 100 times faster than today’s 4G networks, with the potential to connect at least 100 billion devices with download speeds that can reach 10 gigabits per second.

AT&T Inc said in January that it planned to test its high-speed wireless 5G network for customers of its online streaming television service, DirecTv Now, in Austin, Texas.

Google’s Jigsaw Rolls Out Initiative To Combat Online Abuse

February 24, 2017 by  
Filed under Around The Net

Alphabet Inc’s Google and subsidiary Jigsaw rolled out a new technology to aid news organizations and online platforms  in identifying abusive comments on their websites.

The technology, called Perspective, will review comments and score them based on how similar they are to comments people said were “toxic” or likely to make them leave a conversation.

It has been tested on the New York Times and the companies hope to extend it to other news organizations such as The Guardian and The Economist as well as websites.

“News organizations want to encourage engagement and discussion around their content, but find that sorting through millions of comments to find those that are trolling or abusive takes a lot of money, labor, and time. As a result, many sites have shut down comments altogether,” Jared Cohen, President of Jigsaw, which is part of Alphabet, wrote in a blog post.

“But they tell us that isn’t the solution they want. We think technology can help.”

Perspective examined hundreds of thousands of comments that had been labeled as offensive by human reviewers to learn how to spot potentially abusive language.

CJ Adams, Jigsaw Product Manager, said the company was open to rolling out the technology to all platforms, including larger ones such as Facebook and Twitter  where trolling can be a major headache.

The technology could in the future be expanded to trying to identify personal attacks or off-topic comments too, Cohen said.

Perspective will not decide what to do with comments it finds are potentially abusive; rather publishers will be able to flag them to their moderators or develop tools to help comment understand the impact of what they are writing.

Cohen said a significant portion of abusive comments came from people who were “just having a bad day”.

The initiative against trolls follows efforts by Google and Facebook to combat fake news stories in France, Germany and the United States after they came under fire during the U.S. presidential vote when it became clear they had inadvertently fanned false news reports.

The debate surrounding fake news has led to calls from politicians for social networks to be held more liable for the content posted on their platforms.

The Perspective technology is still in its early stages and “far from perfect”, Cohen said, adding he hoped it could be rolled out for languages other than English too.

The ESA Plan To Launch PLATO in 2024 To Search For Alien Planets

February 24, 2017 by  
Filed under Around The Net

PLATO is a European Space Agency telescope set to launch in 2024. The name is an acronym for “PLAnetary Transits and Oscillations of stars.” The overall goal of the mission is to figure out under what conditions planets form and whether those conditions are favorable for life. 

To do this, PLATO will seek out and investigate Earth-size exoplanets, especially planets that orbit in the habitable zone around sun-like stars. (The habitable zone is usually defined as the area around a star where there is enough energy for liquid water on a planet’s surface, although habitability also depends on other factors such as star variability.) It will determine how big their radii are; verify the mass of the planets from ground-based observatories; use astroseismology or “starquakes” to learn about a star’s mass, radius and age; and identify bright targets for atmospheric spectroscopy along with other telescopes. If all goes to plan, the mission should be able to provide detailed information on hundreds of rocky and giant planets, providing more information about how solar systems form generally.

PLATO’s primary mission is expected to last four years. However, the telescope is designed to last 6.5 years and its consumables, such as fuel, are expected to last about eight years. This means that the telescope could continue operations if its science mission was extended.

PLATO, which is named after the Ancient Greek philosopher Plato, was first proposed in 2007 after ESA put out a call for its Cosmic Vision 2015–2025 program. Cosmic Vision is the name of the current phase of ESA’s long-term space science missions. 

ESA, like NASA, solicits opinions from the science community (ahead of selecting missions) to see what areas of space should be studied next. ESA then puts out calls for missions for a launch opportunity, attracting competitors that must present their science case.

PLATO was first proposed in 2007 as a part of Cosmic Visions, finishing assessment and definition phases in 2009 and 2010, respectively. ESA then put out a call in 2010 for a medium-class mission launch opportunity. 

PLATO, as well as two other missions — Solar Orbiter and Euclid (a mission to investigate dark energy and dark matter) —were selected as the finalists for this competition. Subsequently, Solar Orbiter was chosen for a 2017 launch date and Euclid for a 2020 launch date.

In February 2011, PLATO went up against four other medium-class mission candidates for a 2024 launch date. The others were EChO (the Exoplanet CHaracterization Observatory), LOFT (the Large Observatory For X-ray Timing), MarcoPolo-R (to collect and return a sample from a near-Earth asteroid) and STE-Quest (Space-Time Explorer and QUantum Equivalence principle Space Test). 

PLATO was selected in 2014 for the launch opportunity, which is also called M3 (for the third medium-class mission under Cosmic Visions.) The spacecraft is now in its design phase, which will take several years before it is finalized for construction.

The spacecraft will be launched from Earth on a Soyuz-Fregat rocket bound for a location called a Lagrange point. A Lagrange point is a relatively stable gravitational zone in space. PLATO will specifically be targeted for the L2 Lagrange point, a spot in space on the “dark” side of the Earth (meaning that the sun is always in the opposite direction.) 

L2 has been used before for the Wilkinson Microwave Anisotropy Probe (WMAP) and Planck spacecraft, and is also the region where the James Webb Space Telescope will operate. Since L2 is relatively unstable, the spacecraft will follow a Lissajous orbit, which is a path around the Lagrange point, and periodically use fuel to stay in a consistent orbit.

The payload and science are contributed by a PLATO mission consortium (funded by European national funding agencies) while ESA provides the spacecraft, the CCDs, mission operations and part of the science operations.

PLATO’s goal is to watch a large sample of bright stars for months or years, and measure them to high precision. By watching the stars for long periods, PLATO will be able to discern the light curve of the star, or the variations in its light transmitted over time. 

Since PLATO will last four years (at the least), the primary science mission will have it observe two regions of the sky for two years each. It’s possible, however, that the telescope could instead do one long-duration observation of three years, and then move around in the sky for the fourth year of its primary science mission. 

“In view of the exceptionally fast development of exoplanet science, the final observing strategy will be investigated throughout the mission development and decided two years before launch,” ESA said.

The long-term goal of many planetary observers is to find planets like Earth, and to seek signs of habitability on those other planets. While examining the atmospheres of these tiny planets will require a more advanced observatory, knowing where they are is a first step. 

Other space observatories looking for Earth-like planets include Kepler Space Telescope (in operation since 2009), the forthcoming Transiting Exoplanet Survey Satellite (TESS) and to a lesser extent, the forthcoming James Webb Space Telescope (JWST). Both TESS and JWST should launch in 2018.

PLATO has 24 normal cameras on board, arranged in four groups of six. Each of these groups has the same field of view, ESA said, but they are offset by a 9.2-degree-angle from the vertical axis of the spacecraft. Additionally, the spacecraft will have two “fast” cameras that will be used for brighter stars.

If an exoplanet passes in front of a star from the telescope’s perspective, it can cause the light from the star to diminish, affecting its light curve. Other things can appear like planets, however, such as sunspots on the star that are darker than the surrounding surface and which also block the light. 

To verify any planets, PLATO will rely on backup observations from ground-based observatories. These observatories can measure the radial velocity of the star, or the velocity of the star along the line of sight from the observer. If slight tugs or movements are seen in the star, this would imply the presence of a planet due to the effect of the planet’s gravity on the star.

“The key scientific requirement [is] to detect and characterize a large number of terrestrial planets around bright stars,” ESA wrote in a statement. Terrestrial planets, being small, are tough to see around stars because they don’t dim the star’s light as much. The hope is that by observing closer and brighter stars, the planets will be a little larger and easier to spot.

Courtesy-Space

New Qualcomm, Intel Modems Boasts LTE Speeds Greater Than 1Gbps

February 23, 2017 by  
Filed under Mobile

With every new generation of smartphone, LTE connections gain more speed. That’s because the devices have faster modems that can transfer data at previously unheard of download speeds.

The top modem providers are Intel and Qualcomm, whose cellular chips are used in the iPhone. Both have announced modems that will push LTE connections to speeds well over those of regular home internet connections.

Qualcomm unveiled the X20 LTE chipset, which can transfer data at speeds of up to 1.2Gbps. Intel announced the XMM 7560 LTE modem, which can download data at speeds of up to 1Gbps.

However, cellular networks aren’t yet designed to handle such fast speeds. One exception is Telstra, an Australian telecommunications company, which has launched a gigabit LTE service for commercial use in that country.

Gigabit LTE will slowly start appearing in mobile devices and networks this year, said Jim McGregor, principal analyst at Tirias Research.

“This is making 4G what it was intended to be — a true wireless broadband solution,” McGregor said.

These performance bumps are important as users handle more data,  McGregor said.

“We’ve seen this with microprocessors for years,” McGregor said.

Qualcomm said its Snapdragon X20 modem will become available next year, and McGregor estimated it will be in devices soon after. Intel said its XMM 7560 is ready, but couldn’t say when handsets would come out.

Most users may not need LTE speeds of 1.2Gbps, especially when using apps like Uber, Snapchat and WhatsApp. But more PCs are getting LTE connectivity, and could use the speed for high-end applications.

Qualcomm, a modem pioneer, is trying to stay a step ahead of Intel in the rat race to rev up LTE modems. Intel is speeding up modem development as wireless connectivity becomes an essential part of computing, said Aicha Evans, senior vice president and general manager of the Communication and Devices Group at Intel.

he new modems are also a stepping stone to 5G, the next-generation cellular network technology that Evans estimated could deliver speeds of more than 45Gbps. Beyond mobile devices, 5G will be used for machine-to-machine communications and will be a standard feature in a wide range of devices including PCs, robots, drones and internet of things devices.

The Snapdragon X20 LTE chipset is a CAT 18 modem and supports a wide range of cellular technologies that could make it work in most countries worldwide. The chip supports carrier aggregation and data transfers over multiple streams. It works with 40 cellular frequency bands and supports technologies like Voice over LTE (VoLTE) and LTE broadcast.

Intel’s XMM 7560 is a CAT 16 modem and supports carrier aggregation across multiple spectrums. The chip maker has already readied its first 5G modem, and the company now says it has silicon ready for that chip.

Yahoo Agrees To Verizon’s Discounted Acquisition Deal

February 23, 2017 by  
Filed under Around The Net

Verizon Communications Inc reconfirmed plans to acquire Yahoo Inc’s  core business for $4.48 billion, lowering its original offer by $350 million in the wake of two massive cyber attacks at the internet company.

The closing of the deal, which was first announced in July, had been delayed as the companies assessed the fallout from two data breaches that Yahoo disclosed last year. The No. 1 U.S. wireless carrier had been trying to persuade Yahoo to amend the terms of the agreement following the attacks.

Verizon and Yahoo signed the deal on Sunday evening after weeks of talks that included calls with Yahoo CEO Marissa Mayer and a meeting between Verizon CEO Lowell McAdam and Yahoo director Tom McInerney in New York earlier this month to agree on the amount of the price reduction, a person involved in the talks said.

The two sides had an agreement in principle about a week earlier that included a liability sharing agreement, something that Verizon decided early on that it needed to reach a deal.

Verizon conducted brand studies and found that Yahoo’s reputation was holding up after the hacks, the person said. The company decided to proceed in part because it continued to believe that the deal made strategic sense and that users were loyal and engaged.

The companies said on Tuesday they expect the deal to close in the second quarter. The data breach may delay some integration of Yahoo with Verizon after the closing, the person said.

The deal brings to Verizon Yahoo’s more than 1 billion users and a wealth of data it can use to offer more targeted advertising. Verizon will combine Yahoo’s advertising technology tools as well as its search, email and messenger assets with its AOL unit, purchased for $4.4 billion in 2015.

Verizon’s shares rose 0.3 percent to $49.33 in afternoon trading, while Yahoo’s shares were up 0.8 percent at $45.48.

Under the amended terms, Yahoo and Verizon will split cash liabilities related to some government investigations and third-party litigation related to the breaches.

Yahoo, however, will continue to be responsible for liabilities from shareholder lawsuits and SEC investigations.

Yahoo said in December that data from more than 1 billion user accounts was compromised in August 2013, making it the largest breach in history.

This followed the company’s disclosure in September that at least 500 million accounts were affected in another breach in 2014.

Is The Independent Game Developers Future Bleak

February 23, 2017 by  
Filed under Gaming

Never more than a stopgap that was hugely inadequate to the gap in question, Steam Greenlight is finally set to disappear entirely later this Spring. The service has been around for almost five years, and while it was largely greeted with enthusiasm, the reality has never justified that optimism. The amassing of community votes for game approval turned out to be no barrier to all manner of grafters who launched unfinished, amateurish games (even using stolen assets in some cases) on the service, but enough of a barrier to be frustrating and annoying for many genuine indie developers. As an attempt to figure out how to prevent a storefront from drowning in the torrent of rubbish that has flooded the likes of the App Store and Google Play, it was a worthy experiment, but not one that ought to have persisted for five years, really.

Moreover, Greenlight isn’t disappearing because Valve has solved this problem to its satisfaction. The replacement, Direct, is in some regards a step backwards; it’ll see developers being able to publish directly on the system simply by confirming their identity (company or personal) through submission of business documents and paying a fee for each game they submit. The fee in question hasn’t been decided yet, but Valve says it’s thinking about everything from $100 to $5000.

The impact of Direct is going to depend heavily on what that fee ends up being. It’s worth noting that developers for iOS, for example, already pay around $100 a year to be part of Apple’s developer programme, and trawling through the oceans of unloved and unwanted apps released on the App Store every day shows just how little that $100 price does to dissuade the worst kind of shovelware. At $5000, meanwhile, quite a lot of indie developers will find themselves priced out of Steam, especially those at the more arthouse end of the scene, or new creators getting started out. Ironically, though, the chances are that many of the cynical types behind borderline-scam games with ripped off assets and design will calculate that $5000 is a small price to pay for a shot at sales on Steam, especially if the high fees are thinning out the number of titles launching.

It’s worth noting that, for the majority of Steam’s consumers, the loss of arthouse indie games and fringe titles from new creators won’t be of huge concern. Steam, like all storefronts, sells huge numbers at the top end and that falls off rapidly as you come down the charts; the number of consumers who are actively engaging with smaller niche titles on the service is pretty small. However, that doesn’t mean that locking out those creators wouldn’t be damaging – both creatively and commercially.

Plenty of creators are actually making a living at the low end of the market; they’re not making fortunes or buying gigantic mansions to hang around being miserable in, but they’re making enough money from their games to sustain themselves and keep up their output. Often, they’re working in niches that have small audiences of devoted fans, and locking them out of Steam with high submission costs would both rob them of their income (there are quite a few creators out there for whom $5000 represents a large proportion of their average revenue from a game) and rob audiences of their output, or at least force them to look elsewhere.

Sometimes, a game from a creator like that becomes a break-out hit, the game the whole world is talking about for months on end – sometimes, but not very often. It’s tempting to argue that Steam should be careful about its “low-end” indies (a term I use in the commercial sense, not as any judgement of quality; there’s great, great stuff lurking around the bottom of the charts) because otherwise it risks missing the Next Big Thing, but that’s not really a good reason. Steam is just about too big to ignore, and the Next Big Thing will almost certainly end up on the platform anyway.

Rather, the question is over what Valve wants Steam to be. If it’s a platform for distributing big games to mainstream consumers, okay; it is what it is. If they’re serious about it being a broad church, though, an all-encompassing platform where you can flick seamlessly between AAA titles with budgets in the tens of millions and arthouse, niche games made as a labour of love by part-timers or indie dreamers, then Direct as described still doesn’t solve the essential conflict in that vision.

In replacing publishers with a storefront through which creators can directly launch products to consumers, Valve and other store operators have asserted the value of pure market forces over curation – the fine but flawed notion of greatness rising to the top while bad quality products sink to the bottom simply through the actions of consumers making buying choices. This, of course, doesn’t work in practice, partially because in the real world free markets are enormously constrained and distorted by factors like the paucity of information (a handful of screenshots and a trailer video doth not a perfectly informed and rational purchasing decision make), and more importantly because free markets can’t actually make effective assessments of something as subjective as the quality of a game.

Thus, even as their stores have become more and more inundated with tides of low quality titles – perhaps even to the extent of snuffing out genuinely good quality games – store operators have tried to apply algorithmic wizardry to shore up marketplaces they’ve created. Users can vote, and rate things; elements of old-fashioned curation have even been attempted, with rather limited success. Tweaks have been applied to the submission process at one end and the discovery process at the other. Nothing, as yet, presents a very satisfying solution.

One interesting possibility is that we’re going to see the pendulum start to swing back a little – from the extreme position of believing that Steam and its ilk would make publishers obsolete, to the as yet untested notion that digital storefronts will ultimately do a better job of democratising publishing than they have done of democratising development. We’ve already seen the rise of a handful of “boutique” publishers who specialise in working with indie developers to get their games onto digital platforms with the appropriate degree of PR and marketing support; if platforms like Steam start to put up barriers to entry, we can expect a lot more companies like that to spring up to act as middlemen.

Like the indie developers themselves, some will cater to specific niches, while others will be more mainstream, but ultimately they will all serve a kind of curation role; their value will lie not just in PR, marketing and finance, but also in the ability to say to platforms and consumers that somewhere along the line, a human being has looked at a game in depth and said “yes, this is a good game and we’re willing to take a risk on it.” There’s a value to that simple function that’s been all too readily dismissed in the excitement over Steam, the App Store and so on, and as issues of discovery and quality continue to plague those storefronts, that value is only becoming greater.

Whatever Valve ultimately decides to do with Direct – whether it sets a low price that essentially opens the floodgates, or a high one that leaves some developers unable to afford the cost of entry – it will not provide a panacea to Steam’s issues. It might, however, lay the ground for a fresh restructuring of the industry, one that returns emphasis to the publishing functions that were trampled underfoot in the initial indie gold-rush and, into the bargain, helps to provide consumers with clearer assurances of quality. A new breed of publisher may be the only answer to the problems created by storefronts we were once told were going to make publishers extinct.

Courtesy-GI.biz

Is The U.S. Tech Industry Headed In A Downward Spin?

February 23, 2017 by  
Filed under Computing

Layoffs at computer, electronics, and telecommunications companies rose by 21 percent last year.

Beancounters at the global outplacement outfit Challenger, Gray & Christmas said that more than 96,017 US jobs were cut in 2016, compared to 79,315 the prior year.

Tech layoffs accounted for 18 percent of the total 526,915 US job cuts announced in 2016.

Of the 2016 total, some 66,821of the layoffs came from computer companies, up seven percent year over year.

Challenger attributed much of that increase to cuts made by Dell which merged with EMC. In preparation for that combination, layoffs were instituted across EMC and its constituent companies, including VMware.

But Dell was not entirely to blame Intel, IBM, Cisco, Microsoft all saw the mighty HR axeman coming down the corridors.

For example, Chipzilla’s move to axe 12,000 people or 11 percent of its workforce—was made because the company has struggled in the mobile device market.

John Challenger, chief executive of the outplacement firm added that the networking giant Cisco cut 5,500 jobs or 7 per cent of its headcount to better compete with cloud competitors like Amazon Web Services.

It does not look like things are going to get much better either. The industry is gutting itself as companies shift focus to cloud-based computing and mobile systems, Challenger warned.

Courtesy-Fud

Are Low Profile Radeon RX 460 Forthcoming?

February 23, 2017 by  
Filed under Computing

MSI has unveiled yet another HTPC-friendly graphics card, the low-profile Radeon RX 460 that will comes in both 2GB and 4GB versions.

Featuring a dual-slot, low-profile, dual-fan cooler and a low-profile PCB to match, both the MSI RX 460 2GT LP and 4GT LP graphics card will be working at reference 1090MHz GPU base and 1200MHz GPU Boost clocks with GDDR5 memory working at 1750MHz on a 128-bit memory interface.

It also comes with single DVI and one HDMI display outputs.

In case you missed it, the Radeon RX 460 is based on AMD’s Polaris 11 GPU with 896 Stream Processors, 48 TMUs and 16 ROPs and should pack enough punch for a decent casual gaming experience.

Unfortunately, the price or the availability date have not been revealed but we are sure these two will appear in retail/e-tail soon at around US $100/€100.

Courtesy-Fud

Astronomers Find The Building Blocks Of Life On Ceres

February 23, 2017 by  
Filed under Around The Net

The dwarf planet Ceres keeps looking better and better as a possible home for alien life.

NASA’s Dawn spacecraft has spotted organic molecules — the carbon-containing building blocks of life as we know it — on Ceres for the first time, a study published today (Feb. 16) in the journal Science reports.

And these organics appear to be native, likely forming on Ceres rather than arriving via asteroid or comet strikes, study team members said.

“Because Ceres is a dwarf planet that may still preserve internal heat from its formation period and may even contain a subsurface ocean, this opens the possibility that primitive life could have developed on Ceres itself,” Michael Küppers, a planetary scientist based at the European Space Astronomy Centre just outside Madrid, said in an accompanying “News and Views” article in the same issue of Science.

“It joins Mars and several satellites of the giant planets in the list of locations in the solar system that may harbor life,” added Küppers, who was not involved in the organics discovery.

The $467 million Dawn mission launched in September 2007 to study Vesta and Ceres, the two largest objects in the main asteroid belt between Mars and Jupiter. 

Dawn circled the 330-mile-wide (530 kilometers) Vesta from July 2011 through September 2012, when it departed for Ceres, which is 590 miles (950 km) across. Dawn arrived at the dwarf planet in March 2015, becoming the first spacecraft ever to orbit two different bodies beyond the Earth-moon system.

During its time at Ceres, Dawn has found bizarre bright spots on crater floors, discovered a likely ice volcano 2.5 miles (4 km) tall and helped scientists determine that water ice is common just beneath the surface, especially near the dwarf planet’s poles.

The newly announced organics discovery adds to this list of achievements. The carbon-containing molecules — which Dawn spotted using its visible and infrared mapping spectrometer instrument — are concentrated in a 385-square-mile (1,000 square km) area near Ceres’ 33-mile-wide (53 km) Ernutet crater, though there’s also a much smaller patch about 250 miles (400 km) away, in a crater called Inamahari. 

And there could be more such areas; the team surveyed only Ceres’ middle latitudes, between 60 degrees north and 60 degrees south. 

“We cannot exclude that there are other locations rich in organics not sampled by the survey, or below the detection limit,” study lead author Maria Cristina De Sanctis, of the Institute for Space Astrophysics and Space Planetology in Rome, told Space.com via email.

Dawn’s measurements aren’t precise enough to nail down exactly what the newfound organics are, but their signatures are consistent with tar-like substances such as kerite and asphaltite, study team members said.

“The organic-rich areas include carbonate and ammoniated species, which are clearly Ceres’ endogenous material, making it unlikely that the organics arrived via an external impactor,” co-author Simone Marchi, a senior research scientist at the Southwest Research Institute in Boulder, Colorado, said in a statement. 

In addition, the intense heat generated by an asteroid or comet strike likely would have destroyed the organics, further suggesting that the molecules are native to Ceres, study team members said.

The organics might have formed via reactions involving hot water, De Sanctis and her colleagues said. Indeed, “Ceres shows clear signatures of pervasive hydrothermal activity and aqueous alteration,” they wrote in the new study.

Such activity likely would have taken place underground. Dawn mission scientists aren’t sure yet how organics generated in the interior could make it up to the surface and leave the signatures observed by the spacecraft.

“The geological and morphological settings of Ernutet are still under investigation with the high-resolution data acquired in the last months, and we do not have a definitive answer for why Ernutet is so special,” De Sanctis said.

It’s already clear, however, that Ceres is a complex and intriguing world — one that astrobiologists are getting more and more excited about.

“In some ways, it is very similar to Europa and Enceladus,” De Sanctis said, referring to ocean-harboring moons of Jupiter and Saturn, respectively. 

“We see compounds on the surface of Ceres like the ones detected in the plume of Enceladus,” she added. “Ceres’ surface can be considered warmer with respect to the Saturnian and Jovian satellites, due to [its] distance from the sun. However, we do not have evidence of a subsurface ocean now on Ceres, but there are hints of subsurface recent fluids.” 

Courtesy-Fud

Toshiba Wants Nearly $9B For Memory Chip Business

February 22, 2017 by  
Filed under Around The Net

Japan’s Toshiba Corp wished to receive at least 1 trillion yen ($8.8 billion) by selling most of its flash memory chip business, seeking to create a buffer for any fresh financial problems, a source with direct knowledge of the matter said.

The beleaguered conglomerate was pressured to abandon an initial plan to sell just under 20 percent by its main creditor banks which are worried about potential writedowns that may come on top of $6.3 billion hit to its U.S. nuclear unit, financial sources also said.

Toshiba said last week it is now prepared to sell a majority stake or even all of its chip business, the world’s biggest NAND chip producer after Samsung Electronics Co Ltd, also rocked by the emergence of fresh problems at its Westinghouse unit that have delayed the release of earnings.

The company has not decided on the size of the stake to be sold, preferring to focus on the amount that can be raised but would like to retain a one-third holding as that would give it a degree of control over the business, the source with direct knowledge said.

Its willingness to relinquish so much of the unit underscores not only the depths of its financial woes but also resignation on the part of management to becoming a much smaller company.

The sale “is the best and the only way Toshiba can raise a large amount of funds and wipe out concerns about its credit risk,” said the source, adding that the sale should be completed by the end of March next year.

It wants to restart the sale process as soon as possible and may sell to multiple buyers rather than one bidder with interest already received from investment funds, other chipmakers and client companies, he also said.

A separate person with knowledge of the matter said Toshiba will outline terms of the sale by the end of February, conduct a first round of bids in March and aim to have chosen a preferred bidder or bidders by the end of May. The person also said Toshiba valued the chips business at around 1.5 trillion yen.

A Toshiba spokeswoman said the company cannot comment on the specifics of the sale process. Sources declined to be identified as they were not authorized to speak to the media.

TransferWise Users Can Now Send Cash Via Facebook Messenger

February 22, 2017 by  
Filed under Around The Net

Money transfer company TransferWise debuted a new service that allows users to send money internationally through Facebook Inc’s chat application, as competition in the digital payments landscape intensifies.

The London-based startup said on Tuesday that it had developed a Facebook Messenger “chatbot”, or an automated program that can help users communicate with businesses and carry out tasks such as online purchases.

TransferWise’s chatbot enables customers to send money to friends and family to and from the United States, Britain, Canada, Australia and Europe from Facebook Messenger. It can also be used to set up exchange rate alerts.

Facebook already allows its users to send money domestically in the United States via its Messenger app, but has not yet launched similar services internationally. TransferWise said its service will be the first to enable international money transfers entirely within Messenger.

Facebook opened up its Messenger app to developers to create chatbots in April in a bid to expand its reach in customer service and enterprise transactions.

Chatbots have become a hot topic in enterprise technology over the past year because recent advances in artificial intelligence have made them better at interacting. Businesses, including banks, are hoping that they can be used to improve and reduce the cost of their customer service operations.

One of Europe’s most well-known fintech companies, TransferWise was launched in 2011 by Estonian friends Taavet Hinrikus and Kristo Käärmann out of frustration with the high fees they were being charged by banks for international money transfers.

The company, which is valued at more than $1 billion, is backed by several high profile investors including Silicon Valley venture fund Andreessen Horowitz, Virgin Group founder Sir Richard Branson, and PayPal co-founders Max Levchin and Peter Thiel, through his fund Valar Ventures.

Customers in more than 50 countries send roughly $1 billion through its website every month.

While the TransferWise chatbot is now only available in Facebook Messenger it can be adapted to work with other popular chat services, Scott Miller, head of global partnerships for TransferWise said. He said the service would eventually be extended to work in other countries and money transfer routes that the company operates in.

Is Blackberry Taking Nokia To Court?

February 22, 2017 by  
Filed under Around The Net

A patent war is being fought between two of the industry smartphone leaders of yesteryear – Nokia and Blackberry.

Blackberry filed a patent-infringement lawsuit against Nokia Oyj, demanding royalties on the Finnish company’s mobile network products that use an industry wide technology standard.

Blackberry moaned that Nokia’s Flexi Multiradio base stations, radio network controllers and Liquid Radio software are using technology covered by as many as 11 patents owned by BlackBerry.

It added that Nokia was encouraging the use” of the standard- compliant products without a license from Blackberry.

Blackberry did not say how much it wanted Nokia to cough up, but it would appear to be part of Chief Executive Officer John Chen is working to find new ways to pull revenue out of Blackberry’s technology.

He’s used acquisitions to add a suite of software products and negotiated licensing agreements to take advantage of the company’s thick book of wireless technology patents.

Nokia is aware of the inventions because the company has cited some of the patents in some of its own patent applications, BlackBerry said.

Some of the patents were owned by Nortel and Nokia had at one point tried to buy them as part of a failed bid for Nortel’s business in 2009, according to Blackberry.

BlackBerry was part of a group called Rockstar Consortium that bought Nortel’s patents out of bankruptcy for $4.5 billion in 2011. The patents were split up between the members of the group, which included Apple and Microsoft.

Since Blackberry contends that patents cover essential elements of a mobile telecommunications standard known as 3GPP, it has pledged to license them on fair and reasonable terms.

Courtesy-Fud

Why Are The NPD Games Sales Kept Private?

February 22, 2017 by  
Filed under Gaming

When I first began my career in the games industry I wrote a story about an impending digital download chart.

It was February 2008 and Dorian Bloch – who was leader of UK physical games data business Chart-Track at the time – vowed to have a download Top 50 by Christmas.

It wasn’t for want of trying. Digital retailers, including Steam, refused to share the figures and insisted it was down to the individual publishers and developers to do the sharing (in contrast to the retail space, where the stores are the ones that do the sharing). This led to an initiative in the UK where trade body UKIE began using its relationships with publishers to pull together a chart. However, after some initial success, the project ultimately fell away once the sheer scale of the work involved became apparent.

Last year in the US, NPD managed to get a similar project going and is thus far the only public chart that combines physical and digital data from accurate sources. However, although many big publishers are contributing to the figures, there remains some notable absentees and a lack of smaller developers and publishers.

In Europe, ISFE is just ramping up its own project and has even began trialling charts in some territories (behind closed doors), however, it currently lacks the physical retail data in most major markets. This overall lack of information has seen a rise in the number of firms trying to plug the hole in our digital data knowledge. Steam Spy uses a Web API to gather data from Steam user profiles to track download numbers – a job it does fairly accurately (albeit not all of the time).

SuperData takes point-of-sale and transaction information from payment service providers, plus some publishers and developers, which means it can track actual spend. It’s strong on console, but again, it’s not 100% accurate. The mobile space has a strong player in App Annie collecting data, although developers in the space find the cost of accessing this information high.

It feels unusual to be having this conversation in 2017. In a market that is now predominantly digital, the fact we have no accurate way of measuring our industry seems absurd. Film has almost daily updates of box office takings, the music market even tracks streams and radio plays… we don’t even know how many people downloaded Overwatch, or where Stardew Valley would have charted. So what is taking so long?

“It took a tremendous amount of time and effort from both the publisher and NPD sides to make digital sales data begin to flow,” says Mat Piscatella, NPD’s US games industry analyst. NPD’s monthly digital chart is the furthest the industry has come to accurate market data in the download space.

“It certainly wasn’t like flipping a switch. Entirely new processes were necessary on both sides – publishers and within NPD. New ways of thinking about sales data had to be derived. And at the publishers, efforts had to be made to identify the investments that would be required in order to participate. And of course, most crucially, getting those investments approved. We all had to learn together, publishers, NPD, EEDAR and others, in ways that met the wants and needs of everyone participating.

“Over time, most of the largest third-party publishers joined the digital panel. It has been a remarkable series of events that have gotten us to where we are today. It hasn’t always been smooth; and keep in mind, at the time the digital initiative began, digital sales were often a very small piece of the business, and one that was often not being actively managed. Back then, publishers may have been letting someone in a first-party operation, or brand marketing role post the box art to the game on the Sony, Microsoft and Steam storefronts, and that would be that. Pricing wouldn’t be actively managed, sales might be looked at every month or quarter, but this information certainly was not being looked at like packaged sales were. The digital business was a smaller, incremental piece of the pie then. Now, of course, that’s certainly changed, and continues to change.”

“For one, the majors are publicly traded firms, which means that any shared data presents a financial liability. Across the board the big publishers have historically sought to protect the sanctity of their internal operations because of the long development cycles and high capital risks involved in AAA game publishing. But, to be honest, it’s only been a few years that especially legacy publishers have started to aggregate and apply digital data, which means that their internal reporting still tends to be relatively underdeveloped. Many of them are only now building the necessary teams and infrastructure around business intelligence.”

Indeed, both SuperData and NPD believe that progress – as slow as it may be – has been happening. And although some publishers are still holding out or refusing to get involved, that resolve is weakening over time.   “For us, it’s about proving the value of participation to those publishers that are choosing not to participate at this time,” Piscatella says. “And that can be a challenge for a few reasons. First, some publishers may believe that the data available today is not directly actionable or meaningful to its business. The publisher may offer products that have dominant share in a particular niche, for example, which competitive data as it stands today would not help them improve.

“Second, some publishers may believe that they have some ‘secret sauce’ that sharing digital sales data would expose, and they don’t want to lose that perceived competitive advantage. Third, resources are almost always stretched thin, requiring prioritisation of business initiatives. For the most part, publishers have not expanded their sales planning departments to keep pace with all of the overwhelming amount of new information and data sources that are now available. There simply may not be the people power to effectively participate, forcing some publishers to pass on participating, at least for now.

“So I would certainly not classify this situation as companies ‘holding out’ as you say. It’s that some companies have not yet been convinced that sharing such information is beneficial enough to overcome the business challenges involved. Conceptually, the sharing of such information seems very easy. In reality, participating in an initiative like this takes time, money, energy and trust. I’m encouraged and very happy so much progress has been made with participating publishers, and a tremendous amount of energy is being applied to prove that value to those publishers that are currently not participating.”

NPD’s achievements is significant because it has managed to convince a good number of bigger publishers, and those with particularly successful IP, to share figures. And this has long been seen as a stumbling block, because for those companies performing particularly well, the urge to share data is reduced. I’ve heard countless comments from sales directors who have said that ‘sharing download numbers would just encourage more competitors to try what we’re doing.’ It’s why van Dreunen has noted that “as soon as game companies start to do well, they cease the sharing of their data.”

Indeed, it is often fledgling companies, and indie studios, that need this data more than most. It’s part of the reason behind the rise of Steam Spy, which prides itself on helping smaller outfits.

“I’ve heard many stories about indie teams getting financed because they managed to present market research based on Steam Spy data,” boasts Sergey Galyonkin, the man behind Steam Spy. “Just this week I talked to a team that got funded by Medienboard Berlin-Brandenburg based on this. Before Steam Spy it was harder to do a proper market research for people like them.

“Big players know these numbers already and would gain nothing from sharing them with everyone else. Small developers have no access to paid research to publish anything.

“Overall I’d say Steam Spy helped to move the discussion into a more data-based realm and that’s a good thing in my opinion.”

The games industry may be behaving in an unusually backwards capacity when it comes to sharing its digital data, but there are signs of a growing willingness to be more open. A combination of trade body and media pressure has convinced some larger publishers to give it a go. Furthermore, publishers are starting to feel obligated to share figures anyway, especially when the likes of SuperData and Steam Spy are putting out information whether they want them to or not.

Indeed, although the chart Dorian promised me 9 years ago is still AWOL, there are at least some figures out there today that gives us a sense of how things are performing.

“When we first started SuperData six years ago there was exactly zero digital data available,” van Dreunen notes. “Today we track the monthly spending of 78 million digital gamers across platforms, in spite of heavy competition and the reluctance from publishers to share. Creating transparency around digital data is merely a matter of market maturity and executive leadership, and many of our customers and partners have started to realize that.”

He continues: The current inertia comes from middle management that fears new revenue models and industry changes. So we are trying to overcome a mindset rather than a data problem. It is a slow process of winning the confidence and trust of key players, one-at-a-time. We’ve managed to broker partnerships with key industry associations, partner with firms like GfK in Europe and Kadokawa Dwange in Japan, to offer a complete market picture, and win the trust with big publishers. As we all move into the next era of interactive entertainment, the need for market information will only increase, and those that have shown themselves willing to collaborate and take a chance are simply better prepared for the future.”

NPD’s Piscatella concludes: “The one thing I’m most proud of, and impressed by, is the willingness of the participating publishers in our panel to work through issues as they’ve come up. We have a dedicated, positive group of companies working together to get this information flowing. Moving forward, it’s all about helping those publishers that aren’t participating understand how they can benefit through the sharing of digital consumer sales information, and in making that decision to say “yes” as easy as possible.

“Digital selling channels are growing quickly. Digital sales are becoming a bigger piece of the pie across the traditional gaming market. I fully expect participation from the publishing community to continue to grow.”

Courtesy-GI.biz

Is Kepler Capable Of Finding Exomoons?

February 22, 2017 by  
Filed under Around The Net

 

In 2012, a team of scientists from the Kepler mission announced they would start to hunt for moons orbiting distant exoplanets. While Kepler has discovered thousands of extrasolar planets, the hunt for these so-called “exomoons” has so far come up empty.

The major problem has been that for a moon to be detectable in the Kepler data, it would have to be about 10 percent the mass of Earth, or roughly the mass of Mars. This is about ten times larger than the largest moons in our own solar system.

While the formation of planetary satellites seems to be a natural by-product of planet formation, scientist Amy Barr of the Planetary Science Institute (PSI) wondered if it would be possible for large moons — possibly even Earth-like habitable moons — to form. And if so, could they possibly be common in the galaxy?

Using modeling and simulations, Barr and her fellow researchers found it is theoretically possible for super-sized moons to form around both rocky and gas planets, but only if the planets themselves are sufficiently large enough. Large rocky moons could be created from collisions between super-Earth sized rocky worlds, and exomoons around gas giants may be able to form by co-accretion or capture.

“Our results are the first to demonstrate the masses of the moons that could form in the varied set of impact conditions possible within exoplanetary systems,” said Barr, a senior scientist at PSI. “Most importantly, we have shown that it is possible to form exomoons with masses above the theoretical detection limits of the ongoing Hunt for Exomoons with Kepler survey, moons of more than a tenth of an Earth mass.”

Just as the Kepler spacecraft used the transit method to detect planets passing in front of the disc of the parent star — which causes a temporary drop in brightness — the transit method should also be the best and most direct method for detecting exomoons. That’s why a team at the Harvard-Smithsonian Center for Astrophysics started the Hunt of Exomoons with Kepler (HEK) project. But finding exomoons has been a fruitless challenge, mostly because of the size needed for the moon to be detectable.

However, solar systems found by Kepler are quite different than our own, and the most common size of planet in the Kepler data is new class of planets called super-Earths. These are planets between the size of Earth and Neptune, which we don’t have in our own cosmic neighborhood.

“Very little is known about how the satellite formation processes proposed for our solar system might scale to different planetary masses and stellar conditions,” wrote Barr in her paper.

Using hydrodynamical simulations — which have been used to study how Earth’s moon may have formed by a large impact — Barr was able to determine how much material would be launched into orbit by the collision of two rocky super-Earth exoplanets. Collisions between rocky planets with masses of two to seven Earth masses can launch into orbit enough mass to create a satellite large enough to be detected in Kepler transit data.

“These outcomes are broadly similar to the Moon-forming impact, but when two super-earths collide, the disk is much hotter and more massive,” said Barr in a press release.

And her paper, “Formation of Massive Rocky Exomoons by Giant Impact” explains that the models suggest that detectable rocky exomoons can be produced for a variety of impact conditions and may be associated with host planets of various sizes.

A second paper, “Formation of Exomoons: A Solar System Perspective,” demonstrates how large exomoons could form by co-accretion around growing gas giant planets, or by capturing wandering bodies, or other processes that did not take place in our Solar System.

Barr also looked at current theories of how moons form in our Solar System, and how those theories might apply to the formation of exomoons.

“Some of the old theories about the formation of Earth’s Moon, for example, fission, could operate in other solar systems,” said Barr. “With new observatories coming online soon, this is a good time to revisit some of the old ideas, and see if we might be able to predict how common exomoons might be, and what it would take to detect them.”

Barr said that these studies of the types of exotic moon-forming events has “yielded promising initial results, relevant to the current efforts to observe exomoons,” and that the models suggest that detectable exomoons can be produced in a variety of conditions and may be associated with host planets of various sizes.

As of this writing, the combined Kepler and K2 missions have found 2,476 confirmed planets, with an additional 5,216 planet “candidates,” meaning they have yet to be confirmed. The exomoon count is currently at zero, but the work by Barr and her colleagues provides hope that discovering exomoons could be the next big thing.

Courtesy-Space

 

Next Page »