Subscribe to:

Subscribe to :: TheGuruReview.net ::

Flickering Hard Drive LED Can Be Used By Hackers

February 24, 2017 by  
Filed under Computing

The mostly ignored blinking lights on servers and desktop PCs may give away secrets if a hacker can hijack them with malware.

Researchers in Israel have come up with an innovative hack that turns a computer’s LED light into a signaling system that shows passwords and other sensitive data.

The researchers at Ben-Gurion University of the Negev demonstrated the hack in a YouTube video posted Wednesday. It shows a hacked computer broadcasting the data through a computer’s LED light, with a drone flying nearby reading the pattern.

The researchers designed the scheme to underscore vulnerabilities of air-gapped systems, or computers that have been intentionally disconnected from the internet.

Air-gapped systems generally carry highly confidential information or operate critical infrastructure. But the researchers have been coming up with sneaky ways to extract data from these computers, like using the noise from the PC’s fan or hard drive to secretly broadcast the information to a nearby smartphone.

Their latest hack leverages the LED activity light for the hard disk drive, which can be found on many servers and desktop PCs and is used to indicate when memory is read or written.

The researchers found that with malware, they could control the LED light to emit binary signals by flashing on and off. That flickering could send out a maximum of 4,000 bits per second, or enough to leak out passwords, encryption keys and files, according to their paper. It’s likely no one would notice anything wrong.

“The hard drive LED flickers frequently, and therefore the user won’t be suspicious about changes in its activity,” said Mordechai Guri, who led the research, in a statement.

To read the signals from the LED light, all that’s needed is a camera or an optical sensor to record the patterns. The researchers found they could read the signal from 20 meters away from outside a building. With an optical zoom lens, that range could be even longer.

It wouldn’t be easy for hackers to pull off this trick. They’d have to design malware to control the LED light and then somehow place it on an air-gapped system, which typically is heavily protected.

They’d also need to find a way to read the signals from the LED light. To do so, a bad actor might hijack a security camera inside the building or fly a drone to spy through a window at night.

However, the danger of an LED light being hijacked can be easy to solve. The researchers recommend placing a piece of tape over the light, or disconnecting it from the computer.

Linux To Support Virtual GPUs

February 24, 2017 by  
Filed under Computing

pen source’s Mr Sweary Linus Torvalds announced the general availability of the Linux 4.10 kernel series, which includes virtual GPU (Graphics Processing Unit) support.

Linus wrote in the announcement, adding “On the whole, 4.10 didn’t end up as small as it initially looked”.

The kernel has a lot of improvements, security features, and support for the newest hardware components which makes it more than just a normal update.

Most importantly there is support for virtual GPU (Graphics Processing Unit) support, new “perf c2c” tool that can be used for analysis of cacheline contention on NUMA systems, support for the L2/L3 caches of Intel processors (Intel Cache Allocation Technology), eBPF hooks for cgroups, hybrid block polling, and better writeback management.

A new “perf sched timehist” feature has been added in Linux kernel 4.10 to provide detailed history of task scheduling, and there’s experimental writeback cache and FAILFAST support for MD RAID5.

It looks like Ubuntu 17.04 will be the first stable OS to ship with Linux 4.10.

Courtesy-Fud

EU Data Protection Advocates Still Unhappy With Windows 10 Privacy Settings

February 22, 2017 by  
Filed under Computing

European Union data protection watchdogs are indicating they are still concerned about the privacy settings of Microsoft’s Windows 10 operating system despite the U.S. company announcing changes to the installation process.

The watchdogs, a group made up of the EU’s 28 authorities responsible for enforcing data protection law, wrote to Microsoft last year expressing concerns about the default installation settings of Windows 10 and users’ apparent lack of control over the company’s processing of their data.

The group – referred to as the Article 29 Working Party -asked for more explanation of Microsoft’s processing of personal data for various purposes, including advertising.

“In light of the above, which are separate to the results of ongoing inquiries at a national level, even considering the proposed changes to Windows 10, the Working Party remains concerned about the level of protection of users’ personal data,” the group said in a statement which also acknowledged Microsoft’s willingness to cooperate.

Microsoft was not immediately available to comment.

A number of national authorities have already begun enquiries into Windows 10, including France which in July ordered Microsoft to stop collecting excessive user data.

The EU privacy group said that despite a new installation screen presenting users with five options to limit or switch off Microsoft’s processing of their data, it was not clear to what extent users would be informed about the specific data being collected.

Microsoft uses data collected through Windows 10 for different purposes, including advertising, the group said in its statement said.

“Microsoft should clearly explain what kinds of personal data are processed for what purposes. Without such information, consent cannot be informed, and therefore, not valid.”

Is Blackberry Taking Nokia To Court?

February 22, 2017 by  
Filed under Around The Net

A patent war is being fought between two of the industry smartphone leaders of yesteryear – Nokia and Blackberry.

Blackberry filed a patent-infringement lawsuit against Nokia Oyj, demanding royalties on the Finnish company’s mobile network products that use an industry wide technology standard.

Blackberry moaned that Nokia’s Flexi Multiradio base stations, radio network controllers and Liquid Radio software are using technology covered by as many as 11 patents owned by BlackBerry.

It added that Nokia was encouraging the use” of the standard- compliant products without a license from Blackberry.

Blackberry did not say how much it wanted Nokia to cough up, but it would appear to be part of Chief Executive Officer John Chen is working to find new ways to pull revenue out of Blackberry’s technology.

He’s used acquisitions to add a suite of software products and negotiated licensing agreements to take advantage of the company’s thick book of wireless technology patents.

Nokia is aware of the inventions because the company has cited some of the patents in some of its own patent applications, BlackBerry said.

Some of the patents were owned by Nortel and Nokia had at one point tried to buy them as part of a failed bid for Nortel’s business in 2009, according to Blackberry.

BlackBerry was part of a group called Rockstar Consortium that bought Nortel’s patents out of bankruptcy for $4.5 billion in 2011. The patents were split up between the members of the group, which included Apple and Microsoft.

Since Blackberry contends that patents cover essential elements of a mobile telecommunications standard known as 3GPP, it has pledged to license them on fair and reasonable terms.

Courtesy-Fud

Why Are The NPD Games Sales Kept Private?

February 22, 2017 by  
Filed under Gaming

When I first began my career in the games industry I wrote a story about an impending digital download chart.

It was February 2008 and Dorian Bloch – who was leader of UK physical games data business Chart-Track at the time – vowed to have a download Top 50 by Christmas.

It wasn’t for want of trying. Digital retailers, including Steam, refused to share the figures and insisted it was down to the individual publishers and developers to do the sharing (in contrast to the retail space, where the stores are the ones that do the sharing). This led to an initiative in the UK where trade body UKIE began using its relationships with publishers to pull together a chart. However, after some initial success, the project ultimately fell away once the sheer scale of the work involved became apparent.

Last year in the US, NPD managed to get a similar project going and is thus far the only public chart that combines physical and digital data from accurate sources. However, although many big publishers are contributing to the figures, there remains some notable absentees and a lack of smaller developers and publishers.

In Europe, ISFE is just ramping up its own project and has even began trialling charts in some territories (behind closed doors), however, it currently lacks the physical retail data in most major markets. This overall lack of information has seen a rise in the number of firms trying to plug the hole in our digital data knowledge. Steam Spy uses a Web API to gather data from Steam user profiles to track download numbers – a job it does fairly accurately (albeit not all of the time).

SuperData takes point-of-sale and transaction information from payment service providers, plus some publishers and developers, which means it can track actual spend. It’s strong on console, but again, it’s not 100% accurate. The mobile space has a strong player in App Annie collecting data, although developers in the space find the cost of accessing this information high.

It feels unusual to be having this conversation in 2017. In a market that is now predominantly digital, the fact we have no accurate way of measuring our industry seems absurd. Film has almost daily updates of box office takings, the music market even tracks streams and radio plays… we don’t even know how many people downloaded Overwatch, or where Stardew Valley would have charted. So what is taking so long?

“It took a tremendous amount of time and effort from both the publisher and NPD sides to make digital sales data begin to flow,” says Mat Piscatella, NPD’s US games industry analyst. NPD’s monthly digital chart is the furthest the industry has come to accurate market data in the download space.

“It certainly wasn’t like flipping a switch. Entirely new processes were necessary on both sides – publishers and within NPD. New ways of thinking about sales data had to be derived. And at the publishers, efforts had to be made to identify the investments that would be required in order to participate. And of course, most crucially, getting those investments approved. We all had to learn together, publishers, NPD, EEDAR and others, in ways that met the wants and needs of everyone participating.

“Over time, most of the largest third-party publishers joined the digital panel. It has been a remarkable series of events that have gotten us to where we are today. It hasn’t always been smooth; and keep in mind, at the time the digital initiative began, digital sales were often a very small piece of the business, and one that was often not being actively managed. Back then, publishers may have been letting someone in a first-party operation, or brand marketing role post the box art to the game on the Sony, Microsoft and Steam storefronts, and that would be that. Pricing wouldn’t be actively managed, sales might be looked at every month or quarter, but this information certainly was not being looked at like packaged sales were. The digital business was a smaller, incremental piece of the pie then. Now, of course, that’s certainly changed, and continues to change.”

“For one, the majors are publicly traded firms, which means that any shared data presents a financial liability. Across the board the big publishers have historically sought to protect the sanctity of their internal operations because of the long development cycles and high capital risks involved in AAA game publishing. But, to be honest, it’s only been a few years that especially legacy publishers have started to aggregate and apply digital data, which means that their internal reporting still tends to be relatively underdeveloped. Many of them are only now building the necessary teams and infrastructure around business intelligence.”

Indeed, both SuperData and NPD believe that progress – as slow as it may be – has been happening. And although some publishers are still holding out or refusing to get involved, that resolve is weakening over time.   “For us, it’s about proving the value of participation to those publishers that are choosing not to participate at this time,” Piscatella says. “And that can be a challenge for a few reasons. First, some publishers may believe that the data available today is not directly actionable or meaningful to its business. The publisher may offer products that have dominant share in a particular niche, for example, which competitive data as it stands today would not help them improve.

“Second, some publishers may believe that they have some ‘secret sauce’ that sharing digital sales data would expose, and they don’t want to lose that perceived competitive advantage. Third, resources are almost always stretched thin, requiring prioritisation of business initiatives. For the most part, publishers have not expanded their sales planning departments to keep pace with all of the overwhelming amount of new information and data sources that are now available. There simply may not be the people power to effectively participate, forcing some publishers to pass on participating, at least for now.

“So I would certainly not classify this situation as companies ‘holding out’ as you say. It’s that some companies have not yet been convinced that sharing such information is beneficial enough to overcome the business challenges involved. Conceptually, the sharing of such information seems very easy. In reality, participating in an initiative like this takes time, money, energy and trust. I’m encouraged and very happy so much progress has been made with participating publishers, and a tremendous amount of energy is being applied to prove that value to those publishers that are currently not participating.”

NPD’s achievements is significant because it has managed to convince a good number of bigger publishers, and those with particularly successful IP, to share figures. And this has long been seen as a stumbling block, because for those companies performing particularly well, the urge to share data is reduced. I’ve heard countless comments from sales directors who have said that ‘sharing download numbers would just encourage more competitors to try what we’re doing.’ It’s why van Dreunen has noted that “as soon as game companies start to do well, they cease the sharing of their data.”

Indeed, it is often fledgling companies, and indie studios, that need this data more than most. It’s part of the reason behind the rise of Steam Spy, which prides itself on helping smaller outfits.

“I’ve heard many stories about indie teams getting financed because they managed to present market research based on Steam Spy data,” boasts Sergey Galyonkin, the man behind Steam Spy. “Just this week I talked to a team that got funded by Medienboard Berlin-Brandenburg based on this. Before Steam Spy it was harder to do a proper market research for people like them.

“Big players know these numbers already and would gain nothing from sharing them with everyone else. Small developers have no access to paid research to publish anything.

“Overall I’d say Steam Spy helped to move the discussion into a more data-based realm and that’s a good thing in my opinion.”

The games industry may be behaving in an unusually backwards capacity when it comes to sharing its digital data, but there are signs of a growing willingness to be more open. A combination of trade body and media pressure has convinced some larger publishers to give it a go. Furthermore, publishers are starting to feel obligated to share figures anyway, especially when the likes of SuperData and Steam Spy are putting out information whether they want them to or not.

Indeed, although the chart Dorian promised me 9 years ago is still AWOL, there are at least some figures out there today that gives us a sense of how things are performing.

“When we first started SuperData six years ago there was exactly zero digital data available,” van Dreunen notes. “Today we track the monthly spending of 78 million digital gamers across platforms, in spite of heavy competition and the reluctance from publishers to share. Creating transparency around digital data is merely a matter of market maturity and executive leadership, and many of our customers and partners have started to realize that.”

He continues: The current inertia comes from middle management that fears new revenue models and industry changes. So we are trying to overcome a mindset rather than a data problem. It is a slow process of winning the confidence and trust of key players, one-at-a-time. We’ve managed to broker partnerships with key industry associations, partner with firms like GfK in Europe and Kadokawa Dwange in Japan, to offer a complete market picture, and win the trust with big publishers. As we all move into the next era of interactive entertainment, the need for market information will only increase, and those that have shown themselves willing to collaborate and take a chance are simply better prepared for the future.”

NPD’s Piscatella concludes: “The one thing I’m most proud of, and impressed by, is the willingness of the participating publishers in our panel to work through issues as they’ve come up. We have a dedicated, positive group of companies working together to get this information flowing. Moving forward, it’s all about helping those publishers that aren’t participating understand how they can benefit through the sharing of digital consumer sales information, and in making that decision to say “yes” as easy as possible.

“Digital selling channels are growing quickly. Digital sales are becoming a bigger piece of the pie across the traditional gaming market. I fully expect participation from the publishing community to continue to grow.”

Courtesy-GI.biz

Android Apps Poses Security Risk For Cars

February 20, 2017 by  
Filed under Around The Net

Android applications that allow millions of drivers to remotely locate and unlock their vehicles are missing security features that could prevent tampering by hackers.

Researchers from antivirus vendor Kaspersky Lab took seven of the most popular Android apps that accompany connected cars from various manufacturers, and analyzed them from the perspective of a compromised Android device. The apps and manufacturers have not been named.

The researchers looked at whether such apps use any of the available countermeasures that would make it hard for attackers to hijack them when the devices they’re installed on are infected with malware. Other types of applications, such as banking apps, have such protections.

The analysis revealed that none of the tested applications used code obfuscation to make it harder for attackers to reverse-engineer them, and none of them used code integrity checks to prevent malicious manipulation.

Two applications didn’t encrypt the login credentials stored locally and four encrypted only the password. None of the apps checked if the devices they’re running on are rooted, which could indicate that they’re insecure and possibly compromised.

Finally, none of the tested applications used overlay protections to prevent other apps from drawing over their screens. There are malware apps that display fake log-in screens on top of other apps to trick users to expose their log-in credentials.

While compromising connected-car apps might not directly enable theft, it could make it easier for would-be thieves. Most such apps, or the credentials they store, can be used to remotely unlock the vehicle and disable its alarm system.

Also, the risks are not “limited to mere car theft,” the Kaspersky researchers said in a blog post. “Accessing the car and deliberate tampering with its elements may lead to road accidents, injuries, or death.”

While manufacturers are rushing to add smart features to cars that are meant to improve the experience for car owners, they tend to focus more on securing the back-end infrastructure and the communications channels. However, the Kaspersky researchers warn, that client-side code, such as the accompanying mobile apps, should not be ignored as it’s the easiest target for attackers and most likely the most vulnerable spot.

“Being an expensive thing, a car requires an approach to security that is no less meticulous than that of a bank account,” the researchers said.

Will Politics Bring Down The Gaming Industry?

February 20, 2017 by  
Filed under Gaming

If you’re someone who makes a living from videogames – as most readers of this site are – then political developments around the world at the moment should deeply concern you. I’m sure, of course, that a great many of you are concerned about things ranging from President Trump’s Muslim travel ban to the UK Parliament’s vote for “Hard Brexit” or the looming elections in Holland and France simply on the basis of being politically aware and engaged. However, there’s a much more practical and direct way in which these developments and the direction of travel which they imply will impact upon us. Regardless of personal ideology or beliefs, there’s no denying that the environment that seems to be forming is one that’s bad for the medium, bad for the industry, and will ultimately be bad for the incomes and job security of everyone who works in this sector.

Video games thrive in broadly the same conditions as any other artistic or creative medium, and those conditions are well known and largely undisputed. Creative mediums benefit from diversity; a wide range of voices, views and backgrounds being represented within a creative industry feeds directly into a diversity of creative output, which in turn allows an industry to grow by addressing new groups of consumers. Moreover, creative mediums benefit from economic stability, because when people’s incomes are low or uncertain, entertainment purchases are often among the first to fall.

Once upon a time, games had such strong underlying growth that they were “recession proof,” but this is no longer the case. Indeed, it was never entirely an accurate reading anyway, since broader recessions undoubtedly did slow down – though not reverse – the industry’s growth. Finally, as a consequence of the industry’s broad demographic reach, expansion overseas is now the industry’s best path to future growth, and that demands continued economic progress in the developing world to open up new markets for game hardware and software.

What is now happening on a global basis threatens all of those conditions, and therefore poses a major commercial threat to the games business. That threat must be taken especially seriously given that many companies and creators are already struggling with the enormous challenges that have been thrown up by the messy and uneven transition towards smart devices, and the increasing need to find new revenue streams to support AAA titles whose audience has remained largely unchanged even as development budgets have risen. Even if the global economic system looked stable and conditions were ideal for creative industries, this would be a tough time for games; the prospect of restrictions on trade and hiring, and the likelihood of yet another deep global recession and a slow-down in the advances being made by developing economies, make this situation outright hazardous.

Consider the UK development industry. Since well over a decade ago, if you asked just about any senior figure in the UK industry what the most pressing problem they faced was, they’d give you the same answer: skills shortages. Hiring talented staff is tough in any industry, but game development demands highly skilled people from across a range of fields, and assembling that kind of talent isn’t cheap or easy – even when you have access to the entire European Union as a hiring base, as UK companies did. Now UK companies face having to fill their positions with a much smaller pool of talent to draw from, and hiring from abroad will be expensive, complex and, in many cases, simply impossible.

The US, too, looks like it may tighten visa regulations for skilled hires from overseas, which will have a hugely negative impact on game development there. There are, of course, many skilled creatives who work within the borders of their own country, but the industry has been built on labour flows; centres of excellence in game development, like the UK and parts of the US, are sustained and bolstered by their ability to attract talent from overseas. Any restriction on that will impact the ability of companies to create world-class games – it will make them poorer creatively and throw hiring roadblocks in the path of timely, well-polished releases.

Then there’s the question of trade barriers; not only tariffs, which seem likely to make a comeback in many places, but non-tariff barriers in terms of diverse regulations and standards that will make it harder for companies to operate across national borders. The vast majority of games are multinational efforts; assets, code, and technology are created in different parts of the world and brought together to create the final product. Sometimes this is because of outsourcing, other times it’s because of staff who work remotely, and very often it’s simply because a certain piece of technology is licensed from a company overseas.

If countries become more hostile to free trade, all of that will become more complex and expensive. And that’s even before we start to think about what happens to game hardware, from consoles that source components from across Asia before assembly in China or Japan, to PC and smart device parts that flow out of China, Korea, Taiwan and, increasingly, from developing nations in South-East Asia. If tariff barriers are raised, all of those things will get a lot more expensive, limiting the industry’s consumer base at the most damaging time possible.

Such trade barriers – be they tariff barriers or non-tarriff barriers – would disproportionately impact developing countries. Free trade and globalisation have had negative externalities, unquestionably, but by and large they have contributed to an extraordinary period of prosperity around the world, with enormous populations of people being lifted out of poverty in recent decades and many developing countries showing clear signs of a large emerging middle class. Those are the markets game companies desperately want to target in the coming decade or so. In order for the industry to continue to grow and prosper, the emerging middle class in countries like India, Brazil and Indonesia needs to cultivated as a new wave of game consumers, just as many markets in Central and Eastern Europe were a decade ago.

The current political attacks on the existing order of world trade threaten to cut those economies off from the system that has allowed them to grow and develop so quickly, potentially hurling them into deep recession before they have an opportunity to cement stable, sustainable long-term economic prosperity. That’s an awful prospect on many levels, of course (it goes without saying that many of the things under discussion threaten human misery and catastrophe that far outweighs the impact on the games business), but one consequence will likely be a hard stop to the games industry’s capacity to grow in the coming years.

It’s not just developing economies whose consumers are at risk from a rise of protectionism and anti-trade sentiments, however. If we learned anything from the 2008 crash and the recession that followed, it should be that the global economy largely runs not on cash, but on confidence. The entire edifice is built on a set of rules and standards that are designed to give investors confidence; the structure changes over time, of course, but only slowly, because stability is required to allow people to invest and to build businesses with confidence that the rug won’t be tugged out from underneath them tomorrow. From the rhetoric of Donald Trump to the hardline Brexit approach of the UK, let alone the extremist ideas of politicians like Marine le Pen and Geert Wilders, the current political movement deeply threatens that confidence. Only too recently we’ve seen what happens to ordinary consumers’ job security and incomes when confidence disappears from the global economy; a repeat performance now seems almost inevitable.

Of course, the games industry isn’t in a position to do anything about these political changes – not alone, at least. The same calculations, however, apply to a wide variety of industries, and they’re all having the same conversations. Creative industries are at the forefront for the simple reason that they will be the first to suffer should the business environment upon which they rely turn negative, but in opposing those changes, creative businesses will find allies across a wide range of industries and sectors.

Any business leader that wants to throw their weight behind opposing these changes on moral or ethical grounds is more than welcome to, of course – that’s a laudable stance – but regardless of personal ideology, the whole industry should be making its voice heard. The livelihoods of everyone working in this industry may depend on the willingness of the industry as a whole to identify these commercial threats and respond to them clearly and powerfully.

Courtey-GI.biz

AT&T Has Plans For IoT Network Later This Year

February 16, 2017 by  
Filed under Mobile

AT&T is accelerating its planned debuted of LTE-M, an IoT network that’s already being used to track shipping containers and pallets, monitor water use and connect fleets to the internet.

The carrier said Tuesday it will have nationwide LTE-M coverage in the U.S. by the middle of this year, six months ahead of schedule. Previously, AT&T had said LTE-M would cover the U.S. by year’s end.

That means everywhere in the country that AT&T has an LTE network, it will also offer LTE-M. By the end of the year, it will have LTE-M across Mexico too, creating a broad coverage area for businesses that operate on both sides of the border.

LTE-M is one of several LPWANs (low-power, wide-area networks) that are emerging to link sensors and other devices to the internet of things. It’s not as fast as the LTE that smartphones use, but it’s designed to allow for longer battery life, lower cost, smaller parts and better coverage. LTE-M has a top speed of around 1Mbps (bits per second) upstream and downstream and a range of up to 100 kilometers (62 miles), including better penetration through walls.

AT&T is part of a wave of mobile operators considering or rolling out LTE-M. Others include Orange in France and SoftBank in Japan. AT&T launched its first commercial trial of LTE-M last October in San Ramon, California, and has since opened another in Columbus, Ohio.

Several companies are already using the network for enterprise and consumer applications, AT&T said. They include Capstone Metering, a supplier of wireless water meters; RM2, which makes storage pallets with sensors for monitoring inventory; and PepsiCo, which is using LTE-M to collect usage data from soda fountains. Consumers can dispense their own blends of soda from these fountains, and PepsiCo uses sensors to keep the fountains stocked and learn what blends are popular.

There are already several emerging LPWAN systems from mobile operators and other service providers. The growing LoRaWAN, Sigfox and Ingenu technologies come from outside the traditional mobile industry.

LTE-M and another technology, NB-IoT, are based on LTE and are designed to run over carriers’ licensed spectrum. They may be the best options for enterprises concerned about interference and security, Ovum analyst Daryl Schoolar said.

 

Is The Gaming Industry Due For An Overhaul?

February 16, 2017 by  
Filed under Gaming

Physical retailers are calling for a change in how video game pre-orders are conducted.

They are speaking to publishers and platform holders over the possibility of selling games before the release date. Consumers can pick up the disc 1 to 3 weeks before launch, but it will remain ‘locked’ until launch day.

The whole concept stems from the pre-loading service available in the digital space. Today, consumers can download a game via Steam, Xbox Live and PSN before it’s out, and the game becomes unlocked at midnight on launch day for immediate play (after the obligatory day one patch).

It makes sense to roll this out to other distribution channels. The idea of going into a shop to order a game, and then returning a month later to buy it, always seemed frankly antiquated.

Yet it’s not only consumer friendly, it’s potentially retailer and publisher friendly, too.

For online retailers, the need to hit an embargo is costly – games need to be turned around rapidly to get it into consumers’ hands on day one.

For mainstream retailers, it would clear up a lot of confusion. These stores are not naturally built for pre-ordering product, with staff that are more used to selling bananas than issuing pre-order receipts. The fact you can immediately take the disc home would help – it could even boost impulse sales.

Meanwhile, specialist retailers will be able to make a longer ‘event’ of the game coming out, and avoid the situation of consumers cancelling pre-orders or simply not picking up the game.

Yet when retail association ERA approached some companies about the prospect of doing this, it struggled to find much interest from the publishing community. So what’s the problem?

There are a few challenges.

There are simple logistical obstacles. Games often go Gold just a few weeks before they’re launched, and then it’s over to the disc manufacturers, the printers, the box makers and the distributors to get that completed code onto store shelves. This process can take two weeks in itself. Take the recent Nioh. That game was available to pre-download just a few days before launch – so how difficult would it be to get that into a box, onto a lorry and into a retailer in advance of release?

It also benefits some retailers more than others – particularly online ones, and those with strong distribution channels.

For big games, there’s a potential challenge when it comes to bandwidth. If those that pre-ordered Call of Duty all go online straight away at 12:01, that would put a lot of pressure on servers.

Piracy may also be an issue, because it makes the code available ahead of launch.

The end of the midnight launch may be happening anyway, but not for all games. If consumers can get their game without standing in the cold for 2 hours, then they will. And those lovely marketable pictures of snaking queues will be a thing of the past.

None of these obstacles are insurmountable. Getting the game finished earlier before launch is something that most big games publishers are trying to do, and this mechanism will help force that issue. Of course, the disc doesn’t actually have to contain a game at all. It can be an unlock mechanism for a download, which will allow the discs to be ready far in advance of launch. That strategy is significantly riskier, especially considering the consumer reaction to the same model proposed by Xbox back in 2013.

As for midnight events, there are still ways to generate that big launch ‘moment’. Capcom released Resident Evil 7 with an experiential haunted house experience that generated lots of media attention and attracted a significant number of fans. Pokémon last year ran a big fan event for Sun and Moon, complete with a shop, activities, signing opportunities and the chance to download Mew.

So there are other ways of creating launch theatre than inviting consumers to wait outside a shop. If anything, having the game available in advance of launch will enable these theatrical marketing events to last longer. And coupled with influencer activity, it would actually drive pre-release sales – not just pre-release demand.

However, the reality is this will work for some games and not for others, and here lies the heart of the challenge.

Pre-ordering is already a relatively complex matter, so imagine what it’ll be like if some games can be taken home in advance and others can’t? How many instances can we expect of people complaining that ‘their disc doesn’t work’?

If this is going to work, it needs cross-industry support, which isn’t going to happen. This is a business that can’t even agree on a digital chart, don’t forget.

What we may well see is someone giving this concept a go. Perhaps a digital native publisher, like Blizzard or Valve, who can make it part of their PR activity.

Because if someone like that can make the idea work, then others will follow.

Courtesy-GI.biz

Apple’s iCloud Retained Deleted Browser History

February 14, 2017 by  
Filed under Around The Net

Apple’s iCloud appears to have been retaining users’ deleted internet browsing histories, including records over a year old.

Moscow-based forensics firm Elcomsoft noticed it was able to pull supposedly deleted Safari browser histories from iCloud accounts, such as the date and time the site was visited and when the record was deleted.

“In fact, we were able to access records dated more than one year back,” wrote Elcomsoft’s CEO Vladimir Katalov in a blog post.

Users can set iCloud to store their browsing history so that it’s available from all connected devices. The researchers found that when a user deletes that history, iCloud doesn’t actually erase it but keeps it in a format invisible to the user.

The company discovered the issue with its Phone Breaker product, a forensic tool designed to streamline the extracting files from an iCloud account.

Keeping a copy of a user’s browser history can certainly be “invaluable for surveillance and investigations,” Katalov said. But it’s unclear if Apple knew that its iCloud service was storing the deleted records.

On Thursday, Apple didn’t immediately respond to a request for comment but since Elcomsoft’s blog post went live, Apple appears to be “purging” older browser history records from iCloud, the forensics firm said.

“For what we know, they could be just moving them to other servers, making deleted records inaccessible from the outside,” the blog post said. But now only deleted records as old as only two weeks can be extracted, the company said.

Elcomsoft has previously found that Apple was saving users’ call history to iCloud, but offering no explicit way to turn the synching on or off. At the time, Apple responded that its call synching function was designed for convenience, allowing customers to return phone calls from any device.

For users concerned about their privacy, Elcomsoft said that they can opt-out of syncing their Safari browsing history from iCloud.

FBI Used Malware To Hack Computers In 120 Countries

February 14, 2017 by  
Filed under Around The Net

Privacy advocates have alleged in court that an FBI hacking operation to bust up a child pornography site was unconstitutional and violated international law.

That’s because the operation involved the FBI hacking 8,700 computers in 120 countries, based on a single warrant, they said.

“How will other countries react to the FBI hacking in their jurisdictions without prior consent?” wrote Scarlet Kim, a legal officer with U.K.-based Privacy International.

On Friday, that group, along with the Electronic Frontier Foundation (EFF) and the American Civil Liberties Union of Massachusetts, filed briefs in a lawsuit involving the FBI’s hacking operation against Playpen. The child pornography site was accessible through Tor, a browser designed for anonymous web surfing. But in 2014, the FBI managed to take it over.

In a controversial move, the agency then decided to use the site to essentially infect visitors with malware as a way to track them down.

As a result, the FBI is prosecuting hundreds who were found visiting the site, but it also happened to hack into computers from 120 countries.

On Friday, the three privacy groups filed briefs in a case involving Alex Levin, a suspect in the FBI’s Playpen investigation who’s appealing the way the agency used malware to gather evidence against him.

Privacy International claims that the warrant the FBI used to conduct the hacking is invalid. This is because the U.S. was overstepping its bounds by conducting an investigation outside its borders without the consent of affected countries, the group said.

According to Privacy International, the case also raises important questions: What if a foreign country had carried out a similar hacking operation that affected U.S. citizens? Would the U.S. welcome this?

The EFF and ACLU also claim that the FBI’s warrant was invalid, but they cite the U.S. Constitution, which protects citizens from unreasonable searches.

“Here, on the basis of a single warrant, the FBI searched 8,000 computers located all over the world,” EFF attorney Mark Rumold wrote in a blog post. “If the FBI tried to get a single warrant to search 8,000 houses, such a request would unquestionably be denied.”

A key concern is that a warrant to hack into so many computers will set a precedent. “Even serious crimes can’t justify throwing out our basic constitutional principles,” Rumold said.

Take-Two Goes Up But Misses The Mark

February 14, 2017 by  
Filed under Gaming

Take-Two today reported its financial results for the three months ended December 31, and they paint a mixed picture of the company’s performance for the holiday season.

Speaking with GamesIndustry.biz, Take-Two chairman and CEO Strauss Zelnick touted the company’s holiday slate of releases, mostly updating numbers revealed around Take-Two’s last earnings report. Mafia III has now sold-in approximately 5 million copies, while Civilization VI has surpassed 1.5 million units sold-in. NBA 2K17 has sold-in nearly 7 million units (up about 10% year-over-year), while Grand Theft Auto V continues to move copies, with sell-in now topping 75 million. Its recurrent consumer spending business (virtual currency, microtransactions, and DLC)has also done well, Zelnick said, noting that Grand Theft Auto Online posted a record number of players in December.

Despite some of those gaudy numbers, the quarter was not an unqualified success. The publisher reported GAAP net revenues of $476.5 million, up 15% year-over-year but near the low end of its $475 million to $525 million guidance. Additionally, Take-Two’s guidance called for a net income of $17 million to $30 million, but it ultimately posted a net loss of $29.9 million for the quarter.

“I know it’s a bit clouded by GAAP reporting, which requires us to defer revenues, and requires us to accelerate costs related to those deferred revenues, so we have a mismatch,” Zelnick explained. “It can look like, from a GAAP point of view, that we’re not doing as well as we’re doing from a bookings and cash flow point of view.”

Total bookings for the quarter did indeed jump 51% year-over-year to $719 million, with the aforementioned titles and WWE 2K17 serving as the largest contributors to that number. Bookings from recurrent consumer spending did particularly well, growing 55% year-over-year and making up 23% of the company’s total bookings.

The holiday quarter also saw the release of Take-Two’s first VR efforts, Carnival Games VR and NBA2K VR Experience. The company didn’t provide any performance metrics for those titles, but it’s clear Zelnick wasn’t counting on them to contribute too much.

“We were happy to bring the titles to market because it was a reflection of the fact we have the R&D abilities to address video games in a VR format if and when that’s a meaningful part of the business,” he said. “I have expressed skepticism in the past, and I think that’s been borne out by the fact that the market for VR in video games remains quite small.”

Zelnick also addressed the company’s $250 million acquisition of Social Point, the Barcelona-based mobile developer of Dragon City and Monster Legends. As for how the new studio will be integrated into the company, Zelnick said the goal was more to support them to continue doing what they’ve already been successful doing, while being mindful not to mess with what works.

“What we like about Social Point is they have multiplicity, it’s not just one [hit] and that distinguishes them from a lot of people in this space,” Zelnick said. “And they know how to monetize those hits and interact with their audience. I’m hoping we can help them grow even faster, but minimally, we want to be supportive so they can keep doing what brought them to this place in the first place… the way we tend to integrate new creative acquisitions is we want those companies to retain their identity and their independence, and to continue to do what works in the market.”

That’s not to say the company is abandoning all hope of synergy. Zelnick said he hopes Take-Two can help lend its experience in Asian markets to help Social Point find success in those territories, while acknowledging that Take-Two can probably learn a few things about monetizing in a free-to-play environment that could be brought to bear on titles like NBA 2K Online and WWE Supercard.

Courtesy-GI.biz

AT&T Teams Up With IBM, Nokia To Tackle IoT Vulnerabilities

February 10, 2017 by  
Filed under Computing

Some of the most notable firms in security and the internet of things, including AT&T and Nokia, are joining forces to solve problems that they say make IoT vulnerable in many areas.

The IoT Cybersecurity Alliance, formed Wednesday, also includes IBM, Symantec, Palo Alto Networks, and mobile security company Trustonic. The group said it won’t set standards but will conduct research, educate consumers and businesses, and influence standards and policies.

As IoT technologies take shape, there’s a danger of new vulnerabilities being created in several areas. Consumer devices have been in the security spotlight thanks to incidents like the DDoS attacks last year that turned poorly secured set-top boxes and DVRs into botnets. But the potential weaknesses are much broader, spanning the network, cloud, and application layers, the new group said in a press release.

AT&T says over the past three years it has detected an increase of more than 3,000 percent in attackers scanning IoT devices for weaknesses. Enterprises aren’t confident their devices are secure, AT&T says.

“That combination of attacker interest and customer concern could damage or even derail the rosy future most vendors see for the Internet of Things,” Pund-IT analyst Charles King said in an email.

The Alliance vows to jointly research problems in those areas and in major IoT use cases such as connected cars, healthcare, industrial IoT, and so-called smart cities.

The group’s goals and methods are similar to what the Industrial Internet Consortium has been doing since 2014. IIC, which includes AT&T and IBM, also aims to define best practices and influence IoT standards in several areas, including security. But it’s focused specifically on industrial IoT.

Ransomware Becomes More Popular As Malware Declines

February 9, 2017 by  
Filed under Around The Net

A global cyberthreat report recently released found that 2016 was a mixed bag: malware was down slightly, but ransomware attacks soared, up 167 times the number recorded in 2015.

In addition to that huge increase in ransomware, 2016 saw a new line of cybercrime from a large-scale DDoS attack through internet of things devices. The principal case occurred in October when the Mirai botnet attacked unprotected IoT devices, such as internet-ready cameras, resulting in a DDoS attack on Dyn servers.

The 2016 report, by cybersecurity company SonicWall, looked at data from daily network feeds sent from more than 1 million sensors in nearly 200 countries.

During all of 2016, SonicWall found that unique samples of malware fell to 60 million samples, down from 64 million in 2015, a 6.25 percent decrease. Total malware attempts also fell to 7.87 billion from 8.19 billion, a 4 percent decrease.

However, ransomware-as-a-service (RaaS), where ransomware is provided by cybercriminals to other bad guys as a service, rose, offering quick payoffs to cybercrooks, SonicWall found. Ransomware is malicious software designed to block access to a computer system until a ransom is paid to the attacker.

Ransomware attacks rose from 3.8 million in 2015 to 638 million in 2016, an increase of 167 times year over year. SonicWall theorized that ransomware was easier to obtain in 2016 and that criminals faced a low risk of getting caught or punished.

Ransomware was the “payload of choice for malicious email campaigns and exploits,” SonicWall said.

In 2016, the most popular malicious email campaigns were based on ransomware, typically Locky, which was deployed in more than 500 million total attacks throughout the year. No industry was spared: the mechanical and industrial engineering industry got 15% of the ransomware hits, while pharmaceuticals and financial services companies each got 13%, while real estate companies got 12%.

BlackBerry Announces New Line Of Business

February 9, 2017 by  
Filed under Mobile

BlackBerry unveiled a new line of business to provide developers with a secure, cloud-based, mobile communications platform for texting, voice, video and file sharing.

Developers can insert these capabilities into their existing custom apps and services using the new BBM Enterprise SDK (software developer kit), BlackBerry said. The SDK will be sold as a per-user license on a subscription basis to developers, including those employed at enterprises, and to independent software vendors (ISVs).

BlackBerry didn’t say what the licenses would cost, but did say the cost would be affordable, especially compared to communications products from competitors that usually charge on a usage basis for texts, voice and video calls. The SDK will be available worlwide later in February for apps running on iOS and Android.

All communications in the new platform will be highly secure and encrypted with keys kept under the management of the application developers, not BlackBerry, said Frank Cotter, vice president of enterprise products, in a conference call.

These communications will be transmitted via the Internet Protocol and not the SMS channel typically used by competitors. The communications also will be compliant with the Federal Information Processing Standard 140-2 that the U.S. government uses for approving cryptographic modules in devices, Cotter said.

Using the new BlackBerry platform will allow physicians who text patient information to stay within the requirements of HIPAA (Health Insurance Portability and Accountability Act), Cotter said. “Other vendors sidestep HIPAA and say they are just a pipe and that HIPAA doesn’t apply,” he said.

In one example, Cotter said an emergency room physician could use the communications platform to reach out to another doctor via a text, then quickly escalate that text to a voice or video call and transmit a picture of a patient’s injuries while continuing the call. “We bolt [our software] into existing workflows and apps,” he said.

In another example, Cotter said a dashboard tablet used by a police officer during a high-speed chase could quickly be turned to a secure channel with a dispatcher showing video from the scene and the police cruiser’s location.

BlackBerry already works with developers in a partnership program that has created more than 4,000 third-party enterprise apps, said Marty Beard, chief operating officer of BlackBerry. The new SDK promises to build on those apps, he said.

Next Page »