Subscribe to:

Subscribe to :: TheGuruReview.net ::

Cisco Warns Of Bug In Virtual App

June 30, 2015 by Michael  
Filed under Computing

Cisco has warned of a default Secure Shell vulnerability in three of its virtual applications.

The flaw could allow attackers to decrypt traffic exchanged in the services, and has been detailed in a Cisco security advisory.

It affects Cisco’s Web Security Virtual Appliance (SMAv), Email Security Virtual Appliance and Security Management Virtual Appliance, which are already commercially available.

Cisco said that it “is not aware of any public announcements or malicious use of the vulnerabilities”, but warned that attackers who got hold of the private keys could decrypt communications with a man-in-the-middle attack.

The default private encryption keys were preinstalled on all three of the products, a move which is considered bad security practice.

“Successfully exploiting this vulnerability on Cisco SMAv allows an attacker to decrypt communication toward SMAv, impersonate SMAv, and send altered data to a configured content appliance,” the advisory said.

“An attacker can exploit this vulnerability on a communication link toward any content security appliance that was ever managed by any SMAv.”

Cisco has released a patch which deletes the preinstalled SSH keys and explains how customers can correct the problem.

The Cisco-sa-20150625-ironport SSH Keys Vulnerability Fix comes as part of several product upgrades, and must be manually installed from a command line interface.

Cisco’s advisory said that the patch is not required for physical hardware appliances, or for virtual appliance downloads or upgrades after 25 June.

Cisco revealed details of a new point of sale attack earlier this year that could part firms from money and customers from personal data.

The threat, called PoSeidon by the Cisco team, came at a time when eyes were on security breaches at firms like Target.

Cisco said in a blog post that PoSeidon is a threat that has the ability to breach machines and scrape them for credit card information.

Courtesy-TheInq

RedHat Goes PaaS With Linux

June 30, 2015 by Michael  
Filed under Computing

Red Hat has announced the release of OpenShift Enterprise (OSE) 3, a new version of its Platform-as-a-Service offering.

Based on Red Hat Enterprise Linux (RHEL)7, Openshift is built on Docker Linux containers with Kubernetes orchestration using technology developed in collaboration with Google.

The news comes in a busy week for Red Hat, which has also announced a new productivity tie-up with Samsung and taken a leading role in the formation of a new alliance known as the Open Container Project to standardise containers.

Users will have access to a wide range of apps via the Red Hat Container Certification Programme. Middleware solutions including Red Hat JBoss Enterprise, Web Server (Tomcat) and JBoss A-MQ messaging are also included.

Included are a number of tools to help developers create and collaborate, with web, command line, and integrated development environment interfaces. Options include direct code-push from GIT and source to image building. There is also flexibility for deployment, rollback and integration.

In addition, a preview of Openshift Dedicated has been released. The public cloud service based on OpenShift 3 will succeed Openshift Online, which already hosts 2.5 million applications online, allowing businesses to quickly build, launch and deploy bespoke apps.

Ashesh Badani, vice president and general manager, OpenShift, Red Hat, said, “This release of OpenShift Enterprise 3 employs open source containers and orchestration practices to change the developer experience and move the platform in the direction of what customers are asking for – a flexible platform for a microservices architecture.

“Our continued upstream work in the Docker and Kubernetes communities enable us to deliver the most updated technology platform for developers and operators, enabling them to remain competitive through quicker innovation.”

To assist users, Red Hat is offering a range of enterprise administrator courses to teach users how to deploy, configure and manage the system, which can result in a Red Hat Certificate of Expertise in Platform as a Service – a worthy certificate for any office wall.

OpenShift 3 is available now with bespoke pricing models based of socket and core pairings.

Courtesy-TheInq

 

Samsung To Stop Disabling Windows Updates On PCs, Tablets

June 29, 2015 by mphillips  
Filed under Computing

Samsung agreed to stop disabling Windows Update on its PCs and tablets, bowing to a chorus of complaints — including Microsoft’s — that it had interfered with the way users intended the patch service to work on their devices.

“We will be issuing a patch through the Samsung Software Update notification process to revert back to the recommended automatic Windows Update settings within a few days,” a Samsung spokesperson said in an emailed statement Friday afternoon.

Samsung’s pledge put an apparent end to the week’s kerfuffle, which began when Patrick Barker, a crash-debugging and reverse-engineering expert, and a Microsoft MVP (Most Valuable Professional), charged the Korean company with silently changing how Windows Update delivers bug fixes and security patches to customers.

Samsung’s own SW Update — a tool used to update its branded personal computers and tablets with new drivers and refresh third-party, pre-installed software — changed Windows Update’s settings to prevent it from automatically downloading and installing updates, the default setting that Microsoft recommends. Instead, SW Update switched the setting to “Check for updates but let me choose whether to download and install them.”

Microsoft didn’t care for that one bit. “We do not recommend disabling or modifying Windows Update in any way as this could expose a customer to increased security risks,” the company said Wednesday. “We are in contact with Samsung to address this issue.”

Samsung first said it was, like Microsoft, looking into Barker’s findings, but subsequently denied that it had blocked a Windows 8.1 update — a red herring, since that had never been alleged — and at the same time admitted it manipulated Windows Update.

By Friday, whatever conversations occurred between Microsoft and Samsung made the latter change its mind on messing with the former’s patch service. “Samsung has a commitment to security and we continue to value our partnership with Microsoft,” the Samsung statement read.

 

 

 

 

Seagate Goes OneDrive

June 26, 2015 by Michael  
Filed under Computing

Seagate has announced a tie-up with Microsoft’s OneDrive cloud service, offering users of Seagate Backup Plus drives 200GB of cloud storage for two years after redemption.

The offer is redeemable via the bundled Seagate Dashboard app, which also includes an interface to back up to OneDrive as well as Dropbox and Google Drive.

Seagate has also announced a new version of its 4TB Backup Plus drive, superseding the previous two-platter behemoth with a 20.5mm single-platter version.

This will be easier to carry around, and should be a bit more stable than the cross-volume edition currently in circulation.

“Seagate Technology continues to innovate at a fast pace based on the ever changing needs of its customers,” said Jingwen Li, research analyst for storage systems at IDC.

“With features such as OneDrive cloud storage, and the new 4TB capacity, the Seagate Backup Plus family addresses the growing need for data storage and backup with flexible data access and simple data management in the personal storage market.”

The updated 4TB version will arrive next month at $239, while the existing 500GB to 2TB options are already available starting at $79.99. Desktop versions stretch up to an 8TB version at $359.99.

After a frantic morning of registering the Seagate drives in the office to no avail, we confirmed with Seagate that, in fact, the offer is back-dated only to drives manufactured after January 2015, so loyal customers don’t need to get all excited.

Microsoft opened up its API for OneDrive in February to make it easier for manufacturers to integrate.

More recently the firm opened up the API for Microsoft Office so that other cloud storage providers can integrate their own offerings directly, blowing the productivity market wide open.

Courtesy-TheInq

Cyber Criminals Find Financial Industry Most Lucrative

June 25, 2015 by Michael  
Filed under Computing

Banks, Stocker Brokerages and credit card firms are becoming more desirable to money-hungry cyber crooks whose fraudulent operations now account for 300 percent more attacks than any other industry.

That’s according to the latest Finance Industry Drill Down report by security firm Websense, which says that because of the value at stake in compromising hosts in the finance sector, criminals are spending “a tremendous amount of time” on the investigation and lure stages of attacks.

This is to make sure attempts are sophisticated enough to be successful against any defences a business might have in place.

Based on Websense’s telemetry data from the Websense ThreatSeeker Network and the Websense Advanced Classification Engine, the findings revealed the financial sector is one of the biggest targets of cyber-attacks and thus these attacks are increasing in volume and sophistication.

Some of these attacks can earn big bucks for those launching them. We spoke to Websense’s Principal Security Analyst, Carl Leonard, who told us that one of the most popular malware campaigns launched against the financial sector over the past year managed to steal a whopping £100,000.

Leonard added that these hackers are usually working in small groups, but the interesting thing is that they are less skilled than what they would have been traditionally due to the rise of “malware-as-a-service” (MASS).

MAAS refers to a coveted underground network of cybercrooks and hackers who are always on the outlook for new tools to exploit, and are exchanging source code for existing malware, then using basic skills to customise these, enabling even entry-level threat actors to successfully create and launch data theft attacks.

This wide-spread availability of exploit kits on sale on the undergroung cyber markets, comprising a combination of old techniques with newer ones, is resulting in attacks which are difficult to track back to the source. Old threats are thus being recycled into new ones and launched through email and web channels, “challenging even the most robust defensive postures”.

“The barriers to entry are reducing due to malware-as-a-service because you don’t need to have skills of previous years,” explained Leonard. “You can acquire the skills and malware kits that allow you to conduct tools as a malware author.

“These authors can subscribe to a service, which [constantly] provides them with very recent versions of malware.”

The most dangerous malware tools found by Webense to be wrecking havoc in the financial sector are:

Rerdom, an attack vector from the Asprox family of malware responsible for a huge amount of attacks against every industry. This attack group is often known as a spam generator, but it has vast functionality and is increasingly being used to target financial services customers in the form of sending malicious emails, click fraud, harvesting FTP, browser, and email credentials. It accounts for 30 percent of threats found in the financial services industry, Websense said.

Vawtrack, a malicious banking trojan, accounts for 13 percent of all threats in the sector and is being used in attempts to steal a wide range of victims’ credentials. Websense said it was created for gathering personal information and stealing it without leaving traces. The banking trojan can easily take over passwords, digital certificates, browser history and cookies. Its defense mechanism tries to detect any installed AV and disable it by using the Windows mechanism called Software Restriction Policies.

The third most popular malware threat seen in the financial sector is one called Geodo, which Websense said has been found in the sector 400 percent more than any other industry. The security firm describes it as an update of the Cridex attack, and has a separate email worm that looks to steal credentials and self-perpetuate, thereby initiating more lures.

Courtesy-TheInq

 

Oracle To Extends Cloud Offerings, Takes On Amazon

June 24, 2015 by mphillips  
Filed under Around The Net

Oracle Corp founder and Executive Chairman Larry Ellison announced that his database company is expanding its cloud-computing offerings, bringing Oracle into more direct competition with Amazon.com Inc.

“We’re prepared to compete with Amazon.com on price,” said Ellison in a webcast presentation, after announcing that Oracle would offer online storage and capability for customers to run their applications entirely in Oracle’s cloud.

The expansion is a major new step for Oracle, which is shifting its traditional database and customer relationship management businesses to the cloud.

“This is a really big deal,” said Ellison, who stepped aside in 2014 as chief executive of the company.

Amazon Web Services is the market leader in providing cloud computing capability to customers, followed by Microsoft Corp’s Azure service and International Business Machines Corp.

Oracle, which calls its cloud offering the Oracle Cloud Platform, will provide a cost-effective alternative to Amazon, said Ellison.

“Our new archive storage service goes head-to-head with Amazon Glacier and it’s one-tenth their price,” said Ellison. Amazon did not immediately return a request for comment.

Oracle’s cloud business is growing quickly, running at a rate of about $2.3 billion a year in revenue, based on last quarter’s figures.

By comparison, Amazon and Microsoft get about $6.3 billion each in cloud revenue per year.

 

 

Should Nintendo Drop E3?

June 24, 2015 by Michael  
Filed under Computing

You know a company has had a particularly miserable E3 when, before the show is even over, one senior executive finds himself having to officially deny that another senior executive has apologized for the state of their E3 offerings. That’s exactly the situation Reggie Fils-Aime found himself in earlier this week, as the disappointment at Nintendo’s extremely weak showing crystallized around a single tweet sent by company president Satoru Iwata. The tweet was in Japanese; various translations floated around, some more accurate than others, and the media gleefully seized on an interpretation which had Iwata promising to “do better” at E3 in future. It was the perfect stick with which to beat Nintendo for failing to live up to the standards accomplished by Microsoft and, even more spectacularly, by Sony on the previous day; look, even the company’s own president thinks it was rubbish!

As it happens, Fils-Aime is quite right; Iwata did not apologize for Nintendo’s conference. He said that the company was listening closely to feedback and would work hard, in future, to meet the expectations of even more people. This was prefaced with a comment related to the extremely late hour at which the show was broadcast in Japan (it didn’t start until 1am JST; the Sony conference the previous day was at a rather more comfortable 10am JST, and nobody in Japan really cares about the Microsoft conference). In context (and context is king in the Japanese language), Iwata’s comment is clearly a generic “thanks for your feedback, we’ll work hard in future too”, coupled with a tacit promise to try not to mess up the scheduling for Japanese viewers in future.

Iwata didn’t apologize. Of course he bloody didn’t; the Nintendo boss is often frank and refreshingly direct in his manner, but the content of his statements is always, always on-message. The idea that he was going to take to Twitter to say “sorry, that was a load of old bollocks wasn’t it?” after his company’s event is ludicrous. Yet, at the same time, the fact that it seemed plausible to so many people is a reflection of something troubling; Nintendo’s event was genuinely bad enough to make an apology from Iwata himself seem, if not realistic, then at least not ridiculous.

Nintendo, or at least a part of Nintendo – perhaps the Japanese part – didn’t want to be at E3. That’s partially related to NX; the company is the only platform holder which has acknowledged that it’s working on future hardware, but isn’t going to say anything further about it until 2016. It’s also too early to talk about its mobile titles (and E3 probably isn’t the venue for that anyway), and Iwata confirmed prior to the event that it wouldn’t talk about its health, lifestyle and education related projects at a purely gaming event like E3. Nonetheless, there’s plenty that Nintendo could have talked about but didn’t. The choice to reveal only games that are locked in for release within the next 10 months or so isn’t confirmation of a time-of-death being decided for Wii U (they did the same thing for 3DS, which has an installed base twice the size of the PS4 and isn’t going anywhere any time soon), it’s a decision which was taken, along with the decision to do an online broadcast rather than a live event – cutting out the whooping crowds and the spectacle that usually defines an E3 conference.

These are decisions which say, “we’re not playing your game” – the game in question being E3 itself. Nintendo doesn’t feel like it fits well with E3 right now. It’s not just troubled by the dismal sales of the Wii U, it’s also deeply uncomfortable with being the only major company in the industry that’s still seriously committed to family entertainment. It knows that no matter how wonderful its software and franchises are – and I maintain that Nintendo is in a genuine golden age regarding the quality of its games – they make problematic bedfellows for the mainstream of distinctly adult-focused games and the monetization of violent nostalgia for thirty-somethings. I think it’s genuinely wonderful that the games industry’s wings are spread so wide, even in the AAA space, that it can accommodate both the charming, gentle fun of Yoshi’s Wooly World and the gut-wrenching, visceral violence of the Doom reboot; at the same time, I can understand why the creators of the former don’t see much value in investing heavily in promoting it alongside the latter. Wrong place, wrong time, wrong audience. It’s no accident that one of the very few third-party games to appear in the Nintendo event was Skylanders, a hugely successful franchise that’s equally uncomfortable standing shoulder to shoulder with Call of Duty and Assassin’s Creed.

By going digital rather than having a staged event, by replacing its executives with loveable puppets, by giving developers lengthy, meandering videos to chat about their creative process after showing off their new trailers, by refusing to talk about anything but the immediate future of its software line-up – by all these decisions and more, Nintendo said “we’re not playing the E3 game” and attempted to dodge the inevitably negative contrasts with Sony and Microsoft.

It didn’t work. It didn’t work because it’s an intrinsically dishonest approach, one which not only failed to establish a “Nintendo difference” that denied negative contrasts, but which also robbed the company of the chance to make a decent fist out of its showing. Nintendo hobbled its own event, making it even more disappointing than it needed to be, and all it achieved was to make itself look even weaker, even more troubled, next to the might of Sony and Microsoft.

Here’s what Nintendo should have done – should have had the courage to do – nothing. They should have held no digital event. Some of Nintendo of America’s activities, like the entertaining and light-hearted Nintendo World Championships, fit nicely with the week, but the digital event shouldn’t have happened at all. The company is absolutely correct to think that its approach and its products don’t fit E3 as it stands, but absolutely wrong to think that it can avoid the resulting negativity by just down-scaling its involvement. Pick a lane and stick with it; given the choice to go big or go home, Nintendo’s decision ought to have been “go home”, not “can’t we just go a bit small and hope for the best?”

This would not be unprecedented. Faced with a similar disconnect between their games and much of the rest of the industry’s direction, Nintendo – by far the largest games company in Japan – has spurned involvement in the Tokyo Game Show for many, many years. Being at TGS makes no sense for the company. It can achieve better exposure for its games in a more positive environment by holding its own event, digital or otherwise, at a different time; a month or two before the show, or after the show. This decision has never hurt Nintendo one jot – not in the way that a rubbish, half-hearted TGS conference every year would have.

Precisely the same logic applies to E3. Imagine if Nintendo had skipped E3 entirely; sure, there would have been a bit of hand-wringing and pearl-clutching in the media over it, but it would have been over soon, and a few people writing “Nintendo were conspicuous by their absence” in their show reports is hardly the end of the world. Then this week’s digital event could have been held as an ordinary digital event a month or six weeks later; call it “Nintendo’s preview of the next six months”, or whatever. In that context, it would actually have been a pretty great show. Tack on a few seconds of new footage from the upcoming open-world Zelda game and one of Miyamoto’s work-in-progress Gamepad titles, and you’d have a digital event that everyone would consider pretty strong, instead of an E3 show that everyone considered awful and weak.

To make this work, though, Nintendo needs to commit to the strategy. This year, it tried to have its cake and eat it; to participate in E3 without committing to it, without making a big deal of it. It failed so miserably that the Internet spent a few hours genuinely believing that Iwata had apologized for the whole sorry affair. Skipping E3 entirely – or at the very least, dropping all pretense of holding a conference during E3 week – would have been preferable, and ought to be the company’s strategy for the future.

Courtesy-GI.biz

 

The Linux Foundation Donates To Open Source Security

June 24, 2015 by Michael  
Filed under Computing

The Linux Foundations Core Infrastructure Initiative (CII) has announced a $500,000 investment in three projects designed to improve the open source technology’s security and services.

The project will fund the ReproducibleBuilds, Fuzzing Project and False­Positive­Free Testing initiatives.

The $200,000 ReproducibleBuilds funding aims to help Debian developers Holger Levsen and Jérémy Bobbio’s attempts to improve the Debian and Fedora operating systems’ security by letting developers independently verify the authenticity of binary distributions.

The feature will help people working on the systems to avoid introducing flaws during the build process and reduce unneeded variations in distribution code.

The $60,000 Fuzzing Project investment will aid security researcher Hanno Böck’s efforts to coordinate and improve the fuzzing software testing technique that identifies security problems in software or computer systems.

It has been used successfully to find flaws in high-profile technologies including GnuPG and OpenSSL.

The final $192,000 False­Positive­Free Testing funding will go to Pascal Cuoq, chief scientist and co-­founder of TrustInSoft, in his attempts to build an open source TIS Interpreter that will reduce false positive TIS Analyser threat detections.

The overall funding will be overseen by Linux security expert Emily Ratliff, who expects the initiative to centralise the open source community’s security efforts.

“I’m excited to join the Linux Foundation and work on the CII because improving the security of critical open source infrastructure is a bigger problem than any one company can tackle on its own,” she said.

“I’m looking forward to working with CII members to more aggressively support underfunded projects and work to change the way the industry protects and fortifies open source software.”

The funding follows the discovery of several critical bugs in widely used open source technologies, one of the biggest of which was Heartbleed.

Heartbleed is a flaw in the OpenSSL implementation of the TLS protocol used by open source web servers such as Apache and Nginx, which host around 66 percent of all sites.

The funding is one of many initiatives launched by the Linux Foundation designed to stop future Heartbleed-level flaws. The Linux Foundation announced an open audit of openSSL’s security in March.

Courtesy-TheInq

Is It Game Over For The Playstation Vita?

June 23, 2015 by Michael  
Filed under Gaming

Sony is denying that its PlayStation Vita is dead in the water, despite ignoring it during its E3 2015 presentation.

Slim PlayStation Vita went on sale in February and was greeted by a loud sounding yawn by the hand-held game community. Since then we have heard very little about it, and like most of the world, including Sony, did not really care.

PlayStation Europe boss Jim Ryan insisted to Gamespot that the system is still selling well and has “hundreds” of games in development.

“We’re still selling respectable quantities. We have a hundred games in development, and you might say, ‘Well yeah but they’re all indie games’, but many of these games review very highly. Also the PS4′s Remote Play feature is something that is valued a lot.”

Ryan also insists that the handheld market still exists, despite being gutted by tablets and smartphones.

He admitted that it was not as big as it used to be, but hell what these days is.

” A much smaller market than when the DS and PSP were in their glory days. But that market still does exist,” he added.
Despite his enthusiasm we don’t hold out much hope.

Courtesy-Fud

 

Samsung To Fix Swiftkey Smartphone Security Flaw

June 22, 2015 by mphillips  
Filed under Mobile

Samsung agreed to update the security software on its Galaxy smartphones to fix a flaw that researchers warned could let attackers access people’s devices.

Last week, researchers at NowSecure, a mobile security company, identified the flaw in SwiftKey, a keyboard application that comes preloaded on Galaxy smartphones. The flaw could be exploited even when SwiftKey was not used as the default keyboard, NowSecure said.

On Thursday, Samsung said it would issue a fix that wouldroll out over the coming days to owners of the Galaxy S4, released in 2013, and later models. Those devices have Samsung’s Knox security platform installed by default and can receive over-the-air security policy updates. Users must have automatic updates activated in their phone’s settings, Samsung said on its website.

For earlier Galaxy phones that don’t come with Knox, Samsung said it was working on an expedited firmware update. Availability will vary depending on the model, region and service carrier.

SwiftKey’s app, which predicts words as users type, is also available from the Google Play and Apple App stores. But those versions of the app were not affected by the vulnerability, a SwiftKey spokeswoman said last Thursday.

 

Oracle Appears To Be Sliding

June 22, 2015 by Michael  
Filed under Computing

Oracle said weak sales of its traditional database software licenses were made worse by a strong US dollar lowered the value of foreign revenue.

Shares of Oracle, often seen as a barometer for the technology sector, fell 6 percent to $42.15 in extended trading after the company’s earnings report on Wednesday.

Shares of Microsoft and Salesforce.com, two of Oracle’s closest rivals, were close to unchanged.

Daniel Ives, an analyst at FBR Capital Markets said that this announcement speaks to the headwinds Oracle is seeing in the field as their legacy database business is seeing slowing growth.

It also shows that while Cloud business has seen pockets of strength it is not doing as well as many thought,

Oracle, like other established tech companies, is looking to move its business to the cloud-computing model, essentially providing services remotely via data centres rather than selling installed software.

The 38-year-old company has had some success with the cloud model, but is not moving fast enough to make up for declines in its traditional software sales.

Oracle, along with German rival SAP has been losing market share in customer relationship management software in recent years to Salesforce.com, which only offers cloud-based services.

Because of lower software sales and the strong dollar, Oracle’s net income fell to $2.76 billion, or 62 cents per share, in the fourth quarter ended May 31, from $3.65 billion, or 80 cents per share, a year earlier.

Revenue fell 5.4 percent to $10.71 billion. Revenue rose 3 percent on a constant currency basis. Analysts had expected revenue of $10.92 billion, on average.

Sales from Oracle’s cloud-computing software and platform service, an area keenly watched by investors, rose 29 percent to $416 million.

Courtesy-Fud

Greek Crisis Driving Bitcoin Surge

June 18, 2015 by mphillips  
Filed under Around The Net

Bitcoin continues to surge and is on track for its longest winning streak in 18 months, as fears that Greece could exit the euro drove speculators and Greek depositors into the decentralized digital currency.

Prime Minister Alexis Tsipras lashed out at Greece’s creditors on Tuesday as he defied a string of warnings that Europe is preparing for a “Grexit”. The debt-stricken country faces 1.6 billion euros ($1.8 billion) in repayments to the International Monetary Fund by the end of June.

Bitcoin, a web-based “cryptocurrency” invented six years ago, is not backed by or controlled by any government or central bank and floats freely, fluctuating according to user demand.

Though bitcoin’s value has previously been highly volatile, it has stabilized over the past six months and is increasingly treated as a legitimate and potentially valuable asset by major financial institutions, and even by governments such as Britain’s.

Joshua Scigala, co-founder of Vaultoro.com, a firm that holds bitcoin for its customers and allows them to exchange it for gold and vice versa, said that Greeks were buying the currency as their trust in the authorities waned. It is also unclear what currency would be used if a Grexit does occur — another potential factor driving Greek demand for bitcoin.

“Some people aren’t waiting for the government to figure out an exit plan and are doing it for themselves,” said Scigala.

“You have people worrying about their families’ wealth or their life savings, and worrying that their money might be locked up in banks … They’d rather hold money in a private asset like gold or bitcoin.”

Scigala said over the past two months, with Greece locked in talks with its creditors, the company had seen a 124 percent pick-up in inflows from Greek IP addresses – numerical labels that identify computers and other internet-enabled devices.

Bitcoin traded as high as $252.05 on the Bitstamp exchange on Tuesday, its strongest in over two months, before easing a little to $245.21, still up around 4 percent on the day. That marked its sixth straight session of gains — its best run since January 2014.

 

 

Microsoft Set To Encrypt All Bing Search Traffic By Default

June 17, 2015 by mphillips  
Filed under Around The Net

Microsoft will beef up Bing’s security when it begins encrypting all of its search traffic by default this summer.

Bing has offered HTTPS encryption for the past year and a half as an opt-in feature, but now Microsoft will default to locking down everybody’s search queries.

Providing encryption gives a new layer of protection to Bing users and helps guard their traffic from snooping.

With this move, Microsoft catches up to its peers in the search market. In 2011, Google began encrypting searches by default for users who were signed in to their Google account. Starting in 2013, the search giant moved all search traffic through HTTPS. Yahoo, Microsoft’s search alliance partner, began encrypting search traffic from its homepage by default in early 2014.

With the switch to encrypted traffic, Microsoft is also changing the way that webmasters get information about searches that lead to their websites. The company will still offer a referrer string so that website operators and marketers can see that the encrypted traffic is coming from Bing, but won’t provide the exact search term that led people to a page.

Instead, Bing Webmaster Tools will continue to provide aggregated keyword and ranking data so that website operators can keep track of what draws users to their websites along with how they compare with the competition. Advertisers will be able to see what search queries triggered their Bing ads using the Search Query Terms Report, which also provides information on other performance metrics like clicks, impressions and conversions.

 

 

Online Password Company LastPass Reports Hacking

June 17, 2015 by mphillips  
Filed under Around The Net

LastPass users will be required to change their master passwords after the online password locker company reported that its network was breached last Friday.

The company announced the breach in a blog post Monday after investigating “suspicious activity” discovered by its security team. According to LastPass, the investigation did not reveal any evidence that the attackers stole encrypted data from users’ password vaults, nor did the intruders gain access to LastPass users’ accounts. That said, the attackers were able to steal account email addresses, password reminders, server per user salts, and authentication hashes.

Those last two pieces of data make it possible for the attacker to figure out any weak master passwords, though CEO Joe Siegrist said LastPass’s protections (which include running 100,000 rounds of PBKDF2-SHA256 server-side) make it “difficult to attack the stolen hashes with any significant speed.”

Because of that, everyone who uses the service will have to set a new master password for their vault, and anyone logging into their account from a new device or IP address will need to verify their identity via email if they don’t have two-factor authentication enabled.

LastPass lets people store passwords for numerous sites and can then log them into those sites automatically, so they don’t need to remember individual passwords. It includes tools for generating strong passwords with complex character strings, and users need only remember their master password to use the service. But its security, of course, depends on LastPass itself not getting hacked.

LastPass is among the most popular password lockers in use today, in part because the company offers its service for free to users who want to protect their login credentials and other secure information. At this point, it’s not clear how many users are affected by the attack.

 

 

 

Adobe Goes Stock

June 17, 2015 by Michael  
Filed under Around The Net

Adobe has announced a bunch of major updates to its Creative Cloud software, bringing new tools and services in the 2015 edition, including new features in Photoshop CC, Illustrator CC, Premiere Pro CC and InDesign CC.

However, one of the more interesting parts of the announcement is the introduction of the firm’s stock content service called Adobe Stock, which will be integrated into the firm’s Creative Cloud offering.

Adobe said that the new service “radically simplifies buying and using stock content”, including photos, illustrations and graphics.

It’s a growing collection of over 40 million curated, high-quality images that can be accessed by launching Adobe Stock directly within CC desktop software.

Watermarked images can be added to Creative Cloud Libraries, and then used across multiple desktop tools.

“When creatives are ready to license the image for finished work, they can do it directly within the desktop software they are working in,” the firm said.

Adobe Stock will be available as a standalone stock service too, but for a price. This will allow designers and marketers who are not yet Creative Cloud members to download, purchase and sell stock images.

The release of Adobe Stock is bound to shake up the stock image market, as Adobe customers are active contributors to stock image services and regular purchasers of stock content. Adobe is very aware of this.

“An estimated 85 per cent of creatives who buy stock content use Adobe tools, and more than 90 percent of stock content sellers use Adobe software in the preparation of their photos and images,” said Adobe.

Creatives can also contribute work to Adobe Stock, and the software will offer rates to photographers and designers contributing content to the growing database.

Adobe Stock is available today in 36 countries and 13 languages, but it doesn’t come cheap. Pricing for Creative Cloud individuals and team customers is £7.19 for a single image, £23.99 per month for 10 images, and £143.99 per month for 750 images.

Separate prices are available for Adobe Stock customers who are not Creative Cloud members, Adobe said.

The updates across Adobe’s existing Creative Cloud software include new features as well as speed upgrades across 15 CC desktop applications.

One in particular is the 25-year-old Photoshop CC which introduces Artboards, a way to design cross-device user experiences in a single Photoshop document and quickly preview them on a device.

This new version also includes a preview release of Photoshop Design Space, a work environment focused on the needs of mobile app and website designers.

Lightroom CC and Photoshop CC also gain a new Dehaze feature which eliminates fog and haze from photos, including underwater shots, for clearer images.

Premiere Pro CC sees the addition of a Lumetri Color Panel for better colour corrections using sliders and more simple controls, as well as Morph Cut, which makes it easier to deliver polished interview content by smoothing out jump cuts in talking-head shots in creating sequences.

Meanwhile, Adobe said that Illustrator CC is now 10 times faster and 10 times more precise than CS6 owing to a boost to its Mercury Performance Engine, which means users can now pan and zoom smoothly without delays.

Courtesy-TheInq