Subscribe to:

Subscribe to :: ::

Ransomware Now Has Customer Service

July 31, 2017 by  
Filed under Around The Net

Hackers behind some of the most infamous ransomware out there are taking some hints from legit Wall Street companies.

Malware strains like Locky and Cerber helped make ransomware a $25 million industry in 2016 and its operators are starting to operate like conventional corporations with “customer” service staff and outsourced resources, researchers explained Wednesday at Black Hat.

Ransomware has devastated hospitals, universities, banks, and essentially any computer network with weak security over the last 10 years, but attacks have become even more prevalent as infection rates and payments grow. The malware encrypts files on a victim’s computer and demands payments — one that reached $1 million — if the victim ever wants to get data back.

Researchers at Google, Chainalysis, New York University and University of California San Diego followed the money trail and got a look at the evolving ecosystem of ransomware. During the presentation at the Las Vegas conference, the team showed a new professional side to ransomware.

Instead of working as criminals, ransomware attackers are treating their victims as “customers” and bringing in support staff to deal with their “sales.” Yes, just like how your phone providers and banks have customer service, now, so does ransomware.

“It’s become a well-oiled machine,” said Elie Burzstein, Google’s anti-abuse research team lead. “It operates like a real company, that shows how mainstream it’s become and how much it’s here to stay.”

Customer service reps help victims find out how to buy cryptocurrency, like bitcoin, to pay the ransom and negotiate with victims to decrypt specific files. They also offer immunity packages to ensure victims can’t get hit again.

Burzstein said the development has been staggering, as ransomware has evolved into organized crime. Cybercriminals have even hired graphic designers to give their websites and malware a more inviting aesthetic.

Google’s research team also found that ransomware attackers have been outsourcing much of the heavy lifting to massive botnets to get people infected. Locky and Cerber both rented out the Necurs botnet to spam millions of emails in the hopes of spreading its ransomware around the world.

The outsourcing paid off, as Locky made $7.8 million in 2016, while Cerber raked in $6.9 million that year.

Cerber also lets criminals who can’t code malware get in on the cut by renting its ransomware out, Burzstein said. Low-tech crooks can buy Cerber’s ransomware as a service and rake in crumbs off the table based on how many people they’ve infected.

The strategy helped Cerber earn more than $200,000 a month and become the fastest-rising ransomware of 2017.

“Ransomware as a service has become a dominant model,” Burzstein said. “All you have to do is infect people, and then you get a cut.”

The researchers also found new variations of the Cerber ransomware that have been tweaked to get past anti-virus scanners. In 2017, there had been 23,000 new binaries for the Cerber ransomware, while Locky had 6,000 new variations.

Hackers are working around the clock to keep ahead of the competition to make as much money as possible. These sophisticated attacks, with business-minded infrastructure, make ransomware like WannaCry and NotPetya — which last month locked up devices at multibillion-dollar companies — look like imposters.

IBM Will Use Apache Spark To Find E.T.

October 2, 2015 by  
Filed under Computing

IBM is using Apache Sparke to analyse radio signals for signs of extra-terrestrial intelligence.

Speaking at Apache: Big Data Europe, Anjul Bhambrhi, vice president of big data products at IBM, talked about how the firm has thrown its weight behind Spark.

“We think of [Spark] as the analytics operating system. Never before have so many capabilities come together on one platform,” Bhambrhi said.

Spark is a key project because of its speed and ease of use, and because it integrates seamlessly with other open-source components, Bhambrhi explained.

“Spark is speeding up even MapReduce jobs, even though they are batch oriented by two to six times. It’s making developers more productive, enabling them to build applications in less time and with fewer lines of code,” she claimed.

She revealed IBM is working with Nasa and Seti to analyse radio signals for signs of extra-terrestrial intelligence, using Spark to process the 60Gbit of data generated per second by various receivers.

Other applications IBM is working on with Spark include genome sequencing for personalised medicine via the Adam project at UC Berkeley in California, and early detection of conditions such as diabetes by analysing patient medical data.

“At IBM, we are certainly sold on Spark. It forms part of our big data stack, but most importantly we are contributing to the community by enhancing it,” Bhambrhi said.

The Apache: Big Data Europe conference also saw Canonical founder Mark Shuttleworth outline some of the key problems in starting a big data project, such as simply finding engineers with the skills needed just to build the infrastructure for operating tools such as Hadoop.

“Analytics and machine learning are the next big thing, but the problem is there are just not enough ‘unicorns’, the mythical technologists who know everything about everything,” he explained in his keynote address, adding that the blocker is often just getting the supporting infrastructure up and running.

Shuttleworth, pictured above, went on to demonstrate how the Juju service orchestration tool developed by Canonical could solve this problem. Juju enables users to describe the end configuration they want, and will automatically provision the servers and software and configure them as required.

This could be seen as a pitch for Juju, but Shuttleworth’s message was that the open-source community is delivering tools that can manage the underlying infrastructure so that users can focus on the application itself.

“The value creators are the guys around the outside who take the big data store and do something useful with it,” he said.

“Juju enables them to start thinking about the things they need for themselves and their customers in a tractable way, so they don’t need to go looking for those unicorns.”

The Apache community is working on a broad range of projects, many of which are focused on specific big data problems, such as Flume for handling large volumes of log data or Flink, another processing engine that, like Spark, is designed to replace MapReduce in Hadoop deployments.



AMD Demos Hadoop On ARM

October 3, 2014 by  
Filed under Computing

AMD announced that it would demonstrate the first implementation of Apache Hadoop on an ARM Cortex-A57 part at the JavaOne conference.

The chip in question is of course an A-series Opteron. AMD recently announced the Opteron A1100 and it is the company’s first ARM-based server part.

The presentation was delivered by AMD corporate fellow Leendert van Doorn and Henrik Stahl, VP of Java product management and IoT at Oracle.

“This demonstration showcases AMD’s leadership in the development of a robust, standards-based ecosystem for ARM servers,” said van Doorn. “Servers powered by AMD Opteron A-Series processors are well-suited for Hadoop, offering an efficient scale-out compute platform that can also double as an economical persistent storage platform.”

The demo showed an A1100 dev platform running Apache Hadoop on the Oracle JDK. AMD said it would continue its collaboration with ARM, Oracle, Red Hat, Linaro and SUSE in order to boost ARM development in the server space.


Experts Calling On The Government To Act Against Cyber Threats

August 12, 2014 by  
Filed under Computing

Alarmed by the frequency of cyber threats around the world and across industries, a growing number of security experts see aggressive government action as the best hope for averting disaster.

Even though some experts are outraged by the extent of U.S. Internet spying exposed by former NSA contractor Edward Snowden, they are even more concerned about technologically sophisticated enemies using malware to sabotage utilities, wipe out data stored on computer drives, and steal defense and trade secrets.

Such fears and proposals on new laws and executive action to counter these threats were core topics this week in Las Vegas at Black Hat and Def Con, two of the world’s largest gatherings for security professionals and hackers.

At Black Hat, the keynote speech by respected researcher Dan Geer went straight for national and global policy issues. He said the U.S. government should require detailed reporting on major cyber breaches, in the same way that deadly diseases must be reported to the Centers for Disease Control and Prevention.

Critical industries should be subjected to “stress tests” like the banks, Geer said, so regulators can see if they can survive without the Internet or with compromised equipment.

Geer also called for exposing software vendors to product liability suits if they do not share their source code with customers and bugs in their programs lead to significant losses from intrusion or sabotage.

“Either software houses deliver quality and back it up with product liability, or they will have to let their users protect themselves,” said Geer, who works for In-Q-Tel, a venture capital firm serving U.S. intelligence agencies. Geer said he was speaking on his own behalf.

“The current situation – users can’t see whether they need to protect themselves and have no recourse to being unprotected – cannot go on,” he said.

Several of Geer’s proposals are highly ambitious given the domestic political stalemate and the opposition of major businesses and political donors to new regulation, Black Hat attendees said. In an interview, Geer said he had seen no encouraging signs from the White House or members of Congress.

But he said the alternative would be waiting until a “major event” that he hoped would not be catastrophic.

Chris Inglis, who retired this year as deputy director of the National Security Agency, said disaster could be creeping instead of sudden, as broad swaths of data become unreliable.

In an interview, he said some of Geer’s ideas, including product liability, deserved broader discussion.

“Doing nothing at all is a worse answer,” said Inglis, who now advises security firm Securonix.


Apache Goes To Hadoop Clusters

June 30, 2014 by  
Filed under Computing

Apache Spark, a high-speed analytics engine for the Hadoop distributed processing framework, is now available to plug into the YARN resource management tool.

This development means that it can now be easily deployed along with other workloads on a Hadoop cluster, according to Hadoop specialist Hortonworks.

Released as version 1.0.0 at the end of May, Apache Spark is a high-speed engine for large-scale data processing, created with the aim of being much faster than Hadoop’s better-known MapReduce function, but for more specialised applications.

Hortonworks vice president of Corporate Strategy Shaun Connolly told The INQUIRER, “Spark is a memory-oriented system for doing machine learning and iterative analytics. It’s mostly used by data scientists and high-end analysts and statisticians, making it a sub-segment of Hadoop workloads but a very interesting one, nevertheless.”

As a relatively new addition to the Hadoop suite of tools, Spark is getting a lot of interest from developers using the Scala language to perform analysis on data in Hadoop for customer segmentation or other advanced analytics techniques such as clustering and classification of datasets, according to Connolly.

With Spark certified as YARN-ready, enterprise customers will be able to run memory and CPU-intensive Spark applications alongside other workloads on a Hadoop cluster, rather than having to deploy them in separate a cluster.

“Since Spark has requirements that are much heavier on memory and CPU, YARN-enabling it will ensure that the resources of a Spark user don’t dominate the cluster when SQL or MapReduce users are running their application,” Connolly explained.

Meanwhile, Hortonworks is also collaborating with Databricks, a firm founded by the creators of Apache Spark, in order to ensure that new tools and applications built on Spark are compatible with all implementations of it.

“We’re working to ensure that Apache Spark and its APIs and applications maintain a level of compatibility, so as we deliver Spark in our Hortonworks Data Platform, any applications will be able to run on ours as well as any other platform that includes the technology,” Connolly said.


Additional Security Concerns Keeps Focus on eBay

May 29, 2014 by  
Filed under Around The Net

EBay’s security team could be in for a bumpy ride.

Following an attack disclosed last week that exposed sensitive information of up to 145 million people, the auction giant is scrambling to repair several other problems reported in its vast network by security enthusiasts.

“As a company, we take all vulnerabilities reported to us very seriously, evaluating any reported issue within the context of our entire security infrastructure,” wrote Ryan Moore, lead manager of eBay’s business communications, in an email to IDG News Service.

EBay has long been a target for cybercriminals. It is the seventh most visited site in the U.S, according to statistics from Amazon’s Alexa Web analytics unit. Its combination of a marketplace and payments platform, PayPal, means it holds sensitive data and poses opportunity for fraudsters.

Three U.S. states — Connecticut, Florida and Illinois — are jointly investigating eBay’s data breach, a sign that regulators and law enforcement are taking a keen interest in how consumer data is protected following Target’s data breach last year.

EBay’s size puts it in the league of companies such as Facebook, Google and Microsoft. All run large networks constantly prodded by “black hat” hackers, those who are seeking to damage a company or profit from attacks, and “white hats,” who alert companies to problems.

Yasser Ali, a 27-year-old who lives in Luxor, Egypt, said it took him all of three minutes last week to find a serious vulnerability that could let him take over anyone’s eBay account if he knows a person’s user name, which is public information.

Ali shared a video with eBay showing how the flaw could be exploited, he said in a phone interview Tuesday night. He hasn’t received a response from eBay, but said the video was viewed by company officials 17 times, according to a statistics counter on the clip. Moore said eBay has now fixed the bug, and Ali plans to release details of it.



Does Apache Need To Be Patched?

April 30, 2014 by  
Filed under Computing

Apache Software Foundation released an advisory warning that a patch issued in March for a zero-day vulnerability in Apache Struts did not fully patch the bug. Apparently, the patch for the patch is in development and will be released likely within the next 72 hours.

Rene Gielen of the Apache Struts team said that once the release is available, all Struts 2 users are strongly recommended to update their installations. ASF provided a temporary mitigation that users are urged to apply. On March 2, a patch was made available for a ClassLoader vulnerability in Struts up to version All it took was an attacker to manipulate the ClassLoader via request parameters. However Apache admitted that its fix was insufficient to repair the vulnerability. An attacker exploiting the vulnerability could also cause a denial-of-service condition on a server running Struts 2.

“The default upload mechanism in Apache Struts 2 is based on Commons FileUpload version 1.3 which is vulnerable and allows DoS attacks. Additional ParametersInterceptor allows access to ‘class’ parameter which is directly mapped to getClass() method and allows ClassLoader manipulation.”

It will be the third time that Struts has been updated this year. In February, the Apache Struts team urged developers to upgrade Struts 2-based projects to use a patched version of the Commons FileUpload library to prevent denial-of-service attacks.



The U.S. Is Not The Worst Cyber Snooper

June 24, 2013 by  
Filed under Around The Net

The Indian government cyber snooping program is becoming so pervasive that it makes the US Prism operation look harmless. India is giving its security agencies and even income tax officials the ability to tap directly into e-mails and phone calls without oversight by courts or parliament, several sources said.

The excuse is that the move will help safeguard national security, because that excuse is always trotted out when governments do evil things. The Central Monitoring System (CMS) was announced in 2011 but there has been no public debate and the government has said little about how it will work or how it will ensure that the system is not abused.

The government started to quietly roll the system out state by state in April this year, according to government officials. Eventually it will be able to target any of India’s 900 million landline and mobile phone subscribers and 120 million Internet users.

Cynthia Wong, an Internet researcher at New York-based Human Rights Watch said that if India doesn’t want to look like an authoritarian regime, it needs to be transparent about who will be authorized to collect data, what data will be collected, how it will be used, and how the right to privacy will be protected.


Chinese Hackers Appear To Be At It Again

May 22, 2013 by  
Filed under Around The Net

Three months after hackers working for a cyberunit of China’s People’s Liberation Army went silent they appear to have resumed their attacks using different techniques.

The Obama administration had bet that “naming and shaming” the groups, first in industry reports and then in the Pentagon’s own detailed survey of Chinese military capabilities, might prompt China’s new leadership to crack down on the military’s team of hackers. But it appears that Unit 6139 is back in business, according to American officials and security companies.

Mandiant, a private security company that helps companies and government agencies defend themselves from hackers, said the attacks had resumed but would not identify the targets. The victims were many of the same ones the unit had attacked before. Mandiant said that the Chinese hackers had stopped their attacks after they were exposed in February and removed their spying tools from the organisations they had infiltrated.

But in the last two months, they have begun attacking the same victims from new servers and have reinserted many of the tools that enable them to seek out data without detection. The subject of Chinese attacks is expected to be a central issue in an upcoming visit to China by President Obama’s national security adviser, Thomas Donilon. However little is expected to come of it, the Chinese have always denied that they have a hacked anyone, ever.


Anonymous Went After North Korea Again

April 16, 2013 by  
Filed under Around The Net

Anonymous has restarted its attack against North Korea and once again is using a North Korean Twitter account to announce website scalps.

The Twitter account @uriminzok was the scene of announcements about the hacked websites during the last stage of Op North Korea, and reports have tipped up there again.

The first wave of attacks saw a stream of websites defaced or altered with messages or images that were very much not in favour of the latest North Korean hereditary leader, Kim Jong-un.

They were supported by a Pastebin message signed by Anonymous that called for some calming of relations between North Korea and the US, and warned of cyber attacks in retaliation.

“Citizens of North Korea, South Korea, USA, and the world. Don’t allow your governments to separate you. We are all one. We are the people. Our enemies are the dictators and regimes, our goals are freedom and peace and democracy,” read the statement. “United as one, divided by zero, we can never be defeated!”

Before the attacks restarted, the last Twitter message promised that more was to come. It said, “OpNorthKorea is still to come. Another round of attack on N.Korea will begin soon.” Anonymous began delivering on that threat in the early hours this morning.

More of North Korean websites are in our hand. They will be brought down.

— uriminzokkiri (@uriminzok) April 15, 2013

We’ve counted nine websites downed, defacements and hacks, and judging by the stream of confirmations they happened over a two hour period. No new statement has been released other than the above.…

— uriminzokkiri (@uriminzok) April 15, 2013

Downed websites include the glorious, a North Korean news destination. However, when we tried it we had intermittent access.

Last time around the Anonymous hackers had taken control of North Korea’s Flickr account. This week we found the message, “This member is no longer active on Flickr.”


Anonymous Latest CyberAttack Fails

April 10, 2013 by  
Filed under Computing

A cyberattack campaign, dubbed #OpIsrael by hacking group Anonymous failed to bring down the Israeli government websites over the weekend.

Yitzhak Ben Yisrael, of the government’s National Cyber Bureau said that while the attack did take place, it did hardly any damage. Ben Yisrael said that Anonymous lacked the skills to damage the country’s vital infrastructure. And if that was its intention, then it wouldn’t have announced the attack before hand.

“It wants to create noise in the media about issues that are close to its heart,” he said, as quoted by the Associated Press news agency.

Posters using the name of the hacking group Anonymous had warned they would launch a massive attack on Israeli sites in a strike they called #OpIsrael starting April 7. Last week, a leading hacker going by the handle of “Anon Ghost” said that “the hacking teams have decided to unite against Israel as one entity…Israel should be getting prepared to be erased from the Internet,” according to Israeli media reports.

Israel’s Bureau of Statistics was down on Sunday morning but it was unclear if it was hacked. Defense and Education Ministry as well as banks had come under attack the night before but the security shrugged it off.
Anonymous did have a crakc at the stock market website and the Finance Ministry website but no one there noticed.

Where Anonymous was successful was when it targeted small business. Some homepage messages were replaced with anti-Israel slogans, media said. Israeli hackers hit sites of radical Islamist groups and splashed them with pro-Israel messages.


Are SmartTV’s Next For Hackers?

March 1, 2013 by  
Filed under Consumer Electronics

The growing variety of smart devices is bringing with it glaring holes in network security, according to researchers.

Speaking at the 2013 RSA conference in San Francisco, Cylance CEO Stuart McClure noted how devices ranging from industrial controllers to smart television sets can be manipulated to act as gateways to corporate networks and facilities.

McClure demonstrated a number of attacks that used relatively simple and low-tech processes to exploit smart devices and manipulate both the devices themselves and the networks that connect them.

Some of the exploits used uncommon means for accessing networks. Researchers showed how a common universal remote could be modified to access the infrared port on a smart TV and manipulate network security settings. When the settings were disabled, the researchers then accessed the TV from a PC and from there viewed the network itself.

In a second demonstration, the researchers described how an attacker can use web controls to access industrial control systems. By exploiting first a privilege escalation flaw then a second vulnerability, an attacker can gain control over industrial control hardware and manipulate either software and network credentials or cause real-world damage by instructing the unit to operate in unsafe conditions.

McClure said that part of the problem is the nature of smart devices themselves. In bolting network technology onto traditionally solitary devices, vendors have not only neglected security but in making devices accessible they have also created new opportunities for abuse.

“They say these are features, that we designed it this way,” McClure said.

“I say yes, but features can kill.”

Other hacking techniques can compromise companies with little to no technology. Cylance researchers showed how an attacker can exploit the emergency key lock-box units on facilities by duplicating the regional keys used by police and fire departments. In such a scenario an attacker would be able to unlock a facility and potentially steal hardware or intellectual property without triggering alarm systems.

McClure said that while the prospects for securing embedded systems can at first seem daunting, in many cases simple solutions can secure the devices. Methods ranging from electrical tape over the infrared ports on TV sets to connecting lock boxes with fire and security alarms can thwart the attacks described by researchers.

The key to securing embedded systems, said McClure, is for firms to change their thinking and open their eyes to the vulnerabilities around them.

“What we are proposing is to look back at prevention being first, we just need to get back to that mindset,” he explained.

“Being able to choke it at that point and having a secure process for managing all the inputs, you will go a long way to preventing all these attacks.”



Intel Goes Apache Hadoop

February 28, 2013 by  
Filed under Computing

Intel has released its Apache Hadoop distribution, claiming significant performance benefits through its hardware and software optimisation.

Intel’s push into the datacentre has largely been visible with its Xeon chips but the firm works pretty hard on software as well, including contributing to open source projects such as the Linux kernel and Apache’s Hadoop to ensure that its chips win benchmark tests.

Now Intel has released its Apache Hadoop distribution, the third major revision of its work on Hadoop, citing significant performance benefits and claiming it will open source much of its work and push it back upstream into the Hadoop project.

According to Intel, most of the work it has done in its Hadoop distribution is open source, however the firm said it will retain the source code for the Intel Manager for Apache Hadoop, the cluster management part of the distribution. Intel said it will use this to offer support services to datacentres that deploy large Hadoop clusters.

Boyd Davis, VP and GM of Intel’s Datacentre Software Division said, “People and machines are producing valuable information that could enrich our lives in so many ways, from pinpoint accuracy in predicting severe weather to developing customised treatments for terminal diseases. Intel is committed to contributing its enhancements made to use all of the computing horsepower available to the open source community to provide the industry with a better foundation from which it can push the limits of innovation and realise the transformational opportunity of big data.”

Intel trotted out some impressive industry partners that it has been working with on the Hadoop distribution and while the firm’s direct income from the Hadoop distribution will come from support services, the indirect income from Xeon chip sales is likely what Intel is most looking towards as Hadoop adoption grows to manage the extremely large data sets that the industry calls “big data”.


Dell Links Up With The Apache Foundation

October 26, 2012 by  
Filed under Computing

Dell is offering access to its Zinc ARM based server to the Apache Software Foundation for development and testing purposes.

Dell had already shown off its Copper ARM based server earlier this year and said it intends to bring ARM servers to market “at the appropriate time”. Now the firm has allowed the Apache Software Foundation access to another Calxeda ARM based server codenamed Zinc.

Dell’s decision to give the Apache Software Foundation access to the hardware is not surprising as it is the organisation that oversees development of the popular Apache HTTPD, Hadoop and Cassandra software products, all applications that are widely regarded as perfect for ARM based servers. The firm said its Zinc server is accessible to all Apache projects for the development and porting of applications.

Forrest Norrod, VP and GM of Server Solutions at Dell said, “With this donation, Dell is further working hand-in-hand with the community to enable development and testing of workloads for leading-edge hyperscale environments. We recognize the market potential for ARM servers, and with our experience and understanding of the market, are enabling developers with systems and access as the ARM server market matures.”

Dell didn’t give any technical details on its Zinc server and said it won’t be generally available. However the firm reiterated its goal of bringing ARM based servers to the market, though given that it is trying to help the Apache Foundation, a good indicator of ARM server viability will be when the Apache web server project has been ported to the ARM architecture and has matured to production status.




IBM Goes After Apache’s Tomcat

May 3, 2012 by  
Filed under Computing

Java Developers looking for a mobile-friendly platform could be happy with the next release of IBM’s Websphere Application Server, which is aimed at offering a lighter, more dynamic version of the app middleware.

Shown off at the IBM Impact show in Las Vegas on Tuesday, Websphere Application Server 8.5, codenamed Liberty, has a footprint of just 50MB. This makes it small enough to run on machines such as the Raspberry Pi, according to Marie Wieck, GM for IBM Application and Infrastructure Middleware.

Updates and bug fixes can also be done on the fly with no need to take down the server, she added.

The Liberty release will be launched this quarter, and already has 6,000 beta users, according to Wieck.

John Rymer of Forrester said that the compact and dynamic nature of the new version of Websphere Application Server could make it a tempting proposition for Java developers.

“If you want to install version seven or eight, it’s a big piece of software requiring a lot of space and memory. The installation and configuration is also tricky,” he explained.

“Java developers working in the cloud and on mobile were moving towards something like Apache Tomcat. It’s very light, starts up quickly and you can add applications without having to take the system down. IBM didn’t have anything to respond to that, and that’s what Liberty is.”

For firms needing to update applications three times a year, for example, the dynamic capability of Liberty will make it a much easier process.

“If developers want to run Java on a mobile device, this is good,” Rymer added.

The new features are also backwards compatible, meaning current Websphere users will be able to take advantage of the improvements.

However, IBM could still have difficulty competing in the app server space on a standalone basis, according to Rymer.

“Red Hat JBoss costs considerably less, and there’s been an erosion for IBM as it’s lost customers to Red Hat and Apache. Liberty might have an effect here,” he said.

“But IBM wins where the customer isn’t just focused on one product. It will never compete on price, but emphasises the broader values of a platform or environment.”

IBM will be demoing Websphere running on Raspberry Pi at Impact today.




Next Page »