Subscribe to:

Subscribe to :: TheGuruReview.net ::

Espionage Hacking On The Increase, According To Verizon Study

April 23, 2014 by mphillips  
Filed under Around The Net

Hacking for espionage purposes is drastically rising, with groups or national governments from Eastern Europe playing a growing role, according to one of the most comprehensive annual studies of computer intrusions.

Spying intrusions traced back to any country in 2013 were blamed on residents of China and other East Asian nations 49 percent of the time, but Eastern European countries, especially Russian-speaking nations, were the suspected launching site for 21 percent of breaches, Verizon Communications Inc’s said in its annual Data Breach Investigations Report.

Those were by far the most active areas detected in the sampling, which drew more than half of its data from victims in the United States. About 25 percent of spying incidents could not be attributed to attackers from any country, according to the authors of the report.

Though the overall number of spying incidents studied tripled to 511 from total in the 2013 Verizon report, most of that increase is due to the addition of new data sources. Even looking at just the same contributors as before, however, espionage cases grew, said Verizon investigator Bryan Sartin.

Not all electronic spying was blamed on governments. Investigators from Verizon, Intel Corp’s McAfee, Kaspersky Labs and other private companies and public agencies contributing data ascribed 11 percent of espionage attacks to organized criminals and 87 percent to governments.

In some cases, the criminal gangs were probably looking to sell what they found to governments or competitors of the victims.

“We do see a slight merging between the classic organized criminal and the espionage crook,” Sartin said, adding that he expected that trend to continue.

If the rise of detected Eastern European spying comes as a surprise to those mainly familiar with accusations against China, a bigger surprise might be the study’s findings about attacks on retailers.

Though recent breaches at Target Corp and other retailers through their point-of-sale equipment have dominated the headlines and prompted congressional hearings in the past few months, fewer such intrusions have been reported to the Verizon team than in past years, even as the number of report contributors has multiplied.

“The media frenzy makes quite a splash, but from a frequency standpoint, this largely remains a small-and-medium business issue,” the study says.

 

Intel And MediaTek Gain Traction In China

April 23, 2014 by Michael  
Filed under Computing

Intel and MediaTek don’t have much in common, but it appears that they are locking horns in China, in the white-box tablet business of all places. Both companies are vying for a slice of the booming white-box tablet space, which are starting to resemble vanilla PCs of the eighties.

MediaTek drew first blood last November, when it announced plans for the introduction of several tablet-centric chips. The company is apparently planning to double its tablet SoC shipments this year for a grand total of more than 40 million units. Intel is doing the exact same thing. It wants to quadruple its tablet SoC shipments and hit the 40 million mark, too.

However, that’s where the similarities end. The companies are going about it in a much different way, their processors aren’t what we would call similar, but there is still plenty of overlap.

Intel contra revenue vs. MediaTek organic growth

Intel’s only hope of getting into the cutthroat white-box space is through generous deals offered to vendors who choose Intel parts over the competition. The strategy is working, but at the same time it is also costing Intel a lot of cash. Analysts believe Intel could burn as much as $1bn on tablet subsidies this year, although the chipmaker really doesn’t like it when people use the dirty S-word.

MediaTek is taking a different approach. It is rolling out a number of value chips, ranging from quad- and octa-core Cortex A7 parts to mid-range and even performance parts based on A12, A15 and A17 cores, including big.LITTLE designs.

It appears that both strategies are working. Digitimes reports Intel and MediaTek are getting a lot of love from Chinese tablet makers. MediaTek has competitive products and it brings 3G and 4G support to the table. Intel’s subsidies are also doing the trick – and luckily Intel has some good tablet parts to offer, which wasn’t the case in the past.

Intel’s Atom Z2520 and Z3735G appear to be the main weapon in the chipmaker’s push behind the Bamboo Curtain. A 7-inch Intel tablet can leave the factory for as little as $50, while a MediaTek 3G-enabled white box tablet has an ex-factory price of $39.9. Demand for Intel and MediaTek solutions is going up, according to industry sources.

What about the competition?

Chinese white-box outfits tended to use Rockchip and Allwinner parts, along with chips from Amlogic and smaller chip designers. The companies are fighting back, but they don’t appear to be having much success.

Rockchip recently rolled out a new quad-core SoC, Allwinner has the octa-core A80, while Amlogic is talking up its M802, with UHD/4K support – not that it’s very relevant for white-box tablets.

What about the big players? Samsung is not interested nor does it have any SoCs that would fit the bill for white-box tablets. Nvidia is focusing on high-end SoCs with powerful graphics, overkill for cheap tablets. Qualcomm, the elephant in the room, is going after smartphones, with affordable 4G-enabled parts.

AMD’s Temash parts are out of the running, too. They will soon be replaced by Mullins APUs, but AMD does not want to pursue the low-end tablet market. During the company’s latest conference call CEO Rory Read criticised Intel’s contra revenue approach, saying that it’s “foreign” to AMD. Of course, AMD knows a thing or two about Intel subsidies and it simply does not want to go toe to toe with Intel, not when there’s no level playing field.

Intel started talking about $99 tablets last year and some analysts were baffled by the company’s decision to join the SoC race to the bottom. Why bother with a high-volume, low-margin market that can only be conquered with quarterly subsidies in the hundreds of millions dollars? It still looks like a strange market for Intel to compete in, but the sheer amount of money and effort involved in the company’s tablet push indicates that this was a strategic decision rather than a sideshow designed to appease investors and analysts.

Intel knows what it’s doing. It’s waging a proxy war against the ARM alliance and it’s picking its fights wisely. Going after a potentially huge virgin market controlled by relatively small players should be easy, Intel could have gotten away with it almost uncontested had it not been for MediaTek. However, going after white-box tablets is still a lot easier than trying to enter the incredibly competitive smartphone SoC business, especially now that Apple, Samsung and LG are developing in-house SoC designs.

For Intel, Chinese white-box tablets are a back door, the easiest way to boost market share and enter this segment without taking on the biggest players.

“I think games are just going to continue to get better. You’re just going to continue to have a wider variety. It’s great that small teams can actually find a viable outlet now and sell the product. And we’re right there.”

Courtesy-Fud

Alibaba Set To Offer Mobile Phone Service Beginning In May

April 22, 2014 by mphillips  
Filed under Mobile

Alibaba’s Tmall and Taobao sites already sell everything from clothes and furniture to car tires and medicines. But soon they’ll also be offering 3G data and voice call plans as well, according to the Chinese tech giant.

User registration for mobile phone numbers will begin in May.

Alibaba is among the Chinese companies that received a mobile virtual network operators license back in December. This allows them to resell wireless services from the nation’s state-controlled mobile carriers China Mobile, China Unicom, and China Telecom.

It won’t be hard for the Alibaba to find customers. Taobao and Tmall are two of China’s largest online retail sites. In addition, the company is aggressively expanding into mobile services, developing its own operating system for smartphones, along with a mobile chatting app called Laiwang.

As smartphones become the number one way Chinese go online, local tech companies are trying to corner a part of the mobile Internet market. In Alibaba’s case, the company has been on a spending spree, buying a stake in Chinese social networking site, Weibo.com, and moving to acquire the country’s largest online mapping provider.

Offering data and voice services could help Alibaba attract more users to its e-commerce services. As China only has three mobile carriers, there’s plenty of room for MVNOs to grow, according to analysts. But Alibaba won’t be the only e-commerce company offering mobile phone services.

JD.com, another major online retailer in China, has also received a MVNO license. The company plans to offer its telecom services in the second quarter of this year.

JD.com has the second largest business-to-consumer retail site behind Tmall.com, according to research firm Analysys International. The company is set to grow even faster after Chinese Internet giant Tencent bought a 15 percent stake in it.

As part of the deal, JD.com will take over two of Tencent’s online retail businesses. It will also gain access to Tencent’s WeChat app, a mobile messaging app with 300 million users.

 

Most Sites Have Fixed Heartbleed Flaw, Many Remain Exposed

April 22, 2014 by mphillips  
Filed under Around The Net

The world’s top 1,000 websites have been updated to protect their servers against the “Heartbleed” vulnerability, but up to 2% of the top million remained unprotected as of last week, according to a California security firm.

On Thursday, Menifee, Calif.-based Sucuri Security scanned the top 1 million websites as ranked by Alexa Internet, a subsidiary of Amazon that collects Web traffic data.

Of the top 1,000 Alexa sites, all were either immune or had been patched with the newest OpenSSL libraries, confirmed Daniel Cid, Sucuri’s chief technology officer, in a Sunday email.

Heartbleed, the nickname for the flaw in OpenSSL, an open-source cryptographic library that enables SSL (Secure Sockets Layer) or TLS (Transport Security Layer) encryption, was discovered independently by Neel Mehta, a Google security engineer, and researchers from security firm Codenomicon earlier this month.

The bug had been introduced in OpenSSL in late 2011.

Because of OpenSSL’s widespread use by websites — many relied on it to encrypt traffic between their servers and customers — and the very stealthy nature of its exploit, security experts worried that cyber criminals either had, or could, capture usernames, passwords,\ and even encryption keys used by site servers.

The OpenSSL project issued a patch for the bug on April 7, setting off a rush to patch the software on servers and in some client operating systems.

The vast majority of vulnerable servers had been patched as of April 17, Sucuri said in a blog postthat day.

While all of the top 1,000 sites ranked by Alexa were immune to the exploit by then, as Sucuri went down the list and scanned smaller sites, it found an increasing number still vulnerable. Of the top 10,000, 0.53% were vulnerable, as were 1.5% of the top 100,000 and 2% of the top 1 million.

Other scans found similar percentages of websites open to attack: On Friday, San Diego-based Websense said about 1.6% of the top 50,000 sites as ranked by Alexa remained vulnerable.

Since it’s conceivable that some sites’ encryption keys have been compromised, security experts urged website owners to obtain new SSL certificates and keys, and advised users to be wary of browsing to sites that had not done so.

Sucuri’s scan did not examine sites to see whether they had been reissued new certificates, but Cid said that another swing through the Web, perhaps this week, would. “I bet the results will be much much worse on that one,” Cid said.

 

 

Can AMD Grow In A Down PC Market?

April 22, 2014 by Michael  
Filed under Computing

AMD posted some rather encouraging Q1 numbers last night, but slow PC sales are still hurting the company, along with the rest of the sector.

When asked about the PC market slump, AMD CEO Rory Read confirmed that the PC market was down sequentially 7 percent. This was a bit better than the company predicted, as the original forecast was that the PC market would decline 7 to 10 percent.

Rory pointed out that AMD can grow in the PC market as there is a lot of ground that can be taken from the competition. The commercial market did better than expected and Rory claims that AMD’s diversification strategy is taking off. AMD is trying to win market share in desktop and commercial segments, hence AMD sees an opportunity to grown PC revenue in the coming quarters. Rory also expects that tablets will continue to cannibalize the PC market. This is not going to change soon.

Kaveri and Kabini will definitely help this effort as both are solid parts priced quite aggressively. Kabini is also available in AMD’s new AM1 platform and we believe it is an interesting concept with plenty of mass market potential. Desktop and Notebook ASPs are flat which is something that the financial community really appreciated. It would not be so unusual that average selling prices were down since the global PC market was down.

Kaveri did well in the desktop high-end market in Q1 2014 and there will be some interesting announcements in the mobile market in Q2 2014 and beyond.

Courtesy-Fud

 

GlobalFoundries And Samsung Team Up On FinFET

April 22, 2014 by Michael  
Filed under Computing

GlobalFoundries should be rolling out 20nm chips later this year and we hope that some AMD 20nm products might actually launch this year. The foundry failed to conquer the world with its 28nm process, but after some delays it got sorted out the problems and managed to ship some high-volume parts based on this process.

GlobalFoundries is manufacturing AMD’s new Kaveri APUs, while TSMC is making the Jaguar-based 28nm parts. We are not sure who is making the new server parts such as Seattle or Berlin, both 28nm designs. It is expected that GlobalFoundries should commence volume production of some 20nm parts later this year and the company has big plans for a faster transition to 14nm.

GlobalFoundries cozying up to Samsung
It is no secret that Intel leads the way in new process transitions and that Intel plans to ship 14nm parts at the time TSMC and GlobalFoundries are struggling to ship their first 20nm parts.

GlobalFoundries has now announced that it will start a strategic collaboration with none other than Samsung for its 14nm transition. It is easy to see that these two big players need each other in order to fight against bigger competitors like Intel and TSMC. GlobalFoundries and Samsung don’t have much overlap, either.

This joint venture will result in faster time-to-market for 14nm FinFET-based products. We see at least two advantages. According to Ana Hunter, Vice President of Product Management at GlobalFoundries, the process design kits are available today and the the foundry should be ready to manufacture 14nm FinFET products by the end of 2014. This sounds a bit optimistic, as we heard these bold announcements before, especially as both companies didn’t really start shipping 20nm parts yet, at least not in high volume high performance parts. It should be noted that Samsung joined the 28nm club quite late and shipped its first 28nm SoC just a year ago, in the Galaxy S4.

Sawn Han, Vice president of foundry marketing at Samsung Electronics, calls this partnership a ‘game changer’ as it will enable 14nm production by a total of four foundries in the world, three from Samsung and one from GlobalFoundries.  Samsung will offer 14nm FinFET from S2 Fab in Austin Texas, S3 Fab in Hwa Seong in South Korea and S1 Fab in Gi Heung South Korea. GlobalFoundries is preparing its Fab 8 in Saratoga, New York State, for the 14nm push.

14nm FinFET crucial for next-gen SoC designs

The companies say 14nm FinFET technology features a smaller contact gate pitch for higher logic packing density and smaller SRAM bitcells to meet the increasing demand for memory content in advanced SoCs, while still leveraging the proven interconnect scheme from 20nm to offer the benefits of FinFET technology with reduced risk and the fastest time-to-market.

The 14nm LPE should deliver 20 percent more performance compared to 20nm parts and the power required should sink 35 percent versus 20nm LPE parts. Compared to 20nm parts it will save 15 percent of die space as well making it possible to cram more components into the same die size.

We have yet to see the first mobile 20nm parts in actual products. Qualcomm announced its first Snapdragons based on the new process a few weeks ago, but they won’t be ready for months. You can expect that a SoC manufactured on 14nm could end up 40 to 50 percent faster than its 28nm predecessor and that the power requirement could go down by 50 to 70 percent at best.

The total market for mobility, wireless and computer network storage market is expected to hit around $20 billion by 2017. Of course, everyone wants the piece of that action. The joint venture will offer both 14nm LPE (Low Power Enhanced) and 14nm LPP (Laser-Produced Plasma) process.

All we need now are the design wins from high-volume customers and if we were to bet we would place our money on Samsung, namely its Exynos processors. We would be positively surprised to see 14nm SoC in mobile phones and tablets in 2015, but it is a possibility. Keep in mind that we are still waiting to see the first 20nm SoCs and GPUs in action.

Courtesy-Fud

NASA To Test Laser Communications System

April 21, 2014 by mphillips  
Filed under Around The Net

The SpaceX cargo spacecraft, which transport equipment needed for astronauts on the International Space Station to test optical laser communications, has big plans to test out the new concept.

The SpaceX Dragon cargo craft’s scheduled launch last week was scrubbed because of a helium leak in the Falcon 9 rocket that will carry it aloft.

Optical laser communications, also dubbed lasercom, is one of the emerging technologies that NASA is focused on trying out.

With lasercom, data is transmitted via laser beams; the technology potentially offers much higher data rates than the space agency is able to achieve with current radio frequency transmissions.

“Optical communications have the potential to be a game-changer,” said mission manager Matt Abrahamson, in a statement. “It’s like upgrading from dial-up to DSL. Our ability to generate data has greatly outpaced our ability to downlink it. Imagine trying to download a movie at home over dial-up. It’s essentially the same problem in space, whether we’re talking about low-Earth orbit or deep space.”

Abrahamson noted that many of the latest deep space missions send data back and forth at 200 to 400 kilobits per second. The new laser technology is expected to transmit data at 50 megabits per second.

Since one megabit is equal to 1,024 kilobits, that means the new communications should be up to 256 times faster.

Once the Dragon spacecraft rendezvouses with the space station, the orbiter’s robotic arm will remove it from the ship’s cargo bay and then attach it to the outside of the station. The laser test is expected to last at least three months.

A ground telescope will be used to test the new communication tool.

As the space station moves in its orbit around Earth, the ground telescope will track it and transmit a laser beacon carrying a video uplink in 100-second bursts to the orbiting instrument. The tests will help scientists better calculate the ability to point the laser, along with beam acquisition and tracking — all while the space station is traveling at approximately 17,500 miles per hour.

The new laser communications initiative is a key part of NASA’s Space Technology Mission Directorate, an arm of the space agency focused on developing technology for future space missions, as well as for life here on Earth.

 

 

AMD Not Chasing The Sub-$100 Tablet Market

April 21, 2014 by mphillips  
Filed under Consumer Electronics

Advanced Micro Devices doesn’t want its processors in low-end tablets, and is eager to avoid a battle with Intel or ARM, whose chips have driven tablet prices down to under $100.

Growth in the tablet market is driven by low-end devices and Android, but AMD’s tablet strategy is driven by Windows and high-performance machines. So AMD’s avoidance of the low end of the market narrows options for people looking for name-brand chips in low-price machines.

AMD chips are in just a handful tablet models. Those AMD chips that are available for tablets are essentially watered-down PC chips with strong graphics capabilities. But the company plans to introduce new chips, code-named Beema and Mullins, for tablets These new chips are based on a new core and designed to provide more performance and battery life.

“If we miss out on some units in the low end, so be it,” said Lisa Su, general manager of AMD’s global business units, during the first-quarter earnings call on Thursday.

AMD executives said they didn’t want to buy their way into the tablet market like Intel, which has been subsidizing tablet makers to use its x86 chips through its “contra revenue” program. Instead, AMD wants to be selective in its product mix, and focus on high-margin and high-value products.

“This idea of contra revenue is foreign to us,” said Rory Read, CEO of AMD, during the call.

AMD could go after tablets priced at $300, but won’t go under that, said Nathan Brookwood, principal analyst at Insight 64.

“They are not chasing bad business,” Brookwood said.

AMD doesn’t have the financial resources to provide subsidies to tablet makers to use its chips, Brookwood said.

Though the tablet market is important, AMD is more concerned about generating revenue from custom chips and other areas, Brookwood said.

AMD makes custom chips for game consoles like Microsoft’s Xbox One and Sony’s PlayStation 4, which helped drive up revenue by 28 percent in the first fiscal quarter of 2014. AMD’s revenue in the PC, server and tablet chip business declined.

Addressing the wide tablet market isn’t a good idea for AMD and its bottom line, said Dean McCarron, principal analyst at Mercury Research. AMD is directing more resources out of tablets and into consoles, where there is more financial reward, McCarron said.

But it does need one or two big customers to help their tablet business, he said.

“They are being very judicious in what part of the product stack they are playing in,” McCarron said. “They are working on home-run customers.”

 

 

Will Plastic Replace Silicon In Computers?

April 21, 2014 by mphillips  
Filed under Computing

Can plastic materials morph into computers? A research breakthrough recently published brings such a possibility closer to reality.

Researchers are looking at the possibility of making low-power, flexible and inexpensive computers out of plastic materials. Plastic is not normally a good conductive material. However, researchers said this week that they have solved a problem related to reading data.

The research, which involved converting electricity from magnetic film to optics so data could be read through plastic material, was conducted by researchers at the University of Iowa and New York University. A paper on the research was published in this week’s Nature Communications journal.

More research is needed before plastic computers become practical, acknowledged Michael Flatte, professor of physics and astronomy at the University of Iowa. Problems related to writing and processing data need to be solved before plastic computers can be commercially viable.

Plastic computers, however, could conceivably be used in smartphones, sensors, wearable products, small electronics or solar cells, Flatte said.

The computers would have basic processing, data gathering and transmission capabilities but won’t replace silicon used in the fastest computers today. However, the plastic material could be cheaper to produce as it wouldn’t require silicon fab plants, and possibly could supplement faster silicon components in mobile devices or sensors.

“The initial types of inexpensive computers envisioned are things like RFID, but with much more computing power and information storage, or distributed sensors,” Flatte said. One such implementation might be a large agricultural field with independent temperature sensors made from these devices, distributed at hundreds of places around the field, he said.

The research breakthrough this week is an important step in giving plastic computers the sensor-like ability to store data, locally process the information and report data back to a central computer.

Mobile phones, which demand more computing power than sensors, will require more advances because communication requires microwave emissions usually produced by higher-speed transistors than have been made with plastic.

It’s difficult for plastic to compete in the electronics area because silicon is such an effective technology, Flatte acknowledged. But there are applications where the flexibility of plastic could be advantageous, he said, raising the possibility of plastic computers being information processors in refrigerators or other common home electronics.

“This won’t be faster or smaller, but it will be cheaper and lower power, we hope,” Flatte said.

 

Dell And Red Hat Join Forces In The Cloud

April 21, 2014 by Michael  
Filed under Computing

The Dell Red Hat Cloud solution, a co-engineered, enterprise grade private cloud, was unveiled at the Red Hat Summit on Thursday.

The Openstack-based service also includes an extension of the Red Hat partnership into the Dell Openshift Platform as a Service (PaaS) and Linux Container products.

Dell and Redhat said their cloud partnership is intended to “address enterprise customer demand for more flexible, elastic and dynamic IT services to support and host non-business critical applications”.

The integration of Openshift with Redhat Linux is a move towards container enhancements from Redhat’s Docker platform, which the companies said will enable a write-once culture, making programs portable across public, private and hybrid cloud environments.

Paul Cormier, president of Products and Technologies at Red Hat said, “Cloud innovation is happening first in open source, and what we’re seeing from global customers is growing demand for open hybrid cloud solutions that meet a wide variety of requirements.”

Sam Greenblatt, VP of Enterprise Solutions Group Technology Strategy at Dell, added, “Dell is a long-time supporter of Openstack and this important extension of our commitment to the community now will include work for Openshift and Docker. We are building on our long history with open source and will apply that expertise to our new cloud solutions and co-engineering work with Red Hat.”

Dell Red Hat Cloud Solutions are available from today, with support for platform architects available from Dell Cloud Services.

Earlier this week, Red Hat announced Atomic Host, a new fork of Red Hat Enterprise Linux (RHEL) specifically tailored for containers. Last year, the company broke bad with its Fedora Linux distribution, codenamed Heisenbug.

Courtesy-TheInq

Oracle Identifies Its Products Affected By Heartbleed, But No Estimates On Fixes

April 18, 2014 by mphillips  
Filed under Around The Net

Oracle issued a comprehensive list of its software that may or may not be impacted by the OpenSSL (secure sockets layer) vulnerability known as Heartbleed, while warning that no fixes are yet available for some likely affected products.

The list includes well over 100 products that appear to be in the clear, either because they never used the version of OpenSSL reported to be vulnerable to Heartbleed, or because they don’t use OpenSSL at all.

However, Oracle is still investigating whether another roughly 20 products, including MySQL Connector/C++, Oracle SOA Suite and Nimbula Director, are vulnerable.

Oracle determined that seven products are vulnerable and is offering fixes. These include Communications Operation Monitor, MySQL Enterprise Monitor, MySQL Enterprise Server 5.6, Oracle Communications Session Monitor, Oracle Linux 6, Oracle Mobile Security Suite and some Solaris 11.2 implementations.

Another 14 products are likely to be vulnerable, but Oracle doesn’t have fixes for them yet, according to the post. These include BlueKai, Java ME and MySQL Workbench.

Users of Oracle’s growing family of cloud services may also be able to breath easy. “It appears that both externally and internally (private) accessible applications hosted in Oracle Cloud Data Centers are currently not at risk from this vulnerability,” although Oracle continues to investigate, according to the post.

Heartbleed, which was revealed by researchers last week, can allow attackers who exploit it to steal information on systems thought to be protected by OpenSSL encryption. A fix for the vulnerable version of OpenSSL has been released and vendors and IT organizations are scrambling to patch their products and systems.

Observers consider Heartbleed one of the most serious Internet security vulnerabilities in recent times.

Meanwhile, this week Oracle also shipped 104 patches as part of its regular quarterly release.

The patch batch includes security fixes for Oracle database 11g and 12c, Fusion Middleware 11g and 12c, Fusion Applications, WebLogic Server and dozens of other products. Some 37 patches target Java SE alone.

A detailed rundown of the vulnerabilities’ relative severity has been posted to an official Oracle blog.

 

Lavaboom Offers New Encrypted Webmail Service

April 18, 2014 by mphillips  
Filed under Around The Net

A new webmail service named Lavaboom promises to provide easy-to-use email encryption without ever learning its users’ private encryption keys or message contents.

Lavaboom, based in Germany and founded by Felix MA1/4ller-Irion, is named after Lavabit, the now defunct encrypted email provider believed to have been used by former NSA contractor Edward Snowden. Lavabit decided to shut down its operations in August in response to a U.S. government request for its SSL private key that would have allowed the government to decrypt all user emails.

Lavaboom designed its system for end-to-end encryption, meaning that only users will be in possession of the secret keys needed to decrypt the messages they receive from others. The service will only act as a carrier for already encrypted emails.

Lavaboom calls this feature “zero-knowledge privacy” and implemented it in a way that allows emails to be encrypted and decrypted locally using JavaScript code inside users’ browsers instead of its own servers.

The goal of this implementation is to protect against upstream interception of email traffic as it travels over the Internet and to prevent Lavaboom to produce plain text emails or encryption keys if the government requests them. While this would protect against some passive data collection efforts by intelligence agencies like the NSA, it probably won’t protect against other attack techniques and exploits that such agencies have at their disposal to obtain data from computers and browsers after it was decrypted.

Security researchers have yet to weigh in on the strength of Lavaboom’s implementation. The service said on its website that it considers making parts of the code open source and that it has a small budget for security audits if any researchers are interested.

Those interested in trying out the service can request to be included in its beta testing period, scheduled to start in about two weeks.

Free Lavaboom accounts will come with 250MB of storage space and will use two-way authentication based on the public-private keypair and a password. A premium subscription will cost a!8 (around US$11) per month and will provide users with 1GB of storage space and a three-factor authentication option.

 

AT&T To Offer Smartwatches This Year

April 18, 2014 by mphillips  
Filed under Consumer Electronics

Smartwatches for use on AT&T’s network will be out this year, so says one of the company’s executives.

“I think you’ll see wide-area, high-bandwidth [smart]watches this year at some point,” said Glenn Lurie, president of emerging devices at AT&T, in an interview.

The company has a group working in Austin, Texas, on thousands of wearable-device prototypes, and is also looking at certifying third-party devices for use on its network, Lurie said.

“A majority of stuff you’re going to see today that’s truly wearable is going to be in a watch form factor to start,” Lurie said. If smartwatch use takes off — “and we believe it can,” Lurie said — then those devices could become hubs for wearable computing.

Right now smartwatches lack LTE capabilities, so they are largely reliant on smartphones for apps and notifications. With a mobile broadband connection, a smartwatch becomes an “independent device,” Lurie said.

“We’ve been very, very clear in our opinion that a wearable needs to be a stand-alone device,” Lurie said.

AT&T and Filip Technologies in January released the Filip child tracker wristwatch, which also allows a parent to call a child over AT&T’s network. Filip could be improved, but those are the kind of wearable products that AT&T wants to bring to market.

Wearables for home health care are also candidates for LTE connections, Lurie said, but fitness trackers may be too small for LTE connectivity, at least for now.

Lurie couldn’t say when smartglasses would be certified to work on AT&T’s network. Google last year said adding cellular capabilities to its Glass eyewear wasn’t in the plans because of battery use. But AT&T is willing to experiment with devices to see where LTE would fit.

“It’s one thing if I’m buying it to go out for a job, it’s another thing if I’m going to wear it everyday. Those are the things people are debating right now — how that’s all going to come out,” Lurie said. “There’s technology and there’s innovation happening, and those things will get solved.”

Lurie said battery issues are being resolved, but there are no network capacity issues. Wearable devices don’t use too much bandwidth as they relay short bursts of information, unless someone is, for instance, listening to Pandora radio on a smartwatch, Lurie said.

But AT&T is building out network capacity, adding Wi-Fi networks, and virtualizing networks to accommodate more devices.

“We don’t have network issues, we don’t have any capacity issues,” Lurie said. “The key element to adding these devices is a majority of [them] aren’t high-bandwidth devices.”

AT&T wants to make wearables work with its home offerings like the Digital Life home automation and security system. AT&T is also working with car makers for LTE integration, with wearables interacting with vehicles to open doors and start ignitions.

 

Ubuntu Goes 64-bit On ARM

April 18, 2014 by Michael  
Filed under Computing

Canonical has announced its latest milestone server release, Ubuntu 14.04 LTS.

The company, which is better known for its open source Ubuntu Linux desktop operating system, has been supplying a server flavor of Ubuntu since 2006 that is being used by Netflix and Snapchat.

Ubuntu 14.04 Long Term Support (LTS) claims to be the most interoperable Openstack implementation, designed to run across multiple environments using Icehouse, the latest iteration of Openstack.

Canonical product manager Mark Baker told The INQUIRER, “The days of denying Ubuntu are over, and the cloud is where we can make a difference.”

Although Canonical regular issues incremental releases of Ubuntu, LTS releases such as this one represent landmarks for the operating system, which only come about ever two years. LTS releases are also supported for a full five years.

New in this Ubuntu 14.04 LTS release are Juju and Maas orchestration and automation tools and support for hyperscale ARM 64-bit computing such as the server setup recently announced by AMD.

Baker continued, “We’re not an enterprise vendor in the traditional sense. We’ve got a pretty good idea of how to do it by now. Openstack is gaining a more formal status as enterprise evolves to adopt cloud based solutions, and we are making a commitment to support it.

“Openstack Iceberg is also considered LTS and as such will be supported for five years.”

Scalability is another key factor. Baker said, “We look at performance. For the majority of our customers it’s about efficiency – how rapidly we can scale up and scale in, and that’s something Ubuntu does incredibly well.”

Ubuntu 14.04 LTS will be available to download from Thursday.

Courtesy-TheInq

Google Glass One Day Sale Proves Successful

April 17, 2014 by mphillips  
Filed under Consumer Electronics

Google’s one-day sale of Google Glass appears to have been a success with all units sold out, a blog post by the technology titan suggests.

“All spots in the Explorer Program have been claimed for now, but if you missed it this time, don’t worry,” the Google Glass team wrote on its blog on Wednesday.

“We’ll be trying new ways to expand the Explorer program in the future.”

Google did not respond to a request for more information, but an earlier post about the one-day sale spoke of brisk sales of the $1,500 Internet-enabled headset.

“We’ve sold out of Cotton (white), so things are moving really fast,” the team wrote.

Aside from the white version, Glass was being offered in shades marketed as Charcoal, Tangerine, Shale (grey) and Sky (blue). Buyers had the choice of their favorite shade or frame. Google announced the one-day sale available to all U.S. residents over 18 last week, adding it wasn’t ready to bring the gizmo to other countries. Shoppers who missed it have to sign up for updates at the Glass website.

Only a few thousand early adopters and developers had Glass before the one-day sale, which coincided with a major software update for the heads-up display that put video calling on hold.

An official launch of Google Glass may happen later this year.