Subscribe to:

Subscribe to :: TheGuruReview.net ::

Oracle Shows Off New SPARC Processor

August 27, 2015 by Michael  
Filed under Computing

Oracle has been sharing a few details about its SPARC processor code-named Sonoma. Sonoma is not a sleeping Italian mama at all but a place where Americans grow wine that Europeans will not touch.

Sonoma is supposed to be a “low-cost SPARC processor for enterprise workloads.” The chip uses the SPARC M7 design, DDR4 memory interfaces, PCIe electronics and InfiniBand interfaces in a single package. Eight SPARC 4th generation cores, hooks into the system RAM and built using a 20nm process with 13 metal layers.

Each package has a shared 8MB L3 cache, shared L2 caches with 512KB per core pair and private L1 32KB caches.

There are two DDR4 memory controllers, each with four DDR4-2133/2400 channels, up to two DIMMs per channel, and up to 1TB of DRAM per socket. Oracle it can manage 77GB/s bandwidth with the wind behind it and if it is going downhill.

Basant Vinaik, Oracle’s senior principal engineer of CPU and I/O verification, told the Hot Chips conference that Sonoma contains a crypto-unit with user-level crypto instructions.

“The cache has been optimized to reduce latency and increase throughput. Sonoma achieves low latency with its integrated memory controller. We use speculative memory read to do this. Software can tune this using threshold registers.”

Courtesy-Fud

Intel To Invest Heavily In Mirantis For OpenStack

August 26, 2015 by Michael  
Filed under Computing

Intel has teamed up with OpenStack distribution provider Mirantis to push adoption of the OpenStack cloud computing framework.

The deal, which includes a $100m investment in Mirantis from Intel Capital, will provide technical collaboration between the two companies and look to strengthen the open source cloud project by speeding up the introduction of more enterprise features as well as services and support for customers.

The funding will also bring on board Goldman Sachs as an investor for the first time, the firm said, alongside collaboration from the companies’ engineers in the community on OpenStack high availability, storage, network integration and support for big data.

“Intel is actually providing us with cash, so they’ve bought a co-development subscription from us. Then, in addition, we’ve strengthened our balance sheet by putting more equity financing dollars into the company. So overall the total funds are at $100m,” said Mirantis president and co-founder Alex Freedland.

“With Intel as our partner, we’ll show the world that open design, open development and open licensing is the future of cloud infrastructure software. Mirantis’ goal is to make OpenStack the best way to deliver cloud software, surpassing any proprietary solutions.”

Freedland added that the collaboration means that there’s nothing proprietary in the arrangement that it is flowing directly into open source. No intellectual property is going to Intel.

“All this is community-driven, so everyone will be able to take advantage of it,” he added.

The move is part of the Cloud for All initiative announced by Intel in July.

Intel is becoming increasingly involved in OpenStack. The company said at the OpenStack Summit in May that it is making various contributions, including improving the security of containerised applications in the cloud using the VT-x extensions in Intel processors.

Other big companies are also backing the open source software. Google announced in July that it had joined the OpenStack Foundation as a corporate sponsor in a bid to promote open source and open cloud technologies.

Working closely with other members of the OpenStack community, Google said that the move will bring its expertise in containers and container management to OpenStack while sharing its work with innovative open source projects like Kubernetes.

Courtesy-TheInq

Is HP’s Forthcoming Split A Good Idea?

August 25, 2015 by Michael  
Filed under Computing

HP Has released its financial results for the third quarter and they make for somewhat grim reading.

The company has seen drops in key parts of the business and an overall drop in GAAP net revenue of eight percent year on year to $25.3bn, compared with $27.6bn in 2014.

The company failed to meet its projected net earnings per share, which it had put at $0.50-$0.52, with an actual figure of $0.47.

The figures reflect a time of deep uncertainty at the company as it moves ever closer to its demerger into HP and Hewlett Packard Enterprise. The latter began filing registration documents in July to assert its existence as a separate entity, while the boards of both companies were announced two weeks ago.

Dell CEO Michael Dell slammed the move in an exclusive interview with The INQUIRER, saying he would never do the same to his company.

The big boss at HP remained upbeat, despite the drop in dividend against expectations. “HP delivered results in the third quarter that reflect very strong performance in our Enterprise Group and substantial progress in turning around Enterprise Services,” said Meg Whitman, chairman, president and chief executive of HP.

“I am very pleased that we have continued to deliver the results we said we would, while remaining on track to execute one of the largest and most complex separations ever undertaken.”

To which we have to ask: “Which figures were you looking at, lady?”

Breaking down the figures by business unit, Personal Systems revenue was down 13 percent year on year, while notebook sales fell three percent and desktops 20 percent.

Printing was down nine percent, but with a 17.8 percent operating margin. HP has been looking at initiatives to create loyalty among print users such as ink subscriptions.

The Enterprise Group, soon to be spun off, was up two percent year on year, but Business Critical system revenue dropped by 21 percent, cancelled out by networking revenue which climbed 22 percent.

Enterprise Services revenue dropped 11 percent with a six percent margin, while software dropped six percent with a 20.6 percent margin. Software-as-a-service revenue dropped by four percent.

HP Financial Services was down six percent, despite a two percent decrease in net portfolio assets and a two percent decrease in financing volume.

Courtesy-TheInq

 

Web.com Latest Victim of Credit Card Hacking

August 21, 2015 by mphillips  
Filed under Around The Net

Hackers gain unauthorized access to the computers of Internet services provider Web.com Group and stole credit card information of 93,000 customers.

According to a website set up by the company to share information about the incident, Web.com discovered the security breach on Aug. 13 as part of its ongoing security monitoring.

Attackers compromised credit card information for around 93,000 accounts, as well as the names and addresses associated with them. No other customer information like social security numbers was affected, the company said.

According to the company, the verification codes for the exposed credit cards were not leaked. However, there are websites on the Internet that don’t require such codes for purchases.

Web.com has notified affected customers via email and will also follow up with letters sent through the U.S. Postal Service. Those users can sign up for a one-year free credit monitoring service.

The company did not specify how the intruders gained access to its systems, but has hired a “nationally recognized” IT security firm to conduct an investigation.

Web.com provides a variety of online services, including website and Facebook page design, e-commerce and marketing solutions, domain registration and Web hosting. The company claims to have over 3.3 million customers and owns two other well known Web services companies: Register.com and Network Solutions.

Register.com and Network Solutions customers were not impacted by this breach unless they also purchased services directly from Web.com.

 

 

 

 

 

AMD Still Losing Ground

August 21, 2015 by Michael  
Filed under Computing

AMD is continuing to lose market share to Nvidia, despite the fact that its new best video card, the Fury is out.

AMD always had a get out of jail card when the last GPU market share numbers were out on the basis of it not having released anything. At the time NVidia had 76% of the discrete GPU market. This was when Nvidia’s best card was the GeForce GTX 980.

A lot happened in that time. There was the release of the Titan X in March, and before the GTX 980 Ti in June. AMD had its Hawaii architecture inside of the R9 290X, and the dual-GPU in the form of the R9 295X2. It was expected that the R9 390X might turn AMD’s luck around but that turned out to be another rebrand. Then there was the arrival of the R9 Fury X.

AMD has new products on the market: the R9 Fury X, R9 Fury, R9 390X and a bunch of rebranded 300 series video cards. But according to Mercury Research’s latest data, NVIDIA has jumped from 76% of the discrete GPU market in Q4 2014 to 82 per cent in Q2 2015.

AMD has 18 per cent of the dGPU market share, even after the release of multiple new products.

It is not that the Fury X isn’t selling well, but because of yield problems there will only 30,000 units made over the entire of the year.

AMD also rebranded nearly its entire product stack thus making no reason to buy a R9 390X if you own an R9 290X.

Sure there is 8GB of GDDR5 on board compared to the 4GB offered on most R9 290X cards, but that’s not enough to push someone to upgrade their card.

Tweaktown noted that  there was a big issue of the HBM-powered R9 Fury X not really offering any form of performance benefits over the GDDR5-powered GeForce GTX 980 Ti from NVIDIA. The 980 Ti beating the Fury X in some tests which it should not have.

Nvidia has plenty of GM200 GPUs to go around, with countless GTX 980 Ti models from a bunch of AIB partners. There is absolutely no shortage of GTX 980 Ti cards. Even if you wanted to get your paws on a Fury X, AMD has made it difficult.

Now it seems that next year could be a lot worse for AMD. Nvidia will have its GP100 and GP104 out next year powered by Pascal. This will cane AMD’s Fiji architecture. Then Nvidia will swap to 16nm process when its Maxwell architecture is already power efficient. Then there is the move HBM2, where be should see around 1TB/sec memory bandwidth.

All up the future does not look that great for AMD.

Courtesy-Fud

Google Unveils New Wi-Fi Router For Home

August 20, 2015 by mphillips  
Filed under Consumer Electronics

Google Inc has unveiled a Wi-Fi router, the latest move in the company’s efforts to get ready for the connected home and draw more users to its services.

The cylinder-shaped router, named OnHub, can be pre-ordered for $199.99 at online retailers including the Google Store, Amazon.com Incand Walmart.com.

The router comes with in-built antennas that will scan the airwaves to spot the fastest connection, Google said in a blog post.

With the router, users will be able to prioritize a device so that they can get the fastest Internet speeds for data-heavy activities such as downloading content or streaming a movie.

The router can be hooked up with Google’s On app, available on Android and iOS, to run network checks and keep track of bandwidth use among other things.

Google said OnHub automatically updates with new features and the latest security upgrades, just like the company’s Android OS and Chrome browser.

The router is being manufactured by network company TP-LINK, Google said, hinting that ASUS could be the second manufacturing partner for the product.

The product launch comes days after Google restructured itself by creating Alphabet Inc, a holding company to pool its many subsidiaries and separate the core web advertising business from newer ventures like driverless cars.

Making products for the smart home is one such venture.

Google last year bought Nest, a smart thermostat maker, for $3.2 billion, aiming to lead the way on how household devices link to each other and to electricity grids.

The global market for “Internet of Things”, the concept of connecting household devices to the Internet, will nearly triple to $1.7 trillion by 2020, research firm International Data Corp said in June.

 

 

Is Microsoft Besting Sony In Video Game Software Space?

August 20, 2015 by Michael  
Filed under Gaming

The validity of framing the console market as a ‘race’ or a ‘war’ is open to question, but there’s no doubt that it’s a lot more fun when you do. The notion that there is a hard, immovable line between winning and losing simply doesn’t make much sense from a business perspective, but it makes for lively debate and – from an entirely selfish perspective – good copy.

For the first six months of this console generation that was certainly the case: the Xbox One tripping, stumbling and backtracking, with the PlayStation 4 marketing department lying in wait, pointed comments at the ready. Microsoft is dealing with the fallout from that disastrous period even now, its own reluctance to disclose hardware sales figures compounded by Sony’s eagerness to provide an update at every opportunity. At the last count, in July, the PlayStation 4 had sold more than 25 million units. The Xbox One, on the other hand, has sold…. well, we haven’t been given an official worldwide figure in 2015 so far.

In terms of sales, then, it’s very clear which console is ‘winning’ the generation, and it has been from the very first day. In terms of content, though, the debate is more nuanced, the outcome far less certain. Sony’s development resources have long been regarded as a unique strength when compared to Microsoft, effectively guaranteeing a superior crop of exclusive games regardless of how well the PlayStation hardware is selling. Whether that’s still true in terms of first-party studios is almost besides the point, because in terms of available, exclusive games there’s a strong argument that the Xbox has been a more attractive platform since the launch of Titanfall more than a year ago. By the end of this year, that point may well be beyond debate.

“I wouldn’t even say the gap has closed,” says Kudo Tsunoda, one of the leading executives in the Xbox games business. “We’ve got a lot more exclusive games than any other platform.”

Tsunoda and the various studios he oversees are celebrating the second Xbox showcase in less than two months. The first, at E3, is generally regarded as a key battleground within the console war, and a significant proportion of those who watched this year believed that Microsoft emerged victorious despite an impressive showing from Sony. The second, at Gamescom, was an Xbox victory by default, with Sony electing to steer clear of the event for the first time in years. Even so, Microsoft presided over 90 minutes of new games, not all of which were exclusive to the Xbox One, but none of which were on show at E3. Whether those exclusives came from first-party studios (Halo and Gears of War) or via chequebook-and-pen (Tomb Raider and Quantum Break) is largely irrelevant. For perhaps the first time in this console generation Xbox owners have an undeniable right to feel smug.

“There’s a reason we’re able to put on two shows of content together,” Tsunoda continues. “We’ve got seven exclusives coming this holiday, and then everything coming in 2016. Not just the blockbusters, but the ID@Xbox games, the indie games. We’re giving people a lot more.”

Microsoft’s early mistakes have been formative for the Xbox One, its underlying strategy switching from closed and controlled to open and inclusive. Sony recorded several huge PR victories by simply responding to those initial bad choices, but Microsoft has since proved more committed to the stance that Sony initially claimed as its own. An early indicator was Sony’s refusal to allow EA Access onto the PlayStation Network due to stated concerns that it didn’t offer “good value” to the consumer, but just as likely down to competition with its own planned streaming service, PlayStation Now. Microsoft allowed its customers to make that choice for themselves. Had you been asked to guess the stance each company would adopt even a few months before, it’s likely those roles would have been reversed.

Tsunoda repeats the idea that MIcrosoft is ‘listening to the fans’ throughout our interview, making it quite clear that it’s a message the company wants us to hear. However, while it would be naive to believe that any multinational corporation is motivated principally by altruism, the strategy for Xbox One is increasingly guided by consumer demand.

Two incoming services perfectly illustrate the degree to which Microsoft has pivoted since the days of mandatory online checks and a prohibition on used games. Xbox Preview is a more tightly controlled version of Steam Early Access, and just the sort of concept that walled gardens were formed to exclude. Backwards compatibility, meanwhile, demands little in the way of explanation. Equally, its importance cannot be overstated, to the consumers who spend so much on games every console generation, and to those who believe that companies like Microsoft should be treating their creative heritage with more respect.

“With backwards compatibility, it isn’t something that we just think gamers might want,” Tsunoda says. “We know. We’re looking for and soliciting that feedback. It was the number one most requested feature for Xbox One by far.”

Sony has no plans to match Microsoft in this respect, and the possibility of monetising those games through PlayStation Now makes it very unlikely that it ever will. For Microsoft, it’s part of a broader view of gaming with Windows 10 at its core, which should, in theory, unite the previously disparate tendrils of Microsoft’s sprawling organisation. PC and console, past and present, existing in harmony, each interacting with and complementing the other. Cross-Buy, Cross-Play, console to PC streaming; one might say that Microsoft should have been doing this for years already. According to Tsunoda, this is a first step.

“For a long time we’ve had PC gamers and console gamers who weren’t really able to play together,” Tsunoda says. “That’s why Cross-Play is still such a powerful idea. You should be able to play what you love, and play together, regardless of what device you’re playing on. It’s about connecting people.

“With backwards compatibility, it isn’t something that we just think gamers might want. We know”

“It’s a really unique value that only we can offer. You still need very gamer-focused values, but there are lots of things you can do with our technology. We’ve really got a lot more going on [than our competitors]. We’re doing things that can’t be done on any other console.

If Microsoft is pushing towards a more holistic approach to its games business, then a few reminders of its clumsier past still remain. One is perched just below the television directly to our left: Kinect, a device once positioned as an integral part of the future of Xbox, a future that Tsunoda was instrumental in selling to the press and public. These days, though, it feels additive, and that’s being kind. In more than 150 minutes of press conferences across E3 and Gamescom Kinect barely merited a single mention, while a new announcement, the Chatpad, offered a core-friendly alternative to the search and chat functions that represent a huge chunk of why anyone might still use it.

“I don’t think it’s an alternative [to Kinect]. It’s just about giving people a choice in how they can do things,” Tsunoda replies. “There’s still a lot of great voice capabilities that you can use with Kinect, but there’s also a lot of great possibilities for communication with the Chatpad. You can also customise a lot, with specific buttons for specific functions. With everything we do, we’re trying to give people the choice.”

In terms of games, though, Tsunoda offers only Just Dance 2016 as a specific example – which is developed and published by Ubisoft – accompanied by the vague promise that, “There’s still Kinect games coming as well.” This may be what ‘choice’ starts to look like when Microsoft loses faith in one of its possible futures. It should be noted that Kinect is now listed under the “More” section on the Xbox One Accessories page, beneath “Controllers,” beneath “Headsets and Communication,” grouped in the same vague category as the Xbox One Digital TV Tuner and the Xbox One Media Remote.

The fear of obsolescence created by the doldrum in which Kinect now resides also haunts the HoloLens, another promising device that Microsoft has just finished thrusting into the public eye. It stole the show at E3 with an immaculately orchestrated Minecraft demo, only for its limited field-of-view to be scrutinised by the press, and its early utility as gaming hardware to be questioned by none other than the CEO of Microsoft, Satya Nadella.

For Tsunoda, who is also closely involved with the development of HoloLens, the difference between watching a demonstration and actually experiencing it first-hand is more pronounced than any product he’s ever worked on – including Kinect. However, there is more common ground between the two devices than one might think.

“You should think about it in the same way that you would a phone or your computer. It does a lot of things,” Tsunoda says. “Obviously, gaming is a big part of what you do on those machines as well. But that’s what it is: an untethered holographic computer. You can do a lot in the gaming and entertainment space, but it has a lot of other functionality as well.

“Microsoft is a leader in depth-sensing technology: with Kinect, but also the stuff we’re doing with HoloLens as well. A big part of what we’re doing there is an environmental understanding that comes from having pushed our knowledge in depth-sensing. That’s what you’ll see us do as a company. [Kinect] is still a part of the platform, and there’s still Kinect games coming of course, but then also we’re pushing that depth-sensing technology forward with what we’re doing with HoloLens.”

It’s all a part of Microsoft’s future of gaming, whatever that turns out to be. Right now, though, Xbox might finally have emerged from PlayStation’s shadow.

Courtesy-GI.biz

Did VW Sit On Megamos Crypto Security Issue?

August 20, 2015 by Michael  
Filed under Computing

Volkswagen (VW) has watched as a security vulnerability in a key system on a range of vehicles has been released from the garage and put on the news road.

VW was first notified about the problem two years ago, but has worked to keep it under the bonnet. Well, not all of it, just a single line – not a yellow line – has been contentious. The line is still controversial, and has been redacted from the full, now released, report.

VW secured an injunction in the UK high court two years ago. The firm argued at the time that the information would make it easy to steal vehicles that come from its factories and forecourts. That might be true, but that is often the case with vulnerabilities.

The news that VW has suppressed the report for this amount of time is interesting, but it does remind us that not everyone in the industry appreciates third-party information about weaknesses.

VW has a lot of cars under its hood and, according to the report, a lot of different vehicles are affected. These run from Alfa Romeo through to Volvo, and take in midlife crisis mobility vehicles like the Maserati and Porsche.

The report is entitled Dismantling Megamos Crypto: Wirelessly Lockpicking a Vehicle Immobilizer (PDF), and is authored by Roel Verdult from Radbound university in the Netherlands and Flavio Garcia from the University of Birmingham in the UK.

Megamos Crypto sounds like a sci-fi bad guy, maybe a rogue Transformer, but it is actually designed to be a good thing. The security paper said that it is a widely deployed “electronic vehicle immobiliser” that prevents a car starting without the close association of its key and included RFID tag.

The researchers described how they were able to reverse engineer the system and carry out three attacks on systems wirelessly. They mention several weaknesses in the design of the cipher and in the key-update mechanisms. Attacks, they said, can take as little as 30 minutes to carry out, and recovering a 96-bit encryption key is a relatively simple process.

This could be considered bad news if you are a car driver. It may even be worse news for pedestrians. Concerned car owners should find their keys (try down the back of the sofa cushion) and assess whether they have keyless ignition. The researchers said that they told VW about the findings in 2012, and that they understand that measures have been taken to prevent attacks.

We have asked VW for an official statement on the news, but so far it isn’t coughing. Ready to talk, though, is the security industry, and it is giving the revelation the sort of disapproving look that people give cats when they forget what that sand tray is for.

Nicko Van Someren, CTO at Good Technology, suggested that this is another example of what happens when you go from first gear to fourth while going up a hill (this is our analogy). He described it in terms of the Internet of Things (IoT), and in respect of extending systems before they are ready to be extended.

“This is a great example of what happens when you take an interface that was designed for local access and connect it to the wider internet,” he said.

“Increasingly, in the rush to connect ‘things’ for the IoT, we find devices that were designed with the expectation of physical access control being connected to the internet, the cloud and beyond. If the security of that connection fails, the knock-on effects can be dire and potentially even fatal.”

Courtesy-TheInq

IBM Shows Off Linux Only Mainframe

August 19, 2015 by Michael  
Filed under Computing

IBM has announced a new Linux-only mainframe aimed at both ends of the enterprise market.

The new mainframe is led by two servers called LinuxONE that Biggish Blue claims is the “world’s most advanced Linux system.” They both possess the fastest processor in the industry in and are designed for the “new application economy” and hybrid cloud era.

At the top of the range sits the LinuxONE Emperor which is based on IBM z13. It has a new processor design, faster I/O and the ability to address up to 10TB of memory — three times as much as its predecessor. It can house up to 141 processor units in a single system and run as many as 8,000 virtual servers, the company says.

At a maximum 5GHz, the z13′s processor is slower in terms of clockspeed than the chip in the z12, but IBM says it more than compensates for that with other improvements. The chip has eight cores compared with six for its predecessor, and it’s manufactured on a newer, 22 nanometer process, which should mean smaller, faster transistors.

It also has the security and advanced encryption features needed by enterprises. The LinuxONE Rock hopper is an entry level product geared towards speed, security and availability benefits of the mainframe.

IBM is giving shedloads of access to developers. As part of the Linux Foundation’s ‘Open Mainframe Project’ it has contributed some 500,000 lines of code including code related to IT predictive analytics that are on the lookout for unusual system behaviour to stop issues becoming failures.

The LinuxONE Developer Cloud acts as a virtual R&D engine for creating, testing and piloting of emerging applications such as links to engagement systems, mobile apps and hybrid cloud apps.

LinuxONE is provisioned as a virtual machine using the open standards-based KVM hyper-visors.

It is all in the shops now IBM doesn’t give pricing for its new mainframes, but will be more than $100,000.

Courtesy-Fud

 

Can Mobile Phone Calls Be Hacked?

August 19, 2015 by Michael  
Filed under Mobile

Billions of mobile phone users are at risk from a signalling flaw that allows hackers to intercept all voice calls and track locations.

Australian TV program 60 minutes is claiming the scoop, showing in a special report how hackers were able to record the mobile phone conversations of a prominent politician and track his movements from a base thousands of miles away in Germany.

This is because of a flaw in the architecture of the signalling system, known as SS7, which is used to enable mobile phone roaming across telecoms providers, according to the programme.

A hacker can use this information to listen in to any mobile phone conversation by forwarding all calls to an online recording device and then re-routing the call back to its intended recipient, a so-called man-in-the-middle attack.

It also allows the movements of a mobile phone user to be tracked on applications such as Google Maps, and 60 Minutes claimed that it throws the security of SMS verification used by banking apps, for example, into doubt.

“Verification by SMS message is useless against a determined hacker with access to the SS7 portal because they can intercept and use the SMS code before it gets to the bank customer,” the report said.

It’s worth noting, however, that the German hackers who carried out the demonstration, in which they intercepted and recorded a conversation between a 60 Minutes reporter and independent Australian senator Nick Xenophon, were given legal access to SS7 by the government, something most hackers won’t have.

Even so, the disclosures have led to calls for an immediate public inquiry in Australia, amid concerns that the security and intelligence services have long been aware of the SS7 security vulnerabilities.

Senator Xenophon said in response to the report: “This is actually quite shocking because it affects everyone. It means anyone with a mobile phone can be hacked, can be bugged, can be harassed.

“The implications are enormous and what we find shocking is that the security services, the intelligence services, they know about this vulnerability.”

What’s more, security outfit Adaptive Mobile said that such flaws should be taken seriously, as attacks can be launched anywhere in the world on any individual connected to the global SS7 network.

The firm published a blog post following the high-profile attack on Hacking Team when it first became concerned about SS7.

“Security in the SS7 network has become of paramount importance for the mobile community, so knowing how these surveillance companies regard and use SS7 is essential,” Adaptive Mobile said.

“Based on the information that has become available, it seems that there is a wider group of commercial entities selling systems that allow surveillance over SS7, and that these systems are for offer today.”

Courtesy-TheInq

U.S. Rule Over ICANN Extended

August 19, 2015 by mphillips  
Filed under Around The Net

The United States management of the Internet Corporation for Assigned Names and Numbers, the coordinator of the Internet’s domain name system, will continue through September next year and perhaps even beyond.

The Internet global multi-stakeholder community needs time to complete its work, have the plan reviewed by the U.S. government and then put it into action if approved, the U.S. Department of Commerce said Monday.

The U.S. National Telecommunications and Information Administration (NTIA) said in March last year it planned to let its contract with ICANN to operate key domain-name functions expire in September 2015, passing the oversight of the agency to a global governance model.

The Internet Assigned Numbers Authority (IANA) functions, operated by ICANN under contract with the Department of Commerce, is responsible for the coordination of the DNS (Domain Name System) root, IP addressing and other Internet Protocol resources.

In May the department asked the groups developing the transition documents for an indication of how long it would take to finish and implement the proposals. The community estimated it would take until at least September next year, wrote Lawrence Strickling, NTIA administrator and assistant secretary of commerce for communications and information in a blog post Monday.

The Department of Commerce informed Congress on Friday that it plans to extend the IANA contract with ICANN for one year to Sept. 30, 2016. “Beyond 2016, we have options to extend the contract for up to three additional years if needed,” he added.

The move by the U.S. to hand over supervision of the IANA functions to a wider forum has raised concerns that other governments, some of them dictatorial, would take control of ICANN. Many countries have already asked for a greater say in managing the Internet.

 

 

Mozilla Working On Improving Stealth Mode For Firefox

August 18, 2015 by mphillips  
Filed under Around The Net

Mozilla has set a goal to make private browsing truly private.

The company is testing updates to private browsing in Firefox designed to block website elements that could be used by third parties to track browsing behavior across sites. Most major browsers, Firefox included, have a “Do Not Track” option, thoughmany companies do not honor it.

Mozilla’s experimental tool is designed to block outside parties like ad networks or analytics companies from tracking users through cookies and browser fingerprinting.

It’s available in the Firefox Developer Edition on Windows, Mac and Linux, and Firefox Aurora on Android, Mozilla said.

The tool is in pre-beta, although it might be incorporated into future versions of Firefox’s main browser.

The tool might cause some data-hungry websites to not load properly, Mozilla said. Users can unblock specific websites if they wish.

The enhancements also better identify unsafe browser add-ons that could install malware or collect user information.

“We’ve worked with developers and created a process that attempts to verify that add-ons installed in Firefox meet the guidelines and criteria we’ve developed to ensure they’re safer for you,” Mozilla said in a blog post.

Web tracking provides fuel to the lucrative business of targeted ads. A recent report showed that the usage of ad-blocking software is on the rise, costing publishers billions of dollars.

Other browser extensions designed to block tracking and targeted ads include Ghostery and AdBlock Plus.

The Electronic Frontier Foundation, meanwhile, is trying to develop a new standard for the “Do Not Track” browser setting to make it more effective.

 

 

 

 

 

More Details Uncovered On AMD’s ZEN Cores

August 17, 2015 by Michael  
Filed under Computing

Our well informed industry sources have shared a few more details about the AMD’s 2016 Zen cores and now it appears that the architecture won’t use the shared FPU like Bulldozer.

The new Zen uses a SMT Hyperthreading just like Intel. They can process two threads at once with a Hyperthreaded core. AMD has told a special few that they are dropping the “core pair” approach that was a foundation of Bulldozer. This means that there will not be a shared FPU anymore.

Zen will use a scheduling model that is similar to Intel’s and it will use competitive hardware and simulation to define any needed scheduling or NUMA changes.

Two cores will still share the L3 cache but not the FPU. This because in 14nm there is enough space for the FPU inside of the Zen core and this approach might be faster.

We mentioned this in late April where we released a few details about the 16 core, 32 thread Zen based processor with Greenland based graphics stream processor.

Zen will apparently be ISA compatible with Haswell/Broadwell style of compute and the existing software will be compatible without requiring any programming changes.

Zen also focuses on a various compiler optimisation including GCC with target of SPECint v6 based score at common compiler settings and Microsoft Visual studio with target of parity of supported ISA features with Intel.

Benchmarking and performance compiler LLVM targets SPECint v6 rate score at performance compiler settings.

We cannot predict any instruction per clock (IPC improvement) over Intel Skylake, but it helps that Intel replaced Skylake with another 14nm processor in later part of 2016. If Zen makes to the market in 2016 AMD might have a fighting chance to narrow the performance gap between Intel greatest offerings.

Courtesy-Fud

Dropbox Beefs Up Security

August 14, 2015 by mphillips  
Filed under Around The Net

Two-factor authentication is widely regarded as a best practice for security in the online world, but Dropbox has announced a new feature that’s designed to make it even more secure.

Whereas two-step verification most commonly involves the user’s phone for the second authentication method, Dropbox’s new U2F support adds a new means of authenticating the user via Universal 2nd Factor (U2F) security keys instead.

What that means is that users can now use a USB key as an additional means to prove who they are.

“This is a very good advancement and adds extra security over mobile notifications for two-factor authentication,” said Rich Mogull, Securosis CEO.

“Basically, you can’t trick a user into typing in credentials,” Mogull explained. “The attacker has to compromise the exact machine the user is on.”

For most users, phone-based, two-factor authentication is “totally fine,” he said. “But this is a better option in high-security environments and is a good example of where the FIDO standard is headed.”

Security keys provide stronger defense against credential-theft attacks like phishing, Dropbox said.

“Even if you’re using two-step verification with your phone, some sophisticated attackers can still use fake Dropbox websites to lure you into entering your password and verification code,” the company explained in a blog post. “They can then use this information to access your account.”

Security keys, on the other hand, use cryptographic communication and will only work when the user is signing in to the legitimate Dropbox website.

Dropbox users who want to use the new feature will need a security key that follows the FIDO Alliance’s Universal 2nd Factor (U2F) standard. That U2F key can then be set up with the user’s Dropbox account along with any other U2F-enabled services, such as Google.

 

 

 

Will Apple Release A Cheaper iPhone?

August 14, 2015 by Michael  
Filed under Computing

Apple is about to spike plans to make a cheaper, plastic iPhone 6C.

The Tame Apple Press became all moist when the news that Apple was going to make a a plastic bodies and 4in screens in an iPhone 6C?  This would mean that Apple would not only have three phones coming out this year, but actually have one that it could put into cheaper markets.

We have heard that logic before, and it never really worked. And now it looks like Apple has abandoned the plan (if it even had it in the first place).

A marketing firm claims it has seen testing data for just two new iPhones, which strongly suggests that an iPhone 6C launch is not imminent.

Fisku, had access to data that shows identifiers for models in testing. Its logs recently showed two new iPhones, which showed up as “iphone8,1″ and “iphone8,2″ – most likely codenames for the upcoming iPhone 6s (or 7, depending on Apple’s choice of moniker) and the iPhone 6s Plus (or 7 Plus).

If the phone is launched it might be at a much later date, but so far it looks like Apple will stick to launching just two models.

Courtesy-Fud