Subscribe to:

Subscribe to :: ::

Kingston’s Hyper-X Division Deals A Blow To The SSD Market

April 29, 2015 by Michael  
Filed under Computing

Kingston’s Hyper-X division has announced the arrival of its Savage SATA drive for the premium and gaming laptop market.

The SATA3-based 2.5in solid state disk drive (SSD) is based on the Phison S10 quad-core eight-channel controller making it capable of 560MB/s read speed and 530MB/s write.

Individual operation per second clock in at up to 100,000 read and 89,000 write, with a life expectancy of one million hours between failures.

“We are proud to introduce the HyperX Savage as our latest and fastest SATA-based SSD,” said Tony Hollingsbee, business manager for SSDs at Kingston.

“The HyperX 3K drive has been a core part of our SSD offerings since 2012, and now with Savage we are unleashing even higher performance and capacities to satisfy the most demanding consumers, enthusiasts and gamers.”

Available in capacities of 120GB to 960GB, the drive will be available as standalone, or in an upgrade kit with USB 3.0 enclosure, bracket and mounting screws, SATA data cable, cloning software, multi-bit screwdriver and an adapter that allows the 7mm thick drive to work in older 9mm cavities.

The price includes a three-year warranty with tech support and full backwards compatibility with SATA2, so you’ll still feel some of the difference with an older configuration.

Kingston has been gradually augmenting its HyperX division in recent years. The company’s DDR4 chips broke the overclocking world record with a frequency of 4351MHz using a 3333MHz module and a lot of coolant.

In a separate experiment, a previous SSD iteration, the Kingston HyperX 3K, became one of two drive models still functioning after rewrites of more than 2PB, far more than was ever intended and suggesting a lifespan of hundreds of years under normal use.

The company has also been augmenting its line of gaming accessories with the HyperX Cloud II Headset with its own inbuilt 7.1 Soundchip debuting at the start of the year.


Microsoft May Take Massive Write-off From Nokia Acquisition

April 28, 2015 by mphillips  
Filed under Around The Net

Microsoft has hinted that it may take a huge write-off of its Nokia acquisition, perhaps as early as July.

In the 10-Q filed with the U.S. Securities and Exchange Commission (SEC) last week, Microsoft said that its Phone Hardware division, which is based largely on the Nokia assets acquired last year for approximately $7.9 billion, lost money in the March quarter.

With revenue at $1.4 billion for the period, Microsoft said, cost of revenue exceeded sales by $4 million, meaning the company lost about 12 cents — even before marketing, R&D and other expenses were factored in — on each phone sold.

More importantly, Microsoft also warned investors that it may need to write off some of the Nokia acquisition.

“Given its recent performance, the Phone Hardware reporting unit is at an elevated risk of impairment,” Microsoft said, using a term to describe the situation when the market value of a business is less than what’s carried on the books. In such scenarios, corporations are required to balance accounts by taking a charge against earnings to the tune of the difference.

“Declines in expected future cash flows, reduction in future unit volume growth rates, or an increase in the risk-adjusted discount rate used to estimate the fair value of the Phone Hardware reporting unit may result in a determination that an impairment adjustment is required, resulting in a potentially material charge to earnings [emphasis added],” the company continued.

Ben Thompson, an independent analyst who reported on Microsoft’s 10-K statement on Friday, translated the accounting-speak. “A very, very big write-off -— and associated quarterly loss -— is coming soon. What a disaster!” wrote Thompson on his

Microsoft currently carries $5.46 billion in “goodwill” from the Nokia acquisition on its books, as well as another $4.51 billion in intangible assets. The Redmond, Wash. company had attributed the Nokia goodwill to “increased synergies that are expected to be achieved from the integration of NDS [Nokia Corp.'s Devices and Services business].”

That value may now be greatly overstated, Microsoft acknowledged.






Dell Working On 120Gbs Deep Packet Inspection Firewall

April 27, 2015 by Michael  
Filed under Computing

Dell’s security division has announced that it is working on a next-generation Firewall (NGFW) that it claims is the first to deliver deep packet inspection (DPI) speeds of up to 120Gbps.

The company will demonstrate these speeds at the RSA conference in San Francisco this week, and said that the NGFW cluster enables an “easy migration path” for the future growth of enterprise networks.

Dubbed a “firewall sandwich” of high DPI performance, better security efficiency and N+1 resiliency, the NGFW architecture is also said to lower the cost of demanding data centre operations.

“SSL decryption and inspection are critical NGFW capabilities required to effectively uncover malware deeply hidden inside encrypted web sessions and provide deeper perimeter network security,” said Dell.

“In this network design, the Dell SuperMassive NGFW with onboard SSL decryption can be incrementally deployed and horizontally scaled infinitely to address SSL performance loss and increase SSL decryption and inspection performance.”

The company will show off the technology at RSA in collaboration with Array Networks and Spirent Communication to give a demo of a highly-resilient, scalable, ‘Open Firewall Sandwich’ layer 3 architecture.

Dell will be joined by Ixia in demonstrating a network-based model for scaling the NGFW with DPI speeds of above 100Gbps.

Dell also unveiled several updates to the SecureWorks offering, which it claims will help firms increase network security and grow their business.

Updates include improved services in Dell Secure Mobile Access (SMA) solutions to increase mobile productivity for remote workers while protecting critical data from cyber threats.

The new SMA 11.2 release adds secure access to more resources using a standard HTML 5 browser, which Dell said allows easier access for most smartphone, tablet and laptop users while reducing reliance on Java and ActiveX components.

The new release adds HTML 5 browser access to Citrix XenDesktop and XenApp ICA support.

Dell said that new SMA 6200 and 7200 appliances also offer increased scalability. The SMA 6200 entry-level platform supports up to 2,000 concurrent users, while the SMA 7200 mid-range platform supports up to 10,000 concurrent users.

The SMA updates arrive six months after Dell revealed the SuperMassive 9800 firewall, which it claimed would protect against high-profile bugs such as Shellshock and Heartbleed.

Touted at the time as the most powerful in the SuperMassive 9000 line-up, the 9800 offered Dell’s Reassembly-Free DPI single-pass threat prevention engine, and advanced DPI with speeds up to 20Gbps. That’s a whopping 100Gbps less than the speed it is about to go for at RSA.


Mozilla’s Firefox Coming To iPhone

April 21, 2015 by mphillips  
Filed under Mobile

Mozilla will offer Firefox for Apple’s iPhone “soon,” according to a company announcement of an open marketing position.

As the senior mobile marketing manager, the candidate will “lead marketing for Firefox on both Android and iOS,” the listing stated, adding that “a new Firefox for iOS application [will be] arriving soon.”

Mozilla, which had previously staunchly declined to create a version of its iconic browser for iOS, changed its tune last December, when a company manager said that the open-source developer would “get Firefox on iOS.”

Although Mozilla confirmed that it was working on Firefox for iOS, at the time it gave no hint of a timeline. “We are in the early stages of experimenting with something that allows iOS users to be able to choose a Firefox-like experience,” Mozilla said in a Dec. 2 blog.

The phrase “Firefox-like experience” was crucial: Apple allows only those browsers into the App Store that are built atop its own rendering and JavaScript engines, WebKit and Nitro, which power Safari. Mozilla relies on its own technologies for both. Firefox on iOS, then, will be a user interface (UI) layer atop WebKit and Nitro.

Mozilla’s Github repository for iOS Firefox confirmed that.

The reasons for Mozilla’s renewed interest in iOS likely stemmed from Firefox’s decline in browser user share. Over the last 12 months, Firefox has shed 31% of its desktop user share by metrics vendors’ Net Applications count, and now has less than half the share of Google’s Chrome.

Mozilla has put its shoulder behind other mobile initiatives. But Firefox OS, an open-source mobile operating system based on the browser, has not yet gained significant traction and its Firefox browser for Android hasn’t moved the needle. According to Net Applications, Firefox’s usage share on mobile was just 0.7% last month, or about one sixty-sixth that of Safari.




Qualcomm Gives Snapdragon More Umph

April 20, 2015 by Michael  
Filed under Computing

Qualcomm has released a new Trepn Profiler app for Android which will profile Snapdragon processors and tinker with them.

The Trepn Profiler app identifies apps that overwork the CPU or are eating too much data. The app will pinpoint which of the apps drain the battery faster.

All data that will be obtained by this app can provide information you need to know which program is slowing down your phone.

Most Android phone users will not give a damn, but developers will find it useful. Those who are interested in testing roms, custom kernels, and their own apps can use the data gathered by the Trepn Profiler.

Developers can measure optimisation and performance on Snapdragon-powered mobile devices. Data are real-time include network usage, battery power, GPU frequency load, and CPU cores’ load. Key features also include six fast-loading profiling presets, and an advanced mode to manually select data points and save for analysis.

The Advanced Mode allows profiling a single app or device, offline data analysis, and increasing of data collection interval. This special mode also allows longer profiling sessions, displaying two data point in one overlay, and viewing of profile data.

All up this should enable developers to come up with more Snapdragon friendly apps.


AMD To Power Samsung’s Digital Media

April 20, 2015 by Michael  
Filed under Computing

AMD’s Embedded R-Series accelerated processing unit, previously codenamed “Bald Eagle,” is powering Samsung’s latest set-back-box digital media players.

AMD’s Embedded R-Series accelerated processing unit, previously codenamed “Bald Eagle,” is powering Samsung’s latest set-back-box digital media players.

Bald Eagle was designed for high performance at low power with broad connectivity but mostly for digital signage.

It seems that new Samsung SBB-B64DV4 is intended for demanding signage applications that transform Samsung SMART Signage Displays into digital tools for a wide range of business needs.

The chipmaker claimed that by using its Embedded R-Series APUs, Samsung SBB media players for digital signage can manage HD graphics performance and support multivideo stream capabilities up to two displays, in a power efficient and ultra-compact form factor.

Scott Aylor, corporate vice president and general manager, AMD Embedded Solutions said that digital signage is a key vertical for the AMD Embedded business.

“The AMD Embedded R-Series APU enables leading digital signage providers to harness high levels of compute and graphics performance within a low-power design envelope. AMD Embedded Solutions help designers at Samsung achieve aggressive form factor goals and drive down system costs while providing the rich multimedia their digital signage customers’ demand,” he said.

The AMD Embedded RX-425BB APU combines an x86 CPU with an integrated, discrete-class AMD Radeon R6 graphics processing unit in a low-power configuration to minimize heat dissipation constraints and meet energy efficiency requirements.

The processor uses AMD’s latest Graphics Core Next architecture, created for advanced graphics applications and parallel processing capabilities.



Microsoft Unveils Office Touch Apps For Windows Phones

April 20, 2015 by mphillips  
Filed under Mobile

Microsoft Corp has finally rolled out a long-awaited suite of touch-friendly Office apps that allow Windows phone users to work on Word, PowerPoint and Excel documents on their phones with touch commands and to transfer them easily between devices.

Test versions of what Microsoft is calling its Office Universal apps are available to download immediately and full versions will be available by the end of the month, Microsoft said.

Many Office users have waited months for Microsoft to introduce the apps, which adapt their look and commands to the device being used, whether Windows Phone or tablet.

Microsoft, in a departure from tradition, has already released similar touch-friendly Office apps for Apple Inc’s  iPad and iPhone, and for tablets running Google Inc’s Android.

The company’s reasoning was that those popular devices, which have dominated mobile computing, represented a bigger and more lucrative market for its Office products than its own Windows mobile devices.

Basic functions are free for everyone, but for advanced editing features, users must pay for a subscription to Office 365, Microsoft’s cloud-based version of Office.

Microsoft is set to release a new version of Office for desktop PCs, and a new version of Windows, later this year.



Will Moore’s Law Become More Important In The Next Twenty Years?

April 15, 2015 by Michael  
Filed under Computing

Moore’s Law will be more relevant in the 20 years to come than it was in the past 50 as the Internet of Things (IoT) creeps into our lives, Intel has predicted.

The chip maker is marking the upcoming 50th anniversary of Moore’s Law on 19 April by asserting that the best is yet to come, and that the law will become more relevant in the next two decades as everyday objects become smaller, smarter and connected.

Moore’s Law has long been touted as responsible for most of the advances in the digital age, from personal computers to supercomputers, despite Intel admitting in the past that it wasn’t enough.

Named after Gordon Moore, co-founder of Intel and Fairchild Semiconductor, Moore’s Law is the observation that the number of transistors in a dense integrated circuit will double approximately every two years.

Moore wrote a paper in 1965 describing a doubling every year in the number of components per integrated circuit. He revised the forecast in 1975, doubling the time to two years, and his prediction has proved accurate.

The law now is used in the semiconductor industry to guide long-term planning and to set targets for research and development.

Many digital electronic devices and manufacturing developments are strongly linked to Moore’s Law, whether it’s microprocessor prices, memory capacity or sensors, all improving at roughly the same rate.

More recently, Intel announced the development of 3D NAND memory, which the company said was guided by Moore’s Law.

Intel senior fellow Mark Bohr said on a recent press call that, while Moore’s Law has been going strong for 50 years, he doesn’t see it slowing down, adding that Moore himself didn’t realise it would hold true for 50 years. Rivals such as AMD have also had their doubts.

“[Moore] thought it would push electronics into new spaces but didn’t realise how profound this would be, for example, the coming of the internet,” said Bohr.

“If you’re 20-something [the law] might seem somewhat remote and irrelevant to you, but it will be more relevant in the next 20 years than it was in the past 50, and may even dwarf this importance.

“We can see about 10 years ahead, so our research group has identified some promising options [for 7nm and 5nm] not yet fully developed, but we think we can continue Moore’s Law for at least another 10 years.”

Intel believes that upcoming tech will be so commonplace that it won’t even be a ‘thing’ anymore. It will “disappear” into all the places we inhabit and into clothing, into ingestible devices that improve our health, for example, and “it will just become part of our surroundings” without us even noticing it.

“We are moving to the last squares in the chess board, shrinking tech and making it more power efficient meaning it can go into everything around us,” said Bohr.

The Intel fellow describes the law as a positive move forward, but he also believes that we need to have a hard think about where we want to place it once products become smart as they can become targets for digital attacks.

“Once you put intelligence in every object round you, the digital becomes physical. [For example] if your toaster becomes connected and gets a virus it’s an issue, but not so important as if your car does,” he said.

“We have to think how we secure these endpoints and make sure security and privacy are considered upfront and built into everything we deploy.”

Bohr explained that continuing Moore’s Law isn’t just a matter of making chips smaller, as the technology industry has continually to innovate device structures to ensure that it continues.

“Moore’s Law is exponential and you haven’t seen anything yet. The best is yet to come. I’m glad to hand off to the next generation entering the workforce; to create new exciting experiences, products and services to affect the lives of billions of people on the planet,” added Bohr.

“Moore’s Law is the North Star guiding Intel. It is the driving force for the people working at Intel to continue the path of Gordon’s vision, and will help enable emerging generations of inventors, entrepreneurs and leaders to re-imagine the future.”


Chromebooks With Intel’s Braswell Chips Coming Soon

April 8, 2015 by mphillips  
Filed under Computing

A new crop of low-cost Chromebooks will be unveiled, powered by the Intel Braswell chips that are expected to debut later this week.

The new Braswell chips include new Celeron and Pentium processors, which will support both Chrome OS and Windows, said sources familiar with Intel’s product plans. More details on Braswell will be shared at the Intel Developer Forum in Shenzhen this week.

New Chromebooks running Braswell are expected in the coming months from top PC makers, as well as from low-cost manufacturers China who might bring the price point down to less than $200. Braswell will also appear in low-cost Windows laptops, desktops and tablets.

Intel first announced the Braswell chips a year ago, but shipments were delayed due to problems with the company’s 14-nanometer manufacturing process.

Chromebooks, favored by some who do most of their computing on the Internet, are powered by a range of Intel or ARM processors. Most Chromebooks priced starting at $200 to $300 have aging Celeron processors based on the Bay Trail architecture, which Braswell will replace. The fastest and most expensive Chromebooks such as Google’s Chromebook Pixel have Intel’s Core chip, which packs more horsepower than Celeron or Pentium processors.

The new Celeron and Pentium chips could also be Intel’s answer to last week’s release of sub-$200 ARM-based Chromebooks from Haier, HiSense and Asustek. Chromebook shipments are rising in a flat PC market, and have become a new battleground for Intel and ARM, who also compete in servers and mobile devices.

Braswell should deliver better graphics performance, though battery life may not get a boost. The chips may be a good fit for Chromebooks, in which the speed of a wireless connection is most important with the bulk of processing happening not locally but on remote servers hosting applications. That may change as Google is making available more applications that work offline.



Stellar Evolution’s Missing Link Found In Young Star

April 8, 2015 by Michael  
Filed under Around The Net

A massive young star may prove to be the missing link between two stages of star formation.

While most stars have winds that pile the gas around them into columns streaming from their poles, some stars expel spherical winds of expanding material. A real-time study over almost two decades reveals for the first time a star in the process of changing from spherical winds of charged particles to streaming columns of them, linking the two structures together.

Describing how scientists understood stars with spherical expanding winds, Carlos Carrasco-González, of the Centre of Radio Astronomy and Astrophysics in Mexico, said, “We were speculating that these stars were in a younger stage, and that they would develop collimated winds in the future. But this has been proposed by theoretical works, and we had not actually obtained proof of this.” Carrasco-González served as lead author on a study that examined the massive young star W75N(B)-VLA2 over 18 years, and a second study that examined the star in 2014.

“With this work, we have obtained a link between the two stages, the spherical and the collimated one,” Carrasco-González said.

Sunlike stars are abundant and easily observable in the galaxy, so formation theories about them are fairly well established. But massive young stars remain more challenging. Because they are rare, these stars tend to lie farther from the sun, making them harder to observe in great detail. Furthermore, they are often embedded in the dusty clouds where they form, making them difficult to observe in the optical wavelengths for telescopes like NASA’s Hubble Space Telescope.

As a result, a number of theoretical problems plague scientists’ understanding of how these stars form.

“The main problem is that the strong light that arises from these massive protostars can push out the material which is falling onto it, and at some point the star stops growing,” Carrasco-González said.

According to theory, this growth ends before the star becomes the kind of behemoth that scientists observe. Yet scientists are observing these stars, so some physical mechanism must allow the objects to continue to gather material before pushing it all away.

In 1996, Carrasco-González and his team used the Karl G. Jansky Very Large Array (JVLA) to observe a massive young star located 4,200 light-years from Earth. At the time, the star had a circular ring of material around it stretching 185 astronomical units in diameter. (An astronomical unit is the distance from the Earth to the sun — 93 million miles or 150 million kilometers). Scientists interpreted the disk as material heated up by the circular winds flowing evenly from the star.

While the scientists continued to study other characteristics of the star with different instruments, it wasn’t until 2014 that the team used the JVLA again, and realized the star had changed significantly. The new image revealed that the material no longer encircled the star; instead, an elongated jet of material stretched outward. The estimated age for the expanding shell is about 25 years.

For a star, with a lifetime of tens of billions of years, a quarter of a century is barely the blink of an eye. So, these observations allowed astronomers the rare opportunity to study star evolution in real time.

“If the changes are due to either the turn-on of a new jet or a blob of gas being ejected in the jet, then these would be very rare events,” Melvin Hoare, of the University of Leeds in the United Kingdom, told by email. Hoare, who was not part of the research, wrote the Perspectives article that accompanied Carrasco-González’s research.

“The likelihood of catching one is rare,” Hoare wrote.

The research and the Perspectives article were both published online today (Thursday, April 2) in the journal Science.

Most stars tend to emit strong winds, though these winds may originate from a variety of processes. The magnetic field may play a role in extracting material from the stellar atmosphere, as is the case of the sun, or in gathering material from the surrounding disk of material.

Massive young stars are hot and bright; W75N(B)-VLA2 shines about 300 times as brightly as the sun. Because it is a form of energy, the starlight pushes against the cold molecular cloud of material that surrounds it, heating and exciting it to create the signatures Carrasco-González and his team observed over time.

When the jet of wind hits the cold material, it forms a bow shock as it slows down, like a wave breaking off the front of a boat. Slowly, it pushes the material away. Eventually, the cloud of material stretches from its circular physique to create an outflow along its rotation axis, the axis around which the star spins.

The winds themselves may be sporadic, occurring at random times, or they may occur periodically, repeating on a regular schedule. Because VLA2 is part of a three-star system, Carrasco-González suggests that the occurrence is periodic, taking place as the stars draw closer together, allowing the winds to become stronger.

“We think that the behavior observed in this star should also be periodic because, if not, we would be very lucky to catch this moment,” he said.

In other words, because the process lasts for less than two decades, it is very unlikely to be observed if it is a random event. On the other hand, if the episodes are periodic, behaving in the same way at different points in time, “it is more likely to be observed,” Carrasco-González said.

The team cautions that the change may not be as radical as it appears when studying the images. After the hot, young star was observed in 1996, the JVLA underwent an upgrade that allows it to take a more in-depth view of the signature of the ionized winds. Therefore, it is possible that the wind was blowing a column that the instruments simply couldn’t measure 18 years ago. However, if the star had already begun to form a column along its axis, that column would have been weak, the team said in its research.

If the hot, young star is truly evolving, then it has a good chance of helping scientists improve their understanding of how the winds evolve.

“The next step should be to study the behavior of the magnetic field in this star,” Carrasco-González said.

“We know from theoretical models and some observational studies that magnetic fields should play an important role in the formation and collimation of outflows,” he said. “But we still do not have good observational information on how magnetic fields work in these winds.”



Microsoft Has 1.5M People Testing Windows 10 Preview

April 8, 2015 by mphillips  
Filed under Computing

Although Microsoft has stated its Windows 10 preview program has some 2.8 million participants, just over half of those are using the early version on a regular basis, according to Web metrics estimates.

For March, U.S.-based analytics company Net Applications pegged Windows 10′s user share, a rough projection of the fraction of online users running the OS, at 0.09%, or nine PC users out of every 10,000.

That represented 0.1% of all Windows PC users, the slight difference due to the fact that Windows does not account for all personal computer operating systems, but instead about 91%.

Windows 10′s user share of all Windows PCs last month was slightly less than Windows 8′s 0.12% in March 2012, seven months before the latter’s official launch. Microsoft ran a preview program for Windows 8 in the year prior to the OS’s October 2012 on-sale date.

This time, however, Microsoft will ship the upgrade at least one, perhaps several, months sooner on the calendar than it did Windows 8: The Redmond, Wash. company has promised to release Windows 10 this summer, a wide window that could mean as early as June or as late as September.

Windows 10′s user share translated into approximately 1.5 million users, assuming about 1.52 billion Windows PCs in operation across the globe.

Microsoft last gave an Insider participant count a month ago, when Stephen Elop, formerly the CEO of Nokia and now the head of Microsoft’s Device and Services division, said there were 2.8 million registered testers.

The gap between 2.8 million Insider participants and 1.5 million Windows 10 users is not unusual in beta testing circles: More people download and try a preview than run it regularly after that.




Did AMD Commit Fraud?

April 6, 2015 by Michael  
Filed under Computing

AMD must face claims that it committed securities fraud by hiding problems with the bungled 2011 launch of Llano that eventually led to a $100 million write-down, a US court has decided.

According to Techeye US District Judge Yvonne Gonzales Rogers said plaintiffs had a case that AMD officials misled them by stating in the spring of 2011 and will have to face a full trial.

The lawsuit was over the Llano chip, which AMD had claimed was “the most impressive processor in history.”

AMD originally said that the product launch would happen in the fourth quarter of 2010, sales of the Llano were delayed because of problems at the company’s chip manufacturing plant.

The then Chief Financial Officer Thomas Seifert told analysts on an April 2011 conference call that problems with chip production for the Llano were in the past, and that the company would have ample product for a launch in the second quarter.

Press officers for AMD continued to insist that there were no problems with supply, concealing the fact that it was only shipping Llanos to top-tier computer manufacturers because it did not have enough chips.

By the time AMD ramped up Llano shipments in late 2011, no one wanted them any more, leading to an inventory glut.
AMD disclosed in October 2012 that it was writing down $100 million of Llano inventory as not shiftable.

Shares fell nearly 74 percent from a peak of $8.35 in March 2012 to a low of $2.18 in October 2012 when the market learned the extent of the problems with the Llano launch.


Windows 7 Continues To Gain User Share Of PC Market

April 3, 2015 by mphillips  
Filed under Computing

Windows 7 last month powered nearly two-thirds of all personal computers running a version of Microsoft’s OS, according to recently released data from researchers.

Net Applications’ monthly user share tracking — an estimate of the percentage of all systems that rely on a specific operating system — pegged Windows 7 at 63.7% of all Windows PCs in March.

That was a 2.6 percentage point jump from February.

The climb of Windows 7′s user share has been remarkable. An older operating system — Windows 7 debuted in 2009 — typically loses share when a successor appears on the scene. Even in the dark days of Windows Vista, the OS tagged as a flop for Microsoft, Vista stole share from the then-overwhelmingly-dominant Windows XP.

Instead, Windows 7 has gained significant user share since the October 2012 launch of Windows 8. In the intervening 29 months, Windows 7′s share of all Windows PCs has climbed nearly 15 percentage points, representing an increase of almost a third.

Notable, too, has been Windows 8/8.1′s stagnation: In the last four months, Microsoft’s latest OS has grown by just six-tenths of a percentage point, reaching 15.4% of all Windows PCs in March. In the same span, Windows 7′s share of all Windows machines jumped 2.2 points.

Microsoft would prefer that Windows 7 not repeat Windows XP’s trajectory. The 2001 OS still powered more than 30% of all Windows PCs in April 2014, when free support ceased.

However, analysts have already predicted that Windows 7 will reprise XP’s late-to-leave behavior. Net Applications’ data suggests that their forecasts are on the money.





Qualcomm Shuts Down Snapdragon 815 Rumor

April 2, 2015 by Michael  
Filed under Computing

A lot of rumors regarding an alleged upcoming Qualcomm Snapdragon 815 SoC have been floating around, and now the chipmaker has informed us that that no such chip exists.

Qualcomm’s Senior Director of Public Relations Jon Carvill said that there is no Snapdragon 815 in the works:

Carvill was clear:

“There are no plans for a Snapdragon 815 processor.”

Snapdragon 815 filed under creative journalism

The Snapdragon 815 rumours spread like wildfire, but since they didn’t make much sense, we decided not to carry them. Basically the alleged Snapdragon 815 was supposed to be a 16nm SoC with four Cortex-A72 and Cortex-A53 cores, but the rest of the spec was hard to swallow.

Long story short, there is no such thing as a Snapdragon 815. The company never had such a product, and if you know a thing of two about SoC development, it takes years to make a new SoC design from scratch – you don’t just design a new one for a new node out of the blue.

It would be very convenient if the company managed to pull off something like this, but it’s simply not possible.

Qualcomm’s next flagship is the Snapdragon 820

Now that we debunked this rumor, we should focus on Qualcomm’s real next generation flagship SoC – the Snapdragon 820.

The company mentioned the Snapdragon 820 at the Mobile World Congress in Barcelona, but it looks like that it will be a while before we see this chip shipping in actual devices. Qualcomm expects the new part to sample sometime in the second half of the year, so in the best case scenario we might see the first devices by the end of the year, but most products based on the new chip will start shipping in early 2016.

The 20nm Snapdragon 810 is not overheating, it works just fine, and we tested it inside the HTC One M9. We can confirm that it ends up significantly faster than the Snapdragon 801, which we had a chance to try in a few phones.



Will AMD Release Pascal Early Next Year?

April 1, 2015 by Michael  
Filed under Computing

The world is still expecting the birth of the High Bandwidth Memory (HBM) boosted Fiji GPU, and the first cards based on the new chip should launch in late June, or the end of Q2 2015 if you prefer.

AMD is looking ahead and their engineers are working hard on the company’s next generation HBM card, currently codenamed Greenland. We are not sure if this is the name of the whole generation or this is simply a single GPU backed by HBM, that will end up in APUs.

Like we said, we doubt that Fiji will actually launch on the Pacific island of Fiji and that the Greenland launch event will be held on Greenland (Denmark), but we can confirm that the Greenland GPU will use HBM memory. There is still no confirmation on the manufacturing process, but we would expect that Greenland ends in either 14nm GlobalFoundries process or TSMC’s 16nm process. Greenland will be a part of AMD’s next generation K12 APU, which means that this multiple Zen core APU will get some great graphics performance. It is not clear if Greenland is a part of the Caribbean Islands (Fiji) generation or if it belongs to a successor generation.

At this time we cannot confirm (or deny) whether or not Greenland will launch as a desktop card, too, and we can only speculate that Greenland is shrunk derivative of the Fiji generation architecture.

Nvidia’s first HBM Pascal card that is coming by early 2016. Pascal will use the 2.5 D HMB approach and probably HBM 2 memory, and we expect that AMD’s Fiji successor will use HBM 2 memory as well 2 memory as well.

Details are limited, apart of the fact that Greenland can end up in the next generation APU such as K12, making the architecture quite scalable. High Bandwidth Memory combined with new K12 cores might create the fastest integrated product of all time, and let’s not forget that AMD is putting a lot of emphasis on Heterogeneous System Architecture (HSA) and the compute side of things. With the help of HBM-powered Greenland that can end up with 500GB/s bandwidth, along with multiple Zen 64-bit CPU cores, you can expect quite a lot of compute performance from this new integrated chip.