While research groups like IDC and Gartner have shown an overall 15.6 decrease in worldwide tablet shipments in 2016, the market has not gone entirely belly-up, as Amazon continues to pull ahead with a phenomenal 99.4 percent increase in annual tablet growth during the same period.
According to a report by the folks at TrendForce, Amazon managed to ship 11 million Fire-series tablets over the course of 2016 even as global tablet shipments fell by 6.6 percent from the previous year. While the sales numbers were impressive, the company still fell behind Apple at 27 percent of the market and Samsung at 17.2 percent, yet managed to beat expectations as a result of strong year-end holiday sales.
Apple also pulled ahead with strong tablet sales last year and retained its top spot, selling 42 million devices to Samsung’s 27 million. A few weeks ago, we wrote that IDC may have regretted telling the media to rely on expectations that the fruit-themed device company would allegedly oversee the decline of traditional PC sales by 2015. While traditional PC sales dropped 5.7 percent to 260.2 million in 2016, they still remain an impressive part of the overall device market and have not fallen as quickly as tablets have over the past year.
TrendForce expects tablet sales to continue declining from 157.4 million units in 2016 to around 147.8 million units 2017. While Amazon nearly doubled its annual shipments and Apple enjoyed strong iPad sales over the holiday season, other brands such as Microsoft are expected to fall into 7th place as the company experiences panel shortages for its Surface Pro series.
For a limited time, Amazon will occasionally offer its 7-inch 8GB Fire Essentials bundle and its 16GB Fire Essentials Bundle at discounted prices. For instance, the former had been available for $33.33 in November and $49.99 until earlier this month, along with free Prime shipping. The company is expected to offer similar deals throughout the year in an effort to strengthen its sales base from loyal Prime customers.
pen source’s Mr Sweary Linus Torvalds announced the general availability of the Linux 4.10 kernel series, which includes virtual GPU (Graphics Processing Unit) support.
Linus wrote in the announcement, adding “On the whole, 4.10 didn’t end up as small as it initially looked”.
The kernel has a lot of improvements, security features, and support for the newest hardware components which makes it more than just a normal update.
Most importantly there is support for virtual GPU (Graphics Processing Unit) support, new “perf c2c” tool that can be used for analysis of cacheline contention on NUMA systems, support for the L2/L3 caches of Intel processors (Intel Cache Allocation Technology), eBPF hooks for cgroups, hybrid block polling, and better writeback management.
A new “perf sched timehist” feature has been added in Linux kernel 4.10 to provide detailed history of task scheduling, and there’s experimental writeback cache and FAILFAST support for MD RAID5.
It looks like Ubuntu 17.04 will be the first stable OS to ship with Linux 4.10.
MediaTek is planning a Helio X30 in 10nm later this year but news from Taiwan indicates that some key customers didn’t order the new flagship 10 core chip.
One of the main reasons might be the increased competition in the Chinese market and companies cannot afford to have two designs of the same phone with Qualcomm or a MediaTek chip in. The rumor is that Xiaomi, MediaTek’s big customer, might be coming up with its own Pinecone SoC and this will put some additional pressure on MediaTek’s high-end. There might be two Pinecones SoCs targeted at the mainstream and high end market.
LeEco, another big MediaTek customer is going through tough financial times, and was not interested in making big orders. Hope, which is the number one smartphone vendor in China, is usually a big customer. Another big one that usualy goes with MediaTek is the current number 3 in China, Vivo. The number two, Huawei has its own Kirin SoC while the number Four, the fruity Apple has its own SoC.
Oppo is MediaTek’s big hope as is Vivo. Oppo and Vivo are expected to sell 120 million and 100 million smartphones respectively in 2017.
The upcoming Snapdragon 835 SoC is also going to give Mediatek bother. It is shaping up to become one of the best, if not the best phone SoC of all times. MediaTek usually has a pricing advantage over most of its competitors so it might compete against it on price.
This is a TSMC manufactured chip based on the the long relationship that the company has with the biggest chip foundry which is across the street from MedaiTek’s headquarter in Hsinchu, Taiwan. The end result might be the massive cancellation of 10nm wafer orders at TSMC, as there wont be anyone who would want to buy. The timing could not be worse, as this is the first time MediaTek wanted to take the leap of faith and bet on the farm with the latest and greatest 10nm . Now it looks like it will have to cancel a lot of the 10nm orders. Still a few phones with Helio X30 deca core will hit the market.
The top modem providers are Intel and Qualcomm, whose cellular chips are used in the iPhone. Both have announced modems that will push LTE connections to speeds well over those of regular home internet connections.
Qualcomm unveiled the X20 LTE chipset, which can transfer data at speeds of up to 1.2Gbps. Intel announced the XMM 7560 LTE modem, which can download data at speeds of up to 1Gbps.
However, cellular networks aren’t yet designed to handle such fast speeds. One exception is Telstra, an Australian telecommunications company, which has launched a gigabit LTE service for commercial use in that country.
Gigabit LTE will slowly start appearing in mobile devices and networks this year, said Jim McGregor, principal analyst at Tirias Research.
“This is making 4G what it was intended to be — a true wireless broadband solution,” McGregor said.
These performance bumps are important as users handle more data, McGregor said.
“We’ve seen this with microprocessors for years,” McGregor said.
Qualcomm said its Snapdragon X20 modem will become available next year, and McGregor estimated it will be in devices soon after. Intel said its XMM 7560 is ready, but couldn’t say when handsets would come out.
Most users may not need LTE speeds of 1.2Gbps, especially when using apps like Uber, Snapchat and WhatsApp. But more PCs are getting LTE connectivity, and could use the speed for high-end applications.
Qualcomm, a modem pioneer, is trying to stay a step ahead of Intel in the rat race to rev up LTE modems. Intel is speeding up modem development as wireless connectivity becomes an essential part of computing, said Aicha Evans, senior vice president and general manager of the Communication and Devices Group at Intel.
he new modems are also a stepping stone to 5G, the next-generation cellular network technology that Evans estimated could deliver speeds of more than 45Gbps. Beyond mobile devices, 5G will be used for machine-to-machine communications and will be a standard feature in a wide range of devices including PCs, robots, drones and internet of things devices.
The Snapdragon X20 LTE chipset is a CAT 18 modem and supports a wide range of cellular technologies that could make it work in most countries worldwide. The chip supports carrier aggregation and data transfers over multiple streams. It works with 40 cellular frequency bands and supports technologies like Voice over LTE (VoLTE) and LTE broadcast.
Intel’s XMM 7560 is a CAT 16 modem and supports carrier aggregation across multiple spectrums. The chip maker has already readied its first 5G modem, and the company now says it has silicon ready for that chip.
Never more than a stopgap that was hugely inadequate to the gap in question, Steam Greenlight is finally set to disappear entirely later this Spring. The service has been around for almost five years, and while it was largely greeted with enthusiasm, the reality has never justified that optimism. The amassing of community votes for game approval turned out to be no barrier to all manner of grafters who launched unfinished, amateurish games (even using stolen assets in some cases) on the service, but enough of a barrier to be frustrating and annoying for many genuine indie developers. As an attempt to figure out how to prevent a storefront from drowning in the torrent of rubbish that has flooded the likes of the App Store and Google Play, it was a worthy experiment, but not one that ought to have persisted for five years, really.
Moreover, Greenlight isn’t disappearing because Valve has solved this problem to its satisfaction. The replacement, Direct, is in some regards a step backwards; it’ll see developers being able to publish directly on the system simply by confirming their identity (company or personal) through submission of business documents and paying a fee for each game they submit. The fee in question hasn’t been decided yet, but Valve says it’s thinking about everything from $100 to $5000.
The impact of Direct is going to depend heavily on what that fee ends up being. It’s worth noting that developers for iOS, for example, already pay around $100 a year to be part of Apple’s developer programme, and trawling through the oceans of unloved and unwanted apps released on the App Store every day shows just how little that $100 price does to dissuade the worst kind of shovelware. At $5000, meanwhile, quite a lot of indie developers will find themselves priced out of Steam, especially those at the more arthouse end of the scene, or new creators getting started out. Ironically, though, the chances are that many of the cynical types behind borderline-scam games with ripped off assets and design will calculate that $5000 is a small price to pay for a shot at sales on Steam, especially if the high fees are thinning out the number of titles launching.
It’s worth noting that, for the majority of Steam’s consumers, the loss of arthouse indie games and fringe titles from new creators won’t be of huge concern. Steam, like all storefronts, sells huge numbers at the top end and that falls off rapidly as you come down the charts; the number of consumers who are actively engaging with smaller niche titles on the service is pretty small. However, that doesn’t mean that locking out those creators wouldn’t be damaging – both creatively and commercially.
Plenty of creators are actually making a living at the low end of the market; they’re not making fortunes or buying gigantic mansions to hang around being miserable in, but they’re making enough money from their games to sustain themselves and keep up their output. Often, they’re working in niches that have small audiences of devoted fans, and locking them out of Steam with high submission costs would both rob them of their income (there are quite a few creators out there for whom $5000 represents a large proportion of their average revenue from a game) and rob audiences of their output, or at least force them to look elsewhere.
Sometimes, a game from a creator like that becomes a break-out hit, the game the whole world is talking about for months on end – sometimes, but not very often. It’s tempting to argue that Steam should be careful about its “low-end” indies (a term I use in the commercial sense, not as any judgement of quality; there’s great, great stuff lurking around the bottom of the charts) because otherwise it risks missing the Next Big Thing, but that’s not really a good reason. Steam is just about too big to ignore, and the Next Big Thing will almost certainly end up on the platform anyway.
Rather, the question is over what Valve wants Steam to be. If it’s a platform for distributing big games to mainstream consumers, okay; it is what it is. If they’re serious about it being a broad church, though, an all-encompassing platform where you can flick seamlessly between AAA titles with budgets in the tens of millions and arthouse, niche games made as a labour of love by part-timers or indie dreamers, then Direct as described still doesn’t solve the essential conflict in that vision.
In replacing publishers with a storefront through which creators can directly launch products to consumers, Valve and other store operators have asserted the value of pure market forces over curation – the fine but flawed notion of greatness rising to the top while bad quality products sink to the bottom simply through the actions of consumers making buying choices. This, of course, doesn’t work in practice, partially because in the real world free markets are enormously constrained and distorted by factors like the paucity of information (a handful of screenshots and a trailer video doth not a perfectly informed and rational purchasing decision make), and more importantly because free markets can’t actually make effective assessments of something as subjective as the quality of a game.
Thus, even as their stores have become more and more inundated with tides of low quality titles – perhaps even to the extent of snuffing out genuinely good quality games – store operators have tried to apply algorithmic wizardry to shore up marketplaces they’ve created. Users can vote, and rate things; elements of old-fashioned curation have even been attempted, with rather limited success. Tweaks have been applied to the submission process at one end and the discovery process at the other. Nothing, as yet, presents a very satisfying solution.
One interesting possibility is that we’re going to see the pendulum start to swing back a little – from the extreme position of believing that Steam and its ilk would make publishers obsolete, to the as yet untested notion that digital storefronts will ultimately do a better job of democratising publishing than they have done of democratising development. We’ve already seen the rise of a handful of “boutique” publishers who specialise in working with indie developers to get their games onto digital platforms with the appropriate degree of PR and marketing support; if platforms like Steam start to put up barriers to entry, we can expect a lot more companies like that to spring up to act as middlemen.
Like the indie developers themselves, some will cater to specific niches, while others will be more mainstream, but ultimately they will all serve a kind of curation role; their value will lie not just in PR, marketing and finance, but also in the ability to say to platforms and consumers that somewhere along the line, a human being has looked at a game in depth and said “yes, this is a good game and we’re willing to take a risk on it.” There’s a value to that simple function that’s been all too readily dismissed in the excitement over Steam, the App Store and so on, and as issues of discovery and quality continue to plague those storefronts, that value is only becoming greater.
Whatever Valve ultimately decides to do with Direct – whether it sets a low price that essentially opens the floodgates, or a high one that leaves some developers unable to afford the cost of entry – it will not provide a panacea to Steam’s issues. It might, however, lay the ground for a fresh restructuring of the industry, one that returns emphasis to the publishing functions that were trampled underfoot in the initial indie gold-rush and, into the bargain, helps to provide consumers with clearer assurances of quality. A new breed of publisher may be the only answer to the problems created by storefronts we were once told were going to make publishers extinct.
The fatal clock timing flaw that causes switches, routers and security appliances die after about 18 months of service is apparently a feature of some Juniper products.
Cisco was the first vendor to post a notice about the problem earlier this month saying the notice covers some of the company’s most widely deployed products, such as certain models of its Series 4000 Integrated Services Routers, Nexus 9000 Series switches, ASA security devices and Meraki Cloud Managed Switches.
Juniper is telling its customers something similar:
“Although we believe the Juniper products with this component are performing normally as of February 13, 2017, the [listed] Juniper products could after the product has been in operation for at least 18 months begin to exhibit symptoms such as the inability to boot, or cease to operate. Recovery in the field is not possible. Juniper product with this supplier’s component were first placed into service on January 2016. Jupiter is working with the component supplier to implement a remediation. In addition, Juniper’s spare parts depots will be purged and updated with remediated products.”
The products in the warning comprise 13 Juniper switches, routers and other products including the MPC7E 10G, MPC7E (multi rate), MX2K-MPC8E, EX 920 Ethernet switches and PTX3000 integrated photonic line card.
So far neither Cisco nor Juniper have blamed Intel for the fault. However, Chipzilla did describe a flaw on its Atom C2000 chip which is under the bonnet of shedloads of net gear.
Intel said that problems with its Atom chip will hurt Intel’s 2016 Q4 earnings. CFO Robert Swan said that Intel was seeing a product quality issue in the fourth quarter with slightly higher expected failure rates under certain use and time constraints.
Swan said that it will be fixed with a minor design fix that Intel was working with its clients to resolve.
Intel had hoped it would see the back of its short-lived low-power Atom chips for servers. They were used in micro servers but also networking equipment from companies.
HPE and Dell are keeping quiet about the clock technology, though both are rumoured to use it. They might be hoping that Intel will come up with a fix so they can pretend it never happened.
Announced officially by AMD and to be held on February 28th at Ruby Skye in San Francisco, the new Capsaicin and Cream event promises “a feature-packed show highlighting the hottest new graphics and VR technologies propelling the games industry forward”.
Streamed live, the event will include the main Capsaicin & Cream part, which will hopefully include a bit more details on the actual lineup of graphics cards based on the new Vega GPU, as well as the Cream developer sessions which promise “inspiring talks focused on rendering ideas and new paths forward, driven by game industry gurus from multiple companies including Epic and Unity”.
The event will start at 10:00 AM PST, while the livestream is scheduled to start at 10:30 AM PST (20:00 CET).
For many, the success of Resident Evil 7 and its atmospheric campaign has offered a glimpse of what a “killer app” for virtual reality might look like; the game that shifts the common perception of VR from an intriguing glimpse of the future, to an essential part of contemporary entertainment. The term will be familiar to anyone who has seen the launch of a new console, but, as a panel of experts discussed today at Casual Connect Europe, VR defies such easy categorization.
The discussion was triggered by nDreams CEO Patrick O’Luanaigh, who was in the crowd to watch a panel that included representatives from Valve and Nvidia. When asked to pin down his definition of the term “Killer App,” O’Luanaigh said, “it’s less about revenue, more something that everybody talks about. A lot of people say that VR hasn’t had that killer game yet.
“If we look to the consoles we might say, ‘You have to have your Mario or your Sonic.’ But do you?”
“There’s lots of cool stuff out there, but nothing that really makes you feel, ‘Oh my god, this is so amazing, I have to go and buy a headset.’ We’re all saying that we want games like that to come, and as budgets go up hopefully that will happen. It’s really about where that game might come from.”
For Chet Faliszek, who has become the globe-trotting representative for Valve’s VR efforts, the very notion of a ‘Killer App’ seemed to belong more to traditional game hardware – the consoles made by Nintendo, Sega, Sony and Microsoft. “We have so few data points to extrapolate from to figure out what this is,” he said. “If we look to the consoles we might say, ‘You have to have your Mario, or your Sonic.’ But do you?”
Faliszek referred to a talk he gave the previous day, in which he suggested smartphones as a more appropriate comparison for VR technology. “What was the killer app for the App Store?” he asked the crowd the previous day. “I would argue it was flexibility; the ability to become different for each person. If you’d have asked me 20 years ago what feature do I most want on my phone, I probably would say something about making phone calls; now I rarely make a phone call.
Faliszek emphasized this point again, and suggested that some of the difficulty analysts have faced in grappling with the VR market relates to this kind of misunderstanding. “That’s why there’s slower growth in virtual reality than other people predicted – the analysts,” he continued. “Whereas I think people in the [VR] industry have the understanding that, if you demo ten individual things, out of those one person would say, ‘Why is this thing in there?’ And the next person would go, ‘That’s the best thing ever.’
“Today’s high-end becomes tomorrow’s mainstream… If you develop for the high-end, you know that’s going to have the longest tail”
“You have these personal reactions… Everybody finds that thing in there that they want to have.”
It was telling that, when asked about the most impressive applications for virtual reality right now, Faliszek listed tools for creativity: Google’s Tilt Brush, and the VR development capabilities offered by engines from Unity and Epic. There is a desire for a fully formed consumer market for VR to hurry up and arrive already, but the truth may be that, even a year after the launch of Oculus Rift and HTC Vive, the space is still best defined by its creators and the broad range of use cases they are attempting to discover.
However, one basic truth was mentioned on several occasions, starting with O’Luanaigh’s original question about the importance of positional head-tracking and motion controls becoming standard in mobile VR. These are core features the current high-end of VR hardware – including, but not limited to, the HTC Vive – but Faliszek also believes this is the smartest target for any developer wanting to reach the largest possible audience.
“If you want to make the most money in VR, you should make [games] for the largest addressable market,” he said. “The largest addressable market right now may be headsets that are rotational only, but they will be museum piece in a couple of years. If you make something that has positionally tracked head and motion controls you can probably still be selling that game years from now – or some version of that. If you did rotational only? Someone has to pull a headset out of the closet to experience that. The shelf life of that product is going to be much shorter.”
Faliszek made a similar point the day before, advising Casual Connect’s attendees that, “today’s high-end becomes tomorrow’s mainstream. If you really want to think about the largest addressable market, it’s not about the number of headsets out there for any one platform. It’s what will become the standard. If you develop for the high-end, you know that’s going to have the longest tail.”
Despite the probable advantage in the number of headset owners, then, mobile VR may have to reach a better technological standard to be a better commercial opportunity. No part of the VR market offers a huge installed base at present anyway, and, as Faliszek pointed out, “a game that works on 5 million [mobile] headsets this year isn’t necessarily going to work on 50 million headsets in a few years’ time.”
It has been reported a few times that Zen and the desktop part Ryzen are a crucial part of AMD’s strategy in the future. The fact that our sources confirm that Ryzen will compete well against Core i7 Extreme edition will definitely help AMD’s stock.
AMD’s John Taylor, Corporate Vice President, Worldwide Marketing at AMD showcased Zen running the CPU at Computex in June 1st 2016 and the stock market reacted favorably to it. Since early January last year, AMD stock grew tremendously from $1.90 USD roughly a year ago to $13.42 USD now. The stock price will definitely rise further.
It can be anticipated that Ryzen will be in high demand and that every single AMD fan will have a desire to get an AMD Zen based Ryzen machine. The reason is simple – people want AMD to succeed and the price will be much more competitive. We have readers in our community who never gave up hope that AMD would once return to its K7 glory Athlon days. Well, Ryzen is the closest to that goal.
AMD will quickly get some desktop CPU market share back, but we anticipate that demand will exceed supply. Wall Street likes what AMD has been doing and it will most likely react very favorably on Ryzen reviews and shipping.
Lisa Su, AMD’s CEO, has already confirmed that you can expect to see Ryzen shipping this quarter and the closest that we heard to a launch date is the first few days of March. It is happening rather soon and this is the single most important launch in the last decade for AMD. Intel is working on a response, but AMD fanboys will embrace the Zen, even if it ends up slightly slower compared to Intel.
The positive financial impact will help AMD becoming more competitive in both CPU and GPU areas, which is great news for the market. Intel has been left almost alone, for long enough and it is about to taste its own medicine.
Oracle has decided that it is not going to give up trying to convince the world that Google owes it billions for Android software.
For the last seven years, Google and Oracle have been slugging it out over copyright over Java applets, which Oracle insists are the key to making Android run. It has gone through two federal trials and bounced around at appeals courts, including a brief stop at the US Supreme Court. Oracle has sought as much as $9 billion in the case.
Other than one loss, which was successfully appealed, Google has won. Now Oracle briefs have decided it is time for another round and filed an appeal with the US Court of Appeals for the Federal Circuit that seeks to overturn a federal jury’s decision last year.
In the trial last year in San Francisco, the jury ruled Google’s use of 11,000 lines of Java code was allowed under “fair use” provisions in federal copyright law.
In Oracle’s 155-page appeal on Friday, it called Google’s “copying…classic unfair use” and said “Google reaped billions of dollars while leaving Oracle’s Java business in tatters”.
Oracle’s brief also argues that “When a plagiarist takes the most recognizable portions of a novel and adapts them into a film, the plagiarist commits the ‘classic’ unfair use”.
So all Oracle has to do is prove that Applets are the most recognisable part of Java which has been converted into a new product.
Industry veteran journalist Kyle Bennet wrote back in December that Intel might launch a CPU powered by Radeon technology. This happens in the middle of the last quarter when Nvidia and Intel’s cross licensing GPU deal is about to expire.
Just recently, Kyle said that there might be a CPU with Radeon coming this year but more important is that from April 1, Intel will not have a valid GPU license from Nvidia or AMD. None of the three companies spoke publicly about a possible GPU licensing deal and as far as Fudzilla is aware Nvidia hasn’t reached a deal with Intel to extend the licensing.
As part of the original deal and the terms and conditions of the patent cross license agreement, Intel agreed to pay Nvidia licensing fees which in the aggregate will amount to $1.5 billion, payable in annual installments, as follows: a $300 million payment on each of January 18, 2011, January 13, 2012 and January 15, 2013 and a $200 million payment on each of January 15, 2014, 2015 and 2016.
The original document states that “Capture Period” shall mean any time on or prior to March 31, 2017 indicating that this is the last date where the license is still valid.
There are a few possible scenarios going forward and one very likely and that Fudzilla suggested a while ago, is that AMD will license its GPU technology to Intel and get some much-needed cash. Nvidia is always the more expensive choice. If you have been following Nvidia and AMD long enough you will recognize the pattern that both PlayStation and Xbox stayed away from Nvidia simply as AMD was the more affordable choice. Good fellow Jen-Hsun Huang, the CEO of Nvidia is all about making more money, something that resulted in a surge in the stock price.
AMD doesn’t want to talk about it. Fudzilla asked many contacts inside the company on and off the record, but no one seems to want to touch this touchy topic. Where there is smoke, there might be fire, one might imply.
The bottom line is that Intel needs a license or it faces a potential lawsuit. If it gets the GPU patent licensing from AMD, Nvidia would probably stay away from potential legal action.
Nvidia and AMD borrow GPU related ideas from each other left and right and center and we are quite sure that they don’t plan to sue each other for the GPU related patents anytime soon.
We would expect to see some announcements related to a potential AMD – Intel deal in the next few months. While many will argue that AMD is hardly going to benefit from it, making Intel a bigger competitor and losing the edge on the GPU performance lead, AMD would be making some additional cash, something that it desperately needs.
Long-standing rumors surrounding the possibility of wireless charging being a hot feature in Apple’s upcoming iPhone 8 this year are now receiving some confirmation, thanks to the company’s recent decision to join the 213-member Wireless Power Consortium group.
Based on the wireless industry group’s website last week, Apple has been officially listed as one of the latest members to take part in and promote the widespread adoption of the Qi wireless interface standard, which has been used for wireless charging across a number of products.
Early last year, we wrote that the company had filed a patent with the U.S. Patent and Trademark Office (July 2015) describing a near-field magnetic resonance (NFMR) power supply arranged to provide wireless power to a number of devices over 1 meter in distance. With the basic concept in physics being that the efficiency of power transfer decreases with distance, the company was said to be developing an aluminum casing for its upcoming iPhone devices that would allow RF waves to pass through from the wireless charging receiver and through a window made from a non-conductive material.
Qi wireless charging more likely than long-range RF for upcoming iPhone 8
But with recent developments in the industry, the possibility of long-range RF charging coming to this year’s iPhone now seem more distant as the company is more likely to adopt the Qi inductive coupling method instead. During CES, a source within Apple’s supply chain partnered with Energous, a company that develops RF-based charging solutions, and this was the first evidence that the more long-range solution featuring transmitters for the home, car and office would make its way into the hands of consumers in 2017. Unfortunately, Energous then announced that plans changed after a “key strategic partnership” was made with another partner, which will now be the first to ship the technology inside its own mobile devices.
While it appears Apple was indeed focused on developing a long-range charging method for its mobile devices, some analysts now point out that it needed to bring a practical solution to the market sooner in order to avoid a potential missed feature that has become standard in the Android community for at least 24 months.
“The success of wireless charging adoption from Apple’s competitors is something that Apple can no longer ignore,” says analyst Vicky Yussuff at IHS Technology. “Consumer survey data shows over 90% of consumers want wireless charging on their next device.”
Although Apple already uses the Qi standard in its watch, which was released in Q4 2015, it is unclear whether the upcoming iPhone will use the full specifications of the technology, as its smartwatch currently uses a modified version that only works with its own chargers.
Nevertheless, the fact that Apple is now an active member of the Wireless Power Consortium allows it to participate and contribute knowledge and ideas to a community responsible for developing some of the world’s more readily available wireless charging standards. The company says “it looks forward to working together with the WPC and its members,” according to a statement given to BusinessInsider.
Facebook is closing around 200 of its 500 Oculus Rift virtual-reality demo stations at Best Buy locations across the US.
Apparently the move is because of poor “store performance” which is spin for the fact that few people are even trying the technology out.
Business Insider claims it is common for them to go days without giving a single demonstration.
Oculus spokeswoman Andrea Schubert insisted that the closings were due to “seasonal changes”.
“You can still request Rift demos at hundreds of Best Buy stores in the US and Canada. We still believe the best way to learn about VR is through a live demo,” she enthused.
Best Buy said stores that no longer offer demos will continue to sell the Oculus Rift headset and accompanying touch controllers. But it apparently interests in the headsets dried up after Christmas.
Another worker from California said that Oculus software bugs would often render his demo headsets unusable.
Moscow-based forensics firm Elcomsoft noticed it was able to pull supposedly deleted Safari browser histories from iCloud accounts, such as the date and time the site was visited and when the record was deleted.
“In fact, we were able to access records dated more than one year back,” wrote Elcomsoft’s CEO Vladimir Katalov in a blog post.
Users can set iCloud to store their browsing history so that it’s available from all connected devices. The researchers found that when a user deletes that history, iCloud doesn’t actually erase it but keeps it in a format invisible to the user.
The company discovered the issue with its Phone Breaker product, a forensic tool designed to streamline the extracting files from an iCloud account.
Keeping a copy of a user’s browser history can certainly be “invaluable for surveillance and investigations,” Katalov said. But it’s unclear if Apple knew that its iCloud service was storing the deleted records.
On Thursday, Apple didn’t immediately respond to a request for comment but since Elcomsoft’s blog post went live, Apple appears to be “purging” older browser history records from iCloud, the forensics firm said.
“For what we know, they could be just moving them to other servers, making deleted records inaccessible from the outside,” the blog post said. But now only deleted records as old as only two weeks can be extracted, the company said.
Elcomsoft has previously found that Apple was saving users’ call history to iCloud, but offering no explicit way to turn the synching on or off. At the time, Apple responded that its call synching function was designed for convenience, allowing customers to return phone calls from any device.
For users concerned about their privacy, Elcomsoft said that they can opt-out of syncing their Safari browsing history from iCloud.
While an Nvidia graphics chip seems to be hanging the office laptop’s Outlook, the company has seen its quarterly revenue surge more than 50 percent for the second straight quarter and beat expectations.
Apparently it is seeing rising demand for its graphics chips and strength in rapidly growing areas such as self-driving systems and artificial intelligence.
The company also forecast revenue of $1.90 billion, plus or minus 2 percent, for the current quarter, marginally higher than the $1.88 billion the cocaine nose jobs of Wall Street predicted.
The Revenue in the company’s graphics processing unit businesses that contributes to more than three-quarters to its total revenue rose 57 percent to $1.85 billion in the fourth quarter.
Also, the Revenue from the company’s fast-growing data center business which counts Amazon’s AWS, Microsoft Azure and Alibaba Groups cloud business as its customers has more than tripled to $296 million in the quarter.
The business is also expected to grow sequentially, Nvidia Chief Financial Officer Colette Kress said on a conference call.
Revenue in Nvidia’s automotive business, which produces the DRIVE PX 2 self-driving system used by Tesla Inc, reported a 37.6 percent rise to $128 million.
Analysts had expected revenue of $135.3 million from the business. Nvidia’s total revenue rose to $2.17 billion from $1.40 billion, beating the average analyst estimate of $2.11 billion.
The company’s net income more than tripled to $655 million.