The company is expected to make more job cuts this month, including from other locations in the U.S., further lowering the ranks of its 33,000-person work force. Since January, the company has cut its ranks by about 5,000, from 38,000.
The latest headquarters cuts were in IT and portfolio management and Sprint’s network, technology and product areas, according to a statement by spokesperson Roni Singleton. Some employees will work their last day on Nov. 7 and others will finish Nov. 14.
“Sprint is focused on competing aggressively in the marketplace,” Singleton said. “We want our customers to pay less for a better value on a new networks. As part of this plan, we have to more closely align our cost structure with that of our competitors.”
CEO Marcelo Claure signaled there would be job cuts in August shortly after taking on his new role. Claure also inaugurated a round of pricing reductions.
Even so, analysts expect the company to lose more subscribers and fall into fourth place among the nation’s top carriers, behind T-Mobile.
An earnings call is expected in late October, although the date hasn’t been scheduled, Singleton said.
Sprint’s more than 5,000 job cuts in 2014 put it behind Cisco, with 6,000 job cuts (8%) announced for the year and Microsoft, with 18,000 job cuts (14%) planned for the year.
The WiGig standard has been around since 2009, but we haven’t really seen it hitting that many retail devices. Back at IDF 2014, Intel demonstrated WiGig 802.11ad video, peripherals, 4K video transfer and it promised that Skylake based laptops will come out of the box with the technology.
WiGig will let you transfer up to 7Gbpps of audio, video or data via 2.4, 5 or 6GHz bands and is as fast as eight-antenna 802.11ac and nearly 50 times faster than highest 802.11n rate. It is backward compatible with WiFi standards, but due to its high frequency it is limited to short distances, usually up to 10 meters, cannot really penetrate walls but it can propagate by reflecting off of walls, ceilings or objects using beam forming.
Now Qualcomm showcased this technology for the first time and promised it inside Snapdragon 810 based devices. Qualcomm demonstrated peer-to-peer connection and transfer of 4K video between two 20nm Snapdragon 810 based tablets. One of the tablets was the sync side and it was connected directly to a 4K TV and it was clear that you could play a content from one tablet and sync it to the second one.
WiGig’s 7Gbps translates to 875MB per second in the best case scenario. The Qualcomm demo shows a Plutonium MSM8994 based tablet hitting up to 187MB a second (1.5 Gbit per second) available for data transfer, with 4K multi-device streaming on the side. WiGig can possibly get to external storage, enabling faster NAS systems, future peripherals such as keyboard and mouse and on a longer run it can completely eliminate the necessity for docking stations. It will take some time but this is the grand idea.
It remains to be seen when we will be able to buy first Snapdragon 810 device with 802.11ad WiGig abilities. Qualcomm mentioned 2015 a number of times, but there’s nothing more specific than that. A potential problem for this standard might be the speed of flash storage that is used in tablets and phones today. According to Androbench, the HTC One M8 can sequentially read 92.29 MB/s, sequentially write only 17 MB/s, while Nvidia’s Shield tablet can sequentially read 67.75 MB/s, and write only 14.09 MB/s.
The performance gets even less impressive with smaller files, but with numbers we are getting from latest 2014 devices, the flash has to increase speed up to 10 times in order to be ready to write files at 150MB. For theoretical maximum of ridiculously fast 875 MB/s we need about 50 times faster memory that the 14-17MB/s write speed available in the current generation of high end mobile devices.
Google didn’t elaborate on the price increase after announcing the Nexus 6, but several analysts said Google may be intending to push the Nexus as a premium brand that can compete with the iPhone 6 and other high-end phones.
Google originally developed Android to be inclusive and global, and indeed, it is the world’s largest OS by far. The company developed the Nexus line in 2010 to show Android phone manufacturers, and the public, how a pure Android phone could look and feel without the added features and bloatware installed by phone makers.
Meanwhile, the four national carriers are expected to sell the Nexus 6 with a subsidized price of as low as $200 with a two-year contract, and separate pricing for installment plans. AT&T will be a Nexus provider for the first time, and Verizon Wireless will carry the phone despite a spotty history with the Nexus line.
Such a carrier push to sell Nexus 6 phones with a subsidy seems to indicate that Google is intent on spreading wider adoption of its pure Nexus line that it so far hasn’t achieved. Google has long described Android as an operating system for all, but Google also wants to promote a more refined Android device, which it is trying to do with its Nexus line.
The $649 Nexus 6, which will run Android 5.0 Lollipop with support for 64-bit architecture, is a better phone than the $349 Nexus 5 that runs Android 4.4 KitKat. Nexus 6 also starts with 32 GB storage, double the capacity of its predecessor the Nexus 5. (A 64 GB Nexus 6 will run $699 unlocked on Google Play.)
But all the enhancements in the new Nexus 6, including its 5.96-in. Quad HD display and Snapdragon 805 quad-core processor, still don’t fully account for the 86% increase in starting price for the unlocked model, analysts said.
Sundar Pichai, senior vice president of Android at Google, noted in a blog post that wireless carriers will offer the Nexus 6 on monthly contracts or installment plans. A number of industry sources predicted the two-year contract price will start at $200, a common industry price for high-end smartphones, including the new iPhone 6.
The four major carriers, Google and Motorola, which is the Nexus 6 manufacturer, all refused to discuss the prices that carriers will charge. They also would not disclose the November release date.
The credit-card company showed a prototype of the card in London on Friday along with Zwipe, the Norwegian company that developed the fingerprint recognition technology.
The contactless payment card has an integrated fingerprint sensor and a secure data store for the cardholder’s biometric data, which is held only on the card and not in an external database, the companies said.
The card also has an EMV chip, used in European payment cards instead of a magnetic stripe to increase payment security, and a MasterCard application to allow contactless payments.
The prototype shown Friday is thicker than regular payment cards to accommodate a battery. Zwipe said it plans to eliminate the battery by harvesting energy from contactless payment terminals and is working on a new model for release in 2015 that will be as thin as standard cards.
Thanks to its fingerprint authentication, the Zwipe card has no limit on contactless payments, said a company spokesman. Other contactless cards can only be used for payments of around €20 or €25, and some must be placed in a reader and a PIN entered once the transaction reaches a certain threshold.
Norwegian bank Sparebanken DIN has already tested the Zwipe card, and plans to offer biometric authentication and contactless communication for all its cards, the bank has said.
MasterCard wants cardholders to be able to identify themselves without having to use passwords or PINs. Biometric authentication can help with that, but achieving simplicity of use in a secure way is a challenge, it said.
The latest version of the cloud computing stack contains 342 new features, 3,219 bug fixes, almost 500,000 lines of modified documentation and a new Architecture Design Guide.
1,419 unique contributors including representatives from 133 companies made it all happen over six months.
Last month it was revealed that HP had overtaken Red Hat in terms of overall contributions to Juno, and is closing in on Red Hat’s overall lead.
However, Red Hat has shifted focus more towards the cloud market in recent strategy announcements, so that lead could widen again.
The new version adds storage policies, data processing provisioning for Hadoop and Spark and takes the initial steps towards being a platform for Network Function Virtualisation (NFV) in a future release, meaning that it would be capable of managing a number of functions currently fulfilled by expensive software.
Other new features include Nova Compute, a rescue mode improvement with the option to boot from alternative images via locally attached disks, update scheduling and internationalisation updates.
For networking, the Neutron module includes IPv6 and third-party driver testing, plug-ins, and migration support from Nova to Neutron.
The Keystone identity service allows users to share credentials for private and public OpenStack clouds.
The Heat engine, which manages orchestration, includes advanced rollback options in the event of failed deployment and the option for administrators to delegate creation of resources to non-admins.
The Horizon Dashboard now offers Hadoop deployment in a few clicks, enabling rapidly scalable data processing with custom parameters.
Finally, the Trove database allows users to manage relational database servcies in the OpenStack environment.
Of course, OpenStack waits for no-one. With this release safely out, work now begins on the next version, codenamed Kilo, which is due in April 2015.
U.S. Federal Communications Commissioner Jessica Rosenworcel, on Friday, stated that U.S. regulators will look “to infinity and beyond” to harness new technology that can help build a new generation of mobile wireless connections.
The FCC on Friday voted unanimously to open a so-called “notice of inquiry” into what it and the industry can do to turn a new swath of very high-frequency airwaves, previously deemed unusable for mobile networks, into mobile-friendly frequencies.
The FCC’s examination would serve as a regulatory backdrop for research into the next generation of wireless technology, sometimes referred to as 5G and which may allow wireless connections to carry a thousand times more traffic.
“Today we’re stepping in front of the power curve,” FCC Chairman Tom Wheeler said on Friday at the meeting.
In question are frequencies above 24 gigahertz (GHz), sometimes called millimeter waves, that have previously been deemed technically unweildy for mobile connections, though have the potential to carry large amounts of data and give the promise of lightning-fast speeds.
Millimeter waves work best over short distances and have required a direct line-of-sight connection to a receiver. They are now largely used for point-to-point microwave connections.
The FCC said it will study what technologies could help get around the technological and practical obstacles and what kind of regulatory regime could help a variety of technologies to flourish on those airwaves, including the potential for services other than mobile.
The U.S. wireless industry continues to work on deploying the 4G connections, though some equipment manufacturers, such as Samsung are already testing data transmission on the higher frequencies.
Gartner and IDC both recently dramatically lowered their tablet shipment and sales estimates for 2014 and coming years, citing primarily the longer-than-expected time customers keep their existing tablets. (That phenomenon is called the “refresh rate.”)
Gartner said it had originally expected 13% tablet sales growth for the year globally; it has now lowered that growth rate to 11%. IDC’s forecast change was even more dire: In June, it predicted shipment growth this year would be 12.1%, but in September it cut that number to 6.5%.
In the U.S., things are worse, because more than half of households have a tablet and may hold onto it for more than three years, well beyond analysts’ earlier expectations.
IDC said in its latest update that tablet growth in the U.S. this year will be just 1.5%, and will slow to 0.4% in 2015. After that, it expects negative growth through 2018. Adding in 2-in-1 devices, such as a Surface Pro with a keyboard, the situation in the U.S. improves, although overall growth for both tablets and 2-in-1′s will still only reach 3.8% in 2014, and just 0.4% by 2018, IDC said.
“Tablet penetration is high in the U.S. — over half of all households have at least one — which leads to slow growth…,” Mikako Kitagawa, an analyst at Gartner, said in an interview. “A smartphone is a must-have item, but a tablet is not. You can do the same things on a laptop as you do with a tablet, and these are all inter-related.”
Tablets are a “nice-to-have and not a must-have, because phones and PCs are enough to get by,” added Carolina Milanesi, chief of research at Kantar Worldpanel.
In a recent Kantar survey of 20,000 potential tablet buyers, only 13% said they definitely or probably would buy a tablet in the next year, while 54% said they would not, Milanesi said. Of those planning not to buy a tablet, 72% said they were happy with their current PC.
At IDC, analyst Tom Mainelli reported that the first half of 2014 saw tablet growth slow to 5.8% (from a growth rate of 88% in the first half of 2013). Mainelli said the meteoric pace of past years has slowed dramatically due to long device refresh cycles and pressure from sales of large phones, including the new iPhone 6 Plus. That phone has a 5.5-in. display, which is close to some smaller tablets with 7-in. displays.
Juniper Research now estimates smartwatch shipments will hit 100 million by 2019. The firm expects several high-profile products to launch over the next year or so, helping boost mainstream awareness.
However, the figures are anything but encouraging.
The report, titled ‘Smart Watches: Market Dynamics, Vendor Strategies & Scenario Forecasts 2014-2019′, expects growth will decelerate from 2016 onwards. The first batch will ride the hype, but moving forward it won’t do much for mainstream adoption.
However, the forecast also examines the possibility of sustaining 2014-2015 growth in the long term.
If consumers discover a ‘key use case’ or cases for smartwatches, backed by more product releases on the back of higher demand, higher growth could be sustained. In plain English, if people actually find a use for smartwatches, they will see more growth.
Unfortunately the case is hard to make at this point. Smartwatches face a number of hardware limitations and software support is still limited, which means they are not very useful at the moment. Juniper expects more vendors to integrate GPS, NFC and other technologies, but the downside is that smartwatches are not expected to become very cheap. The firm estimates premium branding and high functionality to keep prices at $200+ until the end of the decade.
Europeans not too keen
One possible application that could generate more demand comes in the form of mobile payments. Apple Pay is coming to the Apple Watch, but the service will be limited to the US for quite a while and Apple won’t have an easy time launching it in other markets, where it enjoys a much lower market share.
The problem with mobile digital wallets is that they have not taken off yet. What’s more, new research indicates that Europeans are not sold on the idea of smartwatch wallets.
The survey, carried out by German market research firm GfK, found that just 20 percent of Germans and 27 percent of Britons are interested in contactless payments built into a watch. However, Chinese and American consumers are more open to the idea, with 40 and 54 percent saying they are interested.
Most consumers said they are interested in health applications and many said they would store identification data on their smartwatches.
Google Inc is gearing up to test new technology that may provide the foundation for a wireless version of its high-speed “Fiber” Internet service, according to telecommunication experts who scrutinized the company’s regulatory filings.
In a public but little-noticed application with the U.S. Federal Communications Commission on Monday, Google asked the agency for permission to conduct tests in California across different wireless spectrums, including a rarely-used millimeter-wave frequency capable of transmitting large amounts of data.
It is unclear from the heavily redacted filing what exactly Google intends to do, but it does signal the Internet giant’s broader ambition of controlling Internet connectivity. The technology it seeks to test could form the basis of a wireless connection that can be broadcast to homes, obviating the need for an actual ground cable or fiber connection, experts say.
By beaming Internet services directly into homes, Google would open a new path now thoroughly dominated by Verizon, AT&T, Comcast and other entrenched cable and broadband providers. It could potentially offer a quicker and cheaper way to deliver high-speed Internet service, a potential threat to the cable-telecoms oligopoly, experts said.
“From a radio standpoint it’s the closest thing to fiber there is,” said Stephen Crowley, a wireless engineer and consultant who monitors FCC filings, noting that millimeter frequencies can transmit data over short distances at speeds of several gigabits per second.
“You could look at it as a possible wireless extension of their Google Fiber wireless network, as a way to more economically serve homes. Put up a pole in a neighborhood, instead of having to run fiber to each home,” said Crowley.
Craig Barratt, the head of the Google Access and Energy division leading the effort to offer high-speed fiber networks in Kansas City and other locations, signed off as the authorized person submitting Google’s FCC application.
The world’s No.1 Internet search engine has expanded into providing consumers with services such as Internet access. The company said it wants to roll out its high-speed Internet service to more than 30 U.S. cities, and in 2013 it struck a deal to provide free wireless Internet access to 7,000 Starbucks cafes across America.
Earlier this year, technology news website The Information reported that Google was exploring ways to offer a full-fledged wireless service, with voice and Internet access, in markets where the company already offers its Fiber service.
“Earning an Oracle certification is a well-respected achievement,” the company said on its website. “However, as products age and are removed from Oracle standard support maintenance, the technology becomes less relevant, devaluing the associated credential(s).”
While that may seem like a reasonable enough conclusion, one question in a FAQ page on the site notes that “Oracle has stated that certification is permanent” and the policy change “seems to go against that.”
The change “helps maintain the integrity of our certification program and the value of your certification,” the site states.
The policy reflects certifications for Oracle database versions ranging from 7.3, which dates to the mid-1990s, up to 10g, which was released in 2003.
DBAs certified on those versions must recertify on a newer version of the database by either November 2015 or March 2016 if they want to keep their credentials in an “active” status. Oracle recommends that DBAs upgrade their certification to version 11g or later, the site states.
Oracle stands to benefit financially from the recertifications, given the fees charged to take the tests.
Still, one longtime Oracle DBA, who asked to remain anonymous, praised Oracle’s decision.
“It was never a good idea that certifications were permanent,” the DBA said via email. “Changes in features and architecture, for example 12c multi-tenant, should render previous certifications null and void. Will it ruffle some feathers? Yeah probably. Should it? No. In my opinion certifications should apply to a single release and nothing more.”
There could be more news on this front yet to come. A decision on whether to require all product certifications to be recertified is “currently under discussion,” according to the Oracle FAQ.
Gartner is warning that tablet sales could fall to the power of the cheaper and bigger smartphones. Gartner’s Q3 and annual figures for device sales worldwide — covering smartphones and tablets as well as PCs of all sizes — shows that tablet sales in 2014 will only see 11 per cent growth over last year, compared to growth of 55 percent the year before.
This works out to a projected 229 million tablets selling in 2014, or 9.5% of overall worldwide device sales, which will total 2.4 billion devices for the year, and 2.5 billion in 2015. In short the novelty is wearing off and tablets are getting a good kicking from Android smartphones. Devices built on Google’s mobile operating system will see sales of 1.2 billion devices this year, working out to more than half of all devices sold.
Ultramobiles, the not-quite-PC and not-quite-tablet and not-quite-phone category, will remain niche but continue growing: there will be 37.6 million of these sold this year, and as befits a fast-growing but still-small category, ultramobiles will grow the fastest, doubling in sales in 2015 while the other categories continue to see only modest rises. Ultramobiles are also suffering from the same issue as tablets. People are simply not replacing them as much.
“In the tablets segment, the downward trend is coming from the slowdown of basic ultramobiles,” Gartner concludes.
The life cycle of tablets and ultramobiles is around three years and buyers this year won’t replace devices until 2018. Gartner says it projects 83 million less new tablet purchasers in 2014-2015 and 155 million less tablet replacements through 2018.
Roberta Cozza, a Gartner analyst and co-author of the report said there are too many solid devices out there and users don’t have a reason to upgrade to the new units. Cozza also confirmed Samsung is heads and shoulders above all other OEMs.
If you look at PCs, ultramobiles and phones, Samsung is still number one, with around a 20 per cent share this quarter. Samsung’s fortunes are driven by Android and its share in the PC category is “tiny.”
With Apple in second place at around 10 percent, Nokia in third just behind it and Lenovo in fourth in the overall category.
The official cessation of discussions to merge two of the tech industry’s largest enterprise-oriented firms may come as a disappointment to activist investors Elliott Management, which has pushed hard for storage products maker EMC to pursue merger or spinoff opportunities.
Pressure is building on EMC as rival technology companies, such as eBay Inc and Symantec, begin spinning off operations in an attempt to unlock shareholder value, become more agile, and capitalize on faster-growing businesses.
It is unclear when talks ended following months-long discussions, the people said on condition of anonymity because the talks were private.
Executives from the two companies were still trying to hammer out a deal as recently as last week, but talks bogged down on price and are now dead, the people said.
HP has temporarily suspended its stock buyback program ahead of its Nov. 25 earnings because the company said it is in possession of material non-public information. When pressed by stock analysts, Chief Financial Officer Cathie Lesjak noted on a conference call that the non-public information pertains to a possible acquisition.
HP and EMC declined to comment on Tuesday.
It is also unclear what specifically was discussed. A straight-up merger of the two companies would have created one of the industry’s largest providers of data storage, and created a computing giant with deep penetration in the business of providing computing hardware and services to corporations.
Brian Krebs wrote on his blog that he found companies and organizations that failed to password protect WebEx meetings, which allowed “anyone to join daily meetings about apparently internal discussions and planning sessions.”
Meeting schedules for organizations were available through WebEx’s “Event Center,” he wrote.
Cisco has a variety of options for WebEx that are intended to accommodate sensitive meetings and ones intended for the public.
For example, Cisco requires a password to be set by default for a meeting, but that option can be turned off, wrote Aaron Lewis, who works in global social media marketing, on a company blog.
“The most secure meetings will always be protected by a complex password,” Lewis wrote.
Companies may publicly list a meeting for webinars that anyone can join, but “if your WebEx site administrator or IT department allows listed meetings, then we recommend listing your meeting only if there is a true business reason,” Lewis wrote.
Another tip is to disable the option “join before host,” which will then give the host visibility on who has joined. Also, setting the “host as presenter” prevents someone else form joining the meeting and sharing content, Lewis wrote.
Krebs wrote he found meetings not protected by a password from a host of companies and organizations, including Charles Schwab, CSC, CBS, CVS, The U.S. Department of Energy, Fannie Mae, Jones Day, Orbitz, Paychex Services and Union Pacific.
The feature, part of the Google+ Helpouts online collaboration video service that launched a year ago, allows healthcare workers to share expertise through live video and provide real-time advice from their computers or mobile devices.
“When you’re searching for basic health information — from conditions like insomnia or food poisoning — our goal is provide you with the most helpful information available. We’re trying this new feature to see if it’s useful to people,” a Google spokesperson said in an email response to Computerworld.
The new Helpouts feature offers a link to a video service that a physician or other healthcare worker has established for advising patients who’ve used a particular search query, such as “congestive heart failure” or “shoulder injury.”
Video chat services and other forms of remote communications with healthcare workers have increased 400% from 2012 levels.
This year in the U.S. and Canada, 75 million out of 600 million appointments with general practitioners will involve electronic visits, or eVisits, according to new research from Deloitte.
With an aging Baby Boomer population and broadband bandwidth improved a hundredfold from a decade ago, telemedicine is exploding as a convenient and less costly alternative to the traditional visit to the doctors’ office.
DELL is showing off ”enterprise class” security for small to medium businesses with the launch of a SuperMassive 9800 next-generation firewall, which it claims will protect against high-profile bugs such as Shellshock and Heartbleed.
Touted as the most powerful in the fresh 9000 line-up, and sounding a little like a gang of rappers, the SuperMassive 9800 offers services such as advanced Deep Packet Inspection with speeds up to 20Gbps, and Dell’s patented Reassembly-Free Deep Packet Inspection (RFDPI) single-pass threat prevention engine.
RFDPI scans multiple application types and protocols to spot internal and external attacks and application vulnerabilities, Dell said, making it better at detecting attacks.
The SuperMassive 9800 is also bundled with Dell’s Global Management System 8.0, a tool designed to manage systems and offer real-time event monitoring, analytics and reporting from a single centralised dashboard.
Dell claims that this makes it easier to meet compliance regulations while managing and monitoring network security processes.
The firm claimed that the SuperMassive 9800 provides 97.9 percent “security effectiveness” and helps to protect customers from Shellshock and Heartbleed-level vulnerabilities.
“The recent disclosures of the ShellShock and HeartBleed industry-wide vulnerabilities demonstrate that organisations are literally a few well-formed packets away from infrastructure disaster, proving the need for instant and automated security scaled to meet the needs of the network,” said executive director of Dell Security, Patrick Sweeney.
“The SuperMassive 9800 provides that level of instant security on a flexible, feature-rich platform.”
Shellshock was uncovered in September, and some experts claim that it could be more serious than the Heartbleed SSL bug uncovered in April.
The Bash bug, as implied by its name, is a vulnerability that allows unscrupulous users to take control of Bourne Again Shell (Bash), the software used to control the Unix command prompt on some Unix-like systems.
Researchers at FireEye and Trend Micro warned later in September that hackers were still mounting cyber attacks across the globe thanks to exploits of Bash bug vulnerabilities, made worse by an unsuccessful patch.