Subscribe to:

Subscribe to :: ::

Has Sony Lost The Handheld Gaming Market?

October 6, 2015 by Michael  
Filed under Gaming

Sony’s Worldwide Studios boss Shuhei Yoshida was only stating the obvious when he told the audience at EGX that the “climate is not healthy” for a successor to the company’s struggling handheld console, the PlayStation Vita, but sometimes even the obvious makes for an interesting statement, depending upon who’s stating it.

The likelihood of another handheld console from Sony turning up in the foreseeable future is considered to be incredibly low by almost everyone, and it’s notable that there’s never been so much as a whisper about what such a successor might look like or comprise; it’s so vanishingly unlikely to come to pass, why even bother speculating on what might be? Yet for commentators and analysts to dismiss the notion of Sony carrying on in handheld is one thing; for such a senior figure at the company to seemingly join in that dismissal is another. The final step of the long and strange handheld journey which Sony started with the announcement of the PSP’s development all the way back in 2003 won’t come until the Vita reaches its official end-of-life, but Yoshida’s statement is the moment when we learned for certain that the company itself reckons the handheld market is past saving.

It’s not that there’s any lack of affection for the Vita within Sony, including Yoshida himself, whose Twitter feed confirms that he is an avid player of the system. Even as weak sales have essentially rendered AAA development for the Vita financially unsustainable, the firm has done a great job of turning it into one of the platforms of choice for break-out indie hits, and much of the success of the PS4 as a platform for indie games can be traced back to the sterling work Sony’s team did on building relationships and services for indies on the Vita. For that alone, it’s a shame that the console will apparently be the last of its line; there are some games that simply work better on handhelds than on home consoles, and some developers who are more comfortable working within the limitations of handheld systems.

Yoshida is right, though; mobile phones are the handheld killer. They may not be as good at controlling the kind of games that the PSP and Vita excelled at, but mobile devices are more powerful, more frequently updated, carried everywhere and heavily subsidised by networks for most users. Buttons and sticks make for wonderful game controllers, as Yoshida noted, but when the competition has a great multi-touch screen and accelerometer, a processor faster than most laptops only a few years ago, and is replaced every couple of years with a better model, the best set of buttons and sticks on earth just can’t compete for most consumers. Even if Sony could release a Vita 2 tomorrow which leapfrogged the iPhone 6S, within a year Apple, Samsung and others would be back out in front.

That’s not to say that this battle can’t be won. Nintendo has still managed to shift a dramatic number of 3DS consoles despite the advent of the smartphone era – though in typically Nintendo style, it chose not to play the competition at their own game, favouring a continuation of the DS’ odd form-factor, a 3D screen and a low-cost, low-power chipset over an arms race with smartphones (and, indeed, with the Vita). Crucially, Nintendo also pumped out high quality software on the 3DS at a breathtaking pace, at one point coming close to having a must-buy title on the system every month. Nintendo’s advantage, as ever, is its software – and at least in part, its longevity in the handheld market is down to the family-friendly nature of that software, which has made the 3DS popular with kids, who usually (at least in Japan, the 3DS’ best performing market) do not carry smartphones and generally can’t engage with F2P-style transactions even if they do. Vita, by comparison, aimed itself at a more adult market which has now become saturated with phones and tablets.

So; is that the end of Sony’s handheld adventure? Trounced by Nintendo twice over, first with the DS’ incredibly surprising (if utterly obvious in hindsight) dominance over the PSP, then with the 3DS’ success over the Vita, Sony nonetheless carved out an impressive little market for the PSP, at least. Vita has failed to replicate that success, despite being an excellent piece of hardware, and 12 years after news of the PSP first reached gamers’ eager ears, it looks like that failure and the shifting sands of the market mean Sony’s ready to bail out of handhelds. With the stunning success of PS4 and the upcoming PlayStation VR launch keeping the company busy, there’s seemingly neither time, nor inclination, nor resources to try to drive a comeback for the Vita – and any such effort would be swimming against the tide anyway.

I would not go so far as to say that Sony is dropping out of handheld and portable gaming entirely, though. I think it’s interesting, in the context of Yoshida’s comments, to note what the company did at TGS last month – where a large stand directly facing the main PlayStation booth was entirely devoted to the Sony Xperia range of phones and tablets, and more specifically to demonstrating their prowess when it comes to interacting with a PS4. The devices can be hooked up to a PS4 controller and used for remote play on the console; it’s an excellent play experience, actually significantly better in some games than using the Vita (whose controls do not perfectly map to the controller). I use my Vita to do simple tasks in Final Fantasy XIV on my PS4 while the TV is in use, but it wouldn’t be up to the task of more complex battles or dungeons; I’d happily do those on an Xperia device with a proper controller, though.

Remember when the Vita launched and much of the buzz Sony tried to create was about how it was going to interact with the PS4? That functionality, a key selling point of the Vita, is now on Xperia, and it’s even better than it was on the devoted handheld. Sony’s phones also play Android games well and will undoubtedly be well-optimized for PlayStation Now, which means that full-strength console games will be playable on them. In short, though the Vita may be the last dedicated handheld to carry the Sony brand, the company has come a long way towards putting the core functions of Vita into its other devices. It’s not abandoning handheld gaming; it’s just trying to evolve its approach to match what handheld gaming has become.

It’s not a perfect solution. Not everyone has or wants an Xperia device – Japan is the best performing market for Sony phones and even here, Apple is absolutely dominant, with iPhones holding more than half of the market share for smartphones. If Sony is being clever, though, it will recognize that the success of the PS4 is a great basis from which to build smartphone success; if the Xperia devices can massively improve the user experience of the PS4, many owners of those devices may well consider a switch, if not to a new phone then at least to one of the Xperia tablets. It might also be worth the company’s time to think a little about the controllers people will hook up to the Xperia to play games; I love the PS4 controller, but it’s bulky to carry in a bag, let alone a pocket. If the firm is serious about its phones and tablets filling the handheld gap, a more svelte controller designed specifically for Xperia (but still recognizably and functionally a PS4 pad) would be an interesting and worthwhile addition to the line-up.

Nonetheless, what’s happening with Xperia – in terms of remote play, PS Now, and so on – is an interesting look at how consoles and smartphones might co-exist in the near future. The broad assumption that smart devices will kill off consoles doesn’t show any sign of coming true; PS4 and Xbox One are doing far, far better than PS3 and Xbox 360 did, and while the AAA market is struggling a little with its margins, the rapid rise of very high quality indie titles to fill the gap left by the decline of mid-range games in the previous generation means the software market is healthier than it’s been for years. If consoles aren’t going away, then we need to be thinking about how they’ll interact with smart devices – and if that’s what Sony’s doing with Xperia and PlayStation, it’s a strategy that could pay off handsomely down the line.

AMD Giving More People The Boot

October 5, 2015 by Michael  
Filed under Computing

While we want AMD to do well to balance the Intel and Nvidia empires, it does seem that the outfit cannot get a break.  Today it announced that it is letting 500 staff go and will begin another wave of restructuring.

Of course, we predicted  this would happen. The company is betting the farm on its coming Zen chip, but this will not appear until next year.  Meanwhile it is facing shrinking sales and nearly impossible competition.

Under the restructuring AMD will outsource some IT and application development services.  It will give 500 people, or five percent of its staff, their pink slips and P45s. The company will take a charge of $42 million, with $41 million of that recorded in the just-ended third quarter. AMD said it expected savings of about $58 million in 2016 from the restructuring plan.

This is about the same time AMD hopes to clean up with its Zen chips.

AMD said it will cut white-collar jobs and is not shutting or idling any fabricating operations. The jobs will be lost across AMD’s global operations, including Austin, Texas, and company headquarters in Sunnyvale, California. AMD only has 9,700 employees at the end of last year, so 500 is rather a chunk.

AMD reported its second-quarter revenue fell 35 percent from the year-earlier period, claiming that no one wanted to buy PCs.

The company has been shifting to gaming consoles and low-power servers, but it really has not moved fast enough or come up with the sort of “wow” technology which is needed to see off Intel.



Windows 10 Breaks 100M Device Mark

October 5, 2015 by mphillips  
Filed under Computing

Windows 10′s user share growth slowed considerably in September, but by the month’s end approximately 110 million customers were running the new OS, data released today signaled.

According to U.S. analytics company Net Applications, Windows 10′s user share — a measure of the fraction of unique users who ran the OS when they went online — grew 1.4 percentage points in September to 6.6%.

Microsoft launched Windows 10 on July 29, making September the second full month that the upgrade for Windows 7 or Windows 8.1 devices was available to download and install.

September’s user share increase was substantially smaller than August’s record setting 4.8 percentage points.

Windows 10 accounted for 7.3% of all Windows devices in September, a slightly higher number than its raw user share number because Windows powered “just” 90.5%, not 100%, of all systems tallied by Net Applications. During September, Windows 10′s share of all Windows devices climbed by 1.6 percentage points.

Net Applications’ data represented 110 million Windows 10 PCs, assuming a total of 1.5 billion Windows devices globally, the figure Microsoft typically trumpets.

Microsoft has not publicized a Windows 10 download or installed data point since late August, when it said that 75 million devices worldwide were running the OS.

Net Applications’ Windows 10 user share portrait backed up the findings of another analytics developer, Ireland’s StatCounter, which has also portrayed the OS’s growth as slowing after its first month of availability.

By StatCounter’s measurements, Windows 10 gained 5.9 percentage points ofusage share — more of an activity indicator, as it counts web page views by OS — in the first four weeks after its launch. During the most recent four weeks, or from Aug. 31 to Sept. 27, Windows 10 grew by a much smaller 1.4 points.

Net Applications’ numbers also validate the slowdown in a different way. During the final three weeks of August, an average of 1.8 million devices were added to Windows 10′s rolls daily. But in September, the average daily increase dropped to less than half of that, to about 794,000 devices.

Even so, Windows 10 continued to best Windows 7′s performance during a similar stretch. In 2009, the then-new OS had accumulated a 6.2% share of all Windows personal computers through its second full month, or more than a point under Windows 10 at the same post-launch moment.

With about 110 million devices now running Windows 10, Microsoft is at the 7% mark toward reaching its goal of putting the OS on 1.5 billion systems by mid-2018.




Microsoft, Google Cease Fire In Global Patent Deal

October 2, 2015 by mphillips  
Filed under Around The Net

Microsoft has been pursuing a more collaborative approach under CEO Satya Nadella, engaging longtime rivals like Salesforce, VMware and Apple. There hasn’t been much love between Microsoft and Google, but an announcement on Wednesday points towards an easing of those tensions.

Google and Microsoft have reached a broad agreement on patent matters, with a legal settlement ending some 20 lawsuits between the companies in the U.S. and Germany. Financial terms weren’t disclosed, but the deal brings a laundry list of lawsuits to a close.

“Microsoft and Google are pleased to announce an agreement on patent issues,” they said in a joint statement. “As part of the agreement, the companies will dismiss all pending patent infringement litigation between them, including cases related to Motorola Mobility.”

They also agreed to collaborate on patent matters and work together “to benefit our customers.”

The suits that have been settled include those related to mobile phones, video encoding and Wi-Fi technologies. That doesn’t mean Microsoft has given up its campaign to collect royalties from Android device makers for the mobile operating system’s alleged infringement of Microsoft patents.

It’s not clear from the statement what patent matters the companies will be working on together in the future, but changes have already begun. The two companies agreed earlier this month to work together (alongside other firms like Netflix and Mozilla) on a royalty-free video codec.

It remains to be seen if the settlement will lead to more work between Microsoft and Google in other areas. A major sticking point for consumers has been the lack of a Google-made YouTube app for smartphones and tablets running Windows.



MediaTek Building Ecosystem To Power IoT

October 2, 2015 by Michael  
Filed under Computing

MediaTek is quietly building an ecosystem to drive IoT strategy to push its System on Chip shipments across multiple devices.

The fabless chipmaker is signing partnerships with Amazon, Tinitell, Apple, and People Power.

MediaTek is starting to come out of the shadows in the West with its SoC designs. It sees the IoT as a way to push more of its chips.

It has put in a tender to buy power management outfit Richtek Technology to expand its leadership in Power Management Integrated Circuits (PMIC) to strengthen its overall capabilities for the IoT business model. The deal is expected to close in Q2 2016.

It has provided funding to People Power, a user engagement company providing apps, cloud and mobile services for IoT to further accelerate its penetration in the IoT market in both the U.S. and China, develop new IoT products based on its Fabrux and Influx software architecture

Release of two software development kits (SDKs) for Apple HomeKit, the framework in iOS 8 for communicating with and controlling connected accessories in a user’s home.

This is on top of its partnership with Amazon for the latest devices – Amazon Fire TV is powered by MediaTek’s MT8173, a 64-bit quad-core processor and the world’s first multimedia SoC with ARM’s Cortex-A72 cores; Fire HD 8 and Fire HD 10 tablets powered by MT8135, an up to 1.5 GHz quad-core processor, resulting in a fast and fluid user interface, and smooth running HD videos and high frame-rate games.

Chief Marketing Officer, Johan Lodenius said the company’s cunning plan was to innvovate widely available technology that provides integrated connectivity, while investing in and nurturing developers and the maker community to deliver practical yet innovative solutions.



IBM Will Use Apache Spark To Find E.T.

October 2, 2015 by Michael  
Filed under Computing

IBM is using Apache Sparke to analyse radio signals for signs of extra-terrestrial intelligence.

Speaking at Apache: Big Data Europe, Anjul Bhambrhi, vice president of big data products at IBM, talked about how the firm has thrown its weight behind Spark.

“We think of [Spark] as the analytics operating system. Never before have so many capabilities come together on one platform,” Bhambrhi said.

Spark is a key project because of its speed and ease of use, and because it integrates seamlessly with other open-source components, Bhambrhi explained.

“Spark is speeding up even MapReduce jobs, even though they are batch oriented by two to six times. It’s making developers more productive, enabling them to build applications in less time and with fewer lines of code,” she claimed.

She revealed IBM is working with Nasa and Seti to analyse radio signals for signs of extra-terrestrial intelligence, using Spark to process the 60Gbit of data generated per second by various receivers.

Other applications IBM is working on with Spark include genome sequencing for personalised medicine via the Adam project at UC Berkeley in California, and early detection of conditions such as diabetes by analysing patient medical data.

“At IBM, we are certainly sold on Spark. It forms part of our big data stack, but most importantly we are contributing to the community by enhancing it,” Bhambrhi said.

The Apache: Big Data Europe conference also saw Canonical founder Mark Shuttleworth outline some of the key problems in starting a big data project, such as simply finding engineers with the skills needed just to build the infrastructure for operating tools such as Hadoop.

“Analytics and machine learning are the next big thing, but the problem is there are just not enough ‘unicorns’, the mythical technologists who know everything about everything,” he explained in his keynote address, adding that the blocker is often just getting the supporting infrastructure up and running.

Shuttleworth, pictured above, went on to demonstrate how the Juju service orchestration tool developed by Canonical could solve this problem. Juju enables users to describe the end configuration they want, and will automatically provision the servers and software and configure them as required.

This could be seen as a pitch for Juju, but Shuttleworth’s message was that the open-source community is delivering tools that can manage the underlying infrastructure so that users can focus on the application itself.

“The value creators are the guys around the outside who take the big data store and do something useful with it,” he said.

“Juju enables them to start thinking about the things they need for themselves and their customers in a tractable way, so they don’t need to go looking for those unicorns.”

The Apache community is working on a broad range of projects, many of which are focused on specific big data problems, such as Flume for handling large volumes of log data or Flink, another processing engine that, like Spark, is designed to replace MapReduce in Hadoop deployments.



AMD Goes Pro With APUs

October 1, 2015 by Michael  
Filed under Computing

Troubled chipmaker AMD’s has launched its Pro APUs quietly with just one major customer so far, the maker of expensive printer ink HP.

Based on the Godaveri and Carrizo chips, AMD adds its AMD Secure Processor for corporate peace of mind. The new Pro chips include the new AMD Pro A12 chip, which runs at 3.4GHz. All of the new Pro chips are APUs, which mean that they combine both graphics as well as the CPU core. The A12 integrates 12 compute cores (4 CPU cores and 8 GPU cores), based on the Radeon R7 graphics technology running at 800MHz.

What differentiates the new PRO chips from the more conventional models are what AMD calls the AMD Secure Processor, an embedded core that enables the ARM TrustZone secure environment to run on top of the chip. Theoretically, at least, the technology should supply an added layer of security to sensitive apps.

AMD PRO A-Series mobile processors (formerly codenamed “Carrizo PRO”) are aimed at the commercial laptop market. They were made in collaboration with HP, ExactTrak, and Qualcomm. HP is set to flog a few of them in its HP EliteBooks range.

David Bennett, corporate vice president and general manager, Commercial Products, AMD said the AMD PRO processors enable performance, reliability and opportunity for today’s businesses by giving customers choice and affordability to meet their specific business needs.

The AMD PRO A-Series processors are purpose-designed for business, offering long-term value commercial enterprises can depend on including a 24-month longevity commitment, 18-month image stability, commercial-grade quality assurance and available extended OEM warranty support for up to 36 months.

Protection against modern security threats with new enterprise-class security features including Device Guard, Enterprise Data Protection, and Windows Hello biometric authentication.

The AMD PRO A-Series processors are claimed to enable greater management flexibility in a multi-vendor client environment at what AMD calls a business-friendly price.

HP EliteBook G3 705 series pair the PROs with Qualcomm’s SnapdragonTM X5 LTE modem to provide 4G connectivity and location capabilities.

Fram Akiki, senior director of product management at Qualcomm Technologies said that the closer co-operation between AMD, HP, and Qualcomm on the HP EliteBook 705 G3 Series will benefit enterprise users.

The AMD PRO A-Series mobile processors are available today through online resellers and are currently offered on HP EliteBook 705 G3 Series PCs, including HP EliteBook 725, 745 and 755

The HP EliteBook 705 G3 series with the new Pro chips inside them. The business notebook weighs 2.78 pounds and includes 12.5-inch, 14.0-inch and 15.6-inch displays.

The new Pro chips also contain features that were launched with the earlier chips, such as Heterogenous Systems Architecture (HSA 1.0) compliance to allow programmers to more easily program the CPU, as well as an integrated HEVC video decoder.


Can eSports Become A Billion Dollar Industry?

October 1, 2015 by Michael  
Filed under Gaming

Over the last few years, competitive gaming has made huge strides, building a massive fanbase, supporting the rise of entire genres of games and attracting vast prize pots for the discipline’s very best. Almost across the board, the phenomenon has also seen its revenues gaining, as new sponsors come on board, including some major household names. Sustaining the rapidity of the growth of eSports is going to be key to its long term success, maintaining momentum and pushing it ever further into the public consciousness.

In order to do that, according to Newzoo, eSports need to learn some lessons from their more traditional athletic counterparts. Right now, the research firm puts a pin in eSports revenues of $2.40 per enthusiast per year, a number which is expected to bring the total revenue for the industry to $275 million for 2015 – a 43 per cent increase on last year. By 2018, the firm expects that per user number to almost double, reaching $4.63.

That’s a decent number, representing very rapid growth, but it pales in comparison to Newzoo’s estimates on the average earning per fan for a sport like Basketball, which represents a $14 per fan revenue – rising to $19 where only the major league NBA is a factor. To catch up to numbers like this is going to take some time, but Newzoo’s research has listed five factors it considers vital to achieving that aim.


Right now, MOBAs are undeniably the king of the eSports scene, and one of the biggest genres in gaming. The king of MOBAs, League of Legends, is the highest earning game in the world, whilst others like Valve’s DOTA 2 are also represent huge audiences and revenues, including the prestigious annual International tournament. Shooters are also still big business here, with Activision Blizzard recently announcing the formation of a new Call of Duty League.

Nonetheless, MOBAs are still the mainstay and if you don’t like them, you’re not going to get too deeply into competitive gaming as a fan. Although their popularity with the athletes is going to make them a difficult genre to shift, Newzoo says that broadening the slate is a key factor to growth.

Geographic reach

The major tournaments bring players, and audiences, from all over the world, but it’s often only the very top tier of players who can find themselves a foothold in regular competition. Major territories like the US, South Korea and Europe have some local structure, but again League of Legends stands almost alone in its provision of local infrastructure. By expanding a network of regular leagues and competitions to more countries, eSports stands a much better chance of building a grassroots movement and capturing more fans.


Already a problem very much on the radar of official bodies and players around the world, the introduction of regulation is always a tough transition for any industry. However, when you’re putting up millions of dollars in prize money, you can’t have any grey areas around doping, match fixing and player behaviour at events. These young players are frequently thrust into a very rapid acceleration of lifestyle, fame and responsibility – a heady mixture which can prove to be a damaging influence on many. Just like in other sports, stars need protecting and nurturing – and the competitions careful monitoring – in order for growth to occur without scandal and harm to its stars.

Media rights

Dishing out the rights to broadcast, promote and profit from eSports is a complex issue. Whilst games like football are worldwide concerns, with media rights a hotly contested and constantly shifting field, nobody owns the games themselves. With eSports, every single aspect of the games being played is a trademark in itself, with its owners understandably keen to protect them. However, with fan promotion such a key part of the sport’s growth, and services like Twitch a massive factor in organic promotion, governing the rights of distribution is only going to become a murkier and more complex business as time goes on. With major TV networks, well used to exclusivity, now starting to show an interest, expect this to become a hot topic.

Conflict between new and old media

That clash of worlds, between the fresh and agile formats of digital user-sourced broadcasting and the old network model is also going to be source of many of its own problems. One or the other, or even both, is going to have to adapt fast for there to be a convivial agreement which betters the industry as a whole. There’s currently considerable pushback from established media against the idea of eSports becoming accepted as a mainstream activity, fuelled in no small part by their audiences themselves, so a lo of attitudes need to change. Add to that the links between these media giants and many of the world’s richest advertisers and you can start to see the problem.


MediaTek’s Helio X20Goes Neural

October 1, 2015 by Michael  
Filed under Computing

MediaTek has revealed that its latest generation 10 core processor will be targeting neural networks and tge deep learning market.

Nvidia was one of the first to go after this area and Qualcomm is wants ”in” too. There will  be a big scrap for what could be a huge market  for all of these companies.

Kevin JouSr. Vice President & CTO of MediaTek said.

“Cloud-based computing provides big data for training a neural network, but on a device deep learning enables privacy, instantaneous usability of personalized databases. It can speed up the search for the picture you want. This speeds up the search of your personal data including payments, pictures and everything else that we don’t want to have in the cloud. You can just ask Jennifer Lawrence how smart it was to have the nude pictures in the iCloud.”

Kevin has confirmed that MediaTek is developing the deep learning SDK that will support multi-corps. We have seen that company’s Core Pilot 3.0 scheduler can enable the CPU, GPU, DSP and ISP to  work together.

MediaTek’s Chairman and CEO Tsai Ming-kai said that the company has serious IoT and automotive aspirations. You need deep learning to teach a car the difference between a human printed on a piece of paper and the actual human on a street. This is a painful process, but when solved will enable self-driving cars that are promised to hit our streets by 2020, just five years from now.


Will TSMC’s Revenue Drop In The Fourth Quarter?

September 28, 2015 by Michael  
Filed under Computing

TSMC has warned that its revenues for the fourth quarter will experience a sequential decline, however its bottom line has been saved by the fact that the US dollar is doing well against the Taiwan currency.

The company has announced that its third-quarter revenues will exceed its guidance given in mid-July, thanks to a more favorable US dollar exchange rate to the NT dollar. However, revenues for the third quarter will be about $6.42 billion. Gross margin and operating margin will still be within the previous guidance of 47-49 per cent.

TSMC’s revenues for the fourth quarter, however, will drop to between $6 billion with profit margin rates similar to the prior quarter’s levels, the company said.

For all of 2015, TSMC expects to post a double-digit increase in revenues as the company guided previously.

The company expected that it would gain more A9 processor orders from Apple Inc next quarter to offset customers’ inventory corrections. However one has to wonder if TSMC is now thinking that its Apple sales are set to take a hit.  Its other smartphone chip customers like Qualcomm, MediaTek have been taking a hammering. TSMC will meet its full-year target, but only thanks to the currency move.

Global smartphone shipment growth is expected to slow this year and average 7.9 percent during the next five years, compared with 34 percent growth over the past five years. Sales of all types of devices are to decline by 1 percent this year, Gartner Inc said in a report yesterday.


Red Hat Releases Fedora 23 To Address Security Issues

September 25, 2015 by Michael  
Filed under Computing

Red Hat has torn the roof off the sucker once again with the release of Fedora 23 in beta form.

Coming in three incredible versions, Fedora 23 Cloud, Fedora 23 Server and Fedora 23 Workstation, this new edition picks up where the old one left off and runs with it.

The biggest news for fans is the use of compiler flags to help improve security. These are designed to help protect Fedora 23 beta binaries against memory corruption vulnerabilities, buffer overflows and similar issues.

This is the latest iteration of Red Hat’s Linux-based operating system that likes to think of itself as the leading-edge open source operating system across all use cases. It’s hard to believe, but absolutely true.

The dazzling array of updates starts with Red Hat Fedora Server Beta, which offers a new role through the rolekit service in the form of a cache server for web applications, with the underlying functionality delivered by memcached.

Also new is the fact that rolekit can now be triggered by anaconda kickstart to determine what function should be started with the next reboot, and I think we can all agree that’s been a long time coming.

Cockpit also sees some big changes, including a basic cluster dashboard for Kubernetes, Support for SSH key authentication and support for configuring user accounts with their authorised keys and compatibility with multipath disks.

Meanwhile in Fedora 23 Workstation Beta, the fun keeps coming with a preview of GNOME 3.18. Changes to the software application will allow it to offer firmware updates and access to Libreoffice 5. Improvements have also been made to Wayland, with the ultimate aim being to make it the default graphic server in a future release.

Sadly, that’s where the thrillride ends as Cloud Beta contains very little new of note – but we are warned to stay tuned for news of Fedora 23 Atomic Host, said to be coming soon. We’re literally on the edge of our seats and will bring you the news as soon as we get it.


Microsoft Offering Office 365 For Half Price

September 24, 2015 by mphillips  
Filed under Computing

One of the obstacles Microsoft is presented with is that some users haven’t upgraded their Office suite for several years, despite the company’s releasing new versions.

Customers pay a relatively large sum for access to apps like Word and Excel, and once they have the functionality they need, there’s little incentive to upgrade. It makes sense from a consumer standpoint, but it means Microsoft isn’t making as much money as it could from the widespread use of Office.

That could be the impetus behind a promotion Microsoft announced Tuesday: Customers who have Windows 10 on their computer but are still running Office 2010 or earlier can now get a one-year Office 365 Personal subscription for $35. That’s half the price Microsoft usually charges for the Personal package, which lets users install its productivity suite on one tablet, one computer and one phone.

It comes the same day that Microsoft launched Office 2016, the latest update to its productivity suite. The update includes a variety of new features, such as integration with Bing search and a “Tell Me” search box that helps users find functionality inside the Office apps without having to comb through a maze of menus.

What’s more, the update enhances collaboration between Office users. One of the marquee features is real-time co-authoring in Word’s desktop app, which lets multiple people work on the same document at the same time and see the edits of other users in real time. The success of that feature relies on the use of Office 2016, which means it’s in Microsoft’s interest to get more people onto Office 365.

Of course, the promotion is only good for a year — users have to pay the full $70 for an Office 365 Personal subscription once their promotional period is up. It’s not clear when the deal will end, either, though it seems unlikely that Microsoft will keep it around forever.





Facebook At Work Gears Up For Enterprise Users

September 22, 2015 by mphillips  
Filed under Around The Net

It may not be long before you can use it at work without getting in trouble for accessing social media.

Facebook is expanding the beta test of its social network for the enterprise and hopes to launch it in the next few months.

The company confirmed to Computerworld that Facebook at Work – a desktop service and mobile app designed to connect people in an enterprise – is expected to be available for free before the end of the year.

“We’re currently rolling out Facebook at Work to hundreds of more companies,” a Facebook spokesperson said in an email. “We’re still in test mode but hope to launch more broadly in the coming months. Companies of all sizes are testing Facebook at Work and seeing early signs of increased productivity both on desktop and mobile.”

The enterprise-focused service has been in beta with about 100 companies since January, though the build-out of the project has been in the works for nearly a year.

Facebook at Work is reportedly designed to give users the same look, feel and functions as the personal social network they’re likely fairly familiar with. However, the company has said users’ enterprise pages won’t be connected with their personal pages and information won’t be shared between them.

The company, though it carries a lot of muscle as the world’s largest social network, is getting into an enterprise collaboration market with some hefty competitors, including Cisco Systems, Microsoft’s Yammer, Slacker, Jive Software and IBM.

While Facebook has tremendous name recognition, the question remains whether companies will welcome it into their walled-off confines, especially considering Facebook’s history of privacy gaffs and penchant for videos of puppies and kids.




Are the iPhone 6s Sales Really Off The Chart?

September 22, 2015 by Michael  
Filed under Mobile

The Apple Press has been flat out claiming that the iPhone 6S is doing really well, based on the word of Apple CEO Tim Cook – however one analyst is unconvinced.

Andy Hargreaves of Pacific Crest has warned his clients that demand for the iPhone 6S may be meaningfully lower than last year’s model.

Hargreaves of Pacific Crest based his figures on Google search volume, device shipments availability, and third-party surveys.

He added that a lack of quantitative statements from Apple and the wireless carriers all point to weak iPhone demand.

Hargreaves is one of the top analysts on Wall Street. His picks average a 33 percent one-year return with a 70 percent success rate and he is ranked in the top 1 percent of all analysts, according to

“Apple’s statement appears to be a statement on supply. Relative to demand, the preponderance of data points suggests that demand for the iPhone 6s is lower than it was for the iPhone 6, possibly meaningfully so. This includes Google search data, device shipment times, third-party surveys, a lack of comments from carriers, and a lack of quantitative comment on pre-orders in Apple’s statement.”

iPhone 6S search volume is 75 percent below last year’s iPhone 6 and 25 percent lower than even the iPhone 5S according to Google Trends, said Hargreaves.

Needless to say the Tame Apple Press is furious. Fortune Magazine said it was amazing that the analyst referred to Google Search data as a way of calling BS on Tim Cook statement. After all Apple always tells the truth and never lies to its users.

However saner investment hacks who normally believe in Apple agree that Hargreaves is onto something.

It highlights fears in Wall Street that Apple relies too much in the Iphone. Weaker sales send a signal to the market that the company has lost its innovative bent and is beginning to lose its “cool factor,” it could also signal the end of Job’s Mob’s marvellous run as a growth stock.



Apple Drops iCloud Storage Plan Prices

September 22, 2015 by mphillips  
Filed under Around The Net

For the second time in as many years, Apple dropped prices for its expanded iCloud storage plans, putting costs in line with rivals like Google, Microsoft and Dropbox.

Apple announced changes to iCloud extra storage pricing earlier this month at the event where it unveiled new iPhones, the larger iPad Pro and a revamped Apple TV.

Although the Cupertino, Calif., company did not boost the amount of free storage space — as Computerworld speculated it might — and instead continued to provide just 5GB of iCloud space gratis, it bumped up the $0.99 per month plan from 20GB to 50GB, lowered the price of the 200GB plan by 25% to $2.99 monthly, and halved the 1TB plan’s price to $9.99.

Apple also ditched last year’s 500GB plan, which had cost $9.99 monthly.

The new prices are in line with the competition; in one case, Apple’s was lower.

Google, for example, hands out 15GB of cloud-based Google Drive storage for free — triple Apple’s allowance — and charges $1.99 monthly for 100GB and $9.99 each month for 1TB. The smaller-sized plan is 33% more per gigabyte than Apple’s 200GB deal, and Google’s 1TB plan is priced the same as Apple’s.

Microsoft also gives away 15GB. Additional storage costs $1.99 monthly for 100GB — the same price as Google Drive — while 200GB runs $3.99 per month, 33% higher than Apple’s same-sized plan.

Microsoft does not sell a separate 1TB OneDrive plan but instead directs customers to Office 365 Personal, the one-user subscription to the Office application suite. As part of the subscription, customers are given 1TB of OneDrive space. Office 365 Personal costs $6.99 monthly or $69.99 annually.