Subscribe to:

Subscribe to :: TheGuruReview.net ::

Will Big Huge Game Be Able To Make A Comeback?

October 30, 2014 by Michael  
Filed under Gaming

Brian Reynolds buys the rights to Big Huge Games from the State of Rhode Island at auction and then reopens the studio and teams with Nexon to deliver the new mobile title called DomiNations.

The game might be inspired by a lot of games, but the basic idea is that you are the leader of a Stone Age tribe and you have guide your tribe through civilization and human history. The ability exists for you to form alliances, trade with friends, and raid your enemies.

Reynolds has not said what is next for the new Big Huge Games, but if DomiNations is successful, it could fund more complex projects for console or PC according to our sources.

Courtesy-Fud

IBM To Offer Assistance To Battle Ebola

October 28, 2014 by Michael  
Filed under Around The Net

IBM is helping to contain the Ebola outbreak with tracking software that acts as a platform for sharing information about the disease.

Backed by supercomputer-powered, cloud-based software, IBM’s communications and data analysis system allows African citizens to communicate their concerns and report cases of the virus with voice calls or toll-free SMS directly to the government.

The data from the messages and locations can then be used by government agencies and health bodies to mobilise resources where they are most needed across the country.

It can also be used to find specific regions with growing numbers of suspected Ebola cases which require urgent supplies, as well as speeding up response times for body collection and burial.

The software was set up via a partnership between IBM’s recently established Africa research lab and Sierra Leone’s Open Government Initiative.

IBM’s chief scientist at the African research centre, Dr Uyi Stewart, said that the firm saw the need to quickly develop a system to enable communities directly affected by Ebola to provide valuable insight about how to fight it.

“Using mobile technology, we have given them a voice and a channel to communicate their experiences directly to the government,” he said.

Affected countries such as Sierra Leone have already benefited from the system, which has seen expedited deliveries of essential items such as soap and electricity.

The system also takes advantage of radio broadcasts to encourage people to get in touch and express their opinions about the outbreak. The general public are being alerted to the entire programme via this medium.

“Radio is a powerful medium in Africa but its potential to gather and analyse audience feedback has not been fully seized,” added Dr Sharath Srinivasan,  director of the Centre of Governance and Human Rights at Cambridge University.

“We are working with IBM to offer people across Sierra Leone a channel to voice their opinions and, crucially, to ensure that the data is rapidly analysed and turned into valuable insight about the effectiveness of public service announcements and possible public misconceptions about Ebola.”

IBM said it is currently looking to extend the work to analyse mobile phone signal data in order to monitor and track population movement, enabling scientists to map and predict the spread of disease.

Last week, it emerged that cyber criminals have been taking advantage of the recent Ebola outbreak to trick unsuspecting web users into downloading malware sent in emails that purport to come from the World Health Organisation (WHO).

Uncovered by security researchers at Trustwave, the malware was flagged when it appeared that criminals had crafted bogus WHO emails encouraging people to open a .RAR attachment to find out how they can protect themselves against Ebola.

Courtesy-TheInq

HP’s Helion Goes Commercial

October 27, 2014 by Michael  
Filed under Computing

HP has announced general availability of its Helion OpenStack cloud platform and Helion Development Platform based on Cloud Foundry.

The Helion portfolio was announced by HP earlier this year, when the firm disclosed that it was backing the OpenStack project as the foundation piece for its cloud strategy.

At the time, HP issued the HP Helion OpenStack Community edition for pilot deployments, and promised a full commercial release to follow, along with a developer platform based on the Cloud Foundry code.

HP revealed today that the commercial release of HP Helion OpenStack is now available as a fully supported product for customers looking to build their own on-premise infrastructure-as-a-service cloud, along with the HP Helion Development platform-as-a-service designed to run on top of it.

“We’ve now gone GA [general availability] on our first full commercial OpenStack product and actually started shipping it a couple of weeks ago, so we’re now open for business and we already have a number of customers that are using it for proof of concept,” HP’s CloudSystem director for EMEA, Paul Morgan, told The INQUIRER.

Like other OpenStack vendors, HP is offering more than just the bare OpenStack code. Its distribution is underpinned by a hardened version of HP Linux, and is integrated with other HP infrastructure and management tools, Morgan said.

“We’ve put in a ton of HP value add, so there’s a common look and feel across the different management layers, and we are supporting other elements of our cloud infrastructure software today, things like HP OneView, things like our Cloud Service Automation in CloudSystem,” he added.

The commercial Helion build has also been updated to include Juno, the latest version of the OpenStack framework released last week.

Likewise, the HP Helion Development Platform takes the open source Cloud Foundry platform and integrates it with HP’s OpenStack release to provide an environment for developers to build and deploy cloud-based applications and services.

HP also announced an optimised reference model for building a scalable object storage platform based on its OpenStack release.

HP Helion Content Depot is essentially a blueprint to allow organisations or service providers to put together a highly available, secure storage solution using HP ProLiant servers and HP Networking hardware, with access to storage provided via the standard OpenStack Swift application programming interfaces.

Morgan said that the most interest in this solution is likely to come from service providers looking to offer a cloud-based storage service, although enterprise customers may also deploy it internally.

“It’s completely customisable, so you might start off with half a petabyte, with the need to scale to maybe 2PB per year, and it is a certified and fully tested solution that takes all of the guesswork out of setting up this type of service,” he said.

Content Depot joins the recently announced HP Helion Continuity Services as one of the growing number of solutions that the firm aims to offer around its Helion platform, he explained. These will include point solutions aimed at solving specific customer needs.

The firm also last month started up its HP Helion OpenStack Professional Services division to help customers with consulting and deployment services to implement an OpenStack-based private cloud.

Pricing for HP Helion OpenStack comes in at $1,200 per server with 9×5 support for one year. Pricing for 24×7 support will be $2,200 per server per year.

“We see that is very competitively priced compared with what else is already out there,” Morgan said.

Courtesy-TheInq

Latest Ubuntu Server Goes After The Enterprise

October 27, 2014 by Michael  
Filed under Around The Net

Canonical has released Ubuntu Server 14.10 for data centre server and cloud applications, offering its latest technology for scale-out infrastructure.

The British software company claims that this latest release of Ubuntu Server features the fastest, most secure hypervisors available on bare metal, as well as the latest in container technologies with Docker 1.2.

Canonical says that Ubuntu Server 14.10 with Docker 1.2 is unique in that it offers user-level container management and includes support that enables higher density cloud operations than a virtualisation layer.

The firm is targeting large enterprises that want to deploy what it calls “scale-out” cloud computing with this release.

Canonical says that Ubuntu 14.10 includes some of the most valuable and complex cloud software technologies in use today, including Cloud Foundry, ElasticSearch, Hadoop with Hive and PigLatin as well as real-time data analytics with Storm big data technology.

The firm says that improved GUI for Juju service orchestration greatly simplifies deployment and scaling of these complex software infrastructures on public and private clouds, or on bare metal hardware through what it terms “metal as a service” (MaaS), claiming that full deployments take just minutes.

Canonical noted that its MaaS 1.6 hardware provisioning tool in Ubuntu Server 14.10 now supports a number of different operating systems as guests, including Windows Server with Hyper-V, CentOS and openSUSE.

Canonical also said that Ubuntu 14.10 presents a consistent operating system experience for all major hardware architectures: ARM, ARM64, x86, x86-64 and Power8. ARM64 support is added for the launch of next-generation hyperscale, hyperdense servers from HP and AMD.

The firm added that Ubuntu Server 14.10 includes the addition of bcache, which adds disk acceleration to extend SSD performance to large, cost-effective rotating disks.

For cloud deployments, Canonical said that Ubuntu Server 14.10 includes the latest OpenStack Juno, which includes more granular policy controls for object storage as well as initial support for network function virtualization.

Courtesy-TheInq

China Touts Homegrown Servers Admidst Cybersecurity Concerns

October 27, 2014 by mphillips  
Filed under Computing

A Chinese firm has developed the country’s first homegrown servers, built entirely out of domestic technologies including a processor from local chip maker Loongson Technology.

China’s Dawning Information Industry, also known as Sugon, has developed a series of four servers using the Loongson 3B processor, the country’s state-run Xinhua News Agency reported Thursday.

“Servers are crucial applications in a country’s politics, economy, and information security. We must fully master all these technologies,” Dawning’s vice president Sha Chaoqun was quoted as saying.

The servers, including their operating systems, have all been developed from Chinese technology. The Loongson 3B processor inside them has eight cores made with a total of 1.1 billion transistors built using a 28-nanometer production process.

The Xinhua report quoted Li Guojie, a top computing researcher in the country, as saying the new servers would ensure that the security around China’s military, financial and energy sectors would no longer be in foreign control.

Dawning was contacted on Friday, but an employee declined to offer more specifics about the servers. “We don’t want to promote this product in the U.S. media,” she said. “It involves propriety intellectual property rights, and Chinese government organizations.”

News of the servers has just been among the ongoing developments in China for the country to build up its own homegrown technology. Work is being done on local mobile operating systems, supercomputing, and in chip making, with much of it government-backed. Earlier this year, China outlined a plan to make the country into a major player in the semiconductor space.

But it also comes at a time when cybersecurity has become a major concern for the Chinese government, following revelations about the U.S. government’s own secret surveillance programs. “Without cybersecurity there is no national security,” declared China’s Xi Jinping in March, as he announced plans to turn the country into an “Internet power.”

Two months later, China threatened to block companiesfrom selling IT products to the country if they failed to pass a new vetting system meant to comb out secret spying programs.

Dawning, which was founded using local government-supported research, is perhaps best known for developing some of China’s supercomputers. But it also sells server products built with Intel chips. In this year’s first quarter, it had an 8.7 percent share of China’s server market, putting it in 7th place, according to research firm IDC.

 

 

Is Unity Up to Something Big?

October 24, 2014 by Michael  
Filed under Computing

Earlier today Unity Technologies caused quite a stir in the games industry with the announcement that former Electronic Arts chief exec John Riccitiello would be taking over the CEO job for David Helgason. While EA struggled to make shareholders happy, Unity has been seeing tremendous growth, becoming a favorite toolset for large and small publishers and especially indies. In fact, the company serves over 600,000 monthly developers. But what does Unity really have up its sleeve? Is the hiring of a notable leader like Riccitiello a sign that the company is indeed being groomed for a buyout or public offering?

“John Riccitiello’s corporate moves will rightfully inspire speculation about major changes in the companies involved and as Unity is the dominant independent development platform, what happens next could affect most developers and publishers outside of the top ten,” remarked independent analyst Billy Pidgeon. “An acquisition is very possible although Unity CTO Joachim Ante has denied this. Unity needs to be independent and available to all to retain and grow its value, so a sale to a major publisher or developer would sharply decrease the company’s revenue flow. But a buyer outside the industry could allow Unity to remain somewhat independent, although clients might be wary of doing business with Unity’s new owner.”

EEDAR’s Patrick Walker, head of insights and analytics, largely agreed with Pidgeon, commenting, “While the stature of Riccitiello as a hire and his interest in helming the Unity ship suggest that there are big plans in the works for the company, it is unlikely that these plans are focused on the short term, such as preparation for a near-term buyout. A buyout has been rumored for a while, and the Unity executive team, including founder David Helgason and CTO Joachim Ante, has been consistent in their messaging statement focusing on the company mission rather than pursuit of a buyout. More likely, Riccitiello is being brought on board to spur growth for a longer-term play, such as an eventual IPO or larger-scale buyout.”

Regardless of whether a longer-term buyout is in the cards, Riccitiello has the experience to help accelerate Unity’s growth in the next few years, most believe.

“Unity is a well-positioned company with several paths to increase growth. While game publishing is one route to spur growth, there is also an opportunity for the company to leverage the strengths, such as cross-platform flexibility, that have given it such broad penetration in the indie market to increase penetration in other development verticals,” Walker continued. “Riccitiello has an ideal background, having led major companies both inside and outside the games industry and having served on the Unity board for the past year, to drive partnerships that will help grow Unity as a major development platform across the full spectrum of publishers and developers.”

Wedbush Securities’ Michael Pachter added, “He is certainly capable of leading them, and also well equipped to sell the company. [But] I don’t know the reason for the change.”

Perhaps one major reason for the change is to offload some of the business responsibility from Helgason who may wish to focus more on product development.

“Unity has been growing quickly for several years. The company now has over 300 employees and its technology is being used by hundreds of thousands of developers on practically every platform out there. I suspect that Dave recognized some time ago that the company had to get an experienced business manager at the helm or risk flying off the rails at some point, and that’s exactly what JR is,” observed Lewis Ward, IDC’s gaming research director.

“Some people just aren’t cut out to be CEOs of big businesses – just look at Notch. I suspect that Dave is going to be happier staying focused on the core product strategy and building relationships with studios and indie developers. From JR’s perspective, it’s a great opportunity to ride the beast that has been Unity growth over the past 3+ years. It’s a remarkable story, and I think John is probably going to enjoy the role and stepping back into an important spotlight in the industry.”

Courtesy-TheInq

Survey Reveals Workers Using Personal Devices For Job-related Tasks

October 23, 2014 by mphillips  
Filed under Around The Net

Many workers use their personally owned smartphones and other technology devices for job tasks, but a new survey reveals a big percentage are doing so without their employer’s knowledge.

Market research firm Gartner surveyed 4,300 U.S. consumers in June who work at large companies (with more than 1,000 employees) and found 40% used personally owned smartphones, tablets, laptops or desktops as a primary or supplemental business device.

That 40% might not be unusual, but more surprisingly, Gartner found that 45% of workers not required to use a personal device for work were doing so without their employer’s knowledge.

“Almost half [are using their device] without their employer’s awareness,” said Gartner analyst Amanda Sabia in an interview.

“Are those without employer’s awareness violating a rule? That would depend on the employer,” Sabia added. “The point is that some CIOs are underestimating [the number of] employees using their devices and should be prepared for this.”

The Gartner survey found the most popular personally owned device used for work was a desktop computer, at 42%, closely followed by a smartphone, at 40%, a laptop, at 36%, and a tablet, at 26%.

“The lines between work and play are becoming more and more blurred as employees choose to use their own device for work purposes whether sanctioned by an employer or not,” Sabia said. “Devices once bought for personal use are increasingly used for work.”

 

 

U.S. Government Investigating Medical Device Hacking Threats

October 23, 2014 by mphillips  
Filed under Around The Net

The U.S. Department of Homeland Security is looking into about two dozen cases of suspected cybersecurity flaws in medical devices and hospital equipment that officials believe could be exploited by hackers, a senior official at the agency told Reuters.

The products under review by the agency’s Industrial Control Systems Cyber Emergency Response Team, or ICS-CERT, include an infusion pump from Hospira Inc and implantable heart devices from Medtronic Inc and St Jude Medical Inc, according to other people familiar with the cases, who asked not to be identified because the probes are confidential.

These people said they do not know of any instances of hackers attacking patients through these devices, so the cyber threat should not be overstated. Still, the agency is concerned that malicious actors may try to gain control of the devices remotely and create problems, such as instructing an infusion pump to overdose a patient with drugs, or forcing a heart implant to deliver a deadly jolt of electricity, the sources said.

The senior DHS official said the agency is working with manufacturers to identify and repair software coding bugs and other vulnerabilities that hackers can potentially use to expose confidential data or attack hospital equipment. He declined to name the companies.

“These are the things that shows like ‘Homeland’ are built from,” said the official, referring to the U.S. television spy drama in which the fictional vice president of the United States is killed by a cyber attack on his pacemaker.

“It isn’t out of the realm of the possible to cause severe injury or death,” said the official, who did not want to be identified due to the sensitive nature of his work.

Hospira, Medtronic and St Jude Medical declined to comment on the DHS investigations. All three companies said they take cybersecurity seriously and have made changes to improve product safety, but declined to give details.

 

 

Google Launches Two-Factor Security Key

October 23, 2014 by Michael  
Filed under Computing

Google has added account authentication via USB stick to its two-step verification process, offering users a more convenient way to sign in to their accounts in a secure manner.

As detailed in a Google security blog post, a compatible USB Security Key can now be used to log-in to Google accounts with two-step authentication.

The addition of the USB Security Key, Google claims, ensures that the log-in website is an actual Google website and not a fake.

Two-step authentication normally asks the user to enter a secret code sent to their phone in addition to entering their password online.

This process prevents potential attackers using passwords that might have been stolen or guessed in order to impersonate account holders, as presumably they won’t have the user’s phone to enter the code.

The USB Security Key adds another layer of protection to the process. Instead of entering a secret code, the user can simply insert their USB Security Key in their computer and tap when prompted in Google’s Chrome web browser.

Google said: “When you sign into your Google Account using Chrome and Security Key, you can be sure that the cryptographic signature cannot be phished.”

The USB Security Key implements the open Universal 2nd Factor protocol promoted by the FIDO Alliance, which means it can be used by other web browsers in addition to Chrome and other websites in addition to Google’s.

Google has recently enhanced the level of security it provides, and the extension of two-step authentication to include a physical security key is simply another step.

Courtesy-TheInq

Microsoft Releases First Windows 10 Update

October 23, 2014 by mphillips  
Filed under Computing

Microsoft has issued the first update for Windows 10 Technical Preview, launching its fast-paced release strategy.

The update, designated as Build 9860, followed the Oct. 1 release of the preview, which Microsoft has offered businesses and technology enthusiasts to give potential customers a look at the work in progress and collect feedback during development.

The Oct. 1 version of Windows 10 was labeled Build 9841.

“Sometimes [updates] will be more frequent and sometimes there will be longer gaps, but they will always be chock full of changes and improvements, as well as some bugs and things that are not quite done,” wrote Gabe Aul, of Microsoft’s Operating Systems Group on a company blog.

Aul said that Build 9860 had been handed to his group only a week ago, and repeated earlier warnings by other Microsoft managers that the preview remains incomplete and unpolished.

Although rapid iterations are nothing new to preview or beta software, Microsoft plans to accelerate the delivery of updates — ones that will include not only security patches and performance fixes, but also new features — once Windows 10 officially ships in mid-2015.

Updates will ship as often as monthly for consumers, while businesses will be able to choose between that and two additional tempos that Gartner has tagged as “near-consumer speed” and “long-term servicing.” The former will roll up the “consumer-speed” updates every four to six months to versions that fast-acting enterprises will test and deploy, while the latter will remain feature- and UI-static for as long as two to three years, receiving only security updates.

Other analysts have contended that Microsoft is pushing frequent updates to Windows 10 Technical Preview as much to test the process — both the back-end Windows Update service and the Windows 10 clients’ ability to absorb the changes and smoothly install the updates — as for the company’s stated reasons of gathering feedback and offering users an early look.

“Changes in Windows Update were put in place to make this possible,” Wes Miller, an analyst with Directions on Microsoft, said in an interview earlier this month. “The biggest question for Microsoft is how the updating process works with the Technical Preview.”

In the preview, customers have an update frequently choice of only “Fast” or “Slow.”

Build 9860 will be delivered automatically to most PCs running Windows 10 within days, but users can manually initiate the process by going to “PC Settings,” choosing “Update and recovery” and then “Preview builds,” and finally clicking the “Check Now” button.

Aul said that the download would weigh in at between 2GB and 2.7GB, and that the reboot, the reconstruction of the OS’s search index, and the syncing of OneDrive would take “longer than normal” and “some time.”

Microsoft will ship a second consumer-oriented preview in early 2015, but it’s virtually certain that the firm will provide more-or-less-monthly updates to the Technical Preview between now and then.

 

 

 

The Xbox One Goes Social Next Month

October 22, 2014 by Michael  
Filed under Gaming

November Xbox One update, explaining that it will throw a bucketful of new features into the console.

The firm polishes the console experience on a monthly basis and this month sees it swathe the device in tweaks and social networking positives.

Whether you use the console to browse the internet, talk to people, do social networking, watch television, or even play games, you will see some sort of improvement, according to spokeschap Major Nelson.

“We’re bringing you new and exciting ways to watch TV and interact with the Xbox Live gaming community in this month’s Xbox One system update preview. Today, we will begin rolling out a ton of new features to members of the Xbox One preview programme,” said Nelson in a blog that also introduces an excited video walkthrough.

Cosmetic features include the ability to change the background on your Xbox One, and even use achievements from games in your wallpaper.

Braggish players will be able to add their best clips to their profile page and generally swagger around the place, while people who like to crow on a range of platforms will be able to tweet clips from games.

Users can also share their location in their biography pages, and through the Smartglass app can see when anyone has checked out their profile.

Smartglass users can also check out their friends’ activities on the Xbox One, and can line up downloads of content, for example the free titles provided to Gold level subscribers.

The Xbox One store has been improved and Microsoft said that this would make it “easier to find and download apps for your Xbox One”.

The November update is out will be out, unsurprisingly, next month.

Courtesy-TheInq

Will Google’s Algorithm Change Stop Piracy?

October 22, 2014 by Michael  
Filed under Around The Net

Nosey Google has updated its search engine algorithms in an attempt to restrict piracy web sites appearing high in its search rankings.

The update will mean piracy sites are less likely to appear when people search for music, films and other copyrighted content.

The decision to roll out the search changes was announced in a refreshed version of a How Google Fights Piracy report, which was originally published in September 2013.

However, this year’s updated report features a couple of developments, including changes to ad formats and an improved DMCA demotion search signal.

The move is likely to be a result of criticism received from the entertainment industry, which has argued that illegal sites should be “demoted” in search results because they enable people to find sites to download media illegally.

The biggest change in the Google search update will be new ad formats in search results on queries related to music and movies that help people find legitimate sources of media.

For example, for the relatively small number of queries for movies that include terms like ‘download’, ‘free’, or ‘watch’, Google has instead begun listing legal services such as Spotify and Netflix in a box at the top of the search results.

“We’re also testing other ways of pointing people to legitimate sources of music and movies, including in the right-hand panel on the results page,” Google added.

“These results show in the US only, but we plan to continue investing in this area and to expand it internationally.”

An improved DMCA demotion signal in Google search is also being rolled out as part of the refresh, which down-ranks sites for which Google has received a large number of valid DMCA notices.

“We’ve now refined the signal in ways we expect to visibly affect the rankings of some of the most notorious sites. This update will roll out globally starting next week,” Google said, adding that it will also be removing more terms from autocomplete, based on DMCA removal notices.

The new measures might be welcomed by the entertainment industry, but are likely to encourage more people to use legal alternatives such as Spotify and Netflix, rather than buying more physical media.

Courtesy-TheInq

MasterCard Testing A New Card With Fingerprint Reader

October 20, 2014 by mphillips  
Filed under Around The Net

MasterCard is trying out a contactless payment card with a built-in fingerprint reader that can authorize high-value payments without requiring the user to enter a PIN.

The credit-card company showed a prototype of the card in London on Friday along with Zwipe, the Norwegian company that developed the fingerprint recognition technology.

The contactless payment card has an integrated fingerprint sensor and a secure data store for the cardholder’s biometric data, which is held only on the card and not in an external database, the companies said.

The card also has an EMV chip, used in European payment cards instead of a magnetic stripe to increase payment security, and a MasterCard application to allow contactless payments.

The prototype shown Friday is thicker than regular payment cards to accommodate a battery. Zwipe said it plans to eliminate the battery by harvesting energy from contactless payment terminals and is working on a new model for release in 2015 that will be as thin as standard cards.

Thanks to its fingerprint authentication, the Zwipe card has no limit on contactless payments, said a company spokesman. Other contactless cards can only be used for payments of around €20 or €25, and some must be placed in a reader and a PIN entered once the transaction reaches a certain threshold.

Norwegian bank Sparebanken DIN has already tested the Zwipe card, and plans to offer biometric authentication and contactless communication for all its cards, the bank has said.

MasterCard wants cardholders to be able to identify themselves without having to use passwords or PINs. Biometric authentication can help with that, but achieving simplicity of use in a secure way is a challenge, it said.

 

Openstack Releases Juno

October 20, 2014 by Michael  
Filed under Computing

Openstack has reached another major milestone today with the release of Juno, its newest version.

The latest version of the cloud computing stack contains 342 new features, 3,219 bug fixes, almost 500,000 lines of modified documentation and a new Architecture Design Guide.

1,419 unique contributors including representatives from 133 companies made it all happen over six months.

Last month it was revealed that HP had overtaken Red Hat in terms of overall contributions to Juno, and is closing in on Red Hat’s overall lead.

However, Red Hat has shifted focus more towards the cloud market in recent strategy announcements, so that lead could widen again.

The new version adds storage policies, data processing provisioning for Hadoop and Spark and takes the initial steps towards being a platform for Network Function Virtualisation (NFV) in a future release, meaning that it would be capable of managing a number of functions currently fulfilled by expensive software.

Other new features include Nova Compute, a rescue mode improvement with the option to boot from alternative images via locally attached disks, update scheduling and internationalisation updates.

For networking, the Neutron module includes IPv6 and third-party driver testing, plug-ins, and migration support from Nova to Neutron.

The Keystone identity service allows users to share credentials for private and public OpenStack clouds.

The Heat engine, which manages orchestration, includes advanced rollback options in the event of failed deployment and the option for administrators to delegate creation of resources to non-admins.

The Horizon Dashboard now offers Hadoop deployment in a few clicks, enabling rapidly scalable data processing with custom parameters.

Finally, the Trove database allows users to manage relational database servcies in the OpenStack environment.

Of course, OpenStack waits for no-one. With this release safely out, work now begins on the next version, codenamed Kilo, which is due in April 2015.

Courtesy-TheInq

 

FCC To Explore Next-Generation Wireless Networks

October 20, 2014 by mphillips  
Filed under Mobile

U.S. Federal Communications Commissioner Jessica Rosenworcel, on Friday, stated that U.S. regulators will look “to infinity and beyond” to harness new technology that can help build a new generation of mobile wireless connections.

The FCC on Friday voted unanimously to open a so-called “notice of inquiry” into what it and the industry can do to turn a new swath of very high-frequency airwaves, previously deemed unusable for mobile networks, into mobile-friendly frequencies.

The FCC’s examination would serve as a regulatory backdrop for research into the next generation of wireless technology, sometimes referred to as 5G and which may allow wireless connections to carry a thousand times more traffic.

“Today we’re stepping in front of the power curve,” FCC Chairman Tom Wheeler said on Friday at the meeting.

In question are frequencies above 24 gigahertz (GHz), sometimes called millimeter waves, that have previously been deemed technically unweildy for mobile connections, though have the potential to carry large amounts of data and give the promise of lightning-fast speeds.

Millimeter waves work best over short distances and have required a direct line-of-sight connection to a receiver. They are now largely used for point-to-point microwave connections.

The FCC said it will study what technologies could help get around the technological and practical obstacles and what kind of regulatory regime could help a variety of technologies to flourish on those airwaves, including the potential for services other than mobile.

The U.S. wireless industry continues to work on deploying the 4G connections, though some equipment manufacturers, such as Samsung are already testing data transmission on the higher frequencies.