Kuddle, a Norwegian photo-sharing app created for children, plans to roll out a child safe tablet with Microsoft on Dec 1, and expects to sign funding deals with several venture capital firms within weeks, its chief executive said on Monday.
The Oslo-based company said it was on track to reach its goal of one million users by year-end and plans to soon raise another $5 million of fresh funds on top of the nearly $6 million it has already raised.
“We are working with Microsoft on several child safe devices which will be sold on our online store,” Chief Executive Ole Vidar Hestaas said. “The first device will be an Ipad Mini sized tablet prized under $100 that will be ready ahead of the Kuddle Store launch.”
“This is a child friendly device and it is not possible to download games like GTA (Grand Theft Auto) or apps like Snapchat,” Hestaas said.
Kuddle, which bills itself as a rival to Instagram, lets parents monitor what their children publish and keeps access to content restricted, preventing strangers from seeing and sharing pictures. There are no hashtags or comments to prevent online bullying and “likes” are anonymous.
Hestaas said the company also is in talks with Samsung and Microsoft’s Nokia phones unit on similar cooperation, and that it was also working on deals with European telecoms operators Telenor and Vodafone for child safe Kuddle SIM cards to be sold separately or linked up to one of its devices.
The app, which has a target of 1 million users by the end of 2014, is now available in 7 languages. The most significant growth has recently come from Brazil and the US.
Hestaas said he expects to conclude funding deals with several major international venture capital funds within weeks.
The firm’s present investors include Norwegian golf ace Suzann Pettersen.
PS4 is going gangbusters, 3DS continues to impress, Steam and Kickstarter have between them overseen an extraordinary revitalisation of PC gaming, and mobile gaming goes from strength to strength; yet it’s absolutely clear where the eager eyes of most gamers are turned right now. Virtual reality headsets are, not for the first time, the single most exciting thing in interactive entertainment. At the Tokyo Game Show and its surrounding events, the strongest contrast to the huge number of mobile titles on display was the seemingly boundless enthusiasm for Sony’s Morpheus and Oculus’ Rift headsets; at Oculus’ own conference in California the same week, developers were entranced by the hardware and its promise.
VR is coming; this time, it’s for real. Decades of false starts, disappointments and dodgy Hollywood depictions will finally be left behind. The tech and the know-how have finally caught up with the dreams. Immersion and realism are almost within touching distance, a deep, involved experience that will fulfil the childhood wishes of just about every gamer and SF aficionado while also putting clear blue water between core games and more casual entertainment. The graphical fidelity of mobile devices may be rapidly catching up to consoles, but the sheer gulf between a VR experience and a mobile experience will be unmistakeable.
That’s the promise, anyway. There’s no question that it’s a promise which feels closer to fulfilment than ever before. Even in the absence of a final consumer product or even a release date, let alone a killer app, the prototypes and demos we’ve seen thus far are closer to “true” virtual reality than many of us had dared to hope. Some concerns remain; how mainstream can a product that relies on strapping on a headset to the exclusion of the real world actually become? (I wouldn’t care to guess on this front, but would note that we already use technology in countless ways that would have seemed alien, anti-social or downright weird to people only a generation ago.) Won’t an appreciable portion of people get motion sickness? (Perhaps; only widespread adoption will show us how widespread this problem really is.) There’s plenty to ponder even as the technology marches inexorably closer.
One thing I found myself pondering around TGS and Oculus Connect was the slightly worrying divergence in the strategies of Sony and Oculus. A year or even six months ago, it felt like these companies, although rivals, were broadly marching in lock step. Morpheus and Rift felt like very similar devices – Rift was more “hobbyist” yet a little more technically impressive, while Morpheus was more clearly the product of an experienced consumer products company, but in essence they shared much of the same DNA.
Now, however, there’s a clear divergence in strategy, and it’s something of a concern. Shuhei Yoshida says that Morpheus is 85% complete (although anyone who has worked in product development knows that the last 10% can take a hell of a lot more than 10% of the effort to get right); Sony is seemingly feeling reasonably confident about its device and has worked out various cunning approaches to make it cost effective, from using mobile phone components through to repurposing PlayStation Move as a surprisingly effective VR control mechanism.
By contrast, Oculus Connect showed off a new prototype of Rift which is still clearly in a process of evolution. The new hardware is lighter and more comfortable – closer to being a final product, in short – but it’s also still adding new features and functionality to the basic unit. Oculus, unlike Sony, still doesn’t feel like a company that’s anywhere close to having a consumer product ready to launch. It’s still hunting for the “right” level of hardware capabilities and functionality to make VR really work.
I could be wrong; Oculus could be within a year of shipping something to consumers, but if so, they’ve got a damned funny way of showing it. Based on the tone of Oculus Connect, the firm’s hugely impressive technology is still in a process of evolution and development. It barely feels any closer to being a consumer product this year than it did last year, and its increasingly complex functionality implies a product which, when it finally arrives, will command a premium price point. This is still a tech company in a process of iteration, discovering the product they actually want to launch; for Luckey, Carmack and the rest of the dream team assembled at Oculus, their VR just isn’t good enough yet, even though it’s moving in the right direction fast.
Sony, by contrast, now feels like it’s about to try something disruptive. It’s seemingly pretty happy with where Morpheus stands as a VR device; now the challenge is getting the design and software right, and pushing the price down to a consumer friendly level by doing market-disruptive things like repurposing components from its (actually pretty impressive) smartphones. Again, it’s possible that the mood music from both companies is misleading, but right now it feels like Sony is going to launch a reasonably cost-effective VR headset while Oculus is still in the prototyping phase.
These are two very different strategic approaches to the market. The worrying thing is that they can’t both be right. If Oculus is correct and VR still needs a lot of fine-tuning, prototyping and figuring out before it’s ready for the market, then Sony is rushing in too quickly and risks seriously damaging the market potential of VR as a whole with an underwhelming product. This risk can’t be overstated; if Morpheus launches first and it makes everyone seasick, or is uncomfortable to use for more than a short period of time, or simply doesn’t impress people with its fidelity and immersion, then it could see VR being written off for another decade in spite of Oculus’ best efforts. The public are fickle and VR has cried wolf too many times already.
If, on the other hand, Sony is correct and “good enough” VR tech is pretty much ready to go, then that’s great for VR and for PS4, but potentially very worrying for Oculus, who risk their careful, evolutionary, prototype after prototype approach being upended by an unusually nimble and disruptive challenge from Sony. If this is the case (and I’ve heard little but good things about Morpheus, which suggests Sony’s gamble may indeed pay off) then the Facebook deal could be either a blessing or a curse. A blessing, if it allows Oculus to continue to work on evolving and developing VR tech, shielding them from the impact of losing first-mover advantage to Sony; a curse, if that failure to score a clear win in the first round spooks Facebook’s management and investors and causes them to pull the plug. That’s one that could go either way; given the quality of the innovative work Oculus is doing, even if Sony’s approach proves victorious, everyone should hope that the Oculus team gets an opportunity to keep plugging away.
It’s exciting and interesting to see Sony taking this kind of risk. These gambles don’t always pay off, of course – the company placed bets on 3D TV in the PS3 era which never came to fruition, for example – but that’s the nature of innovation and we should never criticise a company for attempting something truly interesting, innovative and even disruptive, as long as it passes the most basic of Devil’s Advocate tests. Sony has desperately needed a Devil’s Advocate in the past – Rolly, anyone? UMD? – but Morpheus is a clear pass, an interesting and exciting product with the potential to truly turn around the company’s fortunes.
I just hope that in the company’s enthusiasm, it understands the absolute importance of getting this right, not just being first. This is a quality Sony was famed for in the past; rather than trying to be first to market in new sectors, it would ensure that it had by far the best product when it launched. This is one of the things which Steve Jobs, a huge fan of Sony, copied from the company when he created the philosophies which still guide Apple (a company that rarely innovates first, but almost always leapfrogs the competition in quality and usability when it does adopt new technology and features). For an experience as intimate as VR – complete immersion in a headset, screens mere centimetres from your eyes – that’s a philosophy which must be followed. When these headsets reach the market, what will be most important isn’t who is first; it isn’t even who is cheapest. The consumer’s first experience must be excellent – nothing less will do. Oculus seems to get that. Sony, in its enthusiasm to disrupt, must not lose sight of the same goal.
RedHat has announced the Fedora 21 Alpha release for Fedora developers and any brave users that want to help test it.
Fedora is the leading edge – some might say bleeding edge – distribution of Linux that is sponsored by Red Hat. That’s where Red Hat and other developers do new development work that eventually appears in Red Hat Enterprise Linux (RHEL) and other Red Hat based Linux distributions, including Centos, Scientific Linux and Mageia, among others. Therefore, what Fedora does might also appear elsewhere eventually.
The Fedora project said the release of Fedora 21 Alpha is meant for testing in order to help it identify and resolve bugs, adding, “Fedora prides itself on bringing cutting-edge technologies to users of open source software around the world, and this release continues that tradition.”
Specifically, Fedora 21 will produce three software products, all built on the same Fedora 21 base, and these will each be a subset of the entire release.
Fedora 21 Cloud will include images for use in private cloud environments like Openstack, as well as AMIs for use on Amazon, and a new image streamlined for running Docker containers called Fedora Atomic Host.
Fedora 21 Server will offer data centre users “a common base platform that is meant to run featured application stacks” for use as a web server, file server, database server, or as a base for offering infrastructure as a service, including advanced server management features.
Fedora 21 Workstation will be “a reliable, user-friendly, and powerful operating system for laptops and PC hardware” for use by developers and other desktop users, and will feature the latest Gnome 3.14 desktop environment.
Those interested in testing the Fedora 21 Alpha release can visit the Fedora project website.
When Titan first came to light in 2007, most people assumed it would be Blizzard’s next big thing, ultimately taking the place of World of Warcraft which was likely to see further declines in the years ahead. Fast forward seven years, WoW clearly has been fading (down to 6.8 million subs as of June 30) but Blizzard has no MMO lined up to replace it, and that fact was really hammered home today with the surprise cancellation of Titan. In fact, the developer stressed that it didn’t want to be known as an MMO company and one may not be in its future. Cancelling the project this late in the game may have cost Blizzard several tens of millions of dollars, analysts told GamesIndustry.biz.
“Development costs for Titan may have amounted to tens of millions, perhaps $50 million or more. This is not an unusual event, however. Blizzard has cancelled several games in various stages of development in the past. Costs for unreleased games can be significant, but launching substandard games can harm the reputation of a successful publisher such as Blizzard. Expenses for development can be considered R&D, and benefits can include invaluable training, IP and technology that can be applied to other games,” explained independent analyst Billy Pidgeon.
Wedbush Securities’ Michael Pachter estimated an even higher amount lost: “My guess is 100 – 200 people at $100,000 per year, so $70 – 140 million sunk cost. It’s pretty sad that it took so long to figure out how bad the game was. I expect them to go back to the drawing board.”
Indeed, the market has changed considerably in the last seven years, and while MMOs like EA’s Star Wars: The Old Republic struggle to find a large audience, free-to-play games and tablet games like Blizzard’s own Hearthstone are finding success. Blizzard has no doubt been keenly aware of the market realities too.
“As far back as 2013, they had already stated Titan was not likely to be a subscription-based MMORPG. This is consistent with a market that is increasingly dominated by multiplayer games that are either free to play or are an expected feature included with triple-A games such as Call of Duty. Titanfall and Destiny sold as standalone games supplemented by paid downloadable add-ons. Blizzard maintains very high standards of quality, so expectations will be steep for new franchises as well as for sequels,” Pidgeon continued.
DFC Intelligence’s David Cole agreed, noting that after seven years of development in an industry where trends and technologies change at a rapid pace, Blizzard simply had to pull the plug on Titan.
“They realized that unless a big MMO is out-of-this-world unbelievable it won’t work in today’s market where it competes against a bunch of low cost options. If they felt that it just wasn’t getting to that point it makes sense to cut your losses,” he noted. “Also, you see games like League of Legends and their own Hearthstone which are doing very well on a much lower budget.”
“For Blizzard, I am expecting to see them continue to focus on high quality products but also focus on products with shorter development cycles and less cost. The market is just not in a place where you can have games with 7+ year development. It is changing too fast.”
For most developers, junking a seven-year long project would instantly spell turmoil, but thankfully for Blizzard, it’s part of the Activision Blizzard behemoth, which has a market cap of over $15 billion and, as of June 30, cash and cash equivalents of over $4 billion on hand. It’s a nice luxury to have.
We attended the first ever Oculus Connect conference, the beats and chatter of a cocktail reception just next door, Max Cohen is being brutally honest about the company’s mobile-based virtual reality headset.
“I can spend ten minutes talking about the problems with this device. We’re not afraid of them,” the VP of mobile says with a smile.
“It overheats if you run it too long. It is 60 Hertz low persistence, which means some people will notice flicker. The graphical quality is obviously a lot less than the PC. Battery life is a concern. There’s no positional tracking.
“We could try to say this is the be-all end-all of VR. We’d be lying. That’s a bad thing. We would hurt where we can get to the be-all end-all of VR. Everyone, Samsung, Facebook, Oculus, we’re all aligned with making a damn good product that we put out in the market and then working on improving it. Really soon, maybe even sooner than you think, we’ll get to that amazing VR experience for everyone.”
“Samsung, Facebook, Oculus, we’re all aligned with making a damn good product”
Cohen’s talking about the Gear VR, the Samsung backed headset that offers a more portable and accessible entry into the virtual reality world for developers and users alike. It’s John Carmack’s passion project at the company and clearly it’s Cohen’s too.
“The first thing they did was to put me in the HD prototype with the Tuscany demo. I was floored, of course,” he remembers.
“Then I got to see the Valve room and then he showed me this mobile project. It was running on a Galaxy S4 at the time. It crashed a little bit. There were a lot of problems with it, but I just thought this was so amazing. I went back and was talking to a friend of mine who’s an entrepreneur. He said it’s rare that you have the opportunity to work on transformational hardware, and that’s really what this was.”
The story of the Gear VR is a simple one; Oculus went to the Korean company hoping to work with them on screens for the PC-based Rift and found Samsung had been working on a headset you could simply slide a Samsung Galaxy phone into to experience virtual reality. Now the companies are working together on both devices, with Samsung fielding calls from Carmack on a regular basis.
“It’s a collaboration. It’s not we tell them what to do or they tell us what to do,” Cohen continues. “We’re the software platform, so when you put that on, you’re in Oculus, but that wouldn’t be possible without maximizing the hardware. Carmack and our team works very closely with their engineering team. They make suggestions about UI as well. We’re working together to make the best possible experience. If it wasn’t collaborative, this thing just honestly wouldn’t function because this is really hard to do.”
The focus of Oculus Connect isn’t the media or sales or even recruitment, but developers. Supporting them, showing them the technology, offering them advice on the new territory that is virtual reality. Cohen, like everyone else I speak to at the weekend, believes developers and their content is absolutely key to the success of the hardware.
“At the end of the day, we want to make the developers’ lives as easy as possible so they can make cool content.”
“Facebook invested in the platform. They didn’t buy it. What they did is they’re also committing money to make sure it’s successful on an ongoing basis”
That content will be supported by an app store, and Cohen wants it to be a place where developers can make a living, rather than just a showcase of free demos. Jason Holtman, former director of business development at Valve, is overseeing its creation.
“We’re going to launch initially with a free store, but maybe a month later, follow along with commerce,” says Cohen.
“At the end of the day, as great as doing the art for free and sharing that is, we will have a hundred times more content when people can actually monetize it. This is a business. There’s nothing wrong with that. People need to be able to feed themselves. Our job is to make the platform as friendly for developers as we can so that it’s painless. You don’t have to worry about a bunch of overhead.”
There’s a sense that the Facebook money, that headline-grabbing $2 billion, has given the team the luxury of time and the chance to recruit the people they need to make sure this time virtual reality lives up to its promises. Other than that, Facebook seems to be letting Oculus just get on with it.
“That’s the thing… a lot of people, with the Facebook acquisition, asked how that would impact us and the answer is it hasn’t, in terms of our culture, and Facebook’s actually supportive of the way Oculus is because we know that content makes or breaks a platform,” says Cohen.
“They invested in the platform. They didn’t buy it. What they did is they’re also committing money to make sure it’s successful on an ongoing basis. We could have continued to raise a lot of venture capital. It would have been very expensive to do it right. Now we have replaced our board of directors with Facebook, but that’s completely fine. They are helping us. They are accelerating our efforts.”
No one at Oculus is talking about release dates for consumer units yet, and Cohen is no different. It’s clear that he and the team are hungry for progress as he talks about skipping minor updates and making major advances. He talks about “awesome” ideas that he’s desperate to get to, and pushing the envelope, but what matters most is getting it right.
“I think everyone understands that with a little bit more magic, VR can be ubiquitous. Everyone needs it. I think a lot of people understand what we need to do to get there, but it takes hard work to actually solve those things. Oculus and Facebook have lined up the right team to do it, but I want us to actually have time to do that,” says Cohen.
“We’re not trying to sell millions now. We’re trying to get people and early adopters, tech enthusiasts and all that interested in it.”
The product is a redesigned version of Atlas Advertiser Suite, an ad management and measurement platform that Facebook bought from Microsoft Corp last year.
It is expected to help marketers target Facebook users more effectively by measuring which users have seen, interacted or acted upon ads that appear on Facebook’s services and on third-party websites and apps.
The product will also provide a tool for marketers to buy ads to target Facebook users across the Web.
Microsoft took on Atlas with its $6.3 billion acquisition of digital ad agency aQuantive in 2007. Unable to make it work for its own purposes, Microsoft wrote off $6.2 billion of the aQuantive deal’s value in 2012.
The world’s No.1 Internet social network, which lags behind market leader Google Inc in U.S. market for online display ads, did not reveal how much it paid for the technology.
Facebook counts 1.5 million advertising customers and the company’s ad business saw strong growth across all of its geographic regions, Chief Operating Officer Sheryl Sandberg told Reuters in July.
Mobile advertising revenue grew 151 percent year-over-year, accounting for roughly 62 percent of Facebook’s overall ad revenue in the second quarter.
For much of the year we were under the impression that the second generation Maxwell will end up as a 20nm chip.
First-generation Maxwell ended up being branded as Geforce GTX 750 and GTX 750 TI and the second generation Maxwell launched a few days ago as the GTX 980 and Geforce GTX 970, with both cards based on the 28nm GM204 GPU.
This is actually quite good news as it turns out that Nvidia managed to optimize power and performance of the chip and make it one of the most efficient chips manufactured in 28nm.
Nvidia 20nm chips coming in 2015
Still, people keep asking about the transition to 20nm and it turns out that the first 20nm chip from Nvidia in 20nm will be a mobile SoC.
The first Nvidia 20nm chip will be a mobile part, most likely Erista a successor of Parker (Tegra K1).
Our sources didn’t mention the exact codename, but it turns out that Nvidia wants to launch a mobile chip first and then it plans to expand into 20nm with graphics.
Unfortunately we don’t have any specifics to report.
AMD 20nm SoC in 2015
AMD is doing the same thing as its first 20nm chip, codenamed Nolan, is an entry level APU targeting tablet and detachable markets.
There is a strong possibility that Apple and Qualcomm simply bought a lot of 20nm capacity for their mobile modem chips and what was left was simply too expensive to make economic sense for big GPUs.
20nm will drive the voltage down while it will allow higher clocks, more transistors per square millimeter and it will overall enable better chips.
Just remember Nvidia world’s first quad-core Tegra 3 in 40nm was rather hot and making a quad core in 28nm enabled higher performance and significantly better battery life. The same was true of other mobile chips of the era.
We expect similar leap from going down to 20nm in 2015 and Erista might be the first chip to make it to 20nm. A Maxwell derived architecture 20nm will deliver even more efficiency. Needless to say AMD plans to launch 20nm GPUs next year as well.
It looks like Nvidia’s 16nm FinFET Parker processor, based on the Denver CPU architecture and Maxwell graphics won’t appear before 2016.
Red Hat has acquired Feedhenry, a designer of mobile apps for the enterprise market.
The company sees the acquisition as a key driver to offer cross-platform support for its existing software products, including Red Hat Enterprise Linux Openstack 7, which it released earlier this year.
Feedhenry uses Node.js architecture to create mobile apps supporting both the client and server, running natively across Android, iOS, Windows Phone and Blackberry, as well as offering web apps in HTML5. It combines a wide range of toolkits and APIs offering integration with existing systems and most popular software applications from enterprise vendors like Salesforce, SAP and Oracle.
The purchase price is said to be approximately $82 million in cash (just over $8m) and is expected to close in quarter three fiscal year 2015.
Craig Muzilla, SVP of the Application Platform Business group at Red Hat said, “The mobile application platform is one of the fastest growing segments of the enterprise software market. As mobile devices have penetrated into every aspect of enterprise computing, enterprise software customers are looking for easier and more efficient ways for their developers to build mobile applications that extend and enhance traditional enterprise applications.”
“Feedhenry will help us enable customers to take advantage of the capabilities of mobile with the security, scalability, and reliability of Red Hat enterprise software.”
Red Hat said that it will continue to sell and support Feedhenry is products and work with its existing customer base. Feedhenry products will continue to offer a wide variety of cloud deployments, but under the ownership of Red Hat is likely to see particular emphasis on Openshift and Openstack. At the end of last month, Red Hat’s long-serving CTO Brian Stevens left the firm, according to a brief press announcement.
IBM has launched a beta of Watson Analytics, an interactive Q&A service designed to answer questions and highlight trends within sets of enterprise data.
The service “is about putting powerful analytics in the hands of every business user,” said Eric Sall, IBM vice president of marketing for business analytics.
Traditional business intelligence tools remain too difficult to use for business managers, Sall said. “It is hard to get the data. It is hard to analyze the data if you’re not a specialist, and it is hard to use the tools,” he said. Watson Analytics attempts to streamline the process.
Natural language systems are becoming increasingly prevalent as a form of human-computer interface. Apple’s Siri, Google’s GoogleNow and Microsoft’s Cortana all act as virtualized personal assistants, able to answer a range of simple questions on behalf of their users.
Watson Analytics operates in a similar manner, in that it can offer responses to questions posed by the user in their chosen language, rather than forcing the user to develop a SQL query, master a complex statistical package or write data-parsing code to better understand some large set of data.
The effort is the latest move in IBM’s $1 billion initiative to commercialize Watson technologies.
IBM Research developed Watson to compete with human contestants on the “Jeopardy” game show in 2011, using natural language processing and analytics, as well as many sources of structured and unstructured data, to formulate responses to the show’s questions.
In the years since, the company has been working to commercialize the Watson technology by identifying industries that could benefit from this form of cognitive computing, such as health care, law enforcement and finance.
Earlier this year, IBM launched the Watson Discovery Advisor, which is customized for scientific researchers who need to deeply probe one specific body of scientific knowledge, such as chemistry or cellular biology.
Another service, the company’s Watson Engagement Advisor, uses the artificial intelligence technology to aid in customer support.
EA is considering developing games for wearables. The company already has two teams on the job, looking for ways to make wearable games. Their efforts are focused on the Apple Watch for now.
EA told CNET that the company has quite a relationship with Apple and Frank Gibeau, head of EA’s mobile gaming arm, said he is impressed with the new Apple A8 SoC. Gibeau added that Apple’s decision to include 128GB storage in flagship models is more good news for gamers, as it raises the bar for developers and gives them more room to play around with.
Gibeau said EA’s mobile division is “intrigued” by the prospect of gaming on wearables. He said wearables are eventually going to offer more performance and capability, thus enabling new gaming experiences. However, he cautioned that “it’s very early days” for wearable gaming.
“In fact, we have two teams prototyping wearable experiences that are not only standalone, but also some ideas where you can actually use the fitness component in the watch that can unlock capabilities in the game that might be on your iPhone. Or you could do crafting or some other auction trading on your watch that goes back into your tablet game that you might check out later when you get home,” he told CNET.
Intel has announced that it is sampling its Xeon D 14nm processor family, a system on chip (SoC) optimized to deliver Intel Xeon processor performance for hyperscale workloads.
Announcing the news on stage during a keynote at IDF in San Francisco, Intel SVP and GM of the Data Centre Group, Diane Bryant, said that the Intel Xeon processor D, which initially was announced in June, will be based on 14nm process technology and be aimed at mid-range communications.
“We’re pleased to announce that we’re sampling the third generation of the high density [data center system on a chip] product line, but this one is actually based on the Xeon processor, called Xeon D,” Bryant announced. “It’s 14nm and the power levels go down to as low as 15 Watts, so very high density and high performance.”
Intel believes that its Xeon D will serve the needs of high density, optimized servers as that market develops, and for networking it will serve mid-range routers as well as other network appliances, while it will also serve entry and mid-range storage. So, Intel claimed, you will get all of the benefits of Xeon-class reliability and performance, but you will also get a very small footprint and high integration of SoC capability.
This first generation Xeon D chip will also showcase high levels of I/O integrations, including 10Gb Ethernet, and will scale Intel Xeon processor performance, features and reliability to lower power design points, according to Intel.
The Intel Xeon processor D product family will also include data centre processor features such as error correcting code (ECC).
“With high levels of I/O integration and energy efficiency, we expect the Intel Xeon processor D product family to deliver very competitive TCO to our customers,” Bryant said. “The Intel Xeon processor D product family will also be targeted toward hyperscale storage for cloud and mid-range communications market.”
Bryant said that the product is not yet available, but it is being sampled, and the firm will release more details later this year.
This announcement comes just days after Intel launched its Xeon E5 v2 processor family for servers and workstations.
Finally, Ubisoft has a release date for the Wii U version of Watch Dogs. While we don’t know if that many people are waiting for the Wii U version, when it does release it could very well end up being one of the last M rated titles for the Wii U console.
The release date for the Wii U version of Watch Dogs appears to be November 18th in North America and November 21st in Europe. This ends the original release delay that Ubisoft announced for the Wii U version as resources were moved to prepare the other versions of the game for release.
Ubisoft has been one of the strongest supports of software for the Wii U, but recently it announced that it was done producing titles like Assassins Creed and Watch Dogs for the Wii U because the sales of these M rated titles are just not there on the Wii U platform. It did indicate that it would focus on some of its other Wii U titles that continue to be popular on the console.
The news is good that they are getting Watch Dogs, but it looks like we will not see many more games like this on the Wii U.
You can’t accuse eSports League CEO Ralf Reichert of always telling people what they want to hear. At last month’s FanExpo Canada in Toronto, Ontario, just a few blocks away from the Hockey Hall of Fame, Reichert told GamesIndustry.biz that he saw competitive gaming overtaking the local pastime.
“Our honest belief is it’s going to be a top 5 sport in the world,” Reichert said. “If you compare it to the NHL, to ice hockey, that’s not a first row sport, but a very good second-row sport. [eSports] should be ahead of that… It’s already huge, it’s already comparable to these traditional sports. Not the Super Bowl, but the NHL [Stanley Cup Finals].”
Each game of this year’s Stanley Cup Finals averaged 5 million viewers on NBC and the NBC Sports Network. The finals of the ESL Intel Extreme Masters’ eighth season, held in March in Katowice, Poland, drew 1 million peak concurrent viewers, and 10 million unique viewers over the course of the weekend. That’s comparing the US audience for hockey to a global audience for the IEM series, but Reichert said the events are getting larger all the time.
As for how eSports have grown in recent years, the executive characterized it as a mostly organic process, and one that sometimes happens in spite of the major players. One mistake he’s seen eSports promoters make time and again is trying to be too far ahead of the curve.
“There have been numerous attempts to do celebrity leagues as a way to grow eSports, to make it more accessible,” Reichert said. “And rather than focusing on the core of eSports, the Starcrafts and League of Legends of the world, people tried to use easy games, put celebrities on it, and make a classic TV format out of it.”
One such effort, DirecTV’s Championship Gaming Series, held an “inaugural draft” at the Playboy Mansion in Beverly Hills and featured traditional eSports staples like Counter-Strike: Source alongside arguably more accessible fare like Dead or Alive 4, FIFA 07, and Project Gotham Racing 3.
“They put in tens of millions of dollars in trying to build up a simplified eSports league, and it was just doomed because they tried to simplify it rather than embrace the beauty of the apparent complexity.”
Complexity is what gives established sports their longevity, Reichert said. And while he dismisses the idea that eSports are any more complex than American football or baseball, he also acknowledged there is a learning curve involved, and it’s steep enough that ESL isn’t worrying about bringing new people on board.
“It’s tough for generations who didn’t grow up with gaming to get what Starcraft is,” Reichert said. “They need to spend 2-10 hours with it, in terms of watching it, getting it explained, and getting educated around it, or else they still might have that opinion. Our focus is more to have the generations who grew up with it as true fans, rather than trying to educate people who are outside of this conglomerate… There have been numerous attempts to make European soccer easier to approach, or American football, or baseball, but they all kill the soul of the actual sport. Every attempt to do that is just doomed.”
Authenticity is what keeps the core of the audience engaged, Reichert said. And even though there will always be purists who fuss over every change–Reichert said changing competitive maps in Starcraft could spark a debate like instant replay in baseball–being true to the core of the original sport has been key for snowboarding, mixed martial arts, and every other successful upstart sport of the last 15 years.
“Like with every new sport, the biggest obstacle has been people not believing in it,” Reichert said. “And it goes across media, sponsorships, game developers, press, everyone. The acceptance of eSports was a hard fought battle over a long, long time, and there’s a tipping point where it goes beyond people looking at it like ‘what the hell is this?’ And to reach that point was the big battle for eSports… The thing is, once we started to fill these stadiums, everyone looking at the space instantly gets it. Games, stadiums, this is a sport. It’s such a simple messaging that no one denies it anymore who knows about the facts.”
That’s not to say everybody is convinced. ESPN president John Skipper recently dismissed eSports as “not a sport,” even though his network streamed coverage of Valve’s signature Dota 2 tournament earlier this year. Reichert admitted that mainstream institutions seem to be lagging behind when it comes to acceptance, particularly with sponsors. While companies within the game industry are sold on eSports, non-endemic advertisers are only beginning to get it.
“The very, let’s say progressive ones, like Red Bull, are already involved,” Reichert said. “But to get it into the T-Mobiles and other companies as a strategy piece, that will still take some time. The market in terms of the size and quality of events is still ahead of the sponsorship, but that’s very typical.”
Toronto was the second stop for ESL’s IEM Season 9 after launching in Shenzhen July 16. The league is placing an international emphasis on this year’s competition, with additional stops planned in the US, Europe, and Southeast Asia.
In July, Gamasutra’s annual developer salary survey reported that the best compensated job for hands-on game creators wasn’t programmer or producer, but audio professional. That didn’t sound right to the organizers of audio conference GameSoundCon, so they conducted their own survey aimed squarely at audio specialists in the gaming industry, the results of which they released today.
Gamasutra acknowledged its own numbers on audio professionals were likely skewed by a few factors. They only had 33 respondents, they only counted full-time professionals even though audio work is frequently done on a freelance basis, and their survey base of Game Developer Conference attendees was likely skewed to more senior people, as developers might not invest in sending fresh recruits to the show. GameSoundCon’s survey drew 514 responses, and as might be expected, painted a less lucrative picture of the field.
“Most game audio jobs, whether they are composers or sound designers, are freelance,” said GameSoundCon executive director Brian Schmidt. “Game audio is increasingly an outsourced industry.”
According to the survey, the average salaried audio professional position in the game industry pays $70,532. However, only 37 percent of those who took the survey were salaried employees. About 12 percent of respondents said they were paid by the hour, day, or week.
For freelance work, the average project fee was $28,091. However, that number was skewed significantly by big-budget games, where per-project fees could come in greater than $250,000. For indie or casual games, the average project fee dropped to just $9,830. For projects where the audio contractor retained rights to their work, the average fee dipped still lower, to $4,481, with as many projects paying $1,500 or less as there were paying more.
“There does seem to be a good ‘career path’ in game audio,” Schmidt added. “You can start out as a composer for indie games, and end up with a 6-figure salary as an audio director. Being able to get technical definitely gives you a leg up; more than 60 percent of responders say they provided audio content as well as technical services for implementation of the audio.”
The survey also underscored some rarities in the field. Gender diversity is lacking among audio professionals, as 96 percent of respondents were male. Royalties are also rare, with only 2 percent of composers per-unit payments for big-budget titles. Royalties were somewhat more common among indie and casual projects, with 17 percent reporting per-unit payments.
Soundtrack sales also didn’t do much to pad composers’ pockets, as 5 percent of large-budget games included a clause paying out for soundtrack sales. However, that number increased to 18 percent for indie or casual titles.
As the company tries to revive MSN, the focus this time is also on top content from the Web instead of offering original content. For the relaunch, the company has signed up with over 1,300 publishers worldwide including The New York Times, The Wall Street Journal, Yomiuri, CNN and The Guardian.
A “Services Stripe” at the top of the MSN homepage gives users easy access to personal services including Outlook.com email, OneDrive, Office 365 and Skype, as well as popular third-party sites like Twitter and Facebook, according to an online preview launched by Microsoft on Sunday.
The new MSN also provides “actionable information” and content and personal productivity tools such as shopping lists, a savings calculator, a symptom checker, and a 3D body explorer. Readers will have access to content from 11 sections including sports, news, health and fitness, money, travel and video, wrote Frank Holland, corporate vice president of Microsoft Advertising, in a blog post.
The company said it has rebuilt MSN from the ground up for a mobile-first, cloud-first world. The new MSN helps people complete their key digital tasks across all of their devices, wrote Brian MacDonald, Microsoft’s corporate vice president for information and content experiences, in a blog post.
“Information and personalized settings are roamed through the cloud to keep users in the know wherever they are,” MacDonald added. Users worldwide can try out the new MSN preview.
In the coming months, Microsoft plans to release MSN apps across iOS and Android to complement its corresponding Windows and Windows Phone apps. “You only need to set your favorites once, and your preferences will be connected across MSN, Cortana, Bing and other Microsoft experiences,” MacDonald wrote.
Microsoft claims an audience of more than 437 million people across 50 countries for MSN.
MSN.com ranks number 26 among the top sites in the U.S., behind Microsoft’s own Bing site, Google’s search site, YouTube, Facebook and Yahoo’s portal, according to traffic estimates by Alexa.