Subscribe to:

Subscribe to :: TheGuruReview.net ::

Microsoft Unveils Hologram Visor

January 23, 2015 by mphillips  
Filed under Consumer Electronics

Microsoft Corp surprised the tech world with the unveiling of a prototype hologram visor that can bring the Minecraft video game, Skype calls and even the landscape of Mars to three-dimensional life.

The veteran tech pioneer, which long ago lost the mantle of the world’s most inventive company, is making a bold play to regain that title in the face of stiff competition from Google Inc and Apple Inc.

Virtual or enhanced reality is the next frontier in computing interaction, with Facebook Inc focusing on its Oculus virtual reality headset and Google working on its Glass project.

Microsoft said its wire-free Microsoft HoloLens device will be available around the same time as Windows 10 this autumn. Industry analysts were broadly excited at the prospect, but skeptical that it could produce a working model at a mass-market price that soon.

“That was kind of a ‘Oh wow!’ moment,” said Mike Silver, an analyst at Gartner who tried out the prototype on Wednesday. “You would expect to see a relatively high-priced model this year or next year, then maybe it’ll take another couple of years to bring it down to a more affordable level.”

Microsoft does not have a stellar record of bringing ground-breaking technology to life. Its Kinect motion-sensing game device caused an initial stir but never gripped the popular imagination.

The company showed off a crude test version of the visor – essentially jerry-rigged wires and cameras pulled over the head – to reporters and industry analysts at a gathering at its headquarters near Seattle.

It did not allow any photographs or video of the experience, but put some images on its website.

 

 

Do Game Developers Have Unrealistic Expectations?

January 22, 2015 by Michael  
Filed under Gaming

Over the last few years, the industry has seen budget polarization on an enormous scale. The cost of AAA development has ballooned, and continues to do so, pricing out all but the biggest warchests, while the indie and mobile explosions are rapidly approaching the point of inevitable over-saturation and consequential contraction. Stories about the plight of mid-tier studios are ten-a-penny, with the gravestones of some notable players lining the way.

For a company like Ninja Theory, in many ways the archetypal mid-tier developer, survival has been a paramount concern. Pumping out great games (Ninja Theory has a collective Metacritic average of 75) isn’t always enough. Revitalizing a popular IP like DMC isn’t always enough. Working on lucrative and successful external IP like Disney Infinity isn’t always enough. When the fence between indie and blockbuster gets thinner and thinner, it becomes ever harder to balance upon.

Last year, Ninja Theory took one more shot at the upper echelons. For months the studio had worked on a big budget concept which would sit comfortably alongside the top-level, cross-platform releases of the age: a massive, multiplayer sci-fi title that would take thousands of combined, collaborative hours to exhaust. Procedurally generated missions and an extensive DLC structure would ensure longevity and engagement. Concept art and pre-vis trailers in place, the team went looking for funding. Razor was on its way.

Except the game never quite made it. Funding failed to materialize, and no publisher would take the project on. It didn’t help that the search for a publishing deal arrived almost simultaneously with the public announcement of Destiny. Facing an impossible task, the team abandoned the project and moved on with other ideas. Razor joined a surprisingly large pile of games that never make it past the concept stage.

Sadly, it’s not a new story. In fact, at the time, it wasn’t even a news story. But this time Ninja Theory’s reaction was different. This was a learning experience, and learning experiences should be shared. Team lead and co-founder Tameem Antoniades turned the disappointment not just into a lesson, but a new company ethos: involve your audience at an early stage, retain control, fund yourself, aim high, and don’t compromise. The concept of the Independent AAA Proposition, enshrined in a GDC presentation give by Antoniades, was born.

Now the team has a new flagship prospect, cemented in this fresh foundation. In keeping with the theme of open development and transparency, Hellblade is being created with the doors to its development held wide open, with community and industry alike invited to bear witness to the minutiae of the process. Hellblade will be a cross-platform game with all of the ambition for which Ninja Theory is known, and yet it is coming from an entirely independent standpoint. Self-published and self-governed, Hellblade is the blueprint for Ninja Theory’s future.

“We found ourselves as being one of those studios that’s in the ‘squeezed middle’,” project lead Dominic Matthews says. “We’re about 100 people, so we kind of fall into that space where we could try to really diversify and work on loads of smaller projects, but indie studios really have an advantage over us, because they can do things with far lower overheads. We have been faced with this choice of, do we go really, really big with our games and become the studio that is 300 people or even higher than that, and try to tick all of these boxes that the blockbuster AAA games need now.

“We don’t really want to do that. We tried to do that. When we pitched Razor, which we pitched to big studios, that ultimately didn’t go anywhere. That was going to be a huge game; a huge game with a service that would go on for years and would be a huge, multiplayer experience. Although I’m sure it would have been really cool to make that, it kind of showed to us that we’re not right to try to make those kinds of games. Games like Enslaved – trying to get a game like that signed now would be impossible. The way that it was signed, there would be too much pressure for it to be…to have the whole feature set that justifies a $60 price-tag.

“That $60 price-tag means games have to add multiplayer, and 40 hours of gameplay minimum, and a set of characters that appeal to as many people as they possibly can. There’s nothing wrong with games that do that. There’s some fantastic games that do, AAA games. Though we do think that there’s another space that sits in-between. I think a lot of indie games are super, super creative, but they can be heavily stylised. They work within the context of the resources that people have.

“We want to create a game that’s like Enslaved, or like DMC, or like Heavenly Sword. That kind of third-person, really high quality action game, but make it work in an independent model.”

Cutting out the middle-man is a key part of the strategy. But if dealing with the multinational machinery of ‘big pubs’ is what drove Ninja Theory to make such widespread changes, there must surly have been some particularly heinous deals that pushed it over the edge?

“I think it’s just a reality of the way that those publisher/developer deals work,” Matthews says. “In order for a publisher to take a gamble on your game and on your idea, you have to give up a lot. That includes the IP rights. It’s just the realities of how things work in that space. For us, I think any developer would say the same thing, being able to retain your IP is a really important thing. So far, we haven’t been out to do that.

“With Hellblade, it’s really nice that we can be comfortable in the fact that we’re not trying to appeal to everyone. We’re not trying to hit unrealistic forecasts. Ultimately, I think a lot of games have unrealistic forecasts. Everyone knows that they’re unrealistic, but they have to have these unrealistic forecasts to justify the investment that’s going into development.

“Ultimately, a lot of games, on paper, fail because they don’t hit those forecasts. Then the studios and the people that made those games, they don’t get the chance to make any more. It’s an incredibly tough market. Yes, we’ve enjoyed working with our publishers, but that’s not to say that the agreements that developed are all ideal, because they’re not. The catalyst to us now being able to do this is really difficult distribution. We can break away from that retail $60 model, where every single game has to be priced that way, regardless of what it is.

Driven into funding only games that will comfortably shift five or six million units, Matthews believes that publishers have no choice but to stick to the safe bets, a path that eventually winnows down diversity to the point of stagnation, where only a few successful genres ever end up getting made: FPS, sports, RPG, maybe racing. Those genres become less and less distinct, while simultaneously shoe-horning in mechanics that prove popular elsewhere and shunning true innovation.

While perhaps briefly sustainable, Matthews sees that as a creative cul-de-sac. Customers, he feels, are too smart to put up with it.

“Consumers are going to get a bit wary of games that have hundreds of millions of dollars spent on them”

“I think consumers are going to get a bit wary. Get a bit wary of games that have hundreds of millions of dollars spent on them. I think gamers are going to start saying, ‘For what?’

“The pressures are for games to appeal to more and more people. It used to be if you sold a million units, then that was OK. Then it was three million units. Now it’s five million units. Five million units is crazy. We’ve never sold five million units.”

It’s not just consumers who are getting wise, though. Matthews acknowledges that the publishers also see the dead-end approaching.

“I think something has to be said for the platform holders now. Along with digital distribution, the fact that the platform holders are really opening their doors and encouraging self-publishing and helping independent developers to take on some of those publishing responsibilities, has changed things for us. I think it will change things for a lot of other developers. “Hellblade was announced at the GamesCom Playstation 4 press conference. My perception of that press conference was that the real big hitters in that were all independent titles. It’s great that the platform holders have recognised that. There’s a real appetite from their players for innovative, creative games.

“It’s a great opportunity for us to try to do things differently. Like on Hellblade, we’re questioning everything that we do. Not just on development, but also how we do things from a business perspective as well. Normally you would say, ‘Well, you involve these types of agencies, get these people involved in this, and a website will take this long to create.’ The next thing that we’re doing is, we’re saying, ‘Well, is that true? Can we try and do these things a different way,’ because you can.

“There’s definitely pressure for us to fill all those gaps left by a publisher, but it’s a great challenge for us to step up to. Ultimately, we have to transition into a publisher. That’s going to happen at some point, if we want to publish our own games.”

Courtesy-GI.biz

Is Dailymotion Going After Twitch?

January 16, 2015 by Michael  
Filed under Gaming

Dailymotion has hopped onto the game streaming juggernaut and launched a game streaming service.

Dailymotion Games will put the firm into a market that so far includes Twitch, a streamer that has cemented its place as a gaming add-on and a coveted option on the Xbox One and PlayStation 4 consoles.

The Dailymotion information does not dwell on Twitch, which has been a feature of many a gushing press release from console makers such as Sony, but it does say that the live streaming gaming platform has some decent credentials.

For example, the promotional information says that the platform is backed with “industry leading video and live streaming technology”.

The firm also reminds us that it has some history here, and has been e-gaming for some time.

“Since 2011, with the first Dailymotion Cup on Starcraft, Dailymotion has accompanied e-sport growth on the internet,” said Martin Rogard, Dailymotion’s chief operating officer.

“Dailymotion Games is entirely dedicated to e-sports fans and streamers who come together every evening to form an amazingly talented and gregarious community.

“Over the coming months, we will significantly increase our investment in the e-sports domain to ensure worldwide recognition of all our talented content producers.”

Dailymotion said that live streamers will be able to monetise their content, which Microsoft recently confirmed is fine for its users, and that streamers could run their own “controlled video advertisement” and any number of social tools including search and real-time communications. Android and iOS apps are available.

The service is currently in beta. Dailymotion said that it serves some 180 million game videos a month.

 
Courtesy-TheInq

Did Sony Learn Anything From It’s Recent Breach?

January 13, 2015 by Michael  
Filed under Computing

Sony CEO Michael Lynton has told the Associated Press that the firm’s computer systems are still down, but, and thank someone for this, film and television production has not stopped a beat.

Yes, Sony, the firm that bought us a remake of Annie and a very divisive blunt edged political comedy called The Interview in the past couple of months, is still firing on production cylinders, if not sending emails.

A long interview with Associated Press finds Lynton not mentioning the North Korea words, but admitting that the hackers that have it between its teeth are pretty good at what they do and that Sony is on a real learning experience.

“We are the canary in the coal mine, that’s for sure,” said the CEO. “There is no playbook for this, so you are in essence trying to look at the situation as it unfolds and make decisions without being able to refer to a lot of experiences you’ve had in the past or other people’s experiences. You’re on completely new ground.”

Despite what you might have been led to believe, the assault on Sony has not been very costly, according to Lynton, who said that the firm has not had much more than a ripple to contend with.

“What I’m hearing so far is that they’re very manageable,” he added. “They’re not disruptive to the economic well being of the company.”

There has been some internal disruption, though, and Lynton said that staffers are paid with paper checks.

He confirmed that Sony’s technology people did scuttle about looking for workarounds and started using old BlackBerry handsets as part of a boots and braces response.

In the case of the latter, at least one firm was pleased about this news.

Courtesy-TheInq

 

Sony Says Recent Cyber Attack Will Have Minimal Financial Impact

January 8, 2015 by mphillips  
Filed under Around The Net

Sony Corp Chief Executive Kazuo Hirai said he does not expect the November cyber attack on the company’s film studio to have a significant financial impact, two weeks after the studio finally released the movie that spurred the attack.

The studio, Sony Pictures Entertainment, said separately that the film, “The Interview,” has generated revenue of $36 million.

Hirai told reporters at the Consumer Electronics Show in Las Vegas that he had signed off on all major decisions by the company in response to the attack, which the U.S. government has blamed on North Korea.

Sony’s network was crippled by hackers as the company prepared to release “The Interview,” a comedy about a fictional plot to assassinate North Korean leader Kim Jong Un. The attack was followed by online leaks of unreleased movies and emails that caused embarrassment to executives.

“We are still reviewing the effects of the cyber attack,” Hirai told reporters. “However, I do not see it as something that will cause a material upheaval on Sony Pictures business operations, basically, in terms of results for the current fiscal year.”

Sony Pictures said “The Interview,” which cost $44 million to make, has brought in $31 million in online, cable and satellite sales and was downloaded 4.3 million times between Dec. 24 and Jan. 4.

It has earned another $5 million at 580 independent theaters showing the movie in North America.

It is still unclear if Sony Pictures will recoup the costs of the film, starring Seth Rogen and James Franco, including an estimated $30 million to $40 million marketing bill.

On Monday, Hirai praised employees and partners of the Hollywood movie studio for standing up to “extortionist efforts” of hackers, his first public comments on the attack launched on Nov 21.

 

 

 

Will nVidia License It’s GPU Tech?

January 7, 2015 by Michael  
Filed under Computing

Back in June 2013 Nvidia said it would start licensing the Kepler GPU core, but since then we haven’t seen any developments on this front. However, things may be about to change with Maxwell.

Maxwell is a lot more efficient than Kepler and it may be more attractive to potential clients. Digitimes Research expects Nvidia to “rely on the licensing business” and allow its GPU patents to penetrate the mobile GPU market.

Analysts pointed out that Maxwell offers a 160% performance per area gain over Kepler, along with higher flexibility in the location of the GPU on the die. This added flexibility means chip designers should have a much easier time integrating it in various SoC designs.

Nvidia officially announced the Tegra X1 (Erista) just hours ago and this is the company’s first SoC with Maxwell graphics, built on the 20nm node. The company claims its 256-core GPU can outpace the competition with 1-teraflop performance, while at the same time providing competitive performance per watt.

Licensing Maxwell would make sense, but we still do not know how Nvidia plans to go about this. Whether or not it will offer the latest and most powerful designs remains to be seen. It should also be noted that GPU IP and licensing costs are comparatively low, making up only a tiny fraction of the overall cost of the chip.

A year ago industry sources told Fudzilla that the cost of deploying a mainstream GPU on a SoC tends to be very low, as little as 1-2%, or about 10 cents per mid-range chip. Mobile GPU licensing will clearly not be a big cash cow for Nvidia in the short term, but in the long haul the company could benefit indirectly by offering proprietary technologies and services tied to its GPU technology.

Courtesy-Fud

 

Was The PS4 Sales Flat Over The Holiday?

January 7, 2015 by Michael  
Filed under Gaming

While the Sony PlayStation 4 has been selling very well, it seems that Christmas was not really its season.

Sony said that the PlayStation 4 has sold more than 18.5 million units since the new generation of consoles launched. While that is good and makes the PS4 the fastest selling PlayStation to date, there was no peaking at Christmas.

You would think that the PS4 would sell well at Christmas as parents were forced to do grevious bodily harm to their credit cards to shut their spoilt spawn up during the school holidays. But apparently not.

Apparently, the weapon of choice against precious snowflakes being bored was an Xbox One which saw a Christmas spike in sales.

Sony said that its new numbers are pretty much on target, it sold the expected 2 million sales per month rate.

Redmond will be happy with that result even if it still has a long way to go before it matches the PlayStation 4 on sales.

Courtesy-Fud

Will The Xbox One Go Virtual Reality This Year?

January 6, 2015 by Michael  
Filed under Gaming

While we can’t get a real handle on when Microsoft might reveal the VR headset that they have had in development, we have learned from our sources that it is well into development and some selected developers already have developmental prototypes.

It is hard to say when Microsoft might actually reveal the new VR headset and technology, but it would seem that GDC or E3 would be the likely events to see it introduced. We do know that Microsoft is targeting 2015 to move the VR headset into mass production and it is thought that we will see versions for both the Xbox One and PC. Though we expect the PC version to come a little after the Xbox One version.

Rumor has it that the same development team that worked on the Surface tablet are the team that has taken on this project as well.

Courtesy-Fud

Acer Offering New Larger Chromebooks

January 6, 2015 by mphillips  
Filed under Computing

Acer is beefing up the size of chromebooks, offering the world’s first Chrome OS laptop with a 15.6-in. screen.

The Chromebook 15 is also the first device in the category with a processor based on Intel’s latest Broadwell circuit design. Starting at $249.99, the laptop will offer eight hours of battery life and be among the fastest chromebooks available.

Most chromebooks today have 11.6- or 13.3-in. screens, and are considered adequate for those who do most of their computing online. The Chromebook 15 will provide more screen space to read documents or watch movies.

An Internet connection is needed to run a majority of chromebook applications, but some can be used offline. Intel’s Broadwell-based Celeron processor will crank up the performance of both offline and online applications.

The Chromebook 15 will also deliver better graphics and video, which could be beneficial when watching movies. But the graphics quality won’t match that of Acer’s Chromebook 13, which has an Nvidia Tegra K1 chip that is capable of processing 4K video.

Chromebooks today are equipped with either ARM or Intel processors, and the fastest models have Intel’s Core i3 processors based on the Haswell microarchitecture. Broadwell is the successor to the Haswell microarchitecture. Intel is launching new Broadwell chips at International CES in Las Vegas, starting Jan. 6.

At 4.8 pounds, Acer’s Chromebook 15 could be a lighter and cheaper alternative to Windows desktop replacement laptops. The Chrome OS has built-in security features and will automatically update itself on a regular basis.

The Chromebook 15 has an unconventional design, with speaker bars placed next to the keyboard. The laptop supports up to 4GB of DRAM and 32GB of solid-state drive storage, much like the smaller-screen chromebooks. Other features include a webcam, 802.11ac Wi-Fi, Bluetooth 4.0 and SD card reader. The laptop also has USB 3.0, USB 2.0 and HDMI ports.

Acer said the Chromebook will be available in different regions. It will ship in the U.S. next month. The company will show the new Chromebook at CES.

 

 

 

Sony Offering Discounts After PlayStation Outage

January 5, 2015 by mphillips  
Filed under Gaming

If you received a PlayStation 4 for Christmas but network outages hampered you from using it, Sony wants to make it up to you.

Sony Computer Entertainment America will offer 10% off PlayStation Store purchases including games, TV shows and movies as a gesture of thanks for users’ patience following an outage of several days caused by denial-of-service (DDoS) attacks.

In addition, PlayStation Plus members who had an active membership or free trial on Dec. 25 will receive a membership extension of five days, Eric Lempel of Sony Network Entertainment wrote in a blog post.

Judging from the comments to the post, many PlayStation Network (PSN) users were happy about the offer, but not all of them.

“What I would like, more than anything else, is an explanation from Sony about how and why this will never happen again,” wrote one user. “Use the money to strengthen and diversify the network infrastructure so these types of attacks become harder to make and easier to recover from.”

In another blog post, Sony had attributed the outages to an attack creating “artificially high levels of traffic designed to disrupt connectivity and online gameplay.”

The DDoS attacks, which also took down Microsoft’s Xbox Live game network, were apparently launched by hacker group Lizard Squad, which later took aim at anonymous network Tor.

 

 

Hackers Continue Attack On Tor

December 29, 2014 by mphillips  
Filed under Around The Net

Hackers who apparently attacked Sony’s PlayStation Network (PSN) and Microsoft’s Xbox Live on Christmas Day have turned their attention towards anonymous network Tor.

Lizard Squad, which claimed responsibility for the outage, on Friday tweeted, “To clarify, we are no longer attacking PSN or Xbox. We are testing our new Tor 0day.”

While at least one site that maps the Tor network showed numerous routers with the name “LizardNSA,” the extent of any attack was unclear.

Tor directs user traffic through thousands of relays to ensure anonymity. In a Dec. 19 blog post, Tor managers warned of a possible attack, saying, “There may be an attempt to incapacitate our network in the next few days through the seizure of specialized servers in the network called directory authorities.”

Sony engineers, meanwhile, continued to struggle to get PSN back online Friday following the suspected denial-of-service (DDoS) attacks on Thursday.

Sony’s Twitter account for PSN asked frustrated gamers to be patient as staff worked to get the service back up and running, saying it did not know when PSN would be back online.

“We are aware that some users are experiencing difficulty logging into the PSN,” Sony said on its PlayStation support page, where the network was listed as offline.

In a Twitter post showing a chat with the alleged hackers, MegaUpload founder Kim Dotcom suggested he had convinced Lizard Squad to stop the attacks in return for lifetime memberships on his file-transfer site Mega.

Lizard Squad had taken credit for an apparent attack against PSN earlier this month, as well as an attack in August. The incident came at the same time that a U.S. flight carrying Sony Online Entertainment President John Smedley was diverted for security reasons.

 

 

Are Buggy Games Getting A Pass?

December 23, 2014 by Michael  
Filed under Gaming

Recently, my smartphone started acting up. I think the battery is on the way out; it does bizarre things, like shutting itself off entirely when I try to take a picture on 60per cent battery, or suddenly dropping from fully charged to giving me “10per cent remaining, plug me in or else” warnings for no reason at all. I can get it fixed free of charge, but it’s an incredibly frustrating, bothersome thing, especially given how much money I’ve paid for this phone. Most of us have probably had an experience like this with a piece of hardware; a shoddy washing machine that mangled your favorite shirt, a shiny new LCD screen with an intensely irritating dead pixel, an Xbox 360 whose Red Ring of Death demanded a lengthy trip back to the service center. There are few of us who can’t identify with the utter frustration of having a consumer product that you’ve paid good money for simply fail to do its job properly. Sure, it’s a #FirstWorldProblem for the most part (unless it’s something like a faulty airbag in your Honda, obviously), but it’s intensely annoying and certainly makes you less likely to buy anything from that manufacturer again.

Given that we could all probably agree that a piece of hardware being faulty is utterly unacceptable, I’m not sure why software seems to get a free pass sometimes. Sure, there are lots of consumers who complain bitterly about buggy games, but by and large games with awful quality control problems tend to get slapped with labels like “flawed but great”, or have their enormous faults explained in a review only to see the final score reflect none of those problems. It’s not just the media that does this (and for what it’s worth, I don’t think this is corruption so much as an ill-considered aspect of media culture itself); for every broken game, there are a host of consumers out there ready to defend it to the hilt, for whatever reason.

I raise this problem because, while buggy games have always been with us – often hilariously, especially back in the early days of the PlayStation – the past year or so has seen a spate of high-profile, problematic games being launched, suggesting that even some of the industry’s AAA titles are no longer free from truly enormous technical issues. The technical problems that have become increasingly prevalent in recent years are causing genuine damage to the industry; from the botched online launches of games like Driveclub and Battlefield through to the horrendous graphical problems that plague some players of Assassin’s Creed Unity, they are giving consumers terrible experiences of what should be high points for the medium, creating a loud and outspoken group of disgruntled players who act to discourage others, and helping to drive a huge wedge between media (who, understandably, want to talk about the experience and context of a game rather than its technical details) and consumers (who consider a failure to address glaring bugs to be a sign of collusion between media and publishers, and a failure on the part of the media to serve their audience).

We can all guess why this is happening. I don’t wish in any way to underplay how complex and difficult it is to develop bug-free software; I write software tools to assist in my research work, and given how often those simple tools, developed by two or three people at most, have me tearing my hair out at 3am as I search for the single misplaced character that’s causing the whole project to behave oddly, I am absolutely the last person in the world who is going to dismiss the difficulty involved in debugging something as enormous and complex as a modern videogame. Debugging games has inevitably become harder as team sizes and technical complexity has grown; that’s to be expected.

However, just because something is harder doesn’t mean it shouldn’t be happening, and that’s the second part of this problem. Games are developed to incredibly tight schedules, sometimes even tighter today (given the culture of annual updates to core franchises) than they were in the past. Enormous marketing budgets are preallocated and planned out to support a specific release date. The game can’t miss that date; if there are show-stopping bugs, the game will just have to ship with those in place, and with a bit of luck they’ll be able to fix them in time to issue a day-one digital patch (and if your console isn’t online, tough luck).

Yet this situation is artificial in itself. It’s entirely possible to structure your company’s various divisions around the notion that a game will launch when it’s actually ready, and ensure that you only turn out high-quality software; Nintendo, in particular, manages this admirably. Certainly, some people criticise the company for delaying software and it does open up gaps in the release schedule, but compared to the enormous opprobrium which would be heaped upon the company if it turned out a Mario Kart game where players kept falling through the track, or a Legend of Zelda where Link’s face kept disappearing, leaving only eyes and teeth floating ghoulishly in negative space (sleep well, kids!), an occasional delay is a corporate cultural decision that makes absolute sense – not only for Nintendo, but for game companies in general.

It doesn’t even have to go as far as delaying games on a regular basis. There is a strong sense that some of the worst offenders in terms of buggy games simply aren’t taking QA seriously, which is something that absolutely needs to be fixed – and if not, deserves significant punishment from consumers and critics alike. Quality control has a bit of an image problem; there’s a standard stereotype of a load of pizza-fuelled youngsters in their late teens testing games for a few years as they try to break into a “real” games industry job. The image doesn’t come from thin air; for some companies, this is absolutely a reality. It is, however, utterly false to think that every company sees its QA in those terms. For companies that take QA seriously, it’s a division that’s respected and well-treated, with its own career progression tracks, all founded on the basic understanding that a truly good QA engineer is worth his or her weight in gold.

Not prioritising your QA department – not ensuring that it’s a division that’s filling up with talented, devoted people who see QA as potentially being a real career and not just a stepping stone – is exactly the same thing as not prioritising your consumers. Not building time for proper QA into your schedules, or failing to enact processes which ensure that QA is being properly listened to and involved, is nothing short of a middle finger raised to your entire consumer base – and you only get to do that so many times before your consumers start giving the gesture right back to you and your precious franchises.

Media does absolutely have a role to play in this – one to which it has, by and large, not lived up. Games with serious QA problems do not deserve critical acclaim. I understand fully that reviewers want to engage with more interesting topics than technical issues, but I think it’s worth thinking about how film reviewers would treat a movie with unfinished special effects or audio mixed such that voices can’t be heard; or perhaps how music reviewers would treat an album with a nasty recording hiss in the background, or with certain tracks accidentally dropping out or skipping. Regardless of the good intentions of the creative people involved in these projects, the resulting product would be slammed, and rightly so. It’s perhaps the very knowledge of the drubbing that they would receive that means that such awful movies and albums almost never see the light of day (and when they do, they become legendary in their awfulness; consider the unfinished CGI at the end of “The Scorpion King”, which remains a watchword for terrible special effects many years later).

Game companies, by contrast, seem to feel unpleasantly comfortable with releasing games that don’t work and aren’t properly tested. Certain technical aspects probably contribute to this; journalists may be wary of slamming a game for bugs that may be fixed in a day-one patch, for instance. Yet it seems that there’s little choice but to make the criteria stricter in this regard. If media and consumers alike do not take to punishing companies severely for failing to pay proper respect to QA procedures for their games, this problem will only worsen as firms realize that they they can get away with launching unfinished software.

We all want a world where technical issues are nothing but a footnote in the discussion of games; that will be the ultimate triumph of game technology, when it truly becomes transparent. We do not, however, live in that time yet, and the regular launches of games that don’t live up to even the most basic standards of quality is something nobody should be asked to tolerate. The move by some websites to stop reviewing online games until the servers are live and populated with real players is a good start; but the overall tolerance for bugs and willingness to forgive publishers for such transgressions (“we know the last game was a buggy mess, but we’re still going to publish half a dozen puff pieces that will push our readers to pre-order the sequel!”) needs to be fixed. If we want to talk about the things that are important about games (and we do!), it’s essential that we fix the culture that ignores QA and technical issues first.

Courtesy-GI.biz

 

Are Indie Developers Dying Out?

December 22, 2014 by Michael  
Filed under Gaming

For independent developers, the last decade has been an endless procession of migratory possibilities. The physical world was defined by compromise, dependence and strategically closed doors, but the rise of digital afforded freedom and flexibility in every direction. New platforms, new business models, new methods of distribution and communication; so many fresh options appeared in such a brief window of time that knowing where and when to place your bet was almost as important as having the best product. For a few years, right around 2008, there was promise almost everywhere you looked.

That has changed. No matter how pregnant with potential they once seemed, virtually every marketplace has proved unable to support the spiralling number of new releases. If the digital world is one with infinite shelf-space for games, it has offered no easy solutions on how to make them visible. Facebook, Android, iOS, Xbox Live Arcade, the PlayStation Network; all have proved to be less democratic than they first appeared, their inevitable flaws exposed as the weight of choice became heavier and heavier. As Spil Games’ Eric Goossens explained to me at the very start of 2014: “It just doesn’t pay the bills any more.”

Of course, Goossens was talking specifically about indie development of casual games. And at that point, with 2013 only just receding from view, I would probably have named one exception to the trend, one place where the balance between volume and visibility gave indies the chance to do unique and personal work and still make a decent living. That place would have been Steam, and if I was correct in my assessment for even one second, it wasn’t too long before the harsher reality became clear.

After less than five months of 2014 had passed, Valve’s platform had already added more new games than in the whole of the previous year. Initiatives like Greenlight and Early Access were designed to make Steam a more open and accessible platform, but they were so effective that some of what made it such a positive force for indies was lost in the process. Steam’s culture of deep-discounting has become more pervasive and intense in the face of this chronic overcrowding, stirring up impassioned debate over what some believe will be profound long-term effects for the perceived value of PC games. Every discussion needs balance, but in this case the back-and-forth seemed purely academic: for a lot of developers steep discounts are simply a matter of survival, and precious few could even entertain the notion of focusing on the greater good instead.

And the indie pinch was felt beyond Steam’s deliberately weakened walls. Kickstarter may be a relatively new phenomenon – even for the hyper-evolving landscape of the games industry – but it faced similar problems in 2014, blighted by the twin spectres of too much content and not enough money to go around. Anecdotally, the notion that something had changed was lurking in the back ground at the very start of the year, with several notable figures struggling to find enough backers within the crowd. The latter months of 2014 threw up a few more examples, but they also brought something close to hard evidence that ‘peak Kickstarter’ may already be behind us – fewer successful projects, lower funding targets, and less money flowing through the system in general. None of which was helped by a handful of disappointing failures, each one a blow for the public’s already flagging interest in crowdfunding. Yet another promising road for indies had become more treacherous and uncertain.

So are indies heading towards a “mass extinction event”? Overcrowding is certainly a key aspect of the overall picture, but the act of making and releasing a game is only getting easier, and the allure of development as a career choice seems to grow with each passing month. It stands to reason that there will continue to be a huge number of games jostling for position on every single platform – more than even a growing market can sustain – but there’s only so much to be gained from griping about the few remaining gatekeepers. If the days when simply being on Steam or Kickstarter made a commercial difference are gone, and if existing discovery tools still lack the nuance to deal with all of that choice, then it just shifts the focus back to where it really belongs: talent, originality, and a product worth an investment of time and money.

At GDC Europe this summer, I was involved in a private meeting with a group of Dutch independent game developers, all sharing knowledge and perspective on how to find success. We finished that hour agreeing on much the same thing. There are few guarantees in this or any other business, but the conditions have also never been more appropriate for personality and individuality to be the smartest commercial strategy. The world has a preponderance of puzzle-platformers, but there’s only one Monument Valley. We’re drowning in games about combat, but This War of Mine took a small step to the left and was greeted with every kind of success. Hell, Lucas Pope made an entire game about working as a border control officer and walked away with not just a hit, but a mantelpiece teeming with the highest honours.

No matter how crowded the market has become, strong ideas executed with care are still able to rise above the clamour, no huge marketing spend required. As long as that’s still possible, indies have all of the control they need.

Courtesy-GI.biz

Microsoft Opens Up Halo

December 16, 2014 by Michael  
Filed under Gaming

Project Orleans, the cloud engine that powers Xbox hits Halo Reach and Halo 4, is being taken open source.

The engine, which has also played a vital role in the development of Microsoft’s Azure cloud computing platform, will be released under an MIT licence next year by Microsoft Technologies after being trailed at this year’s Microsoft Build Conference.

This is the latest in a long line of open-source announcements by Microsoft this year as the company tries to reinvent itself for the age where its stranglehold on the market has reduced and a wide variety of non-proprietary alternatives exist.

At the same Build conference, the company also announced that it will open source the .NET framework, on which most Windows applications depend.

The project, as described by the team itself, is “an implementation of an improved actor model that borrows heavily from Erlang and distributed objects systems, adds static typing, message indirection and actor virtualisation, exposing them in an integrated programming model”.

The team added that, whereas Erlang is a pure functional language with its own custom virtual machine, the Orleans programming model “directly leverages .NET and its object-oriented capabilities”.

One example available to try is an analysis of Twitter sentiment gauging reaction to a given hash-tag based on the language around it and creating visual representations of the mood of the web.

The code will be available as an extension to Microsoft Studio 12 or 13 with samples and supporting documentation already available, including for the Azure implementations. Non-Azure users can grab a free trial version before they buy.

Courtesy-TheInq

Is The Semiconductor Industry Really Growing?

December 5, 2014 by Michael  
Filed under Computing

The World Semiconductor Trade Statistics (WSTS) organization released its autumn 2014 industry forecast on Tuesday, predicting that the semiconductor market will continue to grow next year.

The WSTS reported that the global semiconductor market will see nine percent growth year over year in 2014 to $333bn, driven mainly by double digit growth in memory shipments and supported by growth in all other product categories.

The trade group said that the highest rates of growth this year are in memory products (17.3 percent), discrete products (12.3 percent) and analogue devices (10.3 percent).

Semiconductor shipments grew in all geographical regions this year, according to the WSTS, driven largely by strong demand in the smartphone and automotive markets.

Assuming that the global economic recovery will continue into 2015 and beyond and the strong semiconductor markets will continue to mature, the WSTS forecasts continuing steady, although moderating, market growth in all product categories and regions next year.

The WSTS forecasts that the worldwide semiconductor market will increase 3.4 percent in 2015 to $345bn, and 3.1 percent in 2016 to $355bn.

The automotive and communications product categories will show stronger growth than the global market as a whole, while consumer and computer product shipments will remain almost flat in the forecast period.

Asia-Pacific, which already accounts for nearly 60 percent of the global market, will continue to show the fastest growth in 2016, reaching a value of $209bn, according to the predictions.

In June, Gartner predicted that the global semiconductor market would increase to $336bn in 2014, which it reckoned would be 6.7 percent growth for the year.

The WSTS produces semiconductor industry forecasts in May and November each year.

Courtesy-TheInq