Subscribe to:

Subscribe to :: TheGuruReview.net ::

Do Game Developers Have Unrealistic Expectations?

January 22, 2015 by Michael  
Filed under Gaming

Over the last few years, the industry has seen budget polarization on an enormous scale. The cost of AAA development has ballooned, and continues to do so, pricing out all but the biggest warchests, while the indie and mobile explosions are rapidly approaching the point of inevitable over-saturation and consequential contraction. Stories about the plight of mid-tier studios are ten-a-penny, with the gravestones of some notable players lining the way.

For a company like Ninja Theory, in many ways the archetypal mid-tier developer, survival has been a paramount concern. Pumping out great games (Ninja Theory has a collective Metacritic average of 75) isn’t always enough. Revitalizing a popular IP like DMC isn’t always enough. Working on lucrative and successful external IP like Disney Infinity isn’t always enough. When the fence between indie and blockbuster gets thinner and thinner, it becomes ever harder to balance upon.

Last year, Ninja Theory took one more shot at the upper echelons. For months the studio had worked on a big budget concept which would sit comfortably alongside the top-level, cross-platform releases of the age: a massive, multiplayer sci-fi title that would take thousands of combined, collaborative hours to exhaust. Procedurally generated missions and an extensive DLC structure would ensure longevity and engagement. Concept art and pre-vis trailers in place, the team went looking for funding. Razor was on its way.

Except the game never quite made it. Funding failed to materialize, and no publisher would take the project on. It didn’t help that the search for a publishing deal arrived almost simultaneously with the public announcement of Destiny. Facing an impossible task, the team abandoned the project and moved on with other ideas. Razor joined a surprisingly large pile of games that never make it past the concept stage.

Sadly, it’s not a new story. In fact, at the time, it wasn’t even a news story. But this time Ninja Theory’s reaction was different. This was a learning experience, and learning experiences should be shared. Team lead and co-founder Tameem Antoniades turned the disappointment not just into a lesson, but a new company ethos: involve your audience at an early stage, retain control, fund yourself, aim high, and don’t compromise. The concept of the Independent AAA Proposition, enshrined in a GDC presentation give by Antoniades, was born.

Now the team has a new flagship prospect, cemented in this fresh foundation. In keeping with the theme of open development and transparency, Hellblade is being created with the doors to its development held wide open, with community and industry alike invited to bear witness to the minutiae of the process. Hellblade will be a cross-platform game with all of the ambition for which Ninja Theory is known, and yet it is coming from an entirely independent standpoint. Self-published and self-governed, Hellblade is the blueprint for Ninja Theory’s future.

“We found ourselves as being one of those studios that’s in the ‘squeezed middle’,” project lead Dominic Matthews says. “We’re about 100 people, so we kind of fall into that space where we could try to really diversify and work on loads of smaller projects, but indie studios really have an advantage over us, because they can do things with far lower overheads. We have been faced with this choice of, do we go really, really big with our games and become the studio that is 300 people or even higher than that, and try to tick all of these boxes that the blockbuster AAA games need now.

“We don’t really want to do that. We tried to do that. When we pitched Razor, which we pitched to big studios, that ultimately didn’t go anywhere. That was going to be a huge game; a huge game with a service that would go on for years and would be a huge, multiplayer experience. Although I’m sure it would have been really cool to make that, it kind of showed to us that we’re not right to try to make those kinds of games. Games like Enslaved – trying to get a game like that signed now would be impossible. The way that it was signed, there would be too much pressure for it to be…to have the whole feature set that justifies a $60 price-tag.

“That $60 price-tag means games have to add multiplayer, and 40 hours of gameplay minimum, and a set of characters that appeal to as many people as they possibly can. There’s nothing wrong with games that do that. There’s some fantastic games that do, AAA games. Though we do think that there’s another space that sits in-between. I think a lot of indie games are super, super creative, but they can be heavily stylised. They work within the context of the resources that people have.

“We want to create a game that’s like Enslaved, or like DMC, or like Heavenly Sword. That kind of third-person, really high quality action game, but make it work in an independent model.”

Cutting out the middle-man is a key part of the strategy. But if dealing with the multinational machinery of ‘big pubs’ is what drove Ninja Theory to make such widespread changes, there must surly have been some particularly heinous deals that pushed it over the edge?

“I think it’s just a reality of the way that those publisher/developer deals work,” Matthews says. “In order for a publisher to take a gamble on your game and on your idea, you have to give up a lot. That includes the IP rights. It’s just the realities of how things work in that space. For us, I think any developer would say the same thing, being able to retain your IP is a really important thing. So far, we haven’t been out to do that.

“With Hellblade, it’s really nice that we can be comfortable in the fact that we’re not trying to appeal to everyone. We’re not trying to hit unrealistic forecasts. Ultimately, I think a lot of games have unrealistic forecasts. Everyone knows that they’re unrealistic, but they have to have these unrealistic forecasts to justify the investment that’s going into development.

“Ultimately, a lot of games, on paper, fail because they don’t hit those forecasts. Then the studios and the people that made those games, they don’t get the chance to make any more. It’s an incredibly tough market. Yes, we’ve enjoyed working with our publishers, but that’s not to say that the agreements that developed are all ideal, because they’re not. The catalyst to us now being able to do this is really difficult distribution. We can break away from that retail $60 model, where every single game has to be priced that way, regardless of what it is.

Driven into funding only games that will comfortably shift five or six million units, Matthews believes that publishers have no choice but to stick to the safe bets, a path that eventually winnows down diversity to the point of stagnation, where only a few successful genres ever end up getting made: FPS, sports, RPG, maybe racing. Those genres become less and less distinct, while simultaneously shoe-horning in mechanics that prove popular elsewhere and shunning true innovation.

While perhaps briefly sustainable, Matthews sees that as a creative cul-de-sac. Customers, he feels, are too smart to put up with it.

“Consumers are going to get a bit wary of games that have hundreds of millions of dollars spent on them”

“I think consumers are going to get a bit wary. Get a bit wary of games that have hundreds of millions of dollars spent on them. I think gamers are going to start saying, ‘For what?’

“The pressures are for games to appeal to more and more people. It used to be if you sold a million units, then that was OK. Then it was three million units. Now it’s five million units. Five million units is crazy. We’ve never sold five million units.”

It’s not just consumers who are getting wise, though. Matthews acknowledges that the publishers also see the dead-end approaching.

“I think something has to be said for the platform holders now. Along with digital distribution, the fact that the platform holders are really opening their doors and encouraging self-publishing and helping independent developers to take on some of those publishing responsibilities, has changed things for us. I think it will change things for a lot of other developers. “Hellblade was announced at the GamesCom Playstation 4 press conference. My perception of that press conference was that the real big hitters in that were all independent titles. It’s great that the platform holders have recognised that. There’s a real appetite from their players for innovative, creative games.

“It’s a great opportunity for us to try to do things differently. Like on Hellblade, we’re questioning everything that we do. Not just on development, but also how we do things from a business perspective as well. Normally you would say, ‘Well, you involve these types of agencies, get these people involved in this, and a website will take this long to create.’ The next thing that we’re doing is, we’re saying, ‘Well, is that true? Can we try and do these things a different way,’ because you can.

“There’s definitely pressure for us to fill all those gaps left by a publisher, but it’s a great challenge for us to step up to. Ultimately, we have to transition into a publisher. That’s going to happen at some point, if we want to publish our own games.”

Courtesy-GI.biz

U.S. And Britain Ramping Up Joint Cyber Defense Efforts

January 20, 2015 by mphillips  
Filed under Around The Net

The U.S. and Britain are increasing their collaboration to thwart digital threats. They are planning to launch more attacks against each other to test their defenses and scare away possible enemies.

The U.S. and the U.K. have been working together to prevent cyber attacks for some time, but are going to increase the collaboration. They will combine their expertise to set up “cyber cells” on both sides of the Atlantic to increase sharing information about threats and to work out how to best protect themselves and create a system that lets hostile states and organization know they shouldn’t attack, said U.K. prime minister David Cameron in an interview published by the BBC.

Cyber attacks “are one of the biggest modern threats that we face,” according to Cameron who is visiting Washington for talks with U.S. president Barack Obama. One of the topics high on the agenda is digital security.

The countries will increase the “war games” launched at each other to test defenses. “It is happening already but it needs to be stepped up,” Cameron said, adding that British intelligence service GCHQ and the U.S. equivalent NSA have know-how that should be shared more.

“It is not just about protecting companies, it is also about protecting people’s data, about protecting people’s finances. These attacks can have real consequences to people’s prosperity,” he said.

The increased cooperation between the countries comes in the wake of the Sony hack and the apparent hacking of the U.S. Central Command’s Twitter account by ISIS (Islamic State of Iraq and Syria), which posted tweets threatening families of U.S. soldiers and claiming to have hacked into military PCs.

 

 

 

LG Stops OLED Production For Now

January 15, 2015 by Michael  
Filed under Around The Net

South Korea’s labor ministry has ordered LG to halt operations of an organic light-emitting diode (OLED) panel production line following a nitrogen gas leak.

The ministry, in a statement posted on its website yesterday, said the production ban will last as authorities investigate a nitrogen gas leak that killed two workers.

An LG spokeswoman confirmed that production at the OLED TV panel line has been halted. She declined to specify the ban’s effect on sales or production and said the firm will work to resume operations as quickly as possible.

The leak happened days after LG unveiled its Best of CES-winning Art Slim OLED sets and might affect a timely launch if the investigation takes a while.

The leak happened around 12:50 p.m. at the P8 factory in Paju, about 40 kilometers north of Seoul. The workers from LG Display and its subcontractor were carrying out routine maintenance on the ninth floor when the valve of a nitrogen gas cylinder was presumably opened by mistake.

One worker died at the scene, with another being pronounced dead on his way to the hospital, authorities said.

Courtesy-Fud

Did Sony Learn Anything From It’s Recent Breach?

January 13, 2015 by Michael  
Filed under Computing

Sony CEO Michael Lynton has told the Associated Press that the firm’s computer systems are still down, but, and thank someone for this, film and television production has not stopped a beat.

Yes, Sony, the firm that bought us a remake of Annie and a very divisive blunt edged political comedy called The Interview in the past couple of months, is still firing on production cylinders, if not sending emails.

A long interview with Associated Press finds Lynton not mentioning the North Korea words, but admitting that the hackers that have it between its teeth are pretty good at what they do and that Sony is on a real learning experience.

“We are the canary in the coal mine, that’s for sure,” said the CEO. “There is no playbook for this, so you are in essence trying to look at the situation as it unfolds and make decisions without being able to refer to a lot of experiences you’ve had in the past or other people’s experiences. You’re on completely new ground.”

Despite what you might have been led to believe, the assault on Sony has not been very costly, according to Lynton, who said that the firm has not had much more than a ripple to contend with.

“What I’m hearing so far is that they’re very manageable,” he added. “They’re not disruptive to the economic well being of the company.”

There has been some internal disruption, though, and Lynton said that staffers are paid with paper checks.

He confirmed that Sony’s technology people did scuttle about looking for workarounds and started using old BlackBerry handsets as part of a boots and braces response.

In the case of the latter, at least one firm was pleased about this news.

Courtesy-TheInq

 

Sony Says Recent Cyber Attack Will Have Minimal Financial Impact

January 8, 2015 by mphillips  
Filed under Around The Net

Sony Corp Chief Executive Kazuo Hirai said he does not expect the November cyber attack on the company’s film studio to have a significant financial impact, two weeks after the studio finally released the movie that spurred the attack.

The studio, Sony Pictures Entertainment, said separately that the film, “The Interview,” has generated revenue of $36 million.

Hirai told reporters at the Consumer Electronics Show in Las Vegas that he had signed off on all major decisions by the company in response to the attack, which the U.S. government has blamed on North Korea.

Sony’s network was crippled by hackers as the company prepared to release “The Interview,” a comedy about a fictional plot to assassinate North Korean leader Kim Jong Un. The attack was followed by online leaks of unreleased movies and emails that caused embarrassment to executives.

“We are still reviewing the effects of the cyber attack,” Hirai told reporters. “However, I do not see it as something that will cause a material upheaval on Sony Pictures business operations, basically, in terms of results for the current fiscal year.”

Sony Pictures said “The Interview,” which cost $44 million to make, has brought in $31 million in online, cable and satellite sales and was downloaded 4.3 million times between Dec. 24 and Jan. 4.

It has earned another $5 million at 580 independent theaters showing the movie in North America.

It is still unclear if Sony Pictures will recoup the costs of the film, starring Seth Rogen and James Franco, including an estimated $30 million to $40 million marketing bill.

On Monday, Hirai praised employees and partners of the Hollywood movie studio for standing up to “extortionist efforts” of hackers, his first public comments on the attack launched on Nov 21.

 

 

 

Was The PS4 Sales Flat Over The Holiday?

January 7, 2015 by Michael  
Filed under Gaming

While the Sony PlayStation 4 has been selling very well, it seems that Christmas was not really its season.

Sony said that the PlayStation 4 has sold more than 18.5 million units since the new generation of consoles launched. While that is good and makes the PS4 the fastest selling PlayStation to date, there was no peaking at Christmas.

You would think that the PS4 would sell well at Christmas as parents were forced to do grevious bodily harm to their credit cards to shut their spoilt spawn up during the school holidays. But apparently not.

Apparently, the weapon of choice against precious snowflakes being bored was an Xbox One which saw a Christmas spike in sales.

Sony said that its new numbers are pretty much on target, it sold the expected 2 million sales per month rate.

Redmond will be happy with that result even if it still has a long way to go before it matches the PlayStation 4 on sales.

Courtesy-Fud

Sony Offering Discounts After PlayStation Outage

January 5, 2015 by mphillips  
Filed under Gaming

If you received a PlayStation 4 for Christmas but network outages hampered you from using it, Sony wants to make it up to you.

Sony Computer Entertainment America will offer 10% off PlayStation Store purchases including games, TV shows and movies as a gesture of thanks for users’ patience following an outage of several days caused by denial-of-service (DDoS) attacks.

In addition, PlayStation Plus members who had an active membership or free trial on Dec. 25 will receive a membership extension of five days, Eric Lempel of Sony Network Entertainment wrote in a blog post.

Judging from the comments to the post, many PlayStation Network (PSN) users were happy about the offer, but not all of them.

“What I would like, more than anything else, is an explanation from Sony about how and why this will never happen again,” wrote one user. “Use the money to strengthen and diversify the network infrastructure so these types of attacks become harder to make and easier to recover from.”

In another blog post, Sony had attributed the outages to an attack creating “artificially high levels of traffic designed to disrupt connectivity and online gameplay.”

The DDoS attacks, which also took down Microsoft’s Xbox Live game network, were apparently launched by hacker group Lizard Squad, which later took aim at anonymous network Tor.

 

 

Hackers Continue Attack On Tor

December 29, 2014 by mphillips  
Filed under Around The Net

Hackers who apparently attacked Sony’s PlayStation Network (PSN) and Microsoft’s Xbox Live on Christmas Day have turned their attention towards anonymous network Tor.

Lizard Squad, which claimed responsibility for the outage, on Friday tweeted, “To clarify, we are no longer attacking PSN or Xbox. We are testing our new Tor 0day.”

While at least one site that maps the Tor network showed numerous routers with the name “LizardNSA,” the extent of any attack was unclear.

Tor directs user traffic through thousands of relays to ensure anonymity. In a Dec. 19 blog post, Tor managers warned of a possible attack, saying, “There may be an attempt to incapacitate our network in the next few days through the seizure of specialized servers in the network called directory authorities.”

Sony engineers, meanwhile, continued to struggle to get PSN back online Friday following the suspected denial-of-service (DDoS) attacks on Thursday.

Sony’s Twitter account for PSN asked frustrated gamers to be patient as staff worked to get the service back up and running, saying it did not know when PSN would be back online.

“We are aware that some users are experiencing difficulty logging into the PSN,” Sony said on its PlayStation support page, where the network was listed as offline.

In a Twitter post showing a chat with the alleged hackers, MegaUpload founder Kim Dotcom suggested he had convinced Lizard Squad to stop the attacks in return for lifetime memberships on his file-transfer site Mega.

Lizard Squad had taken credit for an apparent attack against PSN earlier this month, as well as an attack in August. The incident came at the same time that a U.S. flight carrying Sony Online Entertainment President John Smedley was diverted for security reasons.

 

 

Will The Apple iWatch Be A Dud?

December 24, 2014 by Michael  
Filed under Consumer Electronics

It appears that Apple waited too long and relied too much on the press to keep interest in its iWatch vaporware product going. New research has showing that interest in the device has been falling faster than a free fall team of parachuting elephants who have forgotten to pack the key ingredient of their act.

The Tame Apple press is beside itself with worry as Apple does not like failure and it might not invite them to one of its press launches again unless people get enthusiastic about the watch again.

One tech press reporter seriously wrote “One would assume that ever since Apple announced the introduction of the Apple Watch, anticipation for the product would be steadily growing.”

Why would that be Sherlock? The longer Apple leaves it the more it will be out of date?

Investment firm Piper Jaffray asked 968 iPhone owners whether they were interested in purchasing an Apple Watch, and only seven percent said they planned to buy it. That figure is down from eight percent in September, when Apple first unveiled the product at its annual iPhone event. By the time the product is actually launched next year (maybe) that figure could drop even further.

Some analysts who have been drinking Apple’s Kool Aid, like Trip Chowdhry of Global Equities Research, have claimed that every iPhone user will also be an Apple Watch user. If Piper Jaffray’s figures prove right, GER should sack Chowdhry as a warning to other analysts who promote Apple at the expense of their company’s credibility.

Courtesy-Fud

 

Are Buggy Games Getting A Pass?

December 23, 2014 by Michael  
Filed under Gaming

Recently, my smartphone started acting up. I think the battery is on the way out; it does bizarre things, like shutting itself off entirely when I try to take a picture on 60per cent battery, or suddenly dropping from fully charged to giving me “10per cent remaining, plug me in or else” warnings for no reason at all. I can get it fixed free of charge, but it’s an incredibly frustrating, bothersome thing, especially given how much money I’ve paid for this phone. Most of us have probably had an experience like this with a piece of hardware; a shoddy washing machine that mangled your favorite shirt, a shiny new LCD screen with an intensely irritating dead pixel, an Xbox 360 whose Red Ring of Death demanded a lengthy trip back to the service center. There are few of us who can’t identify with the utter frustration of having a consumer product that you’ve paid good money for simply fail to do its job properly. Sure, it’s a #FirstWorldProblem for the most part (unless it’s something like a faulty airbag in your Honda, obviously), but it’s intensely annoying and certainly makes you less likely to buy anything from that manufacturer again.

Given that we could all probably agree that a piece of hardware being faulty is utterly unacceptable, I’m not sure why software seems to get a free pass sometimes. Sure, there are lots of consumers who complain bitterly about buggy games, but by and large games with awful quality control problems tend to get slapped with labels like “flawed but great”, or have their enormous faults explained in a review only to see the final score reflect none of those problems. It’s not just the media that does this (and for what it’s worth, I don’t think this is corruption so much as an ill-considered aspect of media culture itself); for every broken game, there are a host of consumers out there ready to defend it to the hilt, for whatever reason.

I raise this problem because, while buggy games have always been with us – often hilariously, especially back in the early days of the PlayStation – the past year or so has seen a spate of high-profile, problematic games being launched, suggesting that even some of the industry’s AAA titles are no longer free from truly enormous technical issues. The technical problems that have become increasingly prevalent in recent years are causing genuine damage to the industry; from the botched online launches of games like Driveclub and Battlefield through to the horrendous graphical problems that plague some players of Assassin’s Creed Unity, they are giving consumers terrible experiences of what should be high points for the medium, creating a loud and outspoken group of disgruntled players who act to discourage others, and helping to drive a huge wedge between media (who, understandably, want to talk about the experience and context of a game rather than its technical details) and consumers (who consider a failure to address glaring bugs to be a sign of collusion between media and publishers, and a failure on the part of the media to serve their audience).

We can all guess why this is happening. I don’t wish in any way to underplay how complex and difficult it is to develop bug-free software; I write software tools to assist in my research work, and given how often those simple tools, developed by two or three people at most, have me tearing my hair out at 3am as I search for the single misplaced character that’s causing the whole project to behave oddly, I am absolutely the last person in the world who is going to dismiss the difficulty involved in debugging something as enormous and complex as a modern videogame. Debugging games has inevitably become harder as team sizes and technical complexity has grown; that’s to be expected.

However, just because something is harder doesn’t mean it shouldn’t be happening, and that’s the second part of this problem. Games are developed to incredibly tight schedules, sometimes even tighter today (given the culture of annual updates to core franchises) than they were in the past. Enormous marketing budgets are preallocated and planned out to support a specific release date. The game can’t miss that date; if there are show-stopping bugs, the game will just have to ship with those in place, and with a bit of luck they’ll be able to fix them in time to issue a day-one digital patch (and if your console isn’t online, tough luck).

Yet this situation is artificial in itself. It’s entirely possible to structure your company’s various divisions around the notion that a game will launch when it’s actually ready, and ensure that you only turn out high-quality software; Nintendo, in particular, manages this admirably. Certainly, some people criticise the company for delaying software and it does open up gaps in the release schedule, but compared to the enormous opprobrium which would be heaped upon the company if it turned out a Mario Kart game where players kept falling through the track, or a Legend of Zelda where Link’s face kept disappearing, leaving only eyes and teeth floating ghoulishly in negative space (sleep well, kids!), an occasional delay is a corporate cultural decision that makes absolute sense – not only for Nintendo, but for game companies in general.

It doesn’t even have to go as far as delaying games on a regular basis. There is a strong sense that some of the worst offenders in terms of buggy games simply aren’t taking QA seriously, which is something that absolutely needs to be fixed – and if not, deserves significant punishment from consumers and critics alike. Quality control has a bit of an image problem; there’s a standard stereotype of a load of pizza-fuelled youngsters in their late teens testing games for a few years as they try to break into a “real” games industry job. The image doesn’t come from thin air; for some companies, this is absolutely a reality. It is, however, utterly false to think that every company sees its QA in those terms. For companies that take QA seriously, it’s a division that’s respected and well-treated, with its own career progression tracks, all founded on the basic understanding that a truly good QA engineer is worth his or her weight in gold.

Not prioritising your QA department – not ensuring that it’s a division that’s filling up with talented, devoted people who see QA as potentially being a real career and not just a stepping stone – is exactly the same thing as not prioritising your consumers. Not building time for proper QA into your schedules, or failing to enact processes which ensure that QA is being properly listened to and involved, is nothing short of a middle finger raised to your entire consumer base – and you only get to do that so many times before your consumers start giving the gesture right back to you and your precious franchises.

Media does absolutely have a role to play in this – one to which it has, by and large, not lived up. Games with serious QA problems do not deserve critical acclaim. I understand fully that reviewers want to engage with more interesting topics than technical issues, but I think it’s worth thinking about how film reviewers would treat a movie with unfinished special effects or audio mixed such that voices can’t be heard; or perhaps how music reviewers would treat an album with a nasty recording hiss in the background, or with certain tracks accidentally dropping out or skipping. Regardless of the good intentions of the creative people involved in these projects, the resulting product would be slammed, and rightly so. It’s perhaps the very knowledge of the drubbing that they would receive that means that such awful movies and albums almost never see the light of day (and when they do, they become legendary in their awfulness; consider the unfinished CGI at the end of “The Scorpion King”, which remains a watchword for terrible special effects many years later).

Game companies, by contrast, seem to feel unpleasantly comfortable with releasing games that don’t work and aren’t properly tested. Certain technical aspects probably contribute to this; journalists may be wary of slamming a game for bugs that may be fixed in a day-one patch, for instance. Yet it seems that there’s little choice but to make the criteria stricter in this regard. If media and consumers alike do not take to punishing companies severely for failing to pay proper respect to QA procedures for their games, this problem will only worsen as firms realize that they they can get away with launching unfinished software.

We all want a world where technical issues are nothing but a footnote in the discussion of games; that will be the ultimate triumph of game technology, when it truly becomes transparent. We do not, however, live in that time yet, and the regular launches of games that don’t live up to even the most basic standards of quality is something nobody should be asked to tolerate. The move by some websites to stop reviewing online games until the servers are live and populated with real players is a good start; but the overall tolerance for bugs and willingness to forgive publishers for such transgressions (“we know the last game was a buggy mess, but we’re still going to publish half a dozen puff pieces that will push our readers to pre-order the sequel!”) needs to be fixed. If we want to talk about the things that are important about games (and we do!), it’s essential that we fix the culture that ignores QA and technical issues first.

Courtesy-GI.biz

 

Are Indie Developers Dying Out?

December 22, 2014 by Michael  
Filed under Gaming

For independent developers, the last decade has been an endless procession of migratory possibilities. The physical world was defined by compromise, dependence and strategically closed doors, but the rise of digital afforded freedom and flexibility in every direction. New platforms, new business models, new methods of distribution and communication; so many fresh options appeared in such a brief window of time that knowing where and when to place your bet was almost as important as having the best product. For a few years, right around 2008, there was promise almost everywhere you looked.

That has changed. No matter how pregnant with potential they once seemed, virtually every marketplace has proved unable to support the spiralling number of new releases. If the digital world is one with infinite shelf-space for games, it has offered no easy solutions on how to make them visible. Facebook, Android, iOS, Xbox Live Arcade, the PlayStation Network; all have proved to be less democratic than they first appeared, their inevitable flaws exposed as the weight of choice became heavier and heavier. As Spil Games’ Eric Goossens explained to me at the very start of 2014: “It just doesn’t pay the bills any more.”

Of course, Goossens was talking specifically about indie development of casual games. And at that point, with 2013 only just receding from view, I would probably have named one exception to the trend, one place where the balance between volume and visibility gave indies the chance to do unique and personal work and still make a decent living. That place would have been Steam, and if I was correct in my assessment for even one second, it wasn’t too long before the harsher reality became clear.

After less than five months of 2014 had passed, Valve’s platform had already added more new games than in the whole of the previous year. Initiatives like Greenlight and Early Access were designed to make Steam a more open and accessible platform, but they were so effective that some of what made it such a positive force for indies was lost in the process. Steam’s culture of deep-discounting has become more pervasive and intense in the face of this chronic overcrowding, stirring up impassioned debate over what some believe will be profound long-term effects for the perceived value of PC games. Every discussion needs balance, but in this case the back-and-forth seemed purely academic: for a lot of developers steep discounts are simply a matter of survival, and precious few could even entertain the notion of focusing on the greater good instead.

And the indie pinch was felt beyond Steam’s deliberately weakened walls. Kickstarter may be a relatively new phenomenon – even for the hyper-evolving landscape of the games industry – but it faced similar problems in 2014, blighted by the twin spectres of too much content and not enough money to go around. Anecdotally, the notion that something had changed was lurking in the back ground at the very start of the year, with several notable figures struggling to find enough backers within the crowd. The latter months of 2014 threw up a few more examples, but they also brought something close to hard evidence that ‘peak Kickstarter’ may already be behind us – fewer successful projects, lower funding targets, and less money flowing through the system in general. None of which was helped by a handful of disappointing failures, each one a blow for the public’s already flagging interest in crowdfunding. Yet another promising road for indies had become more treacherous and uncertain.

So are indies heading towards a “mass extinction event”? Overcrowding is certainly a key aspect of the overall picture, but the act of making and releasing a game is only getting easier, and the allure of development as a career choice seems to grow with each passing month. It stands to reason that there will continue to be a huge number of games jostling for position on every single platform – more than even a growing market can sustain – but there’s only so much to be gained from griping about the few remaining gatekeepers. If the days when simply being on Steam or Kickstarter made a commercial difference are gone, and if existing discovery tools still lack the nuance to deal with all of that choice, then it just shifts the focus back to where it really belongs: talent, originality, and a product worth an investment of time and money.

At GDC Europe this summer, I was involved in a private meeting with a group of Dutch independent game developers, all sharing knowledge and perspective on how to find success. We finished that hour agreeing on much the same thing. There are few guarantees in this or any other business, but the conditions have also never been more appropriate for personality and individuality to be the smartest commercial strategy. The world has a preponderance of puzzle-platformers, but there’s only one Monument Valley. We’re drowning in games about combat, but This War of Mine took a small step to the left and was greeted with every kind of success. Hell, Lucas Pope made an entire game about working as a border control officer and walked away with not just a hit, but a mantelpiece teeming with the highest honours.

No matter how crowded the market has become, strong ideas executed with care are still able to rise above the clamour, no huge marketing spend required. As long as that’s still possible, indies have all of the control they need.

Courtesy-GI.biz

Will Microsoft’s Arcadia Bring Streaming To The Xbox One?

December 19, 2014 by Michael  
Filed under Gaming

It’s already been widely reported that Microsoft is working on game-streaming technology, long enough that the company has apparently started over at least once. According to a new ZDNet report, Microsoft halted work on one such project called “Rio,” and has since begun building a new streaming service code-named “Arcadia.”

ZDNet’s Mary Jo Foley cites sources within Microsoft with the news that Arcadia is being worked on by a new team in the company’s Operating Systems Group. A job listing for the team says it will be working “to bring premium and unique experiences to Microsoft’s core platforms.”

Arcadia is said to run on Microsoft’s Azure cloud technology, and will let users stream apps as well as games. While there was talk of having Arcadia stream Android apps and games to Windows devices, Foley reported that particular feature has been tabled for the moment.

Courtesy-GI.biz

Is Borderlands Headed To The Xbox One And PS4?

December 17, 2014 by Michael  
Filed under Gaming

Sources are sighting a rating seen on the Australian classifications that seem to point to an upcoming Remastered Edition of Borderlands is coming for Xbox One and PlayStation 4. So far this has remained unconfirmed by publisher 2K and franchise developer Gearbox.

The new remastered version is expected to be simply called “Borderlands Remastered Edition”, but with no confirmation from 2K and Gearbox it is difficult to say what all it might contain or if it is simply a converted and compiled version of the first three games for the Xbox One and PlayStation 4.

Bottom line if it is in fact a complied remastered release of the first three games, the reality is that this could actually be a good thing for those that own the new consoles.

Courtesy-Fud

LG Plans On Rolling Out Quantum Dot TVs In 2015

December 17, 2014 by mphillips  
Filed under Consumer Electronics

South Korea’s LG Electronics Inc will roll out a new range of high-tech TVs in early 2015, expanding its line-up while it strives to cut costs that make its prized light-emitting diode (OLED) sets too expensive for most consumers.

A spokesman for the world’s No. 2 TV maker after domestic rival Samsung Electronics Co Ltd said on Tuesday LG will start selling products using quantum dot technology early next year. He didn’t disclose details including pricing.

The technology incorporates a film of tiny light-emitting crystals into regular liquid crystal displays (LCD), boosting picture quality. LG will have 55-inch and 65-inch ultra-high definition quantum dot TVs on display at the major CES trade show next month in Las Vegas.

Japan’s Sony Corp is so far the only major TV maker selling quantum dot models.

LG was widely expected to launch quantum dot TVs next year, having declared its intention to use the products in a dual-track strategy as the firm and its affiliate LG Display Co Ltd try to push OLED prices down. Analysts say it may take the LG firms several years to meet that goal.

The OLED TV sets remain expensive: a 65-inch ultra-high definition model launched in Korea earlier this year was priced at 12 million won ($10,993). A comparable Sony quantum dot TV costs about $3,799, according to the Japanese firm’s website.

Samsung Electronics has said quantum dot is one of many technologies it is considering. Analysts expect Samsung Electronics to launch quantum dot TVs next year, and believe it could be more aggressive in pushing the products than LG, which remains committed to OLED.

The LG spokesman said Dow Chemical Co is supplying quantum dot material. Dow Chemical confirmed the supplier relationship in an emailed statement.

Dow is building a quantum dot factory in South Korea using technology from partner Nanoco Group Plc, with production starting in the first half of 2015.

 

 

Hacking Could Cost Sony Studios $100 Million

December 11, 2014 by mphillips  
Filed under Consumer Electronics

Sony Corp’s movie studio could face tens of millions of dollars in costs from the massive network breach that severely hindered its operations and exposed sensitive data, according to cybersecurity experts who have studied past breaches.

The tab will be less than the $171 million Sony estimated for the breach of its Playstation Network in 2011 because it does not appear to involve customer data, the experts said.

Major costs for the attack by unidentified hackers include the investigation into what happened, computer repair or replacement, and steps to prevent a future attack. Lost productivity while operations were disrupted will add to the price tag.

The attack, believed to be the worst of its type on a company on U.S. soil, also hits Sony’s reputation for a perceived failure to safeguard information, said Jim Lewis, senior fellow at the Center for Strategic and International Studies.

“Usually, people get over it, but it does have a short-term effect,” said Lewis, who estimated costs for Sony could stretch to $100 million.

It typically takes at least six months after a breach to determine the full financial impact, Lewis said.

Sony has declined to estimate costs, saying it was still assessing the impact.

The company has insurance to cover data breaches, a person familiar with the matter said. Cybersecurity insurance typically reimburses only a portion of costs from hacking incidents, experts said.