In the early 70s and continuing into the 1980s, many children and parents growing up in the era of economic upheaval and the early dawning of globalization remember a time when Video Home Cassette (VHS) tapes rapidly became a common household item for their ease of use as a television recording and playback method. This time was not without competition between two major standards – namely Sony’s Betamax and JVCs Video Home System (VHS), the latter which eventually won the first format war despite being introduced one year after its rival and having less sophistication in terms of recording quality.
During the timeframe between 1982 when Philips and Sony commercialized the Compact Disc (CD) format to the mid-1990s when personal computer manufacturers and software developers began massive adoption of the CD-ROM standard for data storage, the CE industry was looking for a way to distribute digital video over an effective disc format to replace VHS that would serve two major purposes – being more cost-effective than LaserDiscs, and containing the ability to prevent unauthorized recordings (unlike Video CDs).
After several years of research in the early 1990s and the industry’s promise of avoiding another format war between two new optical disc formats – the Multimedia Compact Disc (MMCD) and the Super Density (SD) disc, the Digital Versatile Disc (DVD) standard was agreed upon. The DVD format went on sale in Japan in 1995, in the United States in 1997, in Europe in 1998 and in Australia in 1999.
The format enjoys a 1x playback speed of 10.5 Mbit/s and was originally offered in a 4.7GB single-layer capacity. In 2003, the double-layer format was launched and doubled capacity to 8.5GB.
Roughly ten years later, the first Blu-ray Disc titles were released on June 20, 2006 and included 50 First Dates, The Fifth Element, Hitch, House of Flying Daggers, Twister, Underworld: Evolution, xXx and The Terminator. The new 1080p Full HD format’s initial launch was predominantly helped by the launch of Sony’s PlayStation 3 in November 2006, selling just over 6 million console units worldwide in its first year on the market. The format wasn’t without competition, however, sparking the beginning of a second format war for the first time in three decades, this time between Blu-ray Disc and High Definition (HD) DVD. It was in February 2008 that one of HD-DVD’s main partners, Toshiba, announced that it would stop the development of HD-DVD players. This factor, along with a heavy lift in market share from PS3 sales, ultimately conceded the war to the Blu-ray disc format.
The Blu-ray Disc format originally launched with 25GB single-layer and 50GB dual-layer disc capacities, later upgrading to 100GB and 128GB with the BDXL format in June 2010. The PS3 features a 2x BD-ROM Blu-ray read speed at just 72Mbit/s. The 2x read speed was most likely chosen by Sony to save on console production costs, as the minimum required data transfer rate for Blu-ray disc movie playback is 54Mbit/s.
Once again, ten years after the launch of the 1080p Blu-ray Disc format we now have the 2160p Ultra HD Blu-ray Disc format in its place. On May 12, 2015, the Blu-ray Disc Association announced completed specifications and the official Ultra HD Blu-ray logo.
The initial 4K Blu-ray specification allows for three size densities – 50GB single layer, 66GB dual-layer and 100GB triple-layer each with 82Mbit/s, 108Mbit/s and 128Mbit/s data read speeds, respectively.
The new UHD 4K Blu-ray specification also moves from H.264 / AVC compression technology to the newer H.265 / HEVC (High Efficiency Video Coding) technology, allowing for noticeably more efficient data compression – about 25 percent to 35 percent lower bit rates – without any loss in image quality across a given data transfer speed.
Marvell today announced its integrated dual port 100 Gigabits per second (Gbps) Ethernet PHY transceiver based around the IEEE 802.3bj standard.
Dubbed the Alaska C 88X5121 transceiver, Marvell claims it performs all physical layer functions required to drive 100Gbps Ethernet over a variety of media including optics, backplanes and passive copper cables.
The transceiver also supports 25 Gigabit Ethernet (GbE) applications, as well as non-Ethernet applications such as Fibre Channel. The Marvell Alaska 88X5121 currently is sampling to Marvell’s global customers.
Michael Zimmerman, Vice President and General Manager, Connectivity, Storage and Infrastructure (CSI) BU, Marvell said that there was huge transitions in the technology from 10GbE to 25GbE and 40GbE to 100GbE.
“Marvell’s 88X5121 transceiver provides a standards-compliant PHY solution that’s required to enable this transition in datacentres. The 88X5121 builds on Marvell’s legacy of providing best-in-class features that enable customers to expand their Ethernet applications across a broad range of applications and implementations.”
Analyst outfit Dell’Oro Group said that cloud providers are entering an expansion and mega-upgrade cycle, driven by increased demand for capacity and aging infrastructure, that will be served by 25 Gbps server technology and 100 Gbps switch technology.
The gear is made using 28nm lithography, in a 17mm by 17mm package footprint. This allows QSFP28-based high density 100GbE and 25GbE line card designs.
The line interface of the 88X5121 is fully compliant to the IEEE 802.3bj standard that defines the physical layer specifications for 100Gbps Ethernet transmission over backplanes and copper cables.
It supports Reed-Solomon Forward Error Correction (FEC) function required for 100G-CR4 and 100G SR4 operation, as well as auto-negotiation and coefficient training protocol required by the IEEE 802.3 standards.
The 88X5121 connects to a MAC or switch on its host interface over a 4x25Gbps CAUI-4 link. The transmit drive and receiver equalization capabilities of the host interface are compliant to OIF CEI-25G LR specifications, significantly exceeding CAUI-4 requirements.
On the line interface, the device supports a variety of media types including single mode and multimode optical modules, passive and active copper direct attach cables and copper backplanes.
For applications not requiring the FEC functionality, the device also supports a low latency repeater mode where the functionalities associated with the Physical Coding Sublayer (PCS) and FEC are bypassed. In the repeater mode, the device can be used to drive backplanes and cables for non-Ethernet traffic types such as OTN and Fibre Channel. The eight lanes of the device can operate independent of the others in this mode, enabling simultaneous support for multiple standards. The 88X5121 wide band of operation (from 1.25Gbps to 28.05Gbps) supports a wide variety of standards and rates.
Hideo Kojima has left the building. The New Yorker has confirmed that the famous game creator’s last day at Konami has come and gone, with a farewell party attended by colleagues from within and without the country – but not, notably, by Konami’s top brass. Only a couple of months after his latest game, Metal Gear Solid V: The Phantom Pain, clocked up the most commercially successful opening day’s sales of any media product in 2015, Kojima has left a studio facing shutdown – its extraordinary technology effectively abandoned, its talent scattered, seemingly unwanted, by a company whose abusive and aggressive treatment of its staff has now entered the annals of industry legend.
It’s not exaggerating to say that an era came to a close as Kojima walked out the door of the studio that bore his name for the last time. For all of Konami’s the-lady-doth-protest-too-much claims that it’s not abandoning the console market, actions matter far more than PR-moderated words, and shutting down your most famous studio, severing ties with your most successful creator in the process, is an action that shouts from the rooftops. Still, there’s some truth to Konami’s statements; it’s unlikely to abandon the console versions of Winning Eleven / Pro Evolution Soccer, or of Power Pro Baseball, any time soon, though more and more of the firm’s focus will be on the mobile incarnations of those franchises. The big, expensive, risky and crowd-pleasing AAA titles, though? Those are dead in the water. Metal Gear Solid, Silent Hill (whose reincarnation, with acclaimed horror director Guillermo del Toro teaming up with Kojima at the helm, is a casualty of this change of focus), Suikoden, Castlevania, Contra… Any AAA title in those franchises from now on will almost certainly be the result of a licensing deal, not a Konami game.
One can criticise the company endlessly for how this transition has been handled; Konami has shown nigh-on endless disrespect and contempt for its creative staff and, Kojima himself aside, for talented, loyal workers who have stuck by the firm for years if not decades. It richly deserves every brickbat it’s getting for how unprofessionally and unpleasantly it’s dealt with the present situation. It’s much, much harder to criticise the company for the broader strokes of the decisions being made. Mobile games based on F2P models are enormous in Japan, not just with casual players but with the core audience that used to consume console games. The transition to the “mid-core” that mobile companies talk about in western territories is a reality in Japan, and has been for years; impressively deep, complex and involved games boast startling player numbers and vastly higher revenue-per-user figures than most western mobile games could even dream of. Konami, like a lot of other companies, probably expects that western markets will follow the same path, and sees a focus on Japan’s mobile space today as a reasonable long-term strategy that will position it well for tomorrow’s mobile space in the west.
Mobile is the right business to be in if you’re a major publisher in Japan right now. It’s where the audience has gone, it’s where the revenues are coming from, and almost all of the cost of a mobile hit is marketing, not development. Look at this from a business perspective; if you want to develop a game on the scale of Metal Gear Solid V, you have to sink tens of millions of dollars (the oft-cited figure for MGSV is $80 million) into it before it’s even ready to be promoted and sold to consumers. That’s an enormous, terrifying risk profile; while the studio next door is working on mobile games that cost a fraction of that money to get ready for launch, with the bulk of the spend being in marketing and post-launch development, which can be stemmed rapidly if the game is underperforming badly. Sure, mobile games are risky as all hell and nobody really knows what the parameters for success and failure are just yet, but with the time and money taken to make a Metal Gear Solid, you can throw ten, twenty or thirty mobile games at the wall and see which one sticks. The logic is compelling, whether you like the outcome or not.
Here’s what nobody, honestly, wants to hear – that logic isn’t just compelling for Konami. Other Japanese publishers are perhaps being more circumspect about their transitions, but don’t kid yourself; those transitions are happening, and Konami will not be the last of the famous old publishers to excuse itself and slip away from the console market entirely. When Square Enix surveys the tortured, vastly expensive and time-consuming development process of its still-unfinished white elephant Final Fantasy XV, and then looks at the startling success it’s enjoyed with games like Final Fantasy Record Keeper or Heavenstrike Rivals on mobile, what thoughts do you think run through the heads of its executives and managers? Do you think Sega hasn’t noticed that its classic franchises are mostly critically eviscerated when they turn up as AAA console releases, but perform very solidly as mobile titles? Has Namco Bandai, a firm increasingly tightly focused on delivering tie-in videogames for Bandai’s media franchises, not noticed the disparity between costs and earnings on its console games as against its mobile titles? And haven’t all of these, and others besides, looked across from their TGS stands to see the gigantic, expensive, airship-adorned stands of games like mobile RPG GranBlue Fantasy and thought, “we’re in the wrong line of work”?
Kojima isn’t the first significant Japanese developer to walk out of a publisher that no longer wants his kind of game – but he’s the most significant thus far, and he’s certainly not going to be the last. The change that’s sweeping through the Japanese industry now is accelerating as traditional game companies react to the emergence of upstarts grabbing huge slices of market share; DeNA and Gree were only the first wave, followed now by the likes of GungHo, CyGames, Mixi and Colopl. If you’re an executive at a Japanese publisher right now, you probably feel like your company is already behind the curve. You’ve studied plenty of cases in business school in which dominant companies who appeared unassailable ended up disappearing entirely as newcomers took the lion’s share of an emerging market whose importance wasn’t recognised by the old firms until it was too late. You go home every evening (probably around midnight – it’s a Japanese company, after all) and eat your microwave dinner in front of TV shows whose ad breaks are packed with expensive commercials for mobile games from companies that hadn’t even appeared on your radar until a year or two ago, and none from the companies you’d always considered the “key players” in the industry. You’re more than a little bit scared, and you really, really want your company to be up to speed in mobile, like, yesterday – even if that means bulldozing what you’re doing on console in the process.
This is not entirely a bleak picture for fans of console-style games. Japanese mobile games really are pushing more and more towards mid-core and even hardcore experiences which, though the monetisation model may be a little uncomfortable, are very satisfying for most gamers; the evolution of those kinds of games in the coming years will be interesting to watch. Still, it will be a very long time before there’s a mobile Metal Gear Solid or a mobile Silent Hill; some experiences just don’t make sense in the context of mobile gaming, and there is a great deal of justification to the fears of gamers that this kind of game is threatened by the transition we’re seeing right now.
I would offer up two potential silver linings. The first is that not all companies are in a position to break away from console (and PC) development quite as dramatically as Konami has done. Sega, for example, is tied to those markets not least by its significant (and very successful) investments in overseas development studios, many of which have come about under the auspices of the firm’s overseas offices. Square Enix is in a similar position due to its ownership of the old Eidos studios and franchises, along with other western properties. Besides, despite the seemingly permanent state of crisis surrounding Final Fantasy XV, the firm likely recognises that the Final Fantasy franchise requires occasional major, high-profile console releases to keep it relevant, even if much of its profit is found in nostalgic retreads of past glories. Capcom, meanwhile, is deeply wedded to console development – it’s a much smaller company than the others and perhaps more content to stick to what it knows and does well, even if console ends up as a (large) niche market. (Having said that, if a mobile version of Monster Hunter springs to the top of the App Store charts, all bets are probably off.)
“Hideo Kojima left Konami because he wants to make a style of game that doesn’t fit on mobile F2P – and that’s, in the long run, probably a good thing”
The other silver lining is perhaps more substantial and less like cold comfort. Hideo Kojima left Konami because he wants to make a style of game that doesn’t fit on mobile F2P – and that’s, in the long run, probably a good thing. He joins a slow but steady exodus of talent from major Japanese studios over the past five years or more. The kind of games which people like Kojima – deeply involved with and influenced by literature, film and critical theory – want to make don’t fit with publishers terribly well any more, but that doesn’t mean those people have to stop making those games. It just means they have to find a new place to make them and a new way to fund them. Kojima’s non-compete with Konami supposedly ends in a few months and then I suspect we’ll hear more about what he plans; but plenty of former star developers from publishers’ internal studios have ended up creating their own independent studios and funding themselves either through publisher deals or, more recently, through crowdfunding. Konami’s never likely to make another game like Castlevania: Symphony of the Night, but that doesn’t stop Koji Igarashi from putting Bloodstained: Ritual of the Night on Kickstarter. Sega knocked Shenmue on the head, but a combination of Sony and Kickstarter has sent Yu Suzuki back to work on the franchise. Keiji Inafune also combined crowdfunding money with publisher funding for Mighty No. 9. Perhaps the most famous and successful of all breakaways from the traditional publishing world, though, is of a very different kind; Platinum Games, which has worked with many of the world’s top publishers in recent years while retaining its independence, is largely made up of veterans of Capcom’s internal studios.
Whichever of those avenues Kojima ends up following – the project-funding style approach of combining crowdfunding and publisher investment, or the Platinum Games approach of founding a studio and working for multiple publishers – there is no question of him walking away from making the kind of games he loves. Not every developer has his sway, of course, and many will probably end up working on mobile titles regardless of personal preference – but the creation of Japanese-style console and PC games isn’t about to end just because publishers are falling over themselves to transition to mobile. As long as the creators want to make this kind of game, and enough consumers are willing to pay for them (or even to fund their development), there’s a market and its demands will be filled. The words “A Hideo Kojima Game” will never appear on the front of a Konami title again; but they’ll appear somewhere, and that’s what’s truly important in the final analysis.
If Hideo Kojima really is on the outs at Konami, he’s at least going out with a bang. The embargo for Metal Gear Solid V: The Phantom Pain coverage hit last night, and the first batch of reviews are glowing.
IGN’s Vince Ingenito gave the game a 10 out of 10, lavishing praise on the way it adapted the series’ stealth-action formula to an open-world environment.
“Right from the moment you’re told to get on your horse and explore the Afghan countryside, Phantom Pain feels intimidating, almost overwhelming in terms of the freedom its open world affords and the number of concepts it expects you to grasp,” Ingenito said. “It’s almost too much, especially given the relative linearity of previous Metal Gears. But what initially appeared to be an overly dense tangle of features to fiddle with instead unraveled into a well-integrated set of meaningful gameplay systems that provided me with a wealth of interesting decisions to make.”
Whether players choose to sneak their way to victory or go in guns blazing, The Phantom Pain affords them a number of avenues to do so. The game’s day/night cycle and changing weather systems can make certain strategies viable (or not) at any given time. At the same time, a private army management meta-game lets players raid battlefields for resources and new recruits, which can then be put to use researching new technologies or using their skills to open up a variety of other strategic alternatives.
However, a perfect score doesn’t mean a perfect game, and Ingenito does identify at least one weak point in the game.
It’s a somewhat surprising criticism of the game, given Metal Gear Solid 4′s penchant for frequent and extended cutscenes larding the action with exposition and plot twists. While The Phantom Pain shows flashes of that approach (Ingenito noted the “spectacular” opening sequence), it ultimately produces a narrative he found “rushed and unsatisfying.”
Obviously, that failing was not enough to tarnish an otherwise fantastic game in Ingenito’s eyes.
“There have certainly been sandbox action games that have given me a bigger world to roam, or more little icons to chase on my minimap, but none have pushed me to plan, adapt, and improvise the way this one does,” he said. “Metal Gear Solid 5: The Phantom Pain doesn’t just respect my intelligence as a player, it expects it of me, putting it in a league that few others occupy.”
GameSpot’s Peter Brown likewise gave the game a 10 and praised its adaptable approach to missions, but enjoyed the story considerably more than his counterpart at IGN.
“After dozens of hours sneaking in the dirt, choking out enemies in silence, and bantering with madmen who wish to cleanse the world, The Phantom Pain delivers an impactful finale befitting the journey that preceded it,” Brown said. “It punches you in the gut and tears open your heart. The high-caliber cutscenes, filled with breathtaking shots and rousing speeches, tease you along the way. Your fight in the vast, beautiful, and dangerous open world gives you a sense of purpose. The story is dished out in morsels, so you’ll have to work for the full meal, but it’s hard to call it ‘work’ when controlling Big Boss feels so good, with so many possibilities at your fingertips.”
Brown said prior knowledge of the series isn’t a prerequisite to enjoying The Phantom Pain, but added that “Fans of the series will find their diligence rewarded in ways that newcomers can’t begin to imagine.” They’ll also, in his estimation, be enjoying the pinnacle of the franchise.
“There has never been a game in the series with such depth to its gameplay, or so much volume in content,” Brown said. “The best elements from the past games are here, and the new open-world gameplay adds more to love on top. When it comes to storytelling, there has never been a Metal Gear game that’s so consistent in tone, daring in subject matter, and so captivating in presentation. The Phantom Pain may be a contender for one of the best action games ever made, but is undoubtedly the best Metal Gear game there is.”
Eurogamer hasn’t published its full review yet, but Matt Wales weighed in with his impressions to date. Like Brown and Ingenito, Wales underscored the narrative approach as a major departure for the series.
“Beyond an outlandish, action-packed opening sequence… The Phantom Pain is a remarkably economical affair, telling its tale of ’80s cold war subterfuge through snatches of radio dialogue (courtesy of Ocelot), and the occasional return to Mother Base between missions,” Wales said. “It’s fascinating to see such restraint from Kojima, a man well known for his self-indulgence and excess, especially considering that The Phantom Pain is likely his Metal Gear swan song.”
On the gameplay side, Wales said The Phantom Pain “isn’t exactly a radical reinvention of the stealth genre,” but acknowledged the increased freedom players are given to accomplish the familiar assortment of objectives.
“Metal Gear Solid 5′s open world might not be vast, varied or stuffed full of things to do, but it’s a place of constant movement,” Wales said. “Night falls, day breaks, sandstorms sweep in, patrols come and go – and this organic sense of life means that missions are never predictable (no matter how often you play them) with tactical possibilities arising all the time. It’s a game of planning and reacting in a world that refuses to stand still, making every minute matter and every success feel earned.”
“The gameplay, storytelling, and protagonists in Metal Gear may shift with each new installment, but Kojima’s ability to surprise and enthrall gamers remains unchanged.”
He also applauded the way The Phantom Pain managed to adopt an open-world design without the genre’s standard glut of padding.
“[E]verything you do feels meaningful and consequential,” Wales said. “Guard posts and roaming patrols aren’t simply there for colour as you traverse the world: one careless move into hostile territory and every single enemy on the map will know you’re coming, with more search parties and increased security radically altering the way a mission unfolds. And while other games tout choice and consequence as a headline feature, the Phantom Pain just gets on with it. Even the smallest action can have unexpected consequences – some significant and others barely perceptible.”
Game Informer’s Joe Juba gave the game a 9.25, currently one of the lowest scores the game has received on Metacritic (where it has a 95 average based on 15 critic reviews). Like some of the above reviewers, Juba was a bit disappointed at The Phantom Pain’s approach to storytelling, but noted that having the narrative take a step in to the background puts the focus on the game’s strongest point, its open-ended gameplay.
“A series can’t survive this long without evolving, and The Phantom Pain is a testament to the importance of taking risks,” Juba said. “An open world, a customizable base, a variable mission structure – these are not traditional aspects of Metal Gear, but they are what makes The Phantom Pain such an exceptional game. The gameplay, storytelling, and protagonists in Metal Gear may shift with each new installment, but Kojima’s ability to surprise and enthrall gamers remains unchanged.”
It just announced a new octa-core ARM Cortex A53 chip with rather strange name, 5-mode 4G LTE ARMADA mobile PXA1936 SoC.
The chip supports 5-mode LTE and the eight A53 cores are clocked at up to 1.5 GHz. The SoC supports 1080p displays, as well as high-def video encoding and decoding, while the improved image processor supports cameras between 13 and 16 megapixels.
Marvell claims that it has an enhanced security processor as well as advanced power management and audio codec. Marvell’s ARMADA Mobile PXA1936 supports both LP-DDR2 and DDR3, eMMC storage, WiFi, Bluetooth, FM, GPS, SOIO and the 5-mode 4G LTE.
The company also announced ARMADA Mobile PXA1908 quad-core, with A53 cores running at up to 1.2GHz. This is a cheaper 5-mode LTE chip. It also has 8 to 13 megapixel camera and 720p displays and it is meant to attack the Moto G market usually powered by Mediatek SoCs or Qualcomm Snapdragon 400 parts.
We don’t think that it will be the fastest solution around, but we hope to see this chip in some important designs. The company claims that we should see the ARMADA Mobile PXA1936 shipping in early 2015.
Support for a union among game developers has grown, according to survey results released today by the International Game Developers Association. The group today announced the result of its Developers Satisfaction Survey from earlier this year, which found that more than half of respondents were in favor of unionization.
Of the more than 2,200 developers surveyed, 56 percent said yes when asked if they would vote to form a national union of game developers in their own countries today. That’s up from the group’s 2009 Quality of Life Survey, where just 35 percent of more than 3,300 developers said they would vote in favor of unionizing at that time.
As for whether the IGDA was considering a move in that direction, the group’s executive director Kate Edwards dismissed the notion.
“For the IGDA, we will always be a professional association,” Edwards told GamesIndustry International. “That’s what we exist for, and what we’ll always be. But if we are seeing that developers feel unionization is what they perceive to be a solution, then that’s something we’re going to pay attention to and see where it goes for them.”
“When we asked people how many jobs they’d had in the last five years and the average number was four, that was pretty eye-opening for us.”
IGDA head Kate Edwards
The survey also yielded new findings on gender diversity. While the group determined that men still “dominate” the industry, it isn’t to the same degree as before. The IGDA found 22 percent of respondents identified as female, up from 11.5 percent in 2009. Additionally, the 2009 survey only included “male” and “female” designations; this year’s poll found 2 percent of respondents identifying as male-to-female transgender, male-to female transgender or “other.”
Edwards also found responses on the lack of job security in the industry notable, if not exactly surprising.
“When we asked people how many jobs they’d had in the last five years and the average number was four, that was pretty eye-opening for us,” Edwards said. “But I do think it basically confirms what a lot of us have sort of known and have been hearing anecdotally for a while now.”
The Developers Satisfaction Survey also polled people on their salary, and found that nearly half of developers earn less than $50,000 annually. That stands in stark contrast to the Gamasutra annual Game Developer Salary Survey, which found that last year the average developer made more than $84,000, with QA being the only discipline with a sub-$50,000 average salary (and even that was a little shy of $49,000). Edwards chalked the difference up to a high percentage of the IGDA survey respondents who identified themselves as independent developers, saying they were likely working in freelance or start-up capacities.
A little less than two-thirds of respondents (61 percent) said they planned to work in games indefinitely. Of those who saw themselves leaving at some point, the most frequently given reason (39 percent) was a desire for a better quality of life.
The IGDA will release a summary report of the survey next month, followed up by reports focusing on specific topics within the survey, like diversity, quality of life, and employment practices. The group has said it will use the findings to help identify what its members care about and prioritize its initiatives and advocacy efforts around those subjects. To keep up with members’ needs as they change, the IGDA is planning the Developer Satisfaction Survey as an annual exercise.
Intel has announced a new family of products aimed at the automotive industry. Intel’s platform is designed for entertainment, navigation and there are some “smartcar” features, too.
The first product is basically a board with an Intel processor on top, but its real value is in the software, not hardware. Intel is developing a Linux-based environment for auto applications and it does not appear to have much in common with Intel’s previous efforts in the field. Intel’s extensive experience in bringing new x86 platforms to market and backing them with the necessary software is unmatched. In addition, Intel should have no problem offering support for a wide range of software platforms down the road.
Significant investment, potentially huge market
Intel Capital started making significant investments in the automotive space two years ago, with the creation of the Intel Capital Connected Car Fund, a $100 million fund tasked with accelerating development in the automotive niche.
The automotive infotainment market is growing at a healthy rate. There is no consensus on the CAGR, but most research firms put it in double digit territory. Growth is picking up, too. GSMA believes the market will grow threefold in just five years, eventually hitting $38 billion by 2018.
The automotive niche is getting a lot of attention from leading chipmakers such as Texas Instruments and Nvidia. In fact, Nvidia is in the process of reshaping its SoC strategy to better tap this market, shifting focus away from smartphones in the process.
The mobile market is overheating and growth is slowing down. As a result new niches such as wearables, IoT, home automation and automotive platforms are attracting more investment.
Speeding up time-to-market
Intel is touting speed as its key differentiator. The chipmaker believes it can drastically reduce infotainment development time, allowing carmakers to bring their solutions to market faster than the competition. Intel claims it can reduce development time by more than a year and cut costs by as much as 50 percent.
It is not just about music and navigation. Smart cars are the next big step and Intel wants to be a part of the self-driving car revolution.
“Our goal is to fuel the evolution from convenience features available in the car today to enhanced safety features of tomorrow and eventually self-driving capabilities,” said Doug Davis, Intel VP, IoT group.
In spite of the mobile boom witnessed over the past decade, most cars in showrooms today are ‘dumb’, not to mention older vehicles on the road. It is not just about making parallel parking a breeze. Smart automotive platforms promise to deliver huge improvements in terms of efficiency and safety. Convenience is just one small part of the puzzle.
In a new financial forecast, Sony has warned of heavy losses primarily due to its exit from the PC business and because “demand for physical media [is] contracting faster than anticipated.”
In two weeks, Sony will announce its financial results. The company expects to post a net loss.
A report released earlier this year by Generator Research showed revenue from DVD and Blu-ray sales will likely decrease by 38% over the next four years.
By comparison, online movie revenue is expected to grow 260% from $3.5 billion this year to $12.7 billion in 2018, the report states.
“Movie producers have little to fear from online distribution in the long term,” Generator Research said. “It is the distribution part of the movie business that should be worried, because online distribution will replace a sizable portion of their current industry.”
Paul Gray, director of TV Electronics & Europe TV Research at market research firm DisplaySearch, said consumers are now accustomed to the instant availability of online media, and “the idea of buying a physical copy seems quaint if you’re under 25.”
“Furthermore, e-tail has hollowed out the retail structure so that it’s largely [just the] latest titles in supermarkets. I suspect they are almost a gift format now,” Gray said.
About to put even more pressure on physical disc formats, Gray said, is the High Efficiency Video Coding (HEVC) video compression standard, which doubles the amount of data that can currently be streamed while keeping the “high-definition” format. HEVC can support 8K Ultra-High Definition content with resolutions up to 8192×4320.
The Blu-ray Disc format simply never hit the market levels of the DVD format, which dominated the home entertainment landscape in 2004 with $21.9 billion in sales representing a whopping 96% of home entertainment spending.
Since that peak, optical disc sales have plummeted by about 30%, according to the Digital Entertainment Group. Surprisingly, DVDs still have respectable sales figures, driven mainly by kiosk-style rental machines such as Redbox.
It is starting to look like chip makers are having cold feet about moving to the next technology for chipmaking. Fabricating chips on larger silicon wafers is the latest cycle in a transition, but according to the Wall Street Journal chipmakers are mothballing their plans.
Companies have to make massive upfront outlays for plants and equipment and they are refusing, because the latest change could boost the cost of a single high-volume factory to as much as $10 billion from around $4 billion. Some companies have been reining in their investments, raising fears the equipment needed to produce the new chips might be delayed for a year or more.
ASML, a maker of key machines used to define features on chips, recently said it had “paused” development of gear designed to work with the larger wafers. Intel said it has slowed some payments to the Netherlands-based company under a deal to help develop the technology.
Gary Dickerson, chief executive of Applied Materials said that the move to larger wafers “has definitely been pushed out from a timing standpoint”
Named the Archival Disc, it will have the same dimensions as Blu-ray discs and will also be readable for at least 50 years.
The disc will have three layers per side. It’s expected to hit the market in 2015, with the initial capacity later expanded to 500GB and then 1TB.
The higher capacities will be achieved through signal-processing technologies including multi-level recording technology, the companies said.
Sony and Panasonic are pushing the optical discs for cloud service companies and archival services amid the explosion in online data. The companies will market the discs separately under their brands.
“As a type of archival media, optical discs have numerous advantages over current mainstream HDD and tape media, such as their ability to be stored for a long time while still maintaining readability,” a Panasonic spokesman said. “We hope to develop demand for archives that use optical discs.”
The discs do not need a special storage environment with constant temperature or humidity and do not require air conditioning, the spokesman said, adding that users can also benefit from reduced power consumption compared to using linear tape-open technology (LTO), a magnetic tape storage format.
While LTO cartridges have greater capacity, typical lifetimes can be a lot less than the 50 years for optical discs. HP’s LTO-5 Ultrium 3TB cartridges, for instance, are warranted to last 30 years.
Hard drives can have even shorter shelf lives. Failure rates in one study were at nearly 12 percent after three years.
Sony and Panasonic said that as optical disc formats evolve, inter-generational compatibility ensures that older discs can still be read by corporate storage systems. However, the companies are not positioning the discs as a medium for consumer storage.
“The development is specifically for professional archiving,” the Panasonic spokesman said. “We are not currently considering optical discs for household consumer use.”
Marvell reported a more-than-expected 112 percent rise in profit, helped by strong demand from storage and networking companies, and said it expected its mobile business to pick up in the current quarter.
Marvell forecast first-quarter revenue between $870 and $910 million, which is above what the cocaine nose jobs of Wall Street predicted. Chief Executive Sehat Sutardja said that in his company’s first quarter, he was expecting some revenue and unit growth for our 4G LTE mobile platform from multiple customers. Marvell said results were not so hot in the mobile business in the fourth quarter as some customers delayed product launches.
The company, which also makes communications and processor products used in mobile phones, said net income doubled to $106.6 million, or 21 cents per share, in the quarter ended February 1 from $50.2 million, or 9 cents per share, a year earlier.
Revenue rose to $931.7 million, beating analysts’ estimate of $901.1 million.
Marvell’s biggest customer is Western Digital which reported better-than-expected quarterly results in January, citing strength in its gaming and notebook business.
Sony has promised to have “substantial” resupplies of the PlayStation 4 before the end of the year, but has given no indication as to what qualifies as substantial. Wedbush analyst Michael Pachter has stepped in to fill that information void, telling investors in a note this morning that he believes Sony is making PS4s at the rate of a million systems per month.
Pachter followed up on Sony’s announcement today that it had sold 2.1 million systems worldwide, saying that number fits well with previous estimates that Sony began manufacturing PS4s for retail on September 1, and that it faces a gap of up to three weeks from a system’s creation to the time it arrives on shelves.
“We expect Sony to continue to ship 1 million consoles per month, so as of the end of January, we believe Sony will have manufactured a cumulative 5 million consoles and will have shipped 4.25 – 4.5 million,” Pachter said. “We expect the 55 percent allocation to North America to continue through January, and then revert to a more normalized 40 percent of units once Sony launches in Japan and other countries. We think that Microsoft is on a similar production schedule, with similar allocations to North America.”
Pachter added that specialty retailer GameStop has been receiving roughly half of the systems shipped to North America, and that it will continue to take up that share of the allocations through December. In the New Year, Pachter expects the company’s share to be dialed back to a “more customary” 30 percent.
If the shipment projections are accurate, the PS4 would be more than holding up its part of publishers’ predictions that Sony and Microsoft would combine to ship 10 million units of their new systems by the end of March.
With the PlayStation 4 and Xbox One on the scene, the next console generation has finally begun. While a new generation usually brings the promise of more graphical power, great graphics are only part of the gaming equation. What will these new consoles allow developers to do creatively?
In its last two titles, Dear Esther and Amnesia: A Machine for Pigs, independent developer The Chinese Room focused on pushing the first-person game away from the shooting mechanics that usually dominate. The studio’s next title, Everybody’s Gone to the Rapture, is coming to PlayStation 4 with some help from Sony Computer Entertainment. For The Chinese Room, next-gen helps their creative juices just by being easier to work with.
“The blunt reality is that easier production equals more creative freedom and opportunity”
The Chinese Room creative director Dan Pinchbeck
“I think the major thing, from the perspective of actually building games, is less for us about the power – that’s brilliant of course, and having significantly higher budgets makes a big difference – but it’s more about the ease of working with PS4,” The Chinese Room creative director Dan Pinchbeck told GamesIndustry International. “So far, it’s just been a dream bit of kit to work with. We’ve got the advantage of working with CryEngine, another great piece of tech of course, but even then it’s been remarkably smooth to get things up and running quickly. That’s worth its weight in gold from a production standpoint, and the blunt reality is that easier production equals more creative freedom and opportunity.”
According to Braid creator Jonathan Blow, aiming for a single, next-generation set of specifications allowed the team behind The Witness to settle on a single visual style for the game. That title is also heading to PlayStation 4 in 2014.
“Creatively, we build and we assume that we have enough power in rendering,” explained Blow. “When we were planning the look of the island, we had a couple of choices. Do we target the PlayStation/Xbox 360 class of machines or do we move to next-generation consoles? Because development was going long, we decided we were going to be in the next console cycle anyways.”
“If we’d ended up on lower-spec machines, it wouldn’t just be that [The Witness] would have lower-poly models. It would’ve affected the style all over the place; the style of the game would’ve been different. I don’t think it would’ve been as nice.”
For Ghost Games, the new shepherd of EA’s Need for Speed franchise, next-gen does come down to “more power”. This power – and the new set of expectations that come with it – frees the team to think outside of the box when it comes to gameplay innovation. A new generation allows developers to think about what’s possible instead of wringing more blood from a worn-out stone.
“It makes us think differently. Every time there is a transition we start thinking about what would be possible.”
Ghost Games executive producer Marcus Nilsson
“It makes us think differently,” said Ghost Games executive producer Marcus Nilsson. “Every time there is a transition we start thinking about what would be possible. We are not locked into old boundaries anymore. From that we get great innovations like AllDrive. The systems are giving us power to do more, more AI, more particles etc. Just turning everything up really.”
Nilsson also noted that the PlayStation 4 and Xbox One provide other options, including social networking features and second-screen modes, which “opens up creative solutions around cross-platform play.”
One of the highlights of Sony’s launch window slate for the PlayStation 4 is Infamous: Second Son from Sucker Punch. While the game simply looks amazing, improved graphics and horsepower also mean the human element of Infamous can be pushed forward.
“[Infamous: Second Son] is all performance captured,” Sucker Punch co-founder and director of development Chris Zimmerman told us. “We actually use all kinds of cameras, with dots on the actors’ faces getting mapped through 3D scans. As you see people in the game, you’ll see their faces move in realistic ways.”
“See the wrinkles appear?” Zimmerman pointed out in a demo of Second Son, “we are actually animating 15,000 vertexes in his face 30 times a second to get that to happen that well. The thing that really matters for a game like this is you can actually see the characters act. You can read his face. You have a million years of human evolution that’s trained you to read people expressions and their faces; now we can bring that to you. That is the expression that these actors had when they did the scene. If we show you the video of their faces and then show you the in-game feature, you’ll be like ‘that’s the expression that guy had on.’ It seems dumb, but it matters.”
In some case though, the PlayStation 4 and Xbox One will just allow what previous generations have allowed: more, better-looking things onscreen in our games. And even that can improve the player’s experience. For BioWare Edmonton and Montreal general manager Aaryn Flynn, next-gen means a more immersive and interactive game world for BioWare fans.
“With the next generation of consoles, the most important question we ask ourselves is ‘How does this help our storytelling?’ As we’ve worked with them, we think it starts with a density and dynamism that wasn’t possible previously,” said Flynn. “‘Density’ in the sense of more interesting things on the screen that help immerse you in the game world, and ‘dynamism’ in that they are more interactive than ever before.”
The generation has only just begun. Developers still have plenty of time to learn how to make the PlayStation 4 and Xbox One dance and sing. What’s been shown so far is pretty damn good, so let’s sit back and enjoy the future.
In less than a week, both the PlayStation 4 and the Xbox One will have launched in the world’s most lucrative console markets. If you had to plant a flag to mark the start of a new generation, you’d do well to find a more appropriate spot.
Well, praise be. Microsoft was justifiably lambasted for its early direction and messaging, but the ill-feeling created by that string of fumbled choices was untroubled by all subsequent attempts to retrench and appease. Since then, Sony has walked a blessed path; not exactly free of mistakes and questionable decisions, but bolstered by the knowledge that the scrutiny of both the press and the forum-dwelling public was focused elsewhere. Perhaps now hard numbers can replace the speculation and supposition. Perhaps now we will be able to see the true measure of the policy reversals and resolution deficiencies.
There is, after all, a bigger picture to consider. It can be fun to get lost in the manufactured rivalry of a console war, but both Sony and Microsoft understand that this generation must be about more than the chips in their little – and not so little – black boxes. Gaming has never been more popular, or more culturally prevalent, but a lot has changed since the console companies last played this billion-dollar crapshoot.
So much of the industry’s recent growth has happened away from the traditional world of AAA blockbusters, where audience gains have been handily outmatched by soaring expenses. The early debate may be dominated by familiar concerns over framerates and dots-per-inch, but the terms of this generation will be different from the last. Sony’s mistakes with the PlayStation 3′s esoteric architecture didn’t go unnoticed by either party, and it shows in the hardware.
“The last generation created a bunch of artificial work. You had to do things in a very different way and, in the end, it wasn’t like you got a massive amount of technical performance out of it. It was time that didn’t go into making the games better,” says Nick Button-Brown, general manager at Crytek.
“I like the fact that, this time, it’s all built on architecture that we can understand. If you look at the PS3, people only started to get the most out of the system at the end of the cycle, but that’s five or six years on. That’s terrible. I want to start getting at the most nearer the start. That’s the advantage with simpler and more similar architecture – we’ll be seeing much more from the first games out.”
Crytek is the studio responsible for Ryse: Son of Rome, a standard-bearer for the Xbox One. Button-Brown admits that, while setting a visual benchmark was the not the main objective of the project, it was a side-mission of sorts, and the pride with which he describes Crytek’s work indicates that he considers the mission very much accomplished. The smoke, the fire, the beads of sweat running down the lined, wrinkled faces of the characters, the way those characters plant their feet; these are, he boldly claims, new heights for console gaming.
“I do think we’re going to set a visual benchmark; it’s going to be very difficult for anyone to beat our visual performance. We put a lot of work into facial, a lot of work into animation, just making it all feel much more real,” he says. “Is there further we can go? Definitely. We have some high-end cinema tools that don’t run in real-time even on high-end PCs now – we’re talking one, two frames per second. Eventually, we’ll be able to run those in real-time.”
In the absence of stiff competition, Ryse has as strong a claim to the pinnacle of visual excellence as any other launch title, but Button-Brown understands that such victories are short-lived. After all, in blockbuster development, a better looking game is always just over the next hump of the release schedule. Crytek will no doubt persist in that direction, but the impact of this generation’s visual performance will not be as profound as the jump to HD, and the differences between the PlayStation 4 and Xbox One hardware will matter less still. This time, exactly what constitutes the “cutting-edge” will be harder to pin down.
“There’s always more we can do [visually], but I do think you reach a point where, for the user, they feel that it looks as good as it’s going to get, and they’re not going to see a huge difference between [the consoles],” he says. “For us, the leap is about the details. It’s not about one or two big things. It’s about being able to do small things much better: more stuff on-screen, more AI, more physics.”
It would be churlish to ignore the fact that Ryse has failed to stir the imaginations of the critics, eliciting unanimous praise for its visual detail and precious little else. My interview with Button-Brown was conducted prior to the publication of those reviews, but even then he was cognisant of the gamble creating a launch title for this particular generation represented. In the past, there were obvious, powerful hooks for developers to work with – the advent of 3D graphics and HD graphics, the availability of a hard-drive, online play as a usable tool – but this generation is more diffuse.
“Going into launch, I don’t know whether we’ve spent the resources in the right place. I don’t know whether we’ve focused our efforts in the right place. I’m only going to know that when people get to buy it,” he says.
“We talk to publishers a lot, and one of the most painful questions is, ‘Tell me what next gen gameplay is gonna be?’ It’s not something you can define. Nobody delivers gameplay because it’s next gen; you’re delivering gameplay because it’s good. That’s one of the things we struggled with [in Ryse's E3 demo]. We showed a cut-down version of the gameplay and we were criticised for that. We didn’t see that coming. We were too close, and we cut it down further than people wanted to see.”
However, while the criticisms leveled at Ryse may well be justified, a part of the problem may be that, at the dawn of a new generation, nobody is quite sure what they want to see. They only know what has gone before, and will resist any attempt to smuggle what are regarded as the bad habits of the past into the $400 future. Ryse signalled its intent with combat that closely resembled a QTE. That was never likely to go down well with the press, who instantly suspected Crytek of trying to coast on graphics alone.
“The generational leap is not as clear cut now,” Button-Brown admits. “Maybe in a year’s time we’ll have a better understanding of what the leap really is this time, as people start playing things and we start to see what really matters. I think with hindsight we’ll be able to look back and see, ‘yeah, that was the big step.’”
Perhaps it’s naive to expect more clarity on what might define this generation from developers working so closely with the hardware, but in any case, that would be no slight against Crytek. Apart from Kinect 2.0 on the Xbox One – which may finally have the hardware to honour some of the promises made four years ago – in terms of new game experiences there isn’t an obvious wellspring for original ideas on either console. Indeed, the most obvious differences in the early days of the generation are likely to be found in the service layer: social integration, voice control, multimedia functions, and other areas often dismissed as secondary to the tasks for a which a console should be designed.
This is one of the key ideas I took away from my conversation with Michiel van de Leeuw, technical director at Guerrilla Games. Essentially, the moment-to-moment experience of established genres will remain the same, but innovation will arise from, “a deeper, underlying layer.”
“It’s not like we have that one gizmo to make everything really good or different, but the way that the operating system and the games work together, it’s much more of a marriage of those two things,” says van de Leeuw. “It’s a much more holistic approach to the console. How do people use it? How do people want to use it? How do we make sure that every hour of using your console is an hour spent having fun? And almost nothing is more fun than sharing experiences with other people. It’s all integrated, and under the hood there’s a lot of complexity to make sure that you don’t notice it. A lot of magic is necessary to make it look simple.”
As a subsidiary of Sony Computer Entertainment and the developer of a key launch title, Guerrilla Games was part of the inner circle that formed around Mark Cerny during the PlayStation 4′s creation. The most taxing problem, the subject of the most meetings and debates, was how to improve the experience around and outside of the games – streaming, background downloads, switching between applications, and so on. For Cerny, “immediacy” was a watchword.
When it came to the fundamental hardware architecture, however, van de Leeuw says that the directive was relatively simple: “give us more…as many graphical gizmos as you can afford.” The extra power was a given rather than the main focus.
“I like to ask people about what the next generation should be about, and everyone says, ‘it has to be a photo-realistic, and everything has to be more. There has to be thousands of people and blah, blah, blah.’ But why is that fun? If you have 1000 people around you, do you feel more attached to them than if you just had one or two? Technology does not immediately result in a more satisfying experience. The first layer that people think about is better graphics, more of everything. And then they think, ‘What do I need more of? I don’t know, really, but there must be more of something‘.”
There it is again: the great, unknowable ‘something’ that, nevertheless, everyone is waiting impatiently to see. Killzone: Shadow Fall has fared better with the critics than Ryse, but the expectation of clear, identifiable progress is used as ammunition in the majority of its negative reviews. For van de Leeuw – who also spoke to me prior to the publication of his game’s review scores – launch titles are not necessarily supposed to alter the way people look at games as a whole, but he also makes no secret of the increasing complexity of productions on the scale of Killzone. More power can make life easier in some respects, but certainly not all.
“You have to focus on 1000 things at the same time, and at the same time as that you need to grow your company, because you need more people to focus on all of those things. That, by itself, becomes a problem, because it becomes difficult to manage the complexity brought by all of those extra people. It’s very challenging.
“We’re working with first-person shooters, and look at how incredibly complex these things are. You’re not just selling one game: you’re selling a movie, and a game, and a multiplayer experience that needs to fit with eSports, and it’s all packaged together. And it all has to be good, because the competition is incredibly, and increasingly, good.”
Indeed, it is the progress evident in individual games, rather than the super-charged hardware, that truly plants a gauntlet at the feet of the industry’s developers. Umpteen gigabytes of GDDR5 memory is not nearly as powerful a motivator to do better work as the release of, say, The Last of Us or The Walking Dead. New hardware may give developers more options, but the real skill lies in making the right decisions. When there is enough of an installed-base to offer a safety net, van de Leeuw says, the industry’s most talented developers will start taking creative risks, and new genres will emerge.
But will that innovation be exclusive to a specific platform? When a consumer makes their decision to buy either a PlayStation 4 or an Xbox One, is the potential for new ideas a relevant factor? From the developer side, ven de Leeuw says, the differences in the hardware of this generation may not offer the sort of rewards that Naughty Dog and Guerrilla wrung out of the PlayStation 3′s distinctive Cell processor. Today, with teams spiralling into the hundreds, budgets on the rise and a dozen other platforms to consider, the emphasis is on efficient tools and flexible engines. Microsoft and Sony made a conscious choice to be more similar than different in terms of architecture, with developers’ needs firmly in mind.
“Being able to squeeze more out of the console by really focusing on it allowed us, in the past, to create experiences that couldn’t be done, or would be much harder to do if we had to split our focus. But I think we’re coming to the day where the amount of effort you have to put in to do that, it’s questionable whether it’s worth it.
“Our games are getting so big. We try to make our experiences richer for gamers, but at some point… there are pros and cons. Sometimes we wished that things were easier. The [PlayStation 3] was difficult to program for, but I still sometimes I miss it because it was also very powerful. You could do a lot of stuff that’s still very difficult to replicate, but the time for bespoke architectures is slowly going away.
“If you look back, raw assembly and raw power were what enabled new experiences. Nowadays, experiences are defined or limited by how efficient our toolsets are, how smooth our workflow is, how quickly we can develop, and how much time we have to spend on mundane distractions… Bespoke architecture allows you to do cool and crazy stuff, and from a technical point-of-view I’m still in love with that sort of thing, but I have a 230-person studio that wants to make a killer title.”
Despite what many executives have claimed in calls to their investors, both van de Leeuw and Button-Brown either strongly imply or directly confirm that the cost of making those “killer titles” will rise this generation – not to the same degree as they did with the Xbox 360 and PS3, perhaps, but certainly beyond the already precarious conditions that exist today. While we pore over screenshot comparisons, declaring winners and losers over slight differences in observable visual performance, it’s worth considering what any third-party would actually stand to gain from making one version of a game significantly better than another. Indeed, at companies like Epic, EA and Crytek, the emphasis has been on creating cost-saving tools that work seamlessly across all platforms, effectively glossing over aspects of the hardware that could lead to substantial gains in performance. First-party developers will still pursue that, of course, but, according to Button-Brown, for everyone else the base-level of AAA acceptability now sits at a daunting height on both platforms.
“If anything is just okay, it’s now terrible. ‘Solid’ is a failure. You now have to be so good,” he says. “The teams are getting larger and the risks are getting higher. We’re trying to do a lot of procedural stuff in this next generation to keep costs under control. It’s one of the ways we’re trying to keep that down, but it’s still a cost increase. Each asset needs to be so much better, so much more defined, than it was in the previous generation. No amount of procedural is going to change the fact that your underlying asset just has to be that much better.”
All of that hard-scrabble at the top end of the industry – essentially, fewer companies using more resources to create and market a smaller number of increasingly large games – will have a clear upside for independent developers. Indeed, right now, the beneficial ramifications of Sony’s decision to court indies as early as possible is arguably the most significant difference between the PlayStation 4 and the Xbox One. It always felt like a smart move, and that feeling will be further justified as the paucity of $60 blockbuster releases becomes more apparent.
Microsoft’s early digital strategies and the Xbox One’s evidently underpowered hardware may have monopolised the headlines, but Oddworld Inhabitants’ Lorne Lanning believes that it’s Microsoft’s belated effort to secure the diverse, free-flow of content from the indie sector that has truly given Sony the advantage. That reluctance to open up the Xbox platform, he argues, is tied to a big-business mentality that no longer works in a connected entertainment medium – the very same mentality that led to the unanimously derided online check-ins and multimedia focus that dominated the Xbox One’s early messaging.
“ID@Xbox was a bittersweet victory,” Lanning says. “If you have your ear to the ground today, you could see that those policies were going to blow up in its face, particularly when you see what [Sony] was doing. That was an old way of thinking, a way of thinking that was all about control. It’s a trickle down from being a monopoly. There’s a reason there was a class-action suit [against Microsoft]. There’s a reason there was an SEC, antitrust thing. There’s a very good reason for that. They wanted to control everything. The people who made those policies were still thinking very much in that way, and it blew up in their faces.”
For Lanning, this will be a generation defined by consumers getting what they want, rather than what they’re given. The generation where consumers wrest control of gaming back from the companies that have controlled it for so long – platform holders, publishers, retailers – and seek satisfaction from the most agile creative forces. There may be some lingering resistance from those with vested interests in established models, but Lanning believes any company seeking to stand in the way of this intractable change is unlikely to emerge with much credit. There will be more products offering a wider variety of experiences than on any previous generation, with price-points to suit every wallet. The lines of communication are wide open. There is nowhere left to hide.
“As people are becoming more informed and more connected, the shenanigans are becoming more transparent. And with that, what we’ll get is more diversity,” Lanning says. “The industry made up of five publishers really isn’t that long ago, and now what’s going on? How many self-publishing indies are there that can get a 1.5x return on each game and keep building? Maybe they can’t grow and be 500 people by the next year, but they can add 5 more by the next year.”
I mention the prevailing fear that the marketplaces on the Xbox One and PlayStation 4 will become too crowded – that by making consoles a more accessible place for independent developers, they will lose the focus that created huge successes like Castle Crashers, Super Meat Boy and Braid. For Lanning, it’s a worthwhile trade, and one of the most important ways that indies need to “grow up” to take advantage of the incredible opportunity this generation represents. The Battlefields and the Assassin’s Creeds will continue to exist and thrive, but the average consumer knows that already. What they don’t know about are games like Octodad, Below and Everybody’s Gone to the Rapture, and more fool the studio who leaves it up to Microsoft or Sony to raise their profile.
“If we sell a game now for $10, we get $7 on digital networks. Once upon a time, we weren’t even getting $7 on a $60 game,” Lanning says. “It’s a whole different thing, but you have to bring your own visibility. That’s your responsibility. Beyond just designing the game, we have to design how to build the relationship with our audience. People know that they want the GTA and the Call of Duty, and they’re gonna be on both systems. But they also want the surprises, and they want to experiment with those surprises at below the $60 price range. The audience always wants more choice.
“The biggest earners are gonna be the big AAA titles, because they have the $100 million marketing campaigns. You can’t compete with that. But in the years to come, the big properties at E3, the $100 million properties, they will have started off in the indie space. They’re gonna innovate cheaper, faster and more with their audience right away. That’s a guarantee.”
In a full discussion below, you’ll read Yoshida’s thoughts on the launch scores (which he joked afterwards that he was hoping I wouldn’t ask him about), how PlayStation is being redefined in the PS4 era, why Drive Club had to be delayed, why graphics and 1080p resolution absolutely matter, and he explains his skepticism for Xbox One’s cloud computing tech. It’s a lengthy conversation but well worth the read to absorb Yoshida’s refreshingly forthright answers.
Q: You’ve been with PlayStation from the very beginning, you’ve seen it all and played a part in the growth of the games business, so perhaps you’re the best person to answer this question. How would you compare this launch to the previous hardware launches? Has it been harder or easier and why?
Shuhei Yoshida: I think this is the most organized launch we’ve had as a company. The launch of PS4 reminds me a lot of the launch of PlayStation 1 because we were a very small company at that time. We had a small group of people trying to do almost everything. Because we were new, we tried to speak to the people in the industry, our partners and developers, and we tried to learn a lot. So we kind of stopped with that approach as we became successful and larger and more confident. The pace of change was not that fast during PS2 and even PS3. The PS3 era for us was the beginning of the network platform being integrated at a system level… but back then people didn’t really use smartphones and that all changed in three or four years and it was a huge change. That forced us back to basics almost, and it required us to really think through everything that we do from the hardware specifications to services to the overall business plans. We had to think about the use of new devices and what that means for us. When people use mobile devices, is that competition? Or are [mobile devices] tools for us? We had to redefine our platform almost, and we have come to conclude that this is the beginning of a new era of PlayStation, shifting more from a hardware focus to a service focus.
The PS4 generation is going to be the transitional generation. In a sense, it’s the completion of the evolution of the strong 3D capable consoles, but at the same time it’s at the maturing phase of our network platform and the beginning of our new service phase, like our cloud gaming that we are preparing to launch next year. And the use of mobile devices is part of our ecosystem. So all that considered, and the difficulty we had at the launch of the PS3, and very strong competition especially in North America, that made us really revisit everything we’ve been doing and redefine the company, almost like we’re re-entering this industry. Even across our teams, I think you now get more consistent messages [about PlayStation] compared to past generations, because we talk a lot more and get a lot of input [from all the teams] on different decisions.
In the past, it was very much [driven by] Tokyo. And now [Group CEO] Andrew House is playing a major role in getting the US and European groups integrated. And I’ve been playing a major role myself on the development side for the last five years… So, Andy and I can quickly decide for certain projects, “let’s get this person from the US team or this person from the European team” and put someone in charge of a global project. So it’s a much more integrated international team that we have now and we are always communicating. There’s been a great maturing of our organization compared to past generations.
Q: During Sony’s last earnings call, CFO Masaru Kato said that PS4 actually will contribute to the division’s profitability much earlier on than past consoles. How important is this to the continued sustainability of PlayStation as a business, and does this mean we should expect Sony to cut prices on PS4 to make it more affordable sooner?
Shuhei Yoshida: Yeah, I read an article where an executive of a major publisher said something about [prices coming down sooner]… Because Masaru Kato used to be CFO of Sony Computer Entertainment and he was the key guy on the business side when we launched the PS3 – he was the right-hand man for Ken Kutaragi – he had to go through that really tough time. During the PS2 era, we were very proud that we were generating like half the profit of Sony Group or something like that, but with the launch of PS3, we lost billions of dollars and we became a burden for Sony. So Masaru’s comments, comparing to PS3, it’s too easy a benchmark. In a sense, we’re doing great because we’re not losing billions with the launch of PS4 – in fact, we’re pretty much breakeven in this launch year of PS4 – but looking forward, it’s fair that as CFO of Sony, and with his experience with previous PlayStation generations, that he would expect a better financial performance… And of course, he’s in a position to really whip all of the business groups at Sony to get the best performance possible.
On the question of whether costs come down quicker, I think there are a couple ways to answer that question. One is that our hardware teams have chosen more standardized components to create PlayStation 4 and that’s contributing to our launch price of $399 versus $599 for the PS3. When we need to source components to get more supply to the retailers, that approach definitely helps compared to some cutting edge component that only one manufacturer can produce, like Blu-ray or the Cell processor. Those were big bottlenecks. It’s much better this time, and that’s all great, but it might mean that because we’re already using more standardized components, the room for costs to come down might actually be slower than when we were starting with cutting edge stuff.
Q: The PS4 software reviews so far have been average or in some cases, worse than average. As the head of Worldwide Studios, what’s your reaction to this? Are you worried about the impact on PS4? The PS3 suffered from a lack of great software but the system did well in the end, so how important is it to have that “system seller” at launch?
Shuhei Yoshida: Yeah, it’s disappointing to see some of the low scores. I haven’t spent enough time reading reviews, but I would characterize them as mixed. And with this launch there are lots of games coming out, so the media must be very busy going through the games quickly, and especially since the online functionality wasn’t ready until in the last couple days. So we have to look at how much time they spend on what aspect of the games and how that may be contributing to some of the lower scores. It’s disappointing but I don’t think it’s worrisome for the launch of the system. I’ve played through all of our games, Killzone, Knack and Resogun, and I totally enjoyed playing through these games. I’m now on my second run of Knack and Resogun at a higher difficulty – these games really grow on you when you play more. I’m very confident that once you purchase these games and play, you’ll be happy that you’ve done so.
Q: You mentioned Knack, and unfortunately that game got even lower scores than the others, and I’m wondering if that’s more frustrating since it came from Mark Cerny. Was Mark not able to devote his complete attention to Knack because of his responsibilities as PS4 system architect? Was he spread a bit too thin?
Shuhei Yoshida: No, I don’t think that’s right. He spent maybe a quarter of his time during the development of Knack and in his position of giving creative direction and overseeing development, it was appropriate… He was in Japan every month for a week, working with the team, so the communication was very good.
The game wasn’t designed [to meet specific] review scores – I was hoping Knack could score in the mid 70s and last I checked it’s around 59-60, so I’m hoping it goes up. The game uses only three buttons to play, so it’s not the type of game reviewers would score high for the launch of a next-gen system. The game was targeted as what we call a second purchase; you know, people may purchase PS4 for Call of Duty or Assassin’s Creed or Killzone, but if they also buy Knack, this is a game that you can play with your family or your significant other. It’s a message that as a platform we are not just trying to cater only to the hardcore, shooter audience – we are looking at all kinds of gamers – but Knack is a great game for core gamers as well because when you up the difficulty level it becomes a really tight, tense action brawler.
But the goal was to design it to be played by anyone, even someone who’s never played before. So it wasn’t aimed at high review scores, even though higher would be appreciated! Killzone is different – it’s definitely targeted to the core gaming audience and we’re still waiting on more reviews because some sites are saying they played single player but not enough multiplayer. So I’ll wait with my personal judgment until I read more reviews.
Q: Regarding the Drive Club delay, considering that the PS4 has been in development for 6 years, it’s odd that an internal studio like Evolution that knew the launch, the specs and everything else well in advance of even the closest third-party partner should miss the launch. Was there some miscommunication or what happened to cause the delay?
Shuhei Yoshida: It’s almost an amazing achievement for any studio to set a release date and achieve it, especially for the launch of a new system because the hardware and software tools are always getting updated. So you always have to work with the moving target, so to speak. That said, PS4 has been praised for the ease of development and the stability of the dev kit by everyone – not just our teams but other developers and publishers. And it’s true that Evolution was also heavily in discussions about PS4 hardware features and network service features. Where the team missed the date and miscalculated the tasks was when they tried to do something they have not done before.
A launch title is especially tricky if you aim too high. When you try new things, you definitely have to prepare for multiple iterations… In order for a title to come out at launch, the ambition level has to kind of be kept in check; the team has to rely on tried and true mechanisms. That I think is the main reason for missing the launch date. Drive Club is exciting because it really goes aggressive into the integration of social features and the second-screen experience, and that’s a new addition for Evolution. The team has been making racing games for a long time, so they’re veterans when it comes to core racing…
Q: So it was the addition of social integration features that set them back?
Shuhei Yoshida: They always planned the game to have these social features but because these features are new, they found some technical matters or flaws in play testing, and that’s the reason we waited until the very end to announce the delay. They might have been able to hit the date, but in terms of both getting technical matters down and getting the game polished enough… we decided we wanted the team to go back to some of the features and spend some more time to get it done.
Q: This is a multi-part question. First, there’s been a lot of noise in the media lately about how Xbox One runs Call of Duty: Ghosts at 720p, not the full 1080p resolution that it plays on PS4. How important is this? Do you think the average consumer would really appreciate the difference? Second, how much will the average consumer notice a difference between last-gen PS3 and Xbox 360 games and what PS4 now offers? PS3 games look very good, so do graphics matter in next-gen? Why should consumers spend $400 on PS4?
Shuhei Yoshida: I can confidently say that graphics matter, because I played through Killzone: Shadow Fall. What I mean is, most people probably can’t tell looking at 720p or 1080p unless you’re in the industry or you’re a hardware nerd, but when you compare a game like Killzone: Shadow Fall to Killzone 3 on PS3, for example, the fact that the game is rendered and displayed at 1080p native means that every pixel is rendered, and in combination with the new Dual Shock 4 analog sticks and triggers, it’s great when you’re playing a shooter and you can see the enemy far away from you and you can move the crosshair to aim with pixel perfect precision.
When you talk to game designers at Guerilla, they would tell you it’s kind of traditional for shooters on consoles to include some aim assist [function] because of the lack of accuracy of the control and the lack of clarity in the graphics, but with 1080p and the power of PS4 you don’t need that. So you actually have more control and the satisfaction level is higher. So when you’re shooting enemies, it’s all you. You don’t need to be able to spot the difference in resolution but it just feels great. That’s the difference; graphics isn’t just about making things look pretty, but it can make the gameplay better. Another example is in racing games, like Gran Turismo, when you see a long road ahead and it curves to the left or right, you can tell what’s coming thanks to the resolution and power of graphics. The improved draw distance gives you anticipation for what’s to come. So the power of hardware and graphics in some areas is actually very related to great gameplay experiences.
Since the beginning of this year when we saw leaks [about the specs] of next-gen platforms, we immediately knew since the tech specs on PS4 were accurate that the Xbox specifications were likely accurate as well. So we knew at that point that we had much more raw power… So I was hoping from earlier this year that when games come out from third-parties – because that’s the best example, to look at the same game on different platforms – if there’s any slight performance difference on the two systems I’ll be very happy. I wasn’t expecting something like [what happened with] Call of Duty, 720p versus 1080p – that’s a significant difference. Or Battlefield 4, which is 900 versus 720 – 900 requires 50 percent more pixels to be rendered. I learned all this from the Digital Foundry site.
There are a lot of hidden powers in our system. You may be familiar with GPGPU and PS4 has a lot more GPGPU processing in it, which is difficult to learn and master, similar to a Cell processor. So every year the games on PS4 will perform better because most of the launch teams probably didn’t use GPGPU – they probably just used core graphics. So when the developers [use more of these] in two to three years the graphics will be really amazing. Resogun, by the way, is already using GPGPU… and that game is getting very good reviews!
Q: That may be the PS4 system seller you were looking for!
Shuhei Yoshida: At least we have one game that’s getting great reviews.
Q: It’s great for Sony to say that PS4 is more powerful than Xbox One, it’s a great marketing point but…
Shuhei Yoshida: Well, I always say “I believe” or “We believe.” I’m not saying that it is.
Q: Ok, but from an industry standpoint, in a way isn’t it good that both consoles are so similar, so that developers can easily create games for both and target a larger combined installed base? I’m wondering – and this may sound like an odd question – does Sony ever communicate with Microsoft to get a sense of where an industry “standard” for consoles might end up for another generation?
Shuhei Yoshida: No, no. We didn’t conspire [laughs]. But it’s very interesting how we came to the same selection of CPU and GPU vendors. It’s not exactly the same as each company customized the processing choices and so we ended up with more processing power but the architecture basically is quite similar. If you talk to any third-party developer, they say it’s a wonderful thing because they really want to make the development process very efficient. So I think it’s great, because learning the Cell processor was very difficult and now with PS4 everything’s much easier – and at the same time, if you’re a multiplatform developer it’s going to be very easy to create PC, PS4 and Xbox One versions of a game because all three share the same kind of roots.
That said, each company, including Nintendo, has some unique additions to the core… So the multiplatform developers do have some decisions about how much customization and additional work they want to do to take advantage of the different unique aspects of the platforms. And by the way, I don’t think developers have to do much more to take advantage of the raw power of PS4, to get games to render at the highest resolution.
Q: Microsoft has talked a lot about their cloud computing and the extra power that gives the Xbox One to offload some of that processing to a server in games like Forza or Titanfall. Is this something Sony can compete with? Can Gaikai be used in a similar way? Is that realistic, or perhaps Sony and Microsoft view the cloud differently?
Shuhei Yoshida: We’ve been clear on what cloud gaming means, and that’s getting games to run on the server and sending that video signal to a distant device. The way they are using cloud computing seems very different and I totally don’t understand what they mean by that. So we can’t react to what they are saying because we don’t understand. The explanation I found personally was, again, an article on Digital Foundry. They went through all the computing tasks a game goes through and for each one they checked off if it can actually be done on the server versus the client, and most of the tasks a game has to perform, they said, cannot be done on the server because of the huge latency and the bandwidth. There’s so much data going back and forth between the CPU and memory and GPU inside the console compared to going through the internet… There were maybe four or five tasks that actually could be done on the server. So that was very educational to me. After reading the article, the Microsoft message was even more confusing to me.
Q: With PS4 launching, we haven’t touched on Vita at all, but I did want to ask if you think those two systems will feed off each other? The Vita business has been slower than Sony would like but do you think the interest in PS4 and features like Remote Play could help boost the Vita sales over the long-term?
Shuhei Yoshida: Yeah, I hope so. It’s been exciting these past couple days when we saw the media experimenting with Remote Play. It’s very impressive. And the use case is if the main TV is occupied, then you can continue the game on Vita. If you live alone, maybe the use case is less, but even if you live alone there’s some value in it. For example, I like to play games before I sleep, so I use Vita in the bed before I sleep and so whether or not the TV is occupied it’s just very convenient for me to be able to continue to play, unless I really need that accuracy with shooting like I talked about earlier, so maybe I wouldn’t play Killzone with Remote Play but I totally enjoy playing Knack on Vita.
So that definitely makes your Vita much more valuable if you already own one, and if you don’t, once you get PS4 the potential value of Vita is much higher. We definitely hope people see that value and have a chance to see PS4 games running on Vita in person, because the combination of PS4′s power and the great display of PS Vita is awesome. It’s like mini cloud gaming, and actually Gaikai has worked on Remote Play. I’m very happy with the implementation – it’s a seamless experience.