Subscribe to:

Subscribe to :: ::

AMD Appears To Be Pushing It’s Boltzmann Plan

November 25, 2015 by Michael  
Filed under Computing

Troubled chipmaker AMD is putting a lot of its limited investment money into the “Boltzmann Initiative” which is uses heterogeneous system architecture ability to harness both CPU and AMD GPU for compute efficiency through software.

VR-World says that stage one results are finished and where shown off this week at SC15. This included a Heterogeneous Compute Compiler (HCC); a headless Linux driver and HSA runtime infrastructure for cluster-class, High Performance Computing (HPC); and the Heterogeneous-compute Interface for Portability (HIP) tool for porting CUDA-based applications to C++ programming.

AMD hopes the tools will drive application performance from machine learning to molecular dynamics, and from oil and gas to visual effects and computer-generated imaging.

Jim Belak, co-lead of the US Department of Energy’s Exascale Co-design Center in Extreme Materials and senior computational materials scientist at Lawrence Livermore National Laboratory said that AMD’s Heterogeneous-compute Interface for Portability enables performance portability for the HPC community.

“The ability to take code that was written for one architecture and transfer it to another architecture without a negative impact on performance is extremely powerful. The work AMD is doing to produce a high-performance compiler that sits below high-level programming models enables researchers to concentrate on solving problems and publishing groundbreaking research rather than worrying about hardware-specific optimizations.”

The new AMD Boltzmann Initiative suite includes an HCC compiler for C++ development, greatly expanding the field of programmers who can leverage HSA.

The new HCC C++ compiler is a key tool in enabling developers to easily and efficiently apply the hardware resources in heterogeneous systems. The compiler offers more simplified development via single source execution, with both the CPU and GPU code in the same file.

The compiler automates the placement code that executes on both processing elements for maximum execution efficiency.



Samsung Debuts 10nm FinFET S-RAM

November 23, 2015 by Michael  
Filed under Computing

Samsung appears to have stolen a march on Intel and TSMC by coming up with a 10-nano FinFET processed S-RAM

According to Electronic Times Intel and TSMC’s products are still being processed at 14-nano and 16-nano so Samsung’s 10-nano S-RAM, will open the way for a generation of Giga-Smartphones. S-RAM is faster than D-RAM and is used for CPU’s cache memory.

It means that Samsung’s 10-nano technology will be mass-produced on full-scale in early 2017. The theory is that 10-nano AP will combine Gigabyte modem chips into one faster chip.

Samsung is showing its plans to the ISSCC. They will have a 128 Megabyte (MB) capacity and a cell area of 0.040 µm2. This compares to the 14-nano S-RAM (0.064 µm2) that Samsung Electronics introduced in the past, its cell area is reduced by 37.5 per cent.

In an ISSCC scientific paper, Samsung said that it built a large-scale fast cache memory in the smallest area. An AP for a smartphone with S-RAM, can minimize Die’s area and improve its performance.

All this means that Samsung Electronics has surpassed Taiwan’s TSMC and developed the next-generation system semiconductor.

Intel postponed its schedule for developing next-generation 10-nano system semiconductor from 2016 to 2017 due to increase of production costs. Samsung Electronics is targeting end of next for commercialising 10-nano processing.

Samsung Electronics has also developed 14-nano flat-surface NAND-Flash, and this is also first ever in industries. Toshiba, Micron and others have announced that after they finish developing 15 to 16-nanos, they are giving up on flat-surface NAND-Flash.

It had been thought that 14-nano NAND-Flash, which reduces area of Floating Gate by about 12.5% compared to 16-nano, will greatly contribute to Samsung Electronics in reducing production cost of NAND by reducing Silicon Die’s area.



GPU Shipments Appear To Be On The Rise

November 17, 2015 by Michael  
Filed under Computing

Beancounters at JPR have been adding up the numbers and dividing by their shoe size and worked out that GPU shipments are up for both Nvidia and AMD.

Over the last few months both have been busy with new releases. Nvidia has its GeForce GTX 950 and GTX 980 Ti, while AMD put its first HBM-powered cards in the Radeon R9 Fury X, Fury and the super-small R9 Nano into the shops.

According to JPR, overall GPU shipments are up quarter-over-quarter – with AMD’s overall GPU shipments up 15.8 per cent. But before AMD fanboys get all excited by a surprise return to form from AMD, JPR said that that NVIDIA “had an exceptionally strong quarter”. Nvidia saw an uptick of 21.3 per cent.

The PC market as a whole increased by 7.5 per cent quarter-over-quarter but decreased 9 per cent year-over-year. Nivida’s discrete GPU shipments were up 26.3 per cent according to JPR, while AMD’s discrete GPUs spiked by 33 per cent.

AMD’s mobile GPU shipments for notebooks increased by 17 per cent, while NVIDIA had 14 per cent.



AMD And GoFlo Have Successful 14nm FinFET Production

November 10, 2015 by Michael  
Filed under Computing

AMD said that Globalfoundaries has demonstrated silicon success on the first AMD products using GloFlo’s 14nm FinFET process technology.

We are pretty sure that it is talking about the prototypes for Zen, but AMD is not being that specific. Nevertheless, AMD is being enthusiastic.

As a result of this milestone, Gloflo silicon-proven technology is planned to be integrated into multiple AMD products that address the growing need for high-performance, power-efficient compute and graphics technologies across a broad set of applications, from personal computers to data centres to immersive computing devices, AMD said.

Er that will be Zen then.

AMD said that it has taped out multiple products using 14nm Low Power Plus (14LPP) process technology and is currently conducting validation work on 14LPP production samples.
Today’s announcement represents another significant milestone towards reaching full production readiness of Globalfoundries’ 14LPP process technology, which will reach high-volume production in 2016, AMD said.

The 14LPP platform taps the benefits of three-dimensional, fully-depleted FinFET transistors to enable customers like AMD to deliver more processing power in a smaller footprint for applications that demand the ultimate in performance.

Mark Papermaster, senior vice president and chief technology officer at AMD said that FinFET technology is expected to play a critical foundational role across multiple AMD product lines, starting in 2016.

“Globalfoundaries has worked tirelessly to reach this key milestone on its 14LPP process. We look forward to Globalfoundaries continued progress towards full production readiness and expect to leverage the advanced 14LPP process technology across a broad set of our CPU, APU, and GPU products.”

Mike Cadigan, senior vice president of product management at Globalfoundaries said that the 14nm FinFET technology is among the most advanced in the industry.

“Through our close design-technology partnership with AMD, we can help them deliver products with a performance boost over 28nm technology, while maintaining a superior power footprint and providing a true cost advantage due to significant area scaling.”

Globalfoundaries 14LPP FinFET is ramping with production-ready yields and excellent model-to-hardware correlation at its Fab 8 facility in New York.

AMD said that in January, the early-access version of the technology (14LPE) was successfully qualified for volume production, while achieving yield targets on lead customer products.
The performance-enhanced version of the technology (14LPP) was qualified in the third quarter of 2015, with the early ramp occurring in the fourth quarter of 2015 and full-scale production set for 2016.


Will AMD’s Zen Processor Hit The Mark?

November 6, 2015 by Michael  
Filed under Computing

A website based its article on one forum post claims that AMD Zen meets all expectations because he knows someone who works in AMD. So it must be true.

The fact that AMD Zen “meets all expectations” got us excited until we looked a bit deeper. It turns out that the report is based on a guy who swears he knows a guy that use to work for AMD on K12 L2 cache design. It is not clear if he met the guy in a pub or not.

His other colleague that still works there tells him that the test chip has meet all of the expectations and the team didn’t find any significant bottlenecks and this got the partners on the server side very excited. We have had our share of AMD Zen exclusive news, but it will take a while until this chip hits the market, we expect it in late 2016.

At the stage of development AMD should actually be in the end phase anyway and if everything went fine, the test chip should be running. The last few quarters are  used to further optimise the design.

AMD definitely needs a break and Zen is a new architecture on a new manufacturing nod, which is  a most complicated and complex thing you can do in chip development. If all continues to go well, the K12 might ship in limited quantities toward s the end of the 2016 and in serious quantities by 2017. Intel should have a Kaby Lake 14nm successor to Skylake launching in a similar timeframe which gives AMD a fighting chance.

Intel has 99 percent of the server market share according to Bloomberg report . If K12 gets even close to performance of Intel desktop and server chips,AMD has a realistic chance of recovery. Server manufacturers don’t really like the one player only market, as the increase competition could drive the prices down.



AMD Replaces Catalyst With Crimson

November 5, 2015 by Michael  
Filed under Computing

AMD has announced Radeon Software Crimson, a “mini graphics operating system” to replace the firm’s Catalyst Driver and work alongside the standalone Radeon Technologies Group (RTG) chip division announced a month ago.

RTG was launched as a dedicated focus for amateur and professional gamers on virtual reality and augmented reality, and the group’s senior vice president, Raja Koduri, said that Crimson will have the same focus on software.

“We have been delivering graphics drivers for 20 plus years. “These so-called drivers have evolved into a graphics mini operating system, Radeon Software, which we deliver on a regular basis to all users,” said Koduri.”

Crimson loads games in less than one second, 10 times faster than the previous Catalyst Driver, according to the firm.

The OS has new features along with greater stability and improved performance. For example, one feature at the core of the software, Radeon Settings, is a user interface built on a new architecture. It was based on ‘three fundamental principles’ when it was designed: responsiveness, discoverability and ease of use.

“We combined these and you now have at your fingertips the ability to control your GPU through one modern application,” said RTG senior manager Terry Makedon.

Other features include a new brushed metal UX design, faster start-up, better navigation, a new Game Manager, an updated Overdrive feature and new video, display and Eyefinity tools.

Users can also change the software to set overclock settings for games so they can run the video card at higher speeds than the factory defaults, and the ability to launch a game with different preset graphics settings so they run at their optimum.

Radeon Software Crimson will arrive by the end of the year, AMD said, and has a feature set for all types of user, whether a dedicated gamer or developer.



AMD Has Mystery Product Named Magnum

October 30, 2015 by Michael  
Filed under Computing

It would appear that AMD has a mystery product called Magnum which seems to have something to do FPGA and DTV.

According to WCCFTECH Zauba, an Indian shipping firm has a shipping entry for a an AMD product codenamed “Magnum.”


The part originated in Canada where much of AMD’s GPU research and development is carried out. This would suggest that Magnum is a GPU and not CPU project.

It has been known that AMD wants to experiment with FPGA but it would appear that this is connected to digital TV. AMD flogged its DTV business to Broadcom in 2008. Going back into the market would be a nightmare because it is a crowded field. AMD has a good SoC so why would it need an FPGA.

The part is expensive $342 which puts it out of the range of TV makers. It could be for a console or a device like Nvidia’s Shield, but then you would not need an FPGA for that.

So Magnum is a mystery and will probably remain so.



Facebook Introduces ’2G Tuesdays’ To Employees

October 30, 2015 by mphillips  
Filed under Around The Net

Facebook would like for its employees to experience the pain of a slow, really slow, connection.

On Tuesdays, Facebook employees can opt in to use a slower mobile Internet connection to give them a better understanding – and a good dose of empathy – for users in emerging markets saddled with slow network speeds.

Called “2G Tuesdays,” the program is set up to ask employees, on their Facebook news feeds, if they want to use the slower connection. Of course, the company still wants its employees to be productive so the experiment only lasts for an hour.

Most smartphones today use 3G or 4G connections, enabling them to quickly download pages and stream video without interruptions or bobbles. In comparison, a 2G connection might mean it takes a minute or two to download a single web page.

“People are coming online at a staggering rate in emerging markets and, in most cases, are doing so on mobile via 2G connections,” a Facebook spokesperson wrote in an email to Computerworld. “But on the lower end of 2G networks, it can take about two minutes to download a webpage. We need to understand how people use Facebook in different Internet connections in all parts of the world so we can build the best experience for them.”

She made it clear that the program is voluntary but should offer a good lesson.

“We hope this will help us understand how people are using Facebook on slower connections, so we can build a better product for all of the people using it,” she added.

For the past few years, Facebook has been focused on bringing Internet connectivity to people without access through its Free Basics project.






AMD CPU Designer Heads To Samsung

October 28, 2015 by Michael  
Filed under Computing

The brains behind some key CPU designs in AMD and Apple has defected to Samsung.

Jim Keller, who has been behind some of the biggest CPU core design projects with AMD and Apple has walked away from designing AMD’s Zen CPU.

The move should be a blow as we think AMD needs all the help it can get at the moment, and Zen is either going to save the company or be its swan song.

Samsung will use Keller to design newer, bigger chips, or perhaps even smaller, low-power chips, as the company gears up for the internet of things.

Keller’s CV is impressive. He worked at DEC until 1998 and helped design the Alpha 21164 and 21264 processors. In 1998 he moved to AMD, where he worked on the Athlon (K7) processor. He was the lead architect of the AMD K8 microarchitecture and designed the x86-64 instruction set and HyperTransport interconnect.

In 1999, he worked at SiByte which was bought by Broadcom. He was chief architect until 2004 when he moved to P.A. Semi. At the time PA Semi was making low-power mobile processors and it was bought by Apple in 2008. As part of Jobs’ Mob Keller was part of the team to design the Apple A4 and A5 system-on-a-chip mobile processors which ended up in the iPhone 4, 4S, iPad and iPad 2.

In Keller returned to AMD to work on Zen. It is not clear why he is going, just before the Zen chip gets its act together.



Facebook Goes TechPrep To Increase Diversity

October 26, 2015 by Michael  
Filed under Around The Net

Facebook is adding to its benevolence jar with a thing called TechPrep that it expects will help the poor and disadvantaged to embrace and master technology.

The cynics among us would link this to Facebook and the drive for more members, because members mean money, but fortunately they are out of the office.

This has been a good week for Facebook in terms of positive things, including CEO Mark Zuckerberg being crowned the industry’s biggest LGBT Ally, an award that he welcomed.

TechPrep is part of this inclusive side of Facebook, and fittingly it has been introduced by Maxine Williams, global director of diversity at the firm. TechPrep is pitched at technological learners, parents and guardians who need better computer skills, and Facebook is working with the McKinsey Institute on the initiative.

“At Facebook, we’re working on a number of initiatives to widen the pipeline and build an inclusive culture. After looking closely at the data, we realised that one challenge is a lack of exposure to computer science and careers in technology, as well as a lack of resources for parents, guardians and others who want to learn more,” she said.

“In the US, this lack of access is prevalent in a number of under-represented groups, including black and Hispanic communities. Today, we’re excited to introduce TechPrep, a resource hub where under-represented people and their parents and guardians can learn more about computer science and programming and find resources to get them started.”

Facebook will curate training packages to suit needs. The firm also referred to research which found that computer science education can empower people, and that parts of the community are easily disillusioned by training and teaching.

“77 percent of parents say they do not know how to help their child pursue computer science. This increases to approximately 83 percent for lower income and non-college graduate parents or guardians,” added Williams.

“Yet being encouraged to pursue computer science by a parent or guardian is a primary motivator for women, blacks and Hispanics. Lower awareness of computer science in blacks and Hispanics is driven by less access to people in computer science and computer science programs, and is a major driver of black and Hispanic drop-off when pursuing programming as a career path.”

Facebook is already in the process of boosting its in-house diversity, as are many of its peers. The technology industry is currently white man heavy, and this is not a good thing.



Will AMD’s Newest SoC Save The Company?

October 26, 2015 by Michael  
Filed under Computing

The troubled chipmaker AMD thinks it is onto a winner with its new AMD Embedded R-Series SOC processors.

Designed for demanding embedded needs, the processors incorporate the newest AMD 64-bit x86 CPU core (“Excavator”), plus third-generation Graphics Core Next GPU architecture, and better power management for reduced energy consumption.

AMD tells us that combined, these chips provide industry-leading graphics performance and key embedded features for next-generation designs. The SOC architecture enables simplified, small form factor board and system designs from AMD customers and a number of third party development platform providers.

What AMD brings to the party is its graphics and multimedia performance, including capability for hardware-accelerated decode of 4K video playback and support for the latest DDR4 memory.
Jim McGregor, principal analyst, TIRIAS Research said that AMDs push into x86 embedded platforms is paying off with an increasing number of customers and applications.

“There is a need for immersive graphics, high-quality visualization, and parallel computing in an increasing number of embedded applications. Across these fronts, the AMD Embedded R-Series SOC is a very compelling solution.”

Scott Aylor, corporate vice president and general manager, AMD Embedded Solutions said that his outfit’s AMD Embedded R-Series SOC is a strong match for these needs in a variety of industries including digital signage, retail signage, medical imaging, electronic gaming machines, media storage, and communications and networking.

“The platform offers a strong value proposition for this next generation of high-performance, low-power embedded designs.”

The new AMD Embedded R-Series SOCs offer 22 percent improved GPU performance when compared to the 2nd Generation AMD Embedded R-Series APU2 and a 58 percent advantage against the Intel Broadwell Core i7 when running graphics-intensive benchmarks.

AMD released some of the specs for its integrated AMD Radeon graphics including:

Up to eight compute units4 and two rendering blocks

GPU clock speeds up to 800MHz resulting in 819 GFLOPS

•DirectX 12 support

Fully HSA Enabled

The AMD Embedded R-Series SOC was architected with embedded customers in mind and includes features such as industrial temperature support, dual-channel DDR3 or DDR4 support with ECC (Error Correction Code), Secure Boot, and a broad range of processor options.

It has a configurable thermal design power (cTDP) allows designers to adjust the TDPs from 12W to 35W in 1W increments for greater flexibility.

The SOC also has a 35 percent reduced footprint when compared to the 2nd Generation AMD Embedded R-Series APU, making it an excellent choice for small form factor applications.

AMD said that the range is the first embedded processor with dual-channel 64-bit DDR4 or DDR3 with Error-Correction Code (ECC), with speeds up to DDR4-2400 and DDR3-2133, and support for 1.2V DDR4 and 1.5V/1.35V DDR3.

Its dedicated AMD Secure Processor supports secure boot with AMD Hardware Validated Boot (HVB) and initiates trusted boot environment before starting x86 cores
It has a high-performance Integrated FCH featuring PCIe Gen3 USB3.0, SATA3, SD, GPIO, SPI, I2S, I2C, and UART

The AMD Embedded R-Series SOC provides industry-leading ten-year longevity of supply. The processors support Microsoft Windows 7, Windows Embedded 7 and 8 Standard, Windows 8.1, Windows 10, and AMD’s all-open Linux driver including Mentor Embedded Linux from Mentor Graphics and their Sourcery CodeBench IDE development tools.

It will be interesting to see if AMD can make up the ground it has lost on PCs and higher ticket items. Most of the company still appears to be in a holding pattern until Zen arrives.



ARM Adds Carbon Design To Its Collection

October 22, 2015 by Michael  
Filed under Computing

ARM has scooped up the product portfolio and other business assets of Carbon Design Systems, a supplier of cycle-accurate virtual prototyping solutions.

The deal, financial terms of which have not been disclosed, will see Carbon’s staff transfer to ARM, where the chip firm will make use of the Massachusetts-based outfit’s expertise in virtual prototypes.

This will enable ARM to iron out any bugs and make improvements to chips before they move to foundry for production, in turn giving designers access to ARM IP earlier in the design cycle and getting new system-on-chip (SoC) solutions to market faster.

ARM also said that Carbon will help the firm enhance its capability in SoC architectural exploration, system analysis and software bring-up.

Carbon already has a library of ARM processor and system models that can be used to create cycle-accurate virtual prototypes of any new ARM-based SoC.

“Early stage virtual prototyping of complex SoCs is now mandatory for leading silicon vendors, as demonstrated by the success of ARM Fast Models,” said Hobson Bullman, general manager of ARM’s development solutions group.

“The integration of Carbon’s virtual prototyping products into the ARM portfolio will deliver access to ARM IP earlier in the design cycle. This builds on the current industry leading solutions to enable further design optimisation, time-to-market and cost-efficiency gains for our partners.”

ARM’s acquisition of Carbon comes after acquiring technology from electronic design outfit Cadence to improve its mobile processor designs.

Speaking out on the firm’s latest acquisition, Ziv Binyamini, corporate vice president of the Advanced Verification Solutions, System and Verification Group at Cadence, said: “ARM and Cadence have together brought innovation and efficiency to early OS bring-up, software-driven verification and interconnect performance analysis by connecting ARM Fast Models and ARM Interconnect IP to the Cadence Palladium Emulation platform and Interconnect Workbench.

“We are looking forward to expanding our collaboration further into cycle-accurate virtual prototyping and performance analysis, to jointly help customers achieve their performance and time-to-market targets.”


Did Japanese Game Development Leave With Kojima

October 22, 2015 by Michael  
Filed under Gaming

Hideo Kojima has left the building. The New Yorker has confirmed that the famous game creator’s last day at Konami has come and gone, with a farewell party attended by colleagues from within and without the country – but not, notably, by Konami’s top brass. Only a couple of months after his latest game, Metal Gear Solid V: The Phantom Pain, clocked up the most commercially successful opening day’s sales of any media product in 2015, Kojima has left a studio facing shutdown – its extraordinary technology effectively abandoned, its talent scattered, seemingly unwanted, by a company whose abusive and aggressive treatment of its staff has now entered the annals of industry legend.

It’s not exaggerating to say that an era came to a close as Kojima walked out the door of the studio that bore his name for the last time. For all of Konami’s the-lady-doth-protest-too-much claims that it’s not abandoning the console market, actions matter far more than PR-moderated words, and shutting down your most famous studio, severing ties with your most successful creator in the process, is an action that shouts from the rooftops. Still, there’s some truth to Konami’s statements; it’s unlikely to abandon the console versions of Winning Eleven / Pro Evolution Soccer, or of Power Pro Baseball, any time soon, though more and more of the firm’s focus will be on the mobile incarnations of those franchises. The big, expensive, risky and crowd-pleasing AAA titles, though? Those are dead in the water. Metal Gear Solid, Silent Hill (whose reincarnation, with acclaimed horror director Guillermo del Toro teaming up with Kojima at the helm, is a casualty of this change of focus), Suikoden, Castlevania, Contra… Any AAA title in those franchises from now on will almost certainly be the result of a licensing deal, not a Konami game.

One can criticise the company endlessly for how this transition has been handled; Konami has shown nigh-on endless disrespect and contempt for its creative staff and, Kojima himself aside, for talented, loyal workers who have stuck by the firm for years if not decades. It richly deserves every brickbat it’s getting for how unprofessionally and unpleasantly it’s dealt with the present situation. It’s much, much harder to criticise the company for the broader strokes of the decisions being made. Mobile games based on F2P models are enormous in Japan, not just with casual players but with the core audience that used to consume console games. The transition to the “mid-core” that mobile companies talk about in western territories is a reality in Japan, and has been for years; impressively deep, complex and involved games boast startling player numbers and vastly higher revenue-per-user figures than most western mobile games could even dream of. Konami, like a lot of other companies, probably expects that western markets will follow the same path, and sees a focus on Japan’s mobile space today as a reasonable long-term strategy that will position it well for tomorrow’s mobile space in the west.

Mobile is the right business to be in if you’re a major publisher in Japan right now. It’s where the audience has gone, it’s where the revenues are coming from, and almost all of the cost of a mobile hit is marketing, not development. Look at this from a business perspective; if you want to develop a game on the scale of Metal Gear Solid V, you have to sink tens of millions of dollars (the oft-cited figure for MGSV is $80 million) into it before it’s even ready to be promoted and sold to consumers. That’s an enormous, terrifying risk profile; while the studio next door is working on mobile games that cost a fraction of that money to get ready for launch, with the bulk of the spend being in marketing and post-launch development, which can be stemmed rapidly if the game is underperforming badly. Sure, mobile games are risky as all hell and nobody really knows what the parameters for success and failure are just yet, but with the time and money taken to make a Metal Gear Solid, you can throw ten, twenty or thirty mobile games at the wall and see which one sticks. The logic is compelling, whether you like the outcome or not.

Here’s what nobody, honestly, wants to hear – that logic isn’t just compelling for Konami. Other Japanese publishers are perhaps being more circumspect about their transitions, but don’t kid yourself; those transitions are happening, and Konami will not be the last of the famous old publishers to excuse itself and slip away from the console market entirely. When Square Enix surveys the tortured, vastly expensive and time-consuming development process of its still-unfinished white elephant Final Fantasy XV, and then looks at the startling success it’s enjoyed with games like Final Fantasy Record Keeper or Heavenstrike Rivals on mobile, what thoughts do you think run through the heads of its executives and managers? Do you think Sega hasn’t noticed that its classic franchises are mostly critically eviscerated when they turn up as AAA console releases, but perform very solidly as mobile titles? Has Namco Bandai, a firm increasingly tightly focused on delivering tie-in videogames for Bandai’s media franchises, not noticed the disparity between costs and earnings on its console games as against its mobile titles? And haven’t all of these, and others besides, looked across from their TGS stands to see the gigantic, expensive, airship-adorned stands of games like mobile RPG GranBlue Fantasy and thought, “we’re in the wrong line of work”?

Kojima isn’t the first significant Japanese developer to walk out of a publisher that no longer wants his kind of game – but he’s the most significant thus far, and he’s certainly not going to be the last. The change that’s sweeping through the Japanese industry now is accelerating as traditional game companies react to the emergence of upstarts grabbing huge slices of market share; DeNA and Gree were only the first wave, followed now by the likes of GungHo, CyGames, Mixi and Colopl. If you’re an executive at a Japanese publisher right now, you probably feel like your company is already behind the curve. You’ve studied plenty of cases in business school in which dominant companies who appeared unassailable ended up disappearing entirely as newcomers took the lion’s share of an emerging market whose importance wasn’t recognised by the old firms until it was too late. You go home every evening (probably around midnight – it’s a Japanese company, after all) and eat your microwave dinner in front of TV shows whose ad breaks are packed with expensive commercials for mobile games from companies that hadn’t even appeared on your radar until a year or two ago, and none from the companies you’d always considered the “key players” in the industry. You’re more than a little bit scared, and you really, really want your company to be up to speed in mobile, like, yesterday – even if that means bulldozing what you’re doing on console in the process.

This is not entirely a bleak picture for fans of console-style games. Japanese mobile games really are pushing more and more towards mid-core and even hardcore experiences which, though the monetisation model may be a little uncomfortable, are very satisfying for most gamers; the evolution of those kinds of games in the coming years will be interesting to watch. Still, it will be a very long time before there’s a mobile Metal Gear Solid or a mobile Silent Hill; some experiences just don’t make sense in the context of mobile gaming, and there is a great deal of justification to the fears of gamers that this kind of game is threatened by the transition we’re seeing right now.

I would offer up two potential silver linings. The first is that not all companies are in a position to break away from console (and PC) development quite as dramatically as Konami has done. Sega, for example, is tied to those markets not least by its significant (and very successful) investments in overseas development studios, many of which have come about under the auspices of the firm’s overseas offices. Square Enix is in a similar position due to its ownership of the old Eidos studios and franchises, along with other western properties. Besides, despite the seemingly permanent state of crisis surrounding Final Fantasy XV, the firm likely recognises that the Final Fantasy franchise requires occasional major, high-profile console releases to keep it relevant, even if much of its profit is found in nostalgic retreads of past glories. Capcom, meanwhile, is deeply wedded to console development – it’s a much smaller company than the others and perhaps more content to stick to what it knows and does well, even if console ends up as a (large) niche market. (Having said that, if a mobile version of Monster Hunter springs to the top of the App Store charts, all bets are probably off.)

“Hideo Kojima left Konami because he wants to make a style of game that doesn’t fit on mobile F2P – and that’s, in the long run, probably a good thing”

The other silver lining is perhaps more substantial and less like cold comfort. Hideo Kojima left Konami because he wants to make a style of game that doesn’t fit on mobile F2P – and that’s, in the long run, probably a good thing. He joins a slow but steady exodus of talent from major Japanese studios over the past five years or more. The kind of games which people like Kojima – deeply involved with and influenced by literature, film and critical theory – want to make don’t fit with publishers terribly well any more, but that doesn’t mean those people have to stop making those games. It just means they have to find a new place to make them and a new way to fund them. Kojima’s non-compete with Konami supposedly ends in a few months and then I suspect we’ll hear more about what he plans; but plenty of former star developers from publishers’ internal studios have ended up creating their own independent studios and funding themselves either through publisher deals or, more recently, through crowdfunding. Konami’s never likely to make another game like Castlevania: Symphony of the Night, but that doesn’t stop Koji Igarashi from putting Bloodstained: Ritual of the Night on Kickstarter. Sega knocked Shenmue on the head, but a combination of Sony and Kickstarter has sent Yu Suzuki back to work on the franchise. Keiji Inafune also combined crowdfunding money with publisher funding for Mighty No. 9. Perhaps the most famous and successful of all breakaways from the traditional publishing world, though, is of a very different kind; Platinum Games, which has worked with many of the world’s top publishers in recent years while retaining its independence, is largely made up of veterans of Capcom’s internal studios.

Whichever of those avenues Kojima ends up following – the project-funding style approach of combining crowdfunding and publisher investment, or the Platinum Games approach of founding a studio and working for multiple publishers – there is no question of him walking away from making the kind of games he loves. Not every developer has his sway, of course, and many will probably end up working on mobile titles regardless of personal preference – but the creation of Japanese-style console and PC games isn’t about to end just because publishers are falling over themselves to transition to mobile. As long as the creators want to make this kind of game, and enough consumers are willing to pay for them (or even to fund their development), there’s a market and its demands will be filled. The words “A Hideo Kojima Game” will never appear on the front of a Konami title again; but they’ll appear somewhere, and that’s what’s truly important in the final analysis.

IBM Caves To Chinese Government

October 21, 2015 by Michael  
Filed under Computing

Biggish Blue is hoping to get the Chinese government onside by showing it its source code.

According to The Wall Street Journal the deal between IBM and the Chinese government is a completely new practice, which was implemented recently.

It allows the Chinese government to take a closer look at the source code behind some of IBM’s software, but does not allow for the code to be copied or tampered with in any way. It does mean that if the US spooks have hidden any backdoors in IBM’s code it will be spotted.

It is not clear how much time ministry officials have to look at the code and it is expected that strict procedures are in place within these technology demonstration centers to ensure that no software source code is released, copied or altered.

Chinese authorities have been pressing US companies looking to expand into China to give them the source code for review, to prove there are no security risks. Obviously all code has security risks, but they are specifically looking for CIA backdoors.

The deal appears to coincide with IBM’s recent announcement that it struck a deal with 21Vianet Group – one of China’s data center service providers.


Could Comet Have Assisted With Alien Life On Europa?

October 20, 2015 by Michael  
Filed under Around The Net

If alien life swims in the ocean beneath Europa’s icy surface, it might have got its start from comets cracking the icy shell to deliver vital pre-life ingredients, say researchers.

New simulations show that a specific family of comets have the mass, velocity and opportunity to do the job — penetrating the full range of likely Europan ice thicknesses.

“It’s one of the best candidates for an ecosystem,” said Rónadh Cox of Williams College, Mass., regarding Europa’s ocean. “But how do you get biological precursers into the ocean?”

To find out if icy, chemical-rich comets could do it, she and her team modeled impacts by the full range of comets that have been influenced by Jupiter, and brought into relatively short orbits around the sun — so-called Jupiter Family Comets. The team simulated the collisions of these known comets into a range of ice thicknesses. Their results were surprising and led to a new insights into the moon’s actual ice crust thickness.

“It turns out it doesn’t matter how thick the crust is,” said Cox, the lead author of a paper about the research in the latest issue of Journal of Geophysical Research-Planets. That’s because it’s all a matter of the frequency of impacts over the very long window of opportunity — in this case the almost five billion-years that the solar system has been around.

If Europa’s ice, for instance, is at the very thick end of the spectrum — 40 kilometers (25 miles) deep — it would require a 5 to 7-kilometer (3-5 mile) diameter comet to breach it, she said.

They found that the odds of that happening with a Jupiter Family Comet is once in 100 million years. That’s a virtual certainty over five billion years.

The odds of getting through the ice get a lot better if the ice is thinner. Thinner ice would allow the more numerous smaller comets to break through the ice once every 10 million years, Cox explained.

“It means the crust is penetrated frequently in geological time,” said Cox.

It also means that the two dozen impact craters seen on Europa today can be compared to the simulated impact craters to see what size their impactors were. Those that left craters on Europa are failures, because if they had breached the ice they would have probably been flooded with fresh ice. So they provide a clue about how the thick the ice really is.

“We got the best match if ice was 10 to 15 kilometers (6 to 9 miles) thick,” said Cox. This ice thickness agrees with other studies that estimate Europa’s ice thickness by entirely different methods, she said.

“I think that this is the most complete and careful work yet on the relation between the Europan ice shell thickness and the size and depth of impact craters on its surface,” said Purdue University’s Jay Melosh, an expert on planetary impacts. “I am particularly impressed that the authors can match the observed crater depth-diameter relation over the entire range of crater sizes on Europa and get an excellent fit for a 10 kilometer (6.2 mile) thick ice shell. There has been a long-standing controversy over whether the ice on Europa is thin (that is, 7 to 15 km) or thick (around 40 km).”

As for those comets that broke the ice, Cox says the evidence for those is there on Europa’s surface as well, although it’s harder to be certain, as there are a lot more analogues for studying impact craters than for ice-penetrating impacts.

“If impactors are going through the ice there has to be a scar,” Cox said. “Geomorphically, we don’t know what that is. What do those look like now that they are frozen?”

They might look like the patchy, complicated Europan icescape called chaos terrain. Very similar looking surface patterns have been created by re-freezing of ice after explosives have been used to break through sea ice on Earth, Cox said. And that’s about the only analogue she has seen for what has happened on Europa.

If so, and the ice is on the thin side, it’s good news for life on Europa and our chances of finding it.

“Cox and Bauer’s paper strongly supports the thin-shell side,” said Melosh. “This is good news for the astrobiology community because it means that exchange of material between the surface and underlying ocean is relatively easy, so that nutrients for a putative Europan biosphere can get in and samples of that biosphere may be ejected to the surface, within reach of future sample return missions.”