Subscribe to:

Subscribe to :: TheGuruReview.net ::

AMD’s Summit Ridge Processor Details Leaked

January 29, 2015 by Michael  
Filed under Computing

AMD’s first 14nm processors are codenamed Summit Ridge and they are reportedly based on an all-new architecture dubbed Zen.

Information on the new architecture and the Summit Ridge design is still very sketchy. According to Sweclockers, the chips will feature up to eight CPU cores, support for DDR4 memory and TDPs of up to 95W.

Summit Ridge will use a new socket, designated FM3. This suggests we are looking at A-series APUs, but there is no word on graphics and the eight-core design points to proper FX-series CPUs – we simply do not know at this point. It is also possible that Summit Ridge is a Vishera FX replacement, but on an FM socket rather than an AM socket.

Of course, AMD Zen should end up in more products than one, namely in APUs and Opteron server parts. The new architecture has been described as a “high-performance” design and will be manufactured using the Samsung-GlobalFoundries 14nm node.

As for the launch date, don’t hold your breath – the new parts are expected to show up in the third quarter of 2016, roughly 18 months from now.

Courtesy-Fud

China Further Restricts Internet, Blocks VPN Access

January 26, 2015 by mphillips  
Filed under Around The Net

China is further tightening its grip on access to the Internet by blocking services that allow users to get around government censorship.

Several foreign-based operators of virtual private network (VPN) services said Friday that access to their services in China had been disrupted as a result of the crackdown and users are facing a harder time getting to some foreign websites.

Virtual private networks work by establishing an encrypted pipe between a computer or smartphone and a server in a foreign country. All communications are sent inside the pipe, effectively shielding Internet traffic from government filters that determine whether a site can be accessed. VPNs are used by Chinese citizens to get to external news sources and by resident foreigners and businesses for day-to-day communications.

StrongVPN, a commercial provider that operates a network of servers around the world, said users in China had recently begun experiencing connection problems to some of its sites. Comments alongside a company blog post indicate the list of sites affected is changing and sites that might work one day are failing the following day.

Another VPN provider, Golden Frog, told customers they might have more success connecting to services in Hong Kong or The Netherlands than those in the United States or Australia.

The Chinese government appears to be using two techniques to disrupt service, said Andrew Staples, a spokesman for Golden Frog. One, deep packet inspection, examines the data in Internet packets to try to determine if it’s a VPN connection. The other, IP blocking, shuts off traffic destined for the Internet addresses used by VPN servers.

 

 

AMD’s Carrizo Coming In The Second Quarter

January 26, 2015 by Michael  
Filed under Computing

AMD released its earnings today and one cool question came up about the upcoming Carrizo mobile APU.

Lisa SU, the new AMD President and CEO, told MKM Partners analyst Ian Ing that Carrizo is coming in Q2 2015.

This is a great news and AMD’s Senior VP and outgoing general manager of computing and graphics group John Byrne already shared a few details about his excitement about Carrizo.

There are two Carrizo parts, one for big notebooks and All in Ones called Carrizo and a scaled down version called Carrizo L. We expect that the slower Carrizo-L is first to come but, Lisa was not specific. Carrizo-L is based on Puma+ CPU cores with AMD Radeon R-Series GCN graphics is intended for mainstream configurations with Carrizo targeting the higher performance notebooks.

Usually when a company says that something is coming in Q2 2015 that points to a Computex launch and this Taipei based tradeshow starts on June 2 2015. We strongly believe that the first Carrizo products will showcased at or around this date.

Lisa also pointed out that AMD has “significantly improved performance in battery life in Carrizo.” This is definitely good news, as this was one of the main issues with AMD APUs in the notebook space.

Lisa also said that AMD expects Carrizo to be beneficial for embedded and other businesses as well. If only it could have come a bit earlier, so let’s hope AMD can get enough significant design wins with Carrizo. AMD has a lot of work to do in order to get its products faster to market, to catch up with Intel on power and performance or simply to come up with innovative devices that will define its future. This is what we think Lisa is there for but in chip design, it simply takes time.

Courtesy-Fud

Do Game Developers Have Unrealistic Expectations?

January 22, 2015 by Michael  
Filed under Gaming

Over the last few years, the industry has seen budget polarization on an enormous scale. The cost of AAA development has ballooned, and continues to do so, pricing out all but the biggest warchests, while the indie and mobile explosions are rapidly approaching the point of inevitable over-saturation and consequential contraction. Stories about the plight of mid-tier studios are ten-a-penny, with the gravestones of some notable players lining the way.

For a company like Ninja Theory, in many ways the archetypal mid-tier developer, survival has been a paramount concern. Pumping out great games (Ninja Theory has a collective Metacritic average of 75) isn’t always enough. Revitalizing a popular IP like DMC isn’t always enough. Working on lucrative and successful external IP like Disney Infinity isn’t always enough. When the fence between indie and blockbuster gets thinner and thinner, it becomes ever harder to balance upon.

Last year, Ninja Theory took one more shot at the upper echelons. For months the studio had worked on a big budget concept which would sit comfortably alongside the top-level, cross-platform releases of the age: a massive, multiplayer sci-fi title that would take thousands of combined, collaborative hours to exhaust. Procedurally generated missions and an extensive DLC structure would ensure longevity and engagement. Concept art and pre-vis trailers in place, the team went looking for funding. Razor was on its way.

Except the game never quite made it. Funding failed to materialize, and no publisher would take the project on. It didn’t help that the search for a publishing deal arrived almost simultaneously with the public announcement of Destiny. Facing an impossible task, the team abandoned the project and moved on with other ideas. Razor joined a surprisingly large pile of games that never make it past the concept stage.

Sadly, it’s not a new story. In fact, at the time, it wasn’t even a news story. But this time Ninja Theory’s reaction was different. This was a learning experience, and learning experiences should be shared. Team lead and co-founder Tameem Antoniades turned the disappointment not just into a lesson, but a new company ethos: involve your audience at an early stage, retain control, fund yourself, aim high, and don’t compromise. The concept of the Independent AAA Proposition, enshrined in a GDC presentation give by Antoniades, was born.

Now the team has a new flagship prospect, cemented in this fresh foundation. In keeping with the theme of open development and transparency, Hellblade is being created with the doors to its development held wide open, with community and industry alike invited to bear witness to the minutiae of the process. Hellblade will be a cross-platform game with all of the ambition for which Ninja Theory is known, and yet it is coming from an entirely independent standpoint. Self-published and self-governed, Hellblade is the blueprint for Ninja Theory’s future.

“We found ourselves as being one of those studios that’s in the ‘squeezed middle’,” project lead Dominic Matthews says. “We’re about 100 people, so we kind of fall into that space where we could try to really diversify and work on loads of smaller projects, but indie studios really have an advantage over us, because they can do things with far lower overheads. We have been faced with this choice of, do we go really, really big with our games and become the studio that is 300 people or even higher than that, and try to tick all of these boxes that the blockbuster AAA games need now.

“We don’t really want to do that. We tried to do that. When we pitched Razor, which we pitched to big studios, that ultimately didn’t go anywhere. That was going to be a huge game; a huge game with a service that would go on for years and would be a huge, multiplayer experience. Although I’m sure it would have been really cool to make that, it kind of showed to us that we’re not right to try to make those kinds of games. Games like Enslaved – trying to get a game like that signed now would be impossible. The way that it was signed, there would be too much pressure for it to be…to have the whole feature set that justifies a $60 price-tag.

“That $60 price-tag means games have to add multiplayer, and 40 hours of gameplay minimum, and a set of characters that appeal to as many people as they possibly can. There’s nothing wrong with games that do that. There’s some fantastic games that do, AAA games. Though we do think that there’s another space that sits in-between. I think a lot of indie games are super, super creative, but they can be heavily stylised. They work within the context of the resources that people have.

“We want to create a game that’s like Enslaved, or like DMC, or like Heavenly Sword. That kind of third-person, really high quality action game, but make it work in an independent model.”

Cutting out the middle-man is a key part of the strategy. But if dealing with the multinational machinery of ‘big pubs’ is what drove Ninja Theory to make such widespread changes, there must surly have been some particularly heinous deals that pushed it over the edge?

“I think it’s just a reality of the way that those publisher/developer deals work,” Matthews says. “In order for a publisher to take a gamble on your game and on your idea, you have to give up a lot. That includes the IP rights. It’s just the realities of how things work in that space. For us, I think any developer would say the same thing, being able to retain your IP is a really important thing. So far, we haven’t been out to do that.

“With Hellblade, it’s really nice that we can be comfortable in the fact that we’re not trying to appeal to everyone. We’re not trying to hit unrealistic forecasts. Ultimately, I think a lot of games have unrealistic forecasts. Everyone knows that they’re unrealistic, but they have to have these unrealistic forecasts to justify the investment that’s going into development.

“Ultimately, a lot of games, on paper, fail because they don’t hit those forecasts. Then the studios and the people that made those games, they don’t get the chance to make any more. It’s an incredibly tough market. Yes, we’ve enjoyed working with our publishers, but that’s not to say that the agreements that developed are all ideal, because they’re not. The catalyst to us now being able to do this is really difficult distribution. We can break away from that retail $60 model, where every single game has to be priced that way, regardless of what it is.

Driven into funding only games that will comfortably shift five or six million units, Matthews believes that publishers have no choice but to stick to the safe bets, a path that eventually winnows down diversity to the point of stagnation, where only a few successful genres ever end up getting made: FPS, sports, RPG, maybe racing. Those genres become less and less distinct, while simultaneously shoe-horning in mechanics that prove popular elsewhere and shunning true innovation.

While perhaps briefly sustainable, Matthews sees that as a creative cul-de-sac. Customers, he feels, are too smart to put up with it.

“Consumers are going to get a bit wary of games that have hundreds of millions of dollars spent on them”

“I think consumers are going to get a bit wary. Get a bit wary of games that have hundreds of millions of dollars spent on them. I think gamers are going to start saying, ‘For what?’

“The pressures are for games to appeal to more and more people. It used to be if you sold a million units, then that was OK. Then it was three million units. Now it’s five million units. Five million units is crazy. We’ve never sold five million units.”

It’s not just consumers who are getting wise, though. Matthews acknowledges that the publishers also see the dead-end approaching.

“I think something has to be said for the platform holders now. Along with digital distribution, the fact that the platform holders are really opening their doors and encouraging self-publishing and helping independent developers to take on some of those publishing responsibilities, has changed things for us. I think it will change things for a lot of other developers. “Hellblade was announced at the GamesCom Playstation 4 press conference. My perception of that press conference was that the real big hitters in that were all independent titles. It’s great that the platform holders have recognised that. There’s a real appetite from their players for innovative, creative games.

“It’s a great opportunity for us to try to do things differently. Like on Hellblade, we’re questioning everything that we do. Not just on development, but also how we do things from a business perspective as well. Normally you would say, ‘Well, you involve these types of agencies, get these people involved in this, and a website will take this long to create.’ The next thing that we’re doing is, we’re saying, ‘Well, is that true? Can we try and do these things a different way,’ because you can.

“There’s definitely pressure for us to fill all those gaps left by a publisher, but it’s a great challenge for us to step up to. Ultimately, we have to transition into a publisher. That’s going to happen at some point, if we want to publish our own games.”

Courtesy-GI.biz

AMD Headed To The Facial Recognition Space

January 19, 2015 by Michael  
Filed under Computing

AMD has developed facial recognition technology to enable users to organize and search video clips based on the people featured in them.

AMD executive Richard Gayle demonstrated to Tom’s Guide how AMD Content Manager, uses facial recognition to browse through a group of local videos to find specific faces.

There is an index that displays the people’s faces that have been detected throughout the video clips.

The user can edit the names of the people as well as add keyword tags to help improve future searches for specific people.

For instance, if you are searching for videos that feature one person, you can click on his or her respective face to pull up the corresponding videos.

Additionally, if you want to narrow a search to a specific person combined with a keyword tag, you can drag the face icon and click on the desired keyword.

Once you click on the video you wish to view, a player appears in the right windowpane, along with a timeline displayed at the bottom with a list of all the people who appear in the video.

The timeline is separated into various coloured boxes to mark the exact moment in the video when each person first appears on screen, so you do not have to watch the entire video to see the bit you want.

The application also has facial recognition capabilities that allow users to do some basic editing, such as compiling a single montage video of any individual or individuals.

While this is pretty good technology, it probably does not have any major use yet on its own.

Gayle said it is unlikely that AMD will release Content Manager in its current form but will license it to OEMs that are able to rebrand the application before offering it on their respective systems.

He claimed that only AMD processors have sufficient power to operate the application, because of the processor’s ability to have the CPU, GPU and memory controller work closely together.

Courtesy-Fud

AMD’s Fiji GPU Goes High Bandwidth

January 16, 2015 by Michael  
Filed under Computing

New evidence coming from two LinkedIn profiles of AMD employees suggest that AMD’s upcoming Radeon R9 380X graphics card which is expected to be based on the Fiji GPU will actually use High-Bandwidth Memory.

Spotted by a member of 3D Center forums, the two LinkedIn profiles mention both the R9 380X by name as well as describe it as the world’s firts 300W 2.5D discrete GPU SoC using stacked die High-Bandwidth Memory and silicon interposer. While the source of the leak is quite strange, these are more reliable than just rumors.

The first in line is the profile of Ilana Shternshain, an ASIC Physical Design Engineer, which has been behind the Playstation 4 SoC, Radeon R9 290X and R9 380X, which is described as the “largest in ‘King of the hill’ line of products.”

The second LinkedIn profile is the one from AMD’s System Architect Manager, Linglan Zhang, which was involved in developing “the world’s first 300W 2.5D discrete GPU SOC using stacked die High Bandwidth Memory and silicon interposer.”

Earlier rumors suggest that AMD might launch the new graphics cards early this year as the company is under heavy pressure from Nvidia’s recently released, as well as the upcoming, Maxwell-based graphics cards.

Courtesy-Fud

Will 20nm GPU’s Ever Make It To Market?

January 14, 2015 by Michael  
Filed under Computing

We want to make sure that you realize that 20nm GPUs won’t be coming at all. Despite the fact that Nvidia, Qualcomm, Samsung and Apple are doing 20nm SoCs, there won’t be any 20nm GPUs.

From what we know AMD and Nvidia won’t be releasing 20nm GPUs ever, as the yields are so bad that it would not make any sense to manufacture them. It is not economically viable to replace 28nm production with 20nm.

This means the real next big thing technology will be coming with 16nm / 14nm FinFET from TSMC and GlobalFoundries / Samsung respectively, but we know that AMD is working on Caribbean Islands and Fiji as well, while Nvidia has been working on its new chip too.

This doesn’t mean that you cannot pull a small miracle in 28nm, as Nvidia did that back in September 2014 with Maxwell and proved that you can make a big difference with optimization on the same manufacturing process, in case when the new node is not an option.

Despite the lack of 20nm chips we still think that next gen Nvidia and AMD chips bring some innovations and make you want to upgrade in order to buy it to play the latest games on FreeSync or G-Sync monitors, or in 4K/UHD resolutions.

Courtesy-Fud

Mozilla, Yahoo Partnership Has Been A Boon For Firefox

January 12, 2015 by mphillips  
Filed under Around The Net

Mozilla’s partnership with Yahoo has quadrupled the search provider’s usage by those running Firefox in the U.S., but the browser’s users still prefer Google, according to data from an Irish analytics company.

Data provided to Computerworld by StatCounter showed that Yahoo’s search engine referred more than four times the number of pages visited by Firefox 34 than did the browser’s predecessor, Firefox 33, in the U.S.

Mozilla changed the default search from Google to Yahoo when it released Firefox 34 on Dec. 1. Firefox 33, which a small percentage of users continue to run, uses Google as its default search provider.

StatCounter’s numbers, described as usage share, are based on the number of page views each browser accumulates on the three million sites that deploy the firm’s analytics package, so they are more an indication of activity than a user tally. The company counts page referrals from search providers, not search queries.

As of Jan. 6, Yahoo’s search usage share on Firefox 34 was 32.2%, or more than four times the 7.5% that Yahoo had on Firefox 33 on the same day.

The Yahoo increase in Firefox 34 came at the expense of Google, which had a 60.8% share in that version, significantly lower than the 86.1% in Firefox 33. Meanwhile, Microsoft’s Bing search engine, at 5.5% in Firefox 34, was only slightly up from the 5.4% in Firefox 33.

On Jan. 6, StatCounter’s search provider usage shares for all browsers in the U.S. were 75.3% for Google, 12.4% for Bing and 10.5% for Yahoo. In other words, Firefox 34 users were more than three times likelier to reach a destination page from a Yahoo search than the U.S. average because of the new default.

 

 

Was The PS4 Sales Flat Over The Holiday?

January 7, 2015 by Michael  
Filed under Gaming

While the Sony PlayStation 4 has been selling very well, it seems that Christmas was not really its season.

Sony said that the PlayStation 4 has sold more than 18.5 million units since the new generation of consoles launched. While that is good and makes the PS4 the fastest selling PlayStation to date, there was no peaking at Christmas.

You would think that the PS4 would sell well at Christmas as parents were forced to do grevious bodily harm to their credit cards to shut their spoilt spawn up during the school holidays. But apparently not.

Apparently, the weapon of choice against precious snowflakes being bored was an Xbox One which saw a Christmas spike in sales.

Sony said that its new numbers are pretty much on target, it sold the expected 2 million sales per month rate.

Redmond will be happy with that result even if it still has a long way to go before it matches the PlayStation 4 on sales.

Courtesy-Fud

What Is Going On With Jupiter’s Moon Europa?

December 30, 2014 by Michael  
Filed under Around The Net

The huge geysers on Jupiter’s icy moon Europa have gone underground.

Late last year, scientists announced that NASA’s Hubble Space Telescope had detected plumes of water vapor spewing about 120 miles (200 kilometers) into space from Europa’s south pole in December 2012. The news was met with a great deal of excitement, as it suggested that a robotic probe may be able to sample Europa’s possibly life-supporting subsurface ocean without touching down.

The researchers have trained Hubble on Europa repeatedly since then, trying to confirm and characterize the plumes during observations in January, February, November and December of this year. But they’ve come up empty.

“We have not yet found any signals of waper vapor in the new images so far,” team member Lorenz Roth, of the Southwest Research Institute in San Antonio, said Dec. 19 during a talk here at the annual fall meeting of the American Geophysical Union (AGU).

Other research teams have also failed to confirm the plumes. For example, a recent re-analysis of images gathered by NASA’s Galileo probe, which studied the Jupiter system up close from 1995 through 2003, turned up no evidence of their existence, said Cynthia Phillips of the SETI (Search for Extraterrestrial Intelligence) Institute in Mountain View, California.

Europa’s plumes are thus unlikely to resemble the famous powerhouse geysers that erupt continuously from the south pole of Saturn’s icy moon Enceladus, which also harbors an ocean of liquid water beneath its icy shell, she said. (The two moons’ oceans are kept liquid by heat-generating tidal forces, the same mechanism thought to power the geysers.)

“I find it hard to believe that if a plume that was similar to the plumes we see on Enceladus had been going off on Europa during the Galileo era — I find it really unlikely that we would have missed it,” Phillips said during her talk at AGU on Dec. 19. “I think we would have seen that thing.”

Further, researchers announced on Dec. 18 at AGU that NASA’s Cassini spacecraft, which flew by Jupiter in 2001 on its way to Saturn, also didn’t see any plume activity at Europa at the time.

“We found no evidence for water near Europa, even though we have readily detected it as it erupts in the plumes of Enceladus,” Larry Esposito of the University of Colorado at Boulder, team lead for Cassini’s ultraviolet imaging spectrograph insturment (UVIS), said in a NASA statement.

UVIS measurements also suggest that most of the hot gas surrounding the satellite originates from the neighboring volcanic moon Io, not Europa, and that Europa’s wispy atmosphere is 100 times less dense than thought, the study found.

However, none of this necessarily means that Europa’s geysers don’t exist.

“It is certainly still possible that plume activity occurs, but that it is infrequent or the plumes are smaller than we see at Enceladus,” Cassini UVIS team member and study co-author Amanda Hendrix, of the Planetary Science Institute in Tucson, Arizona, said in the NASA statement. “If eruptive activity was occurring at the time of Cassini’s flyby, it was at a level too low to be detectable by UVIS.”

Indeed, the plume’s discoverers had no expectation of constant and intense activity; Hubble observations in October 1999 and November 2012 did not detect any geysers, Roth said.

“It was clear from the beginning that this is a transient phenomenon,” he said.

Roth counseled patience, describing the Hubble plume hunt as a work in progress. (The current search campaign should continue through April 2015.). Phillips voiced similar sentiments, saying that weak and/or intermittent plumes could have gone undetected by Galileo.

“The end result here is, stay tuned,” Phillips said.

Courtesy-Space

Are Indie Developers Dying Out?

December 22, 2014 by Michael  
Filed under Gaming

For independent developers, the last decade has been an endless procession of migratory possibilities. The physical world was defined by compromise, dependence and strategically closed doors, but the rise of digital afforded freedom and flexibility in every direction. New platforms, new business models, new methods of distribution and communication; so many fresh options appeared in such a brief window of time that knowing where and when to place your bet was almost as important as having the best product. For a few years, right around 2008, there was promise almost everywhere you looked.

That has changed. No matter how pregnant with potential they once seemed, virtually every marketplace has proved unable to support the spiralling number of new releases. If the digital world is one with infinite shelf-space for games, it has offered no easy solutions on how to make them visible. Facebook, Android, iOS, Xbox Live Arcade, the PlayStation Network; all have proved to be less democratic than they first appeared, their inevitable flaws exposed as the weight of choice became heavier and heavier. As Spil Games’ Eric Goossens explained to me at the very start of 2014: “It just doesn’t pay the bills any more.”

Of course, Goossens was talking specifically about indie development of casual games. And at that point, with 2013 only just receding from view, I would probably have named one exception to the trend, one place where the balance between volume and visibility gave indies the chance to do unique and personal work and still make a decent living. That place would have been Steam, and if I was correct in my assessment for even one second, it wasn’t too long before the harsher reality became clear.

After less than five months of 2014 had passed, Valve’s platform had already added more new games than in the whole of the previous year. Initiatives like Greenlight and Early Access were designed to make Steam a more open and accessible platform, but they were so effective that some of what made it such a positive force for indies was lost in the process. Steam’s culture of deep-discounting has become more pervasive and intense in the face of this chronic overcrowding, stirring up impassioned debate over what some believe will be profound long-term effects for the perceived value of PC games. Every discussion needs balance, but in this case the back-and-forth seemed purely academic: for a lot of developers steep discounts are simply a matter of survival, and precious few could even entertain the notion of focusing on the greater good instead.

And the indie pinch was felt beyond Steam’s deliberately weakened walls. Kickstarter may be a relatively new phenomenon – even for the hyper-evolving landscape of the games industry – but it faced similar problems in 2014, blighted by the twin spectres of too much content and not enough money to go around. Anecdotally, the notion that something had changed was lurking in the back ground at the very start of the year, with several notable figures struggling to find enough backers within the crowd. The latter months of 2014 threw up a few more examples, but they also brought something close to hard evidence that ‘peak Kickstarter’ may already be behind us – fewer successful projects, lower funding targets, and less money flowing through the system in general. None of which was helped by a handful of disappointing failures, each one a blow for the public’s already flagging interest in crowdfunding. Yet another promising road for indies had become more treacherous and uncertain.

So are indies heading towards a “mass extinction event”? Overcrowding is certainly a key aspect of the overall picture, but the act of making and releasing a game is only getting easier, and the allure of development as a career choice seems to grow with each passing month. It stands to reason that there will continue to be a huge number of games jostling for position on every single platform – more than even a growing market can sustain – but there’s only so much to be gained from griping about the few remaining gatekeepers. If the days when simply being on Steam or Kickstarter made a commercial difference are gone, and if existing discovery tools still lack the nuance to deal with all of that choice, then it just shifts the focus back to where it really belongs: talent, originality, and a product worth an investment of time and money.

At GDC Europe this summer, I was involved in a private meeting with a group of Dutch independent game developers, all sharing knowledge and perspective on how to find success. We finished that hour agreeing on much the same thing. There are few guarantees in this or any other business, but the conditions have also never been more appropriate for personality and individuality to be the smartest commercial strategy. The world has a preponderance of puzzle-platformers, but there’s only one Monument Valley. We’re drowning in games about combat, but This War of Mine took a small step to the left and was greeted with every kind of success. Hell, Lucas Pope made an entire game about working as a border control officer and walked away with not just a hit, but a mantelpiece teeming with the highest honours.

No matter how crowded the market has become, strong ideas executed with care are still able to rise above the clamour, no huge marketing spend required. As long as that’s still possible, indies have all of the control they need.

Courtesy-GI.biz

Microsoft Opens Up Halo

December 16, 2014 by Michael  
Filed under Gaming

Project Orleans, the cloud engine that powers Xbox hits Halo Reach and Halo 4, is being taken open source.

The engine, which has also played a vital role in the development of Microsoft’s Azure cloud computing platform, will be released under an MIT licence next year by Microsoft Technologies after being trailed at this year’s Microsoft Build Conference.

This is the latest in a long line of open-source announcements by Microsoft this year as the company tries to reinvent itself for the age where its stranglehold on the market has reduced and a wide variety of non-proprietary alternatives exist.

At the same Build conference, the company also announced that it will open source the .NET framework, on which most Windows applications depend.

The project, as described by the team itself, is “an implementation of an improved actor model that borrows heavily from Erlang and distributed objects systems, adds static typing, message indirection and actor virtualisation, exposing them in an integrated programming model”.

The team added that, whereas Erlang is a pure functional language with its own custom virtual machine, the Orleans programming model “directly leverages .NET and its object-oriented capabilities”.

One example available to try is an analysis of Twitter sentiment gauging reaction to a given hash-tag based on the language around it and creating visual representations of the mood of the web.

The code will be available as an extension to Microsoft Studio 12 or 13 with samples and supporting documentation already available, including for the Azure implementations. Non-Azure users can grab a free trial version before they buy.

Courtesy-TheInq

Samsung Finally Starts 14nm FinFET

December 15, 2014 by Michael  
Filed under Computing

A company insider has spilled the beans in Korea, claiming that Samsung has started Apple A9 production in 14nm FinFET.

The A9 is the next generation SoC for Apple iPhone and iPad products and it is manufactured on the Samsung – GlobalFoundries 14nm FinFET manufacturing process. In the other news, Samsung’s Ki-nam, president of the company’s semiconductor business and head of System LSI business has confirmed that the company started production of 14-nanometre FinFET chips.

The report mentions Austin as a possible site for Apple products but we wonder if the GlobalFoundries Fab 8 in New York State could become one of the partners for the 14nm FinFET manufacturing. Samsung didn’t officially reveal the client for the 14nm FinFET, but Apple is the most obvious candidate, while we expect to see 14 / 16nm FinFET graphics chips from AMD and Nvidia but most likely in the latter half of 2015 at best.

Qualcomm is likely to announce new LTE modem based on 14nm FinFET and the flagship SoC Snapdragon 810 is a 20nm chip. Qualcomm is manufacturing its 810 chips as we speak to meet demand for flagship Android phones coming in Q1 2015. Flagship Samsung, HTC and LG phones among others are likely to use Snapdragon 810 as a replacement for this year’s Snapdragon 801, a high end chip that ended up in millions of high-end phones.

Samsung / GlobalFoundries14nm FinFET process is 15 percent smaller, 20 percent faster, and 35 percent more power efficient compared to 20nm processors. This definitely sounds exiting and will bring more performance into phones, tablets, GPUs and will significantly decrease power consumption. The move from 28nm is long overdue.

We believe that Qualcomm’s LTE modem might be the first chip to officially come with this manufacturing process and Apple will probably take most of the 14nm production for an update in its tablets and phones scheduled for 2015.

Courtesy-Fud

Samsung To Jump In The GPU Arena Next Year

December 8, 2014 by Michael  
Filed under Computing

Samsung is having another crack at building a GPU.

This is not company’s first attempt to make a GPU and this time it is meant to be used with its SoC and not in graphics cards. Samsung has announced last year that it wants to make its System on Chips based on in-house 64-bit architecture but we still have to wait and see one eventuate.

Samsung is trying to make a GPU for years and enter this already crowded GPU IP market. Qualcomm uses Adreno, Nvidia uses Geforce and wants to license it to others. Apple uses PowerVR while Mediatek uses ARM owed Mali graphics for newer processors while using PowerVR for some older parts. Intel is using PowerVR G6430 for its mobile processors such as Atom Z3580 Moorefield while AMD has its own graphics that it can use for future SoCs and APUs. Intel owns Intel HD graphics that dominates the integrated CPU market especially for notebooks.

Samsung currently uses Mali graphics but this might change. If its team is successful, it might come with its own graphics and jack them under the bonnet of its own Exynos processor by the next summer.

All the sudden Nvidia’s lawsuit against Samsung makes more sense.

Samsung is trying to get into Nvidia space and the company doesn’t like it. Even if Samsung manages to make a successful GPU, the competition is hard. Even with years of trying Samsung is mostly using Exynos for its own tablets and some phones. Most Samsung high end phones use Qualcomm Snapdragons as these tend to have better LTE modems and are widely available.

According to the Korean ZDnet the company might talk about the GPU as early as February at the Solid Circuits Society (ISSCC) conference with the official announcement scheduled for summer 2015.

Courtesy-Fud

 

Media Company Intends To Test Drones For Gathering News

December 5, 2014 by mphillips  
Filed under Around The Net

A U.S. media company said it intends to be among the first broadcasters to roll out news-gathering drones once the Federal Aviation Administration (FAA) issues new policies governing the unmanned remote-control aircraft, with test flights beginning in Oregon.

Alpha Media, which owns close to 70 radio stations in U.S. media markets, intends to begin testing drones to gather video footage on highway traffic and concerts for its Portland radio station KXL-FM 101, once those new rules are in effect, executive vice president Scott Mahalick said.

“We’ve entered into an agreement with a drone manufacturer, and we’ll be flying them in Portland, subject to the FAA’s new guidelines,” Mahalick said, adding the company would like to expand drone use beyond Portland.

The FAA currently bans most commercial drone flights, but is required by Congress to integrate drones into the U.S. airspace in coming years. In September, it loosened restrictions, granting exemptions to a group of television and movie production companies.

Another 159 companies have applied for commercial drone authorization, largely for non-newsgathering purposes, though the FAA can’t estimate how long it will take to review these applications, said agency spokeswoman Alison Duquette.

Other media outlets have shown interest in drone use and regulation. CNN reported over the summer that it and a university would jointly study safe and effective drone operation.

Duquette said federal aviation officials were working to draft rules that would allow broader commercial use of drones weighing under 55 pounds (25 kg), eliminating the need for FAA approval, with the changes being implemented at some point in 2015.

Mahalick said Alpha Media may ultimately seek FAA permission for larger drones as well.