Subscribe to:

Subscribe to :: TheGuruReview.net ::

AMD’s Carrizo Goes Mobile Only

July 30, 2014 by Michael  
Filed under Computing

AMD’s upcoming Carrizo APU might not make it to the desktop market at all.

According to Italian tech site bitsandchips.it, citing industry sources, AMD plans to limit Carrizo to mobile parts. Furthermore the source claims Carrizo will not support DDR4 memory. We cannot confirm or deny the report at this time.

If the rumours turn out to be true, AMD will not have a new desktop platform next year. Bear in mind that Intel is doing the exact same thing by bringing 14nm silicon to mobile rather than desktop. AMD’s roadmap previously pointed to a desktop Carrizo launch in 2015.

AMD’s FM2+ socket and Kaveri derivatives would have to hold the line until 2016. The same goes for the AM3+ platform, which should also last until 2016.

Not much is known about Carrizo at the moment, hence we are not in a position to say much about the latest rumours. AMD’s first 20nm APU will be Nolan, but Carrizo will be the first 20nm big core. AMD confirmed a number of delays in a roadmap leaked last August.

The company recently confirmed its first 20nm products are coming next year. In all likelihood AMD will be selling 32nm, 28nm and 20nm parts next year.

Courtesy-Fud

AMD Tumbles

July 21, 2014 by Michael  
Filed under Computing

AMD’s debt load is causing huge problems for the chipmaker — this quarter it had another substantial loss. The tame Apple Press has been claiming that AMD’s woes are caused by the fact it did not move to mobile as was directed by the profit Steve Jobs. They claim, along with some of the dafter analysts, that mobile computing has replaced the PC and companies that stuck to the “old technology” suffered.

However that does not explain how Intel made a stonking profit mostly because of its PC chip sales while its mobile division bled cash. The insistence that mobile was a replacement technology, rather than a parallel development which would not have been noticed if the economy had not tanked, is evidence of how many analysts and hacks drank the Jobs’ kool aid.

AMD’s problems are a lot more obvious. Each quarter it has to pay $49 million to service its huge debt pile. If it did not have to do this the company would have reported a non-GAAP operating profit of $67 million. In fact AMD’s revenue rose 24 percent to $1.44 billion in the second quarter. The company said its third-quarter revenue would rise 2 percent, plus or minus 3 percent, from the June quarter. That would be about $1.47 billion. Analysts on average had expected revenue of $1.44 billion in the second quarter and $1.57 billion in the third quarter.

AMD’s stock fell 15 percent in extended trade after the outfit said it had a net loss of $36 million in the June quarter, compared with a loss of $74 million, a year earlier. AMD has been expanding into non-PC markets like game consoles and low-power servers and it aims to obtain half of its revenue from those additional businesses by the end of 2015. It is also doing well in professional graphics.

Revenue in the Computing Solutions Group dropped 20 percent from a year ago, to $669 million, as microprocessor unit shipments declined. But notebook processor sales rose, while AMD sold fewer desktop processors and chipsets. GPU revenue declined as well, partially offset by a rise in chips sold into graphics workstations and add-on cards.

Courtesy-Fud

MediaTek Shows Off 64-bit SoC LTE Chip

July 16, 2014 by Michael  
Filed under Computing

Mediatek has unveiled what it claims is the “world’s first” 64-bit octa-core LTE smartphone system on chip (SoC) with 2K display support.

Named the Mediatek MT6795, the chip is designed for use by high-end device makers for upcoming Android 64-bit mobile operating systems like the recently announced Android L, with support for 2K display up to 2560×1600 resolution.

The chip also features a clock speed of up to 2.2GHz along with Corepilot, which refers to Mediatek’s technology that aims to deliver higher performance per Watt to save power, thus increasing battery life on mobile devices while not sacrificing performance and bringing on board the power of eight cores.

The SoC also provides 4G LTE support, Mediatek said, as well as dual-channel LPDDR3 clocked at 933MHz for “top-end memory bandwidth” in a smartphone.

Mediatek VP and GM for Europe Siegmund Redl told The INQUIRER in a media briefing that the announcement is in line with the industry’s growth in the smartphone arena.

“There has been a discussion about ‘how many cores do you really need’ and what is the benefit [of octo-core],” Redl said. “Quad-core is pretty much mainstream today and application developers are exploiting the fact they can do multithreading and pipelining and parallel computing with handheld devices.

“This will not change with octa-core. When we started to introduce the first octa-core we were showing off a game with very intense graphics and processing that needed the support of multiple cores and again this is the way the industry is going; you bring out the hardware and the software development follows that and takes advantage of it and the user experience is a smoother one.”

The firm claims that the SoC features multimedia subsystems that support many technologies “never before possible or seen in a smartphone”, including support for 120Hz displays.

“With multimedia we raised the bar in terms of recording frames per second, such as slow motion replay with 480 frames per second, for much better user experience,” Redl added.

Multi-mode wireless charging is also supported by the SoC’s companion multi-mode wireless power receiver chip.

The Mediatek MT6795, dubbed the chip for “power users”, joins the firm’s MT6752 SoC for mainstream users and MT6732 SoC for entry level users. It’s the 64-bit version of the 32-bit MT6595 SoC that was launched at Mobile World Congress earlier this year, which features four ARM Cortex A17 cores and four Cortex A7 cores as well as Imagination Technologies PowerVR Series6 GPU for “high-performance graphics”.

Redl said that existing customers that use the MT6595 today for devices that are soon to be hitting the market can reuse the designs they have for the older chip as “they have a pin compatible drop-in with a 64-bit architecture”.

Redl said Mediatek will make the MT6795 chip commercially available by the end of the year, for commercial devices coming in early January or February.

Courtesy-TheInq

 

Will EA Mimic Mobile Developers?

July 9, 2014 by Michael  
Filed under Gaming

Late last year, Frank Gibeau switched roles at Electronic Arts, moving from president of the PC and console-focused EA Labels to be the executive vice president of EA Mobile. Speaking with GamesIndustry International at E3 last month, Gibeau said he was enticed by the vast opportunity for growth in the mobile world, and the chance to shape the publisher’s efforts in the space.

“One of the things I enjoy doing is building new groups, new teams and taking on cool missions,” Gibeau said. “The idea was that EA is known as a console company, and for our PC business. We’re not particularly well known for our mobile efforts, and I thought it would be an awesome challenge to go in and marshal all the talent and assets of EA and, frankly, build a mobile game company.”

It might sound a little odd to hear Gibeau speaking of building a mobile game company at EA. After all, he described EA as “the king of the premium business model” in the mobile world not too long ago, when the company was topping charts with $7 apps like The Sims 3 or raking it in with paid offerings like Tetris, Monopoly, or Scrabble.

“Two years ago, we were number one on feature phones with the premium business model,” Gibeau said. “Smart devices come in, freemium comes in, and we’re rebuilding our business. I think we’ve successfully gotten back into position and we see a lot of opportunity to grow the business going forward, but if you had talked to me about two years ago and tried to speculate there would be a company called Supercell with that much share and that many games, we wouldn’t even have come close.”

Gibeau expects that pace of upheaval to continue in the mobile market, but some things seem set in stone. For example, Gibeau is so convinced that the days of premium apps are done, he has EA Mobile working exclusively on freemium these days.

“If you look at how Asia operates, premium just doesn’t exist as a business model for interactive games, whether it’s on PC or mobile devices. If you look at the opportunity set, if you’re thinking globally, you want to go freemium so you can capture the widest possible audience in Japan, Korea, China, and so on… With premium games, you just don’t get the downloads you do with a free game. It’s better to get as many people into your experience and trying it. If they connect with it, that’s great, then you can carry them for very long periods of time. With premium, given that there are so many free offerings out there, it’s very difficult to break through.”

Unfortunately for EA, its prior expertise is only so relevant in the new mobile marketplace. Its decades of work on PCs and consoles translated well to premium apps that didn’t require constant updating, but Gibeau said running live services is a very different task – one EA needs to get better at.

“Our challenge frankly is just mastering the freemium live service component of what’s happening in mobile,” Gibeau said. “That’s where we’re spending a lot of our time right now. We think we have the right IP. We have the right talent. We’ve got great production values. Our scores from users are pretty high. It’s really about being able to be as good as Supercell, King, Gungho, or some of these other companies at sustained live services for long periods of time. We have a couple games that are doing really well on that front, like The Simpsons, Sims Freeplay, and Real Racing, but in general I think that’s where we need to spend most of our time.”

As Gibeau mentioned, EA has already had some successes on that front, but its record isn’t exactly unblemished. The company launched a freemium reboot of Dungeon Keeper earlier this year and the game was heavily criticized for its aggressive monetization approach. In May, EA shuttered original developer Mythic.

“Dungeon Keeper suffered from a few things,” Gibeau said. “I don’t think we did a particularly good job marketing it or talking to fans about their expectations for what Dungeon Keeper was going to be or ultimately should be. Brands ultimately have a certain amount of permission that you can make changes to, and I think we might have innovated too much or tried some different things that people just weren’t ready for. Or, frankly, were not in tune with what the brand would have allowed us to do. We like the idea that you can bring back a brand at EA and express it in a new way. We’ve had some successes on that front, but in the case of Dungeon Keeper, that just didn’t connect with an audience for a variety of reasons.”

The Dungeon Keeper reboot wasn’t successful, but EA continues to keep the game up and running, having passed the live service responsibilities to another studio. It’s not because the company is hoping for a turnaround story so much as it’s just one more adaptation to running games with a live service model.

“If you watch some of the things we’ve been doing over the last eight or nine months, we’ve made a commitment to players,” Gibeau said. “We’re sincere and committed to that. So when you bring in a group of people to Dungeon Keeper and you serve them, create a live service, a relationship and a connection, you just can’t pull the rug out from under them. That’s just not fair. We can sustain the Dungeon Keeper business at its level for a very long time. We have a committed group of people who are playing the game and enjoying it. So our view is going to be that we’ll keep Dungeon Keeper going as long as there’s a committed and connected audience to that game. Are we going to sequel it? Probably not. [Laughs] But we don’t want to just shut stuff off and walk away. You can’t do that in a live service environment.”

Much like EA’s institutional experience, there’s only so much of Gibeau’s past in the console and PC core gaming world that is directly relevant to today’s mobile space. But as the segment grows out of what he calls the “two guys in a garage” stage, EA’s organizational expertise will be increasingly beneficial.

“These teams are starting to become fairly sizeable,” Gibeau said, “and the teams and investment going into these games is starting to become much greater. Now they’re much, much less than you see on the console side, but there’s a certain rigor and discipline in approach from a technology and talent standpoint that’s very applicable… If you look at these devices, they will refresh their hardware and their computing power multiple times before you see a PlayStation 5. And as you see that hardware get increasing power and capability on GPU and CPU levels, our technology that we set up for gen 4 will be very applicable there. We’re going to be building technologies like Frostbite that operate on mobile devices so we can create richer, more immersive experiences on mobile.”

Even if mobile blockbusters like Candy Crush Saga aren’t exactly pushing the hardware, Gibeau said there’s still a need for all that extra horsepower. With the increased capabilities of multitasking on phones, he sees plenty of room for improvement before the industry runs up against diminishing returns on the CPU and GPU front. He likens today’s mobile titles to late-generation PS2 games, with PS3 and Xbox 360-level games just around the corner.

“As it relates to games, this is like black and white movies with no sound at this point, in terms of the type of games we’ve created,” Gibeau said. “We’re just starting to break through on the really big ideas is my personal view. If you look at games like Clash of Clans, Real Racing, even Candy Crush, they’re breaking through in new ways and spawning all types of new products that are opening up creativity and opportunities here. So I think computing power is just something we’ll continue to leverage.”

The best part for Gibeau is that the hard work of convincing people to buy these more powerful devices isn’t falling solely on the shoulders of game developers.

“The beauty of it is it’s not a single-use device,” Gibeau said, “so people will be upgrading them for a better camera, better video capability, different form factor, different user inputs, as a wearable… I think there’s so much pressure from an innovation standpoint between Samsung, Apple, Google, and Windows coming in, that they’ll continue to one up each other and there will be a very vibrant refresh cycle for a very long period of time. The screens get better, the computing power gets better, and I don’t have to worry about just games doing it like we were in the console business. Those were pretty much just games consoles; these are multi-use devices. And the beauty of it is there will be lots of different types of applications coming in and pushing that upgrade path.”

Courtesy-GI.biz

Panasonic Goes Intel For SoC

July 9, 2014 by Michael  
Filed under Computing

Panasonic and Intel have just announced that they will start making its SoC chips using Intel’s 14nm process.

Panasonic is joining Altera , Achronix Semiconductor, Tabula, Netronome and Microsemi on an ever growing list of Intel foundry clients. We expect that the list to expand over time. There have been some rumours that Cisco is planning to make its chips at Intel, too.

Keep the fabs busy

In our recent conversation with a few industry insiders we learned that Intel wants to keep its fabs busy and occupied. This is rather obvious and makes perfect sense as investing in a transition to a new manufacturing node cost a few billion dollars on a good day.

Intel has announced its Core M Broadwell processors that are coming in the latter part of this year and this will be just a fraction of what Intel plans to manufacture in its new 14nm fabs. Intel Airmont, Morganfield as well as Cherryview and Willowview Atoms, all 14nm designs, will also try to keep the fabs busy.

Lower power with 14nm SoC

Panasonic is planning to make 14nm next-generation SoCs that will target audio-visual equipment markets and will enable higher levels of performance, power and viewing experience for consumers.

The 14nm low power process technology with second generation Tri-Gate transistors will help Panasonic to decrease overall power consumption of the device. We expect that these SoCs to be used for future 4K TVs as well as the set-top boxes and possibly upscaling in Blu-ray players.

TSMC will start making 20nm chips later this year and Nvidia might be among the first clients to use it for its upcoming Maxwell 20nm GPUs. Other players will follow as Qualcomm has started making modems in 20nm and will soon move some of its SoC production to this new manufacturing node. Of course, AMD’s 20nm GPUs are in the works, too.

Intel’s 14nm is still significantly more power optimised than the 20nm process offered by TSMC and the Global Foundries – Samsung alliance, but Intel is probably not offering its services for pennies either.

Intel is known as a high margin company and we don’t see this changing over night. One of Intel’s biggest challenges in 2015 and beyond is to keep the fabs busy at all time. It will try to win more mobile phone business and it is really pushing to win its spot in the wearable technology market as phone market seems oversaturated.

Wearables offer a clean start for Intel, but first steps in any new markets are usually hard. Android Wear ARM based watches that just hit the market will complement or replace wearable wristbands like the Fitbit, based on an ARM Core M3. Intel wants to make shirts with chips inside and the more success it has in bringing cheap SoCs into our lives, the more chips it can sell. Panasonic will just help keep the fabs busy until Intel manages to fill them with its own chips.

Courtesy-Fud

ARM Launches Juno Hardware Development Program

July 7, 2014 by Michael  
Filed under Computing

ARM has announced two programs to assist Android’s ascent into the 64-bit architecture market.

The first of those is Linaro, a port of the Android Open Source Project to the 64-bit ARMv8-A architecture. ARM said the port was done on a development board codenamed “Juno”, which is the second initiative to help Android reach the 64-bit market.

The Juno hardware development platform includes a system on chip (SoC) powered by a quad-core ARM Cortex-A53 CPU and dual-core ARM Cortex-A57 CPU in an ARM big.little processing configuration.

Juno is said to be an “open, vendor neutral ARMv8 development platform” that will also feature an ARM Mali-T624 graphics processor.

Alongside the news of the 64-bit initiatives, ARM also announced that Actions Semiconductor of China signed a license agreement for the 64-bit ARM Cortex-A50 processor family.

“Actions provides SoC solutions for portable consumer electronics,” ARM said. “With this IP license, Actions will develop 64-bit SoC solutions targeting the tablet and over-the-counter (OTT) set top box markets.”

The announcements from ARM come at an appropriate time, as it was only last week that Google announced the latest version of its Android mobile operating system, Android L, which comes with support for 64-bit processors. ARM’s latest developments mean that Android developers are likely to take advantage of them in the push to take Android to the 64-bit market.

Despite speculation that it would launch as Android 5.0 Lollipop, Google outed its next software iteration on Wednesday last week as simply Android L, touting the oddly-named iteration as “the largest update to the operating system yet”.

Courtesy-TheInq

 

YouTube To Debut Weekly Radio Show On Sirius

June 27, 2014 by mphillips  
Filed under Around The Net

YouTube is making a foray into radio with a weekly show on satellite radio service Sirius XM that will feature the online video website’s most popular and emerging artists, the companies said on Thursday.

The show called The YouTube 15 will be hosted by Jenna Marbles, one of YouTube’s most popular stars whose videos on how to talk to your dog and other snippets from her life drew more than 13 million subscribers to her channel.

YouTube’s radio show will debut July 11 on the SiriusXM Hits 1 channel, which plays pop, R&B, rock and hip-hop.

It is the first time YouTube, owned by Google Inc, has partnered with another platform on a show about music.

The show is aimed at exposing listeners to a curated selection from the vast library of YouTube music videos, said Scott Greenstein, president and chief content officer for SiriusXM.

The selection of songs will reflect “what’s trending and very popular” to familiarize listeners with top hits on YouTube, he said. “Equally importantly, you are going to hear new and emerging music that many people for sure will not have heard.”

 

 

nVidia Releases CUDA To Server Vendors

June 25, 2014 by Michael  
Filed under Computing

Nvidia has released CUDA – its code that lets developers run their code on GPUs – to server vendors in order to get 64-bit ARM cores into the high performance computing (HPC) market.

The firm said today that ARM64 server processors, which are designed for microservers and web servers because of their energy efficiency, can now process HPC workloads when paired with GPU accelerators using the Nvidia CUDA 6.5 parallel programming framework, which supports 64-bit ARM processors.

“Nvidia’s GPUs provide ARM64 server vendors with the muscle to tackle HPC workloads, enabling them to build high-performance systems that maximise the ARM architecture’s power efficiency and system configurability,” the firm said.

The first GPU-accelerated ARM64 software development servers will be available in July from Cirrascale and E4 Computer Engineering, with production systems expected to ship later this year. The Eurotech Group also plans to ship production systems later this year.

Cirrascale’s system will be the RM1905D, a high density two-in-one 1U server with two Tesla K20 GPU accelerators, which the firm claims provides high performance and low total cost of ownership for private cloud, public cloud, HPC and enterprise applications.

E4′s EK003 is a production-ready, low-power 3U dual-motherboard server appliance with two Tesla K20 GPU accelerators designed for seismic, signal and image processing, video analytics, track analysis, web applications and Mapreduce processing.

Eurotech’s system is an “ultra-high density”, energy efficient and modular Aurora HPC server configuration, based on proprietary Brick Technology and featuring direct hot liquid cooling.

Featuring Applied Micro X-Gene ARM64 CPUs and Nvidia Tesla K20 GPU accelerators, the new ARM64 servers will provide customers with an expanded range of efficient, high-performance computing options to drive compute-intensive HPC and enterprise data centre workloads, Nvidia said.

Nvidia added, “Users will immediately be able to take advantage of hundreds of existing CUDA-accelerated scientific and engineering HPC applications by simply recompiling them to ARM64 systems.”

ARM said that it is working with Nvidia to “explore how we can unite GPU acceleration with novel technologies” and drive “new levels of scientific discovery and innovation”.

Courtesy-TheInq

Will AMD’s Mantle See Success On Linux?

June 20, 2014 by Michael  
Filed under Computing

AMD is planning to bring its new Mantle API to Linux in the near future. Although Linux is not a big gaming platform at the moment, SteamOS could change all that starting next year.

AMD’s Richard Huddy says the decision was prompted by requests from developers who would like to see Mantle on Linux. However, he stopped short of specifying a launch date. Huddy confirmed that AMD plans to dedicate resources to bringing Mantle to Linux, but other than that we don’t have much to go on.

Mantle on SteamOS makes a lot of sense

Mantle is designed to cut CPU overhead and offer potentially significant performance improvements on certain hardware configurations. This basically means gamers can save a few pennies on their CPU and use them towards a better GCN-based graphics card.

However, aside from enthusiasts who build their own gaming rigs, the world of PC gaming is also getting a lot of attention from vendors specialising in out-of-the box gaming PCs and laptops. Many of them have already announced plans to jump the SteamOS bandwagon with Steam Machines of their own.

Should Mantle become available on Linux and SteamOS, it would give AMD a slight competitive edge, namely in the value department. In theory vendors should be able to select a relatively affordable APU and discrete GPU combo for their Steam boxes.

AMD already tends to provide good value in the CPU department. The prospect of using mainstream APUs backed by cheap discrete Radeons (or even Dual Graphics systems) sounds interesting.

It will take a while but the potential is there

Huddy told PC World that Mantle has some clear advantages over DirectX. Microsoft’s new DirectX 12 API has already been announced, but the first games to support it won’t arrive until late 2015.

“It (Mantle) could provide some advantages on Steam boxes,” said Huddy. “We are getting requests to deliver this high-performance layer.”

While DirectX 12 will be very relevant in the PC space, the same obviously cannot be said of Linux and SteamOS. Therefore Mantle on Linux makes a lot of sense. However, it all depends on AMD’s timetable.

Last month Valve announced Steam Machines would be pushed back to 2015. They were originally supposed to launch this summer and the first announcements were made months ago. The first designs were based on Intel and Nvidia silicon, but support for AMD hardware was added just a bit later.

When Valve announced the delay we argued that it could have a silver lining for AMD. It simply gives AMD more time to improve its drivers or add Mantle support, something Nvidia and Intel do not have to worry about.

It still remains to be seen whether Steam Machines can make a big dent on the gaming market. PC gaming is going through a renaissance, but the latest consoles are doing well, too (apart from the Wii U). The concept is very attractive on more than one level, but it is very difficult to make any predictions yet, since we are still about 15 months away from launch.

Courtesy-Fud

20nm SoCs In Route To Production

June 20, 2014 by Michael  
Filed under Computing

The transition to 20nm has been anything but fast and much of the industry has been stuck at 28nm for a while, but the first 20nm products are coming as we speak.

TSMC’s 20nm process is almost ready for prime time, but volume production is still a couple of months away. However, some outfits do not need great yields and huge volumes and one maker if bitcoin mining ASICs says it will become the first outfit to ship 20nm products this week. Sweden-based KnCMiner received the first batch of 20nm Neptune ASICs earlier this week and it says it should start shipping finalized mining rigs by the end of the week.

Most ARM-based SoCs and practically every GPU on the market today are 28nm designs. The first 20nm SoCs should arrive by the end of the year, courtesy of Qualcomm and possibly Apple. Nvidia and AMD were expected to introduce 20nm GPUs sometime in the second half of 2014, but it is becoming increasingly apparent that we won’t see them until a bit later, with volume production slated for 2015.

The KnCMiner Neptune is a relatively big chip, with 1440 cores in a 55x55mm package, but there is no word on die size. The miner will use five chips and churn out 3TH/s while consuming 2.1KW. Although KnCMiner does not talk about the foundry, it appears that we are looking at TSMC silicon. However, this does not mean we will see mainstream chips manufactured on the new node anytime soon.

Cryptocurrency mining is a relatively small niche and many miners are willing to take big risks and pay through the nose for the latest kit in an effort to gain an upper hand in the mining hardware arms race. Mining ASICs don’t require great yields or big volumes, as the designers can operate with much higher margins than consumer chip outfits.

It is a risky space that has already seen a number of spectacular flops, but the promise of quick cash and downright ridiculous ROI is still attracting a lot of (greedy) risk takers. As a result there is a lot of demand and pre-orders are the norm.

Regardless of the controversy surrounding this very risky industry, it is hard not to be impressed by KnC’s feat, as the company states it has managed to beat big chipmakers to 20nm – and it has, albeit in a very tight niche.

Courtesy-Fud

Intel Pushes For SIMD in JavaScript

June 18, 2014 by Michael  
Filed under Computing

Intel is leading efforts to enable SIMD (Single Instruction Multiple Data) capabilities in the official specification underlying JavaScript in a move to speed up the software.

Intel has been trying to increase performance of browser-based applications that need to access SIMD instructions on the host processor and use data parallelism.

Chipzilla has teamed up with Google and Mozilla, to present its SIMD.js proposal in July to the ECMA International TC39 committee. The committee is supposed to approve revisions to the ECMAscript standard underlying JavaScript.

Intel wants to get SIMD.js capabilities in ECMAscript 7, which may not arrive for some time. ECMAscript 6, for its part which should be ready later this year.

SIMD capabilities have already been implemented to an extent in Firefox nightly, and I think the full JIT support will be there in a couple of months. Intel has also submitted the full patch to Chromium V8 for review.” SIMD.js has been included in Intel’s Crosswalk Web runtime and distributed in the company’s XDK tool for HTML5 mobile application development.

Courtesy-Fud

Oracle Starts Offering Database In-Memory Support

June 13, 2014 by Michael  
Filed under Computing

Oracle has eyed SAP HANA with the launch of an in-memory option for the its Database 12c software, which promises to speed up the processing of specific workloads and by up to 100 times.

Key for customers is that this has been integrated transparently, enabling existing workloads and applications that use Oracle Database 12c to take advantage of it.

The in-memory option for Oracle’s flagship Database 12c platform was first disclosed at the Oracle Openworld show in San Francisco last year. However, while the firm is announcing the technology today, the feature will actually be available as part of release 12.1.0.2 of Oracle Database 12c, due to ship within 60 days.

Oracle is pitching the fact that having in-memory capability integrated with its existing database is a key differentiator, especially against SAP’s rival HANA platform, because it offers compatibility with current applications along with existing features for robustness and to guarantee transactional integrity.

im Shetler, vice president of product management at Oracle said, “It is completely transparent to implement for existing applications that work with the Oracle database. This is different from all the other [in-memory] offerings in the market today that either require changes to applications or limit functionality to a subset of full database functionality.”

Singling out SAP HANA, Shetler said that it is effectively an in-memory data store that is very fast, “but they are still trying to complete the rest of the database functionality around it,” he claimed.

“With Oracle, all of the features of the database are available, all of the applications that exist today, including third-party applications, custom-written applications, they will all work out of the box with the Oracle Database In-Memory option, so we think that’s really huge benefit to enabling companies to become real-time enterprises,” he added.

Another area where Oracle claims an advantage is in the size of the database that can be used with its in-memory technology. Users can allocate a region of memory to hold the database and specify which data they want to go into that region, whether this is an entire table, a portion of a table, or a subset of the columns in a table.

“It’s very typical for an analytics application to only look at a small number of columns. A report might have only 10 data items out of a table that might have 500 columns, so it’s important to conserve space to be able to identify precisely just the data that needs to be in memory, and we give you the ability to do that,” Shetler said.

The in-memory option also works with Oracle’s Real Application Clusters (RAC) feature, which allows a single database to be spread across multiple servers, enabling a much larger data set to be accommodated.

On Oracle’s Exadata systems, the in-memory option allows the data to be spread across memory, flash and disk and transparently accessed, according to Oracle, so once more the entire data set does not have to be present in memory at the same time.

Oracle said many of its application teams have been working to incorporate the in-memory option into their software, and from its experience, it believes developers will see a performance boost without changing their applications, but an even greater boost if they update their code to take best advantage of in-memory processing.

“Some applications such as Oracle PeopleSoft, financial applications or the E-Business Suite have seen several hundred and up to a thousand times speed-up through a combination of adding the in-memory option and doing some restructuring of their internal algorithms,” Shetler claimed.

Oracle is also extending its partner program to certify applications on Oracle Database 12c with the in-memory option.

The Oracle in-memory option will be available on all platforms where Oracle Database 12c is currently supported, but availability details have not yet been announced.

Courtesy-TheInq

Batman Arkham Knight Gets Pushed Back

June 6, 2014 by Michael  
Filed under Gaming

Those who have been eagerly waiting for October to experience the latest adventures of Batman from developer Rocksteady, are going to be very disappointed to learn that the game will not make its originally announced October release.

Instead developer Rocksteady has confirmed that the game will be released in 2015. An exact release date has not yet been decided upon. We are hearing however, that as spring release for Arkham Knight is very likely.

While the exact reasons behind the delay were not announced, but the game is much bigger than previous Batman titles that Rocksteady has done and it is the first all next-generation title that the developer has done which also might be contributing to the delay. The game is still scheduled for release only on the Xbox One, PlayStation 4, and PC so the next generation status of the game has not changed.

Courtesy-Fud

Intel Joins Forces With Rockchip For SoC

June 2, 2014 by Michael  
Filed under Computing

Intel has joined forces with Chinese chip design firm Rockchip to develop next generation processors for the tablet market based on Intel Atom core technology and integrating 3G broadband communications.

Under the terms of the agreement, Intel and Fuzhou Rockchip Electronics (Rockchip) will work together on an Intel branded mobile system on chip (SoC) processor with the intention of enabling a range of entry-level Android tablets.

The chip is expected to ship in the first half of 2015, according to Intel, and will be based on a quad-core Atom processor design integrated with Intel’s 3G modem technology, which the firm gained through its acquisition of Infineon Technologies in 2010.

Rockchip, which is expected to contribute to the integrated graphics technology, will also help Intel bring the product to market faster than might otherwise be the case. The firm is a leading fabless semiconductor design company and already develops mobile SoCs, although its present designs are largely focused around the ARM architecture.

The agreement builds on announcements Intel made at an investor relations day last year, where chief executive Brian Krzanich disclosed the Intel Sofia family, of which the latest chip will form part, and conceded that the chipmaker needed to become more agile in order to gain traction in entry-level markets.

“The strategic agreement with Rockchip is an example of Intel’s commitment to take pragmatic and different approaches to grow our presence in the global mobile market by more quickly delivering a broader portfolio of Intel architecture and communications technology solutions,” Krzanich said.

With this announcement, the Intel Sofia family comprises three products, which are not shipping yet.

A dual-core 3G version is slated for the fourth quarter this year, the quad-core 3G version is due in the first half of 2015, and a version with 4G/LTE communication is also due in the first half of next year.

Courtesy-TheInq

Will Qualcomm Make An SoC Specifically For Smartwatches?

May 28, 2014 by Michael  
Filed under Computing

Qualcomm hasn’t neglected the smartwatch space. It has launched the Toq watch late last year and the Mirasol display driven watch is quite nice. The company is actually one of the smartwatch pioneers. We can call Toq a first generation wristwatch that runs on an unnamed 200MHz Cortex M3 based processor, naturally from Qualcomm.

A few months back we talked to Rob Chandhok, Senior Vice President at Qualcomm Technologies and President of Qualcomm Interactive Platforms and we heard that Toq is besically a proof-of-concept. Qualcomm wanted to show to the world what could be done with present day tech.

Qualcomm is selling its smartwatch directly to customers, but the long term strategy is to sell chips to the other smartwatch manufacturers. The next step is to get into the Android Wear platform, as it has potential to become a very popular platform in this emerging market.

Android Wear will make a difference

We cannot confirm whether or not Qualcomm chips found their way into the Moto 360 or other Android Wear designs, but it would be a nice win. The Moto 360 should land at a quite attractive $249 price point and we think that $249 or $199 is a realistic price for high-end smartwatches. The same price tactic that was used by Samsung with its Galaxy Gear smartwatches and of course by Qualcomm with Toq. It looks like the sweet spot.

Mediatek is one of the competitors to watch out for. The company has announced its Aster ARM7 wearable platform that is being described by EEtimes as “the smallest SoC” with highest integration for wearable devices. The Aster resides in a tiny 5.4 x 6 mm package, it has has Bluetooth 4.0/Bluetooth Low Energy, power management IC, and memory (4 Mbytes of Flash and 4 Mbytes of SRAM) and at the same time it can target devices that will cost as low as $50.

Qualcomm has a wearable chip in the works

This is not the market that Qualcomm wants to target, as $199+ smartwatches can offer more performance, power and make a bit more money. Qualcomm CDMA Technologies Taiwan president Eddie Chang has mentioned that Qualcomm has been working on a wearable chip that is about to go into mass production soon. He didn’t mention any specific devices, but he promises that there will be devices shipping based on Qualcomm wearable chips later this year.

One can only assume that Qualcomm will win some Android Wear devices, but the competition in this segment has proven to be very fierce, as there are dozens of manufacturers fighting for this emerging market. Some big names including Texas Instruments and STMicroelectronics, companies that don’t make much noise in the phone market are hoping to be big players in the wearable market. STMicroelectronics is already inside the Fitbit wristband, which is quite a popular device in this market.

Courtesy-Fud