Troubled Japanese outfit Toshiba is considering splitting off part of its chip business in a bid to help it raise the cash it lost on its accounting scandal.
Toshiba needs a restructuring after revealing a number of unprofitable businesses which were hidden by some creative accounting. It agreed in October to sell its image sensor business to Sony. However it has been placed on a Tokyo Stock Exchange watch list and the outfit faces difficulty raising funds through the sale of shares or bonds.
Chief executive Masashi Muromachi told a news conference he was considering flogging every asset possible. NAND flash memory chips was a core business and would not be sold, which effectively leaves system LSI and discrete chips as options to split off.
The semiconductor business requires continuous investment to maintain its competitiveness against rivals such as Samsung Electronics and the thought is that when the bank manager is not returning your calls it is best to cut back on it. Some of Toshiba’s chips end up under the bonnet of the smartphones designed by the fruity cargo cult Apple.
Tosh already announced that it was flogging of its Malaysian chip assembly unit to US-based Amkor Technology as part of its strategy to consolidate chip operations. At the time it hinted of a big restructuring, but not an actual sell off its chipmaking empire.
The Pakistani government wanted the ability to monitor all BlackBerry Enterprise Service traffic in the country, including every BES email and BES BBM (BlackBerry Messenger) message, BlackBerry’s Chief Operating Officer Marty Beard wrote in a blog post on Monday.
BlackBerry has been under pressure in many countries including neighboring India to provide access to data on its enterprise services to law enforcement.
“We do not support ‘back doors’ granting open access to our customers’ information and have never done this anywhere in the world,” Beard wrote.
BlackBerry’s move is a response to a reported July notification by the Pakistan Telecommunication Authority to the country’s mobile phone operators that BlackBerry’s BES servers would not be allowed to operate in the country from December “for security reasons.”
The company had earlier on Monday said it would quit Pakistan on Nov. 30, but later said it was postponing by a month because of a delay by Pakistan of its shutdown order to Dec. 30. There are also indications that the two sides may arrive at a compromise. PTA spokesman Khurram Ali Mehran said the agency is still in contact with BlackBerry “to find out a solution.”
“Although the Pakistani government’s directive was aimed only at our BES servers, we have decided to exit the market altogether, because Pakistan’s demand for open access to monitor a significant swath of our customers’ communications within its borders left us no choice but to exit the country entirely,” Beard wrote.
Civil rights group Bytes for All, Pakistan posted on its site in July a leaked document of minutes of a PTA meeting, which advised operators Pakistan Mobile Company (Mobilink), Pakistan Telecom Mobile (Ufone) and Telenor Pakistan to ensure that all BES connections are closed on or before Nov. 30, 2015. The document cited “serious security concerns.”
To help identify faults or plan maintenance, manufacturers are able to gather performance data from connected cars such as the total distance travelled, or the length and number of trips made.
But drivers may be unaware of just how much other information such cars allow manufacturers to gather about them.
A study conducted by German motorists organization ADAC for European lobby group FIA Region 1 found that in addition to trip and distance data, one recent model reported maximum engine revolutions, the status of vehicle lights — and far more besides.
The car, a BMW 320d, also recorded the length of time the driver used different driving modes, and recorded when the seatbelt tightened due to sudden braking. More sinisterly, it also transmitted the latest destinations entered into the car’s navigation system, and personal information such as contacts synchronized from mobile phones.
ADAC only examined one car, and wants to extend the study to see how other brands behave, a spokeswoman said.
But FIA wants car manufacturers to come clean themselves, without waiting to be unmasked: It asked them to publish an easily understandable list for each model of all the data collected, processed, stored and transmitted externally.
With the risk that the data might be intercepted or the car hacked and the data taken, FIA wants carmakers to secure the data, and to make it possible for drivers to block the processing or transmission of non-essential data.
The motorists’ organizations have launched a campaign to promote motorists privacy rights and their freedom to choose service providers: My Car, My Data.
Astronomers trained Hubble on the Milky Way’s dense central bulge and spotted a population of superdense stellar corpses called white dwarfs that are remnants of stars that formed about 12 billion years ago. These stars are archeological evidence of the first few billion years of the galaxy’s history, researchers said.
“It is important to observe the Milky Way’s bulge, because it is the only bulge we can study in detail,” study lead author Annalisa Calamida, of the Space Telescope Science Institute (STScI) in Baltimore, Maryland, said in a statement. “You can see bulges in distant galaxies, but you cannot resolve the very faint stars, such as the white dwarfs.”
Like other spiral galaxies, the Milky Way harbors a dense central bulge surrounded by wispy spiral arms. Scientists think that such bulges formed first, while the outer arms came later.
“The Milky Way’s bulge includes almost a quarter of the galaxy’s stellar mass,” Calamida said. “Characterizing the properties of the bulge stars can then provide important ways to understand the formation of the entire Milky Way galaxy and that of similar, more-distant galaxies.”
But studying the Milky Way’s core is a challenge; Earth’s sun orbits on one of the outlying arms, with stars lying between Earth and the galaxy’s star-packed heart.
Using Hubble, the team studied the motion of about 240,000 Milky Way stars over nearly a decade. By comparing how the positions of these stars changed over that time, the researchers were able to pick out 70,000 that inhabit the bulge.
The team found that the galactic center contains slightly more low-mass stars compared to the outskirts.
“These results suggest that the environment in the bulge may have been different than the one in the disk, resulting in different star-formation mechanisms,” Calamida said.
The astronomers also identified 70 white dwarfs in the bulge sample, by comparing the stars’ colors to those predicted for white dwafs by theoretical models. Finding white dwarfs is no small feat; since these corpses no longer undergo fusion, they are quite dim. Indeed, NASA officials compared isolating a white dwarf from the background to searching for the glow of a pocket flashlight held by an astronaut on the moon.
But studying white dwarfs is worth the effort, the researchers said. Doing so can reveal information about the stars that built the Milky Way’s core nearly 12 billion years ago, researchers said. (For comparison, the universe is approximately 13.8 billion years old.)”These 70 white dwarfs represent the peak of the iceberg,” study leader Kailash Sahu, also of STScI, said in the same statement. “We estimate that the total number of white dwarfs is about 100,000 in this tiny Hubble view of the bulge.”
With Hubble pushing the limits of what can be seen, it will fall to other instruments to capture even fainter stars, Sahu said.
“Future telescopes such as NASA’s James Webb Space Telescope will allow us to count almost all of the stars in the bulge, down to the faintest ones, which today’s telescopes, even Hubble, cannot see,” Sahu said.
The researchers said they intend to analyze other portions of the same field of sky, ultimately leading to a more precise estimate for the age of the galactic heart.
The results were published in September in the Astrophysical Journal.
The video was released nearly two years after Amazon announced that it intended to use drones to deliver parcels through a new service called Prime Air.
But the drone showcased in the video posted is quite different from the one it showed previously. The new drone, for example, carries the parcel in its fuselage rather than below the drone.
The new hybrid drone rises vertically to nearly 400 feet and then takes a horizontal orientation to become a streamlined and fast airplane, according to the Amazon video. The device lands vertically, drops the package at the destination, and then again rises up vertically. The user in the video is alerted on a tablet about the delivery.
The previous drone design will also probably continue. In time, there will be a whole variety of designs, with different designs for different environments, said the narrator, automotive journalist Jeremy Clarkson.
The new drone can fly up to 15 miles (24 kilometers) and is able to “sense and avoid” obstacles both on air and on land. Amazon said in its FAQ that it has more than a dozen prototypes that it has developed in its research and development labs. The online retailer has Prime Air development centers in the U.S., the U.K. and Israel, and is testing the vehicles in multiple international locations.
The commercial rollout of the retailer’s program in the U.S. is likely to depend on when the U.S. Federal Aviation Administration finalizes rules for the commercial use of drones.
The FAA proposed rules earlier this year that could allow programs like those of Amazon.com for the commercial delivery of packages by drones to take off. The drones would still have to operate under restrictions such as a maximum weight of 55 pounds and follow rules that limit flights to daylight and visual line-of-sight operations.
Intel said 2016 sales will climb in the “mid single-digit” percent range and said it didn’t need a buoyant personal-computer market to make piles of dosh.
Chief Executive Officer Brian Krzanich told analysts that Intel’s growth was not dependent on its PC business.
Intel is facing a weaker PC market and said that its revenue has been bolstered by demand for high-powered processors that run servers, the building blocks of cloud-computing centers.
Additionally, orders for memory chips and processors used in new markets for Intel — such as automotive and factory automation — are helping to boost sales, the CEO said.
Intel predicted gross margin, or the percentage of sales remaining after deducting the cost of production, of about 62 percent for 2016. It’s budgeting about $10 billion for spending on new plants and equipment and raised its quarterly dividend payout by 2 cents a share, the company said in a filing today. The higher payout is in line with Bloomberg’s dividend forecast for Intel.
IDC Corp predicted that PC shipments are on course to shrink 4.9 percent to below 300 million units this year, after peaking at 364 million in 2011.
Stacy Smith, Intel’s chief financial officer said that even if the PC market shrinks 10 percent, Intel expects to be able to grow in the low-single digit percentage range, said. If the market is flat, Intel will grow in the high-single digit percentage range, he said.
While Intel got more than twice as much revenue from selling PC chips as it did from its data-centre group in the recent period, the two units brought in almost the same amount of operating profit.
That change has been driven by Intel’s 99 percent market share in server chips and surging demand for the machines from operators of data centres, such as Amazon.com Inc. and Google, which are building up their capacity to provide computing power, storage and services via the Internet.
Bill Holt, Intel’s head of manufacturing said that Chipzilla could reduce the cost of transistors which makes it worth investing in new production techniques. The company is maintaining its lead over TSMC and Samsung.
Intel is also on track to cut losses by its mobile chip division and expects a reduction of about a $1 billion this year, Smith said. In 2016 it’s aiming to get another $800 million closer to profitability in that business, he said.
nVidia has unveiled the first version of two new virtual reality (VR) software development kits (SDKs).
The company said at the release of version 1.0 of GameWorks VR and DesignWorks VR that the SDKs will solve the power-guzzling problems associated with complex, immersive VR graphics processing.
“VR promises to dramatically change the way we experience everyday life, but delivering VR is a complex challenge, especially since it requires seven times the graphics processing power of traditional 3D apps and games,” said the firm.
The two SDKs aim to solve this by making use of the company’s GeForce and Quadro GPUs, providing developers with tools to create more engaging VR experiences by increasing performance, reducing latency, improving hardware compatibility and accelerating 360-degree video broadcasts. They also support Windows 10.
GameWorks VR is aimed at game and application developers, and includes a feature called VR SLI, which provides increased performance for VR applications where multiple GPUs can be assigned a specific eye to dramatically accelerate stereo rendering.
GameWorks VR also delivers specific features for VR headset developers, including Context Priority, which provides control over GPU scheduling to support advanced VR features such as asynchronous time warp. This cuts latency and quickly adjusts images as gamers move their heads, without the need to re-render a new frame.
There’s also a feature in the SDK called Direct Mode, which treats VR headsets as head-mounted displays accessible only to VR applications, rather than a typical Windows monitor, providing better plug-and-play support and compatibility for VR headsets.
Nvidia said that GameWorks VR is already being integrated into leading game engines, such as those from Epic Games, which has announced support for GameWorks VR features in an upcoming version of the popular Unreal Engine 4.3.
DesignWorks VR is aimed at developers of professional VR applications in markets such as manufacturing, media, entertainment, oil and gas, and medical imaging. It builds on the core GameWorks VR SDK with the addition of powerful tools.
Warp and Blend, which features new APIs that provide application-independent geometry corrections and intensity adjustments across entire desktops to create seamless VR CAVE environments, without introducing any latency.
Synchronisation, a tool to prevent tearing and image misalignment while creating one large desktop driven from multiple GPUs or clusters.
GPU Affinity, which provides dramatic performance improvements by managing the placement of graphics and rendering workloads across multiple GPUs.
Direct for Video, a feature that enables VR and augmented reality environments such as head-mounted displays, CAVEs/immersive displays and cluster solutions.
Nvidia’s new SDKs come with a set of APIs and libraries for headset and app developers, including a new Multi-Res Shading Technology. This is the first time this technology has been made available publically, Nvidia said, and is touted as an “innovative rendering technique” that increases performance by as much as 50 percent while maintaining image quality.
Developers can download the VR SDKs from the Nvidia developer website. The updated release of DesignWorks VR can be accessed by registering at the above link.
Nvidia unveiled DesignWorks in August, a suite of rendering tools aimed at changing the design industry and how designers work on creations.
The software allows rendering at large scale, and in VR, giving designers the ability to collaborate with others and incorporate live video. This means they can render and see their designs with greater accuracy, and share those designs with others.
Computer engineers down under have claimed a breakthrough in quantum computing through coding which suggests that silicon can be used as the foundation for a powerful quantum computer.
The breakthrough was made by researchers at Australia’s University of New South Wales (UNSW), who found that a quantum version of computer code can be written and manipulated using two quantum bits in a silicon microchip.
Using this theory, the scientists created an experiment based on a phenomenon known as quantum entanglement. This allows the measurement of one particle immediately affecting another, regardless of distance, even if they are at opposite ends of the universe.
The experiments leading up to the discovery, published in the international journal Nature Nanotechnology on Tuesday, were performed by project leader Andrea Morello and lead authors Stephanie Simmons and Juan Pablo Dehollain in a UNSW laboratory.
“The effect [of quantum entanglement] is famous for puzzling some of the deepest thinkers in the field, including Albert Einstein who called it ‘spooky action at a distance’,” said Morello.
“Einstein was sceptical about entanglement, because it appears to contradict the principles of ‘locality’, which means that objects cannot be instantly influenced from a distance.”
Morello explained that because of this physicists have struggled for the past 50 years to establish a clear boundary between our everyday world and the quantum world, and that the best guide to that boundary has been a theorem called Bell’s Inequality, which states that no local description of the world can reproduce all of the predictions of quantum mechanics.
Bell’s Inequality demands a very stringent test to verify whether two particles are actually entangled, known as the ‘Bell test’.
“The key aspect of the Bell test is that it is extremely unforgiving: any imperfection in the preparation, manipulation and read-out protocol will cause the particles to fail the test,” explained Dehollain.
“Nevertheless, we have succeeded in passing the test, and we have done so with the highest ‘score’ ever recorded in an experiment.”
The experiment placed two quantum particles – an electron and the nucleus of a single phosphorus atom – inside a silicon microchip. On top of each other, these particles have electron orbits around the nucleus. Therefore, there is no complication arising from the “spookiness” of action at a distance.
However, the experiment that created these two-particle entangled states is tantamount to writing a type of computer code that does not exist in everyday computers. The team therefore claim to have demonstrated the ability to write a purely quantum version of computer code, using two quantum bits in a silicon microchip to achieve the highest scores.
The researchers see this as a “key plank in the quest for super-powerful quantum computers of the future”.
“Passing the Bell test with such a high score is the strongest possible proof that we have the operation of a quantum computer entirely under control,” said Morello. “In particular, we can access the purely-quantum type of code that requires the use of the delicate quantum entanglement between two particles.”
To put this in perspective, an every day computer can write four possible code words: 00, 01, 10 and 11. A quantum computer could write and use ‘superpositions’ of the classical code words instead, such as (01 + 10), or (00 + 11), which requires the creation of quantum entanglement between two particles.
“These codes are perfectly legitimate in a quantum computer, but don’t exist in a classical one,” said Simmons. “This is, in some sense, the reason why quantum computers can be so much more powerful: with the same number of bits, they allow us to write a computer code that contains many more words, and we can use those extra words to run a different algorithm that reaches the result in a smaller number of steps.”
Morello highlighted the importance of achieving the breakthrough using a silicon chip. “What I find mesmerising about this experiment is that this seemingly innocuous ‘quantum computer code’ – (01 + 10) and (00 + 11) – has puzzled, confused and infuriated generations of physicists over the past 80 years,” he said.
“Now, we have shown beyond any doubt that we can write this code inside a device that resembles the silicon microchips you have on your laptop or your mobile phone. It’s a real triumph of electrical engineering.”
Amazon is making it a little, or a lot, harder for miscreants to make off with user accounts by adding two-factor authentication.
It has taken Amazon some time to fall into line on this. Two-factor authentication has become increasingly popular and common in the past couple of years, and it is perhaps overdue for a firm that deals so heavily in trade.
Amazon is treating it like it’s new, and is offering to hold punters’ hands as they embrace the security provision.
“Amazon Two-Step Verification adds an additional layer of security to your account. Instead of simply entering your password, Two-Step Verification requires you to enter a unique security code in addition to your password during sign in,” the firm said.
The way that the code is served depends on the user, who can choose to get the extra prompt in one of three ways. They may not appeal to those who do not like to over-share, but they will require a personal phone number.
As is frequently the case, Amazon will offer to send supplementary log-in information to a phone via text message or voice call, and even through a special authenticating app.
It’s an option, and you do not have to enable it. Amazon said that users could select trusted sign-on computers that spare them from the mobile phone contact.
“Afterward, that computer or device will only ask for your password when you sign in,” explained the Amazon introduction, helpfully.
There are a number of other outfits that offer the two-factor system and you might be advised to take their trade and do your business through them. Apple, Microsoft, Google, Twitter, Dropbox, Facebook and many others offer the feature.
A website called TwoFactorAuth will let you check your standing and the position of your providers.
Microsoft surprised the world when its new phone range failed to contain anything to interest business users – now it seems it is prepared to remedy that.
Microsoft promised that its Lumia range would cover the low end, business and enthusiast segments but while the Lumia 950 and Lumia 950 XL and Lumia 650 should cover the low-end segment as well nothing has turned up for business users.
This was odd, given that business users want phones that play nice with their networks, something that Redmond should do much better than Google or Apple.
Microsoft’s CFO Amy Hood told the UBS Global Technology Conference that business versions of the Lumia were coming. She said:
“We launched a Lumia 950 and a 950 XL. They’re premium products, at the premium end of the market, made for Windows fans. And we’ll have a business phone, as well.”
There were no details, but we have been hearing rumours of a Surface phone being sighted on benchmarks. It was thought that his would be a Microsoft flagship, but with the launch of the Lumia 950/950 XL, it is possible that this Surface phone could be aimed at the business user. The word Surface matches nicely with Microsoft’s Surface Pro branding.
The project, called the Amazon Wind Farm US Central, is expected to generate about 320,000 megawatt hours (MWh) of wind power per year beginning in May 2017; that’s enough electricity to power more than 29,000 U.S. homes a year.
While AWS’s latest wind farm is dwarfed by previously announced projects, it is still large compared to those typically built by non-utility businesses.
For example, one of the largest wind farms to be completed this year was the 300MW Jumbo Road wind project located about 50 miles southwest of Amarillo, Texas. The project was commissioned by Berkshire Hathaway Energy subsidiary BHE Renewables, an electricity utility that sells power to Austin Energy. That wind farm cost more than $1 billion to build.
Amazon has launched a handful of wind farm projects and other renewable energy initiatives over the past two years as it moves toward a goal of 100% renewable energy use.
In April 2015, AWS announced that it was getting about 25% of its power from renewable energy sources; it plans to increase that level to 40% by the end of 2016.
In January 2015, Amazon announced a renewable project with the Amazon Wind Farm (Fowler Ridge) in Benton County, Indiana, which is expected to generate 500,000MWh of wind power annually.
Along with the new Amazon Wind Farm US Central, Amazon said its renewable projects will deliver more than 1.6 million MWh of renewable energy into electric grids across the central and eastern U.S., or roughly the equivalent amount of energy required to power 150,000 homes.
With Android and iOS controlling most of the mobile operating system market, it’s tough going for alternatives like Sailfish, now in survival mode as its maker, Jolla, moves to lay off a large part of its workers.
The first smartphone with the Linux-based OS shipped at the end of 2013. Adoption of Sailfish has been weak, however, and Jolla is selling only one smartphone model, via the company’s website, for about $303. It’s a Jolla-branded phone, made by a third-party contract manufacturer. A tablet is also available for preorder.
Jolla is restructuring debt in its home country, Finland, after a round of funding fell through. The company announced Friday that it will lay off “a big part” of its staff, without giving many details of future plans. The company did say it would be tailoring the OS to fit the needs of different clients, and that it has several “major and smaller potential clients.” It also said Sailfish is stable and ready for licensing.
For analysts, Jolla’s collapse wasn’t a surprise. In a copycat market, Sailfish offers cool customization features, for example. But it doesn’t have the backing of device makers or carriers, which is crucial for survival.
The China market was a big focus for Jolla, but Xiaomi took the country by storm with end-to-end offerings including OS, user interface and hardware, along with the creation of a developer ecosystem, said Carolina Milanesi, chief of research and head of Kantar Worldpanel ComTech.
Many alternative mobile OSes like Ubuntu, Firefox, WebOS, Blackberry and others are in the same boat as Sailfish, trying to find a niche in a market ruled by Apple and Google. The biggest competitor to Android and iOS is Microsoft’s Windows Phone, which had just a 1.7 percent market share in mobile handsets, with 5.87 million units shipping during the third quarter this year, according to Gartner.
A Gartner analyst said Windows Phone could find adopters in the enterprise market. But Jolla doesn’t have the resources of Microsoft, of course, and this raises questions about the future of Sailfish.
Every few hundred days, the host star of Kepler-438b — an exoplanet just 12 percent wider than Earth that appears to be the right temperature to host life as we know it — blasts out “superflares” of high-energy radiation more powerful than any eruption ever recorded from our sun, the researchers said.
These flares by themselves likely do not have much of an impact on the planet’s habitability, the researchers said. But if the flares are accompanied by gigantic explosions of plasma known as coronal mass ejections (CMEs), as strong flares from the sun generally are, life might have a hard time getting a foothold on Kepler-438b, they noted
”Large coronal mass ejections have the potential to strip away any atmosphere that a close-in planet like Kepler-438b might have, rendering it uninhabitable,” study co-author Chloe Pugh, of the University of Warwick in England, said in a statement. (Kepler-438b completes one orbit every 35 days but is still within the so-called “habitable zone,” because its host star is a red dwarf considerably dimmer and cooler than the sun.)
“With little atmosphere, the planet would also be subject to harsh UV [ultraviolet] and X-ray radiation from the superflares, along with charged particle radiation, all of which are damaging to life,” Pugh added.
Kepler-438b, which lies about 470 light-years from Earth, may be able to retain an atmosphere if it possesses a global magnetic field like Earth does, the researchers said.
“However, if it does not, or the flares are strong enough, it could have lost its atmosphere, be irradiated by extra-dangerous radiation and be a much harsher place for life to exist,” lead author David Armstrong, also of the University of Warwick, said in the same statement
As its name suggests, Kepler-438b was discovered by NASA’s Kepler space telescope, which has found more than half of the 1,900 or so alien worlds known to date.
Kepler-438b has a 70 percent chance of being rocky, researchers have said. Indeed, it’s the most Earth-like alien world known, according to the Earth Similarity Index (ESI), a metric that takes into account an exoplanet’s size, density, surface temperature and other characteristics.
Kepler-438b’s ESI rating is 0.88, on a scale from 0 (no similarity to Earth) to 1 (a true Earth twin). In second place is Kepler-296e, a planet about 1,700 light-years away that sports an ESI of 0.85.
The new study appeared online today (Nov. 17) in the journal Monthly Notices of the Royal Astronomical Society.
Some iPad Pro owners have reported strange behavior in their new 12.9-inch tablets. Normally when you charge a device, unless the battery has completely died, the screen remains responsive. But some iPad Pros are completely freezing, then dying, after a recharge. The problem appears to be widespread — Apple’s support communities are filled with complaints about the issue.
Apple knows about the problem, but hasn’t said why it’s happening. There doesn’t seem to be a real fix for it, either — at least not yet. The company published a support document on Thursday advising Pro users to force restart their tablets to bring them back to life, but that’s not really a long-term solution, because the issue is ongoing.
“When I connect my iPad Pro to the charger for more than an hour, it goes dead,” one iPad Pro owner reported in the Apple support forum. “It takes multiple hard resets to bring it back to life.”
MacRumors first reported the iPad Pro issue last Monday, just days after the supersized tablets began shipping, and even experienced the problem with one of its own tablets. Apple employees are reportedly advising a range of solutions, from using iTunes to restore settings to performing a hard restart, as Apple is now officially recommending.
We’ll update this story when Apple pushes out a fix for the problem.
Samsung appears to have stolen a march on Intel and TSMC by coming up with a 10-nano FinFET processed S-RAM
According to Electronic Times Intel and TSMC’s products are still being processed at 14-nano and 16-nano so Samsung’s 10-nano S-RAM, will open the way for a generation of Giga-Smartphones. S-RAM is faster than D-RAM and is used for CPU’s cache memory.
It means that Samsung’s 10-nano technology will be mass-produced on full-scale in early 2017. The theory is that 10-nano AP will combine Gigabyte modem chips into one faster chip.
Samsung is showing its plans to the ISSCC. They will have a 128 Megabyte (MB) capacity and a cell area of 0.040 µm2. This compares to the 14-nano S-RAM (0.064 µm2) that Samsung Electronics introduced in the past, its cell area is reduced by 37.5 per cent.
In an ISSCC scientific paper, Samsung said that it built a large-scale fast cache memory in the smallest area. An AP for a smartphone with S-RAM, can minimize Die’s area and improve its performance.
All this means that Samsung Electronics has surpassed Taiwan’s TSMC and developed the next-generation system semiconductor.
Intel postponed its schedule for developing next-generation 10-nano system semiconductor from 2016 to 2017 due to increase of production costs. Samsung Electronics is targeting end of next for commercialising 10-nano processing.
Samsung Electronics has also developed 14-nano flat-surface NAND-Flash, and this is also first ever in industries. Toshiba, Micron and others have announced that after they finish developing 15 to 16-nanos, they are giving up on flat-surface NAND-Flash.
It had been thought that 14-nano NAND-Flash, which reduces area of Floating Gate by about 12.5% compared to 16-nano, will greatly contribute to Samsung Electronics in reducing production cost of NAND by reducing Silicon Die’s area.