Subscribe to:

Subscribe to :: TheGuruReview.net ::

Does Lexar Have The Fastest SD Cards?

September 18, 2014 by Michael  
Filed under Computing

flash storage vendor to claim the title of the fastest SD card.

Just a day after we reported on the Sandisk Extreme Pro microSDXC UHS-I card, the Micron subsiduary has announced the Lexar Professional 2000x SDHC/SDXC UHS-II range.

The new UHS-II standard cards offer transfer speeds of up to 300MBps, with write speeds of 260MBps, making them up to three times the speed of the Sandisk UHS-I card.

“UHS-II technology really raises the bar in terms of performance. The latest UHS-II additions to our Lexar Professional product portfolio provide users the ability to capture and offload work even faster, so they can get back to what’s important–capturing great images and video,” said Lexar director of card product marketing Adam Kaufman.

“Our multi-line UHS-II product offering gets performance into the hands of any user at a great value and with the Professional 2000x card, it comes right out of the box.”

UHS-II cards are backwards compatible with previous generations, automatically downgrading to UHS-I and Class 10 for compatibility, but the company is also offering the SR2 reader, an add-on device that allows you to use the 2000x series at its full potential.

The company has also added to its 1000x range with transfer speeds of 150MBps/95MBps. All cards come with Image Rescue software and a limited lifetime warranty.

Both cards will be available in the last quarter of the year. The 2000x series will come in 32GB and 64GB flavors for $150.99 and $199.99. The 1000x series starts at $49.99 for 16GB, with increments all the way up to a 256GB version for $799.99. The Professional Workflow SR2 reader will cost $39.99.

Courtesy-Fud

IBM New Training Program Results In Pay Cut For Enrolled Employees

September 17, 2014 by mphillips  
Filed under Computing

IBM has started a new training program that will cut the pay of participating employees by 10%.

According to an IBM internal memo dated Sept 12th. The memo sent to affected employees begins by telling the worker that an assessment has revealed “that some managers and employees have not kept pace with acquiring the skills and expertise needed to address changing client needs, technology and market requirements.”

It then tells the recipient that “you have been identified as one of these employees,” and says that from mid-October through the end of March, “you will dedicate up to one day per week,” or up to 23 working days total, “to focus on learning and development.”

But IBM is coupling this training with a six month salary reduction. The key statement in the memo is this: “While you spend part of your workweek on learning and development activities, you will receive 90% of your current base salary.”

Salary will be restored to the full rate effective April 1, 2015.

Asked about program, IBM spokeswoman Trink Guarino said the firm “is implementing a skills development program for a small number of U.S. strategic outsourcing employees. Under this program, these employees will spend one day a week developing skills in key growth areas such as cloud, analytics, mobile and social.”

There was negative reaction from some IBM employees.

One IBM IT professional, who asked not to be identified, said he was “shocked” to be added to the list, particularly since his work has been consistently praised by managers.

By reducing pay “by a significant amount,” IBM is acting “in the hopes that the employees won’t be able to sustain that pay and decide to quit, exempting IBM from letting them go and have to pay severance,” the employee said.

One source familiar with the program said the percentage of employees impacted is small, in the single digits.

While employees may see the pay cut as unfair, the salary reduction is viewed by management as a form of employee “co-investment” in training, and as a better alternative to laying off and hiring employees with the latest skills. It’s not that these employees lack skills, but they don’t necessarily have the ones that are needed today, the source said.

 

 

 

 

Will Intel Debut NUC This Year?

September 17, 2014 by Michael  
Filed under Computing

Everyone is not too happy with Intel’s Next Unit of Computing (NUC) brand that the company came up with for its small form factor desktop replacements at IDF 2012. Intel started shipping these small desktops in early 2013.

NUC started off with Sandy Bridge-based parts codenamed Ski Lake (DCP847SK) and with the Celeron 847 it got quite a lot of attention thanks to more affordable pricing. A year after Intel launched multiple Core i3 based SKUs with Ivy Bridge and this year it introduced models based on Wilson Canyon platform and Haswell CPUs. Affordable Bay Trail models appeared as well.

The latest Intel NUC Kit D54250WYK measures tiny 116.6mm x 112mm x 34.5mm and sells for about 370 USD in states and 300 Euro in Germany or £278 in the UK. Back at IDF 2014, Intel’s biggest developer conference some people close to NUC projects told us that since the launch the project has been success.

It started with 250,000 shipped units in the first generation and grew to half a million units with second generation products. There is a chance that this year Intel might sell as many as one million units as an ultimate goal but shipments in the 750,000 to 1 million range might be more realistic. Even if Intel sells around 750,000 units, it will mean that they managed to triple the market within rather short time.

There will be Braswell and Broadwell fourth generation NUCs coming in 2015, but Intel needs to launch 15W TDP part Broadwell and this happens in Q2 2015 as far as we know. We don’t know if the Braswell NUC comes as soon as Broadwell-U or a bit later, but it is in the works.

This Braswell NUC should be really affordable and should replace the Bay-Trail M based DN2820FYKH powered by the Celeron N2820. Have in mind that this entry level Celeron costs a mere $144 at press time and only needs some RAM and an HDD to work. At its lowest spec 2GB SODIMM sell for as low as $10 and Toshiba has MSATA 62GB drive for as low as $24.95.

This means a small, power efficient machine that can run Windows goes as low as $179. No wonder that they are so popular.

Courtesy-Fud

Stanford University Develops Ant-sized Radios

September 16, 2014 by mphillips  
Filed under Consumer Electronics

Scientists at Stanford University have assembled ant-sized radios that could one day help track patients’ temperatures, turn on coffee makers in the morning and prevent forgery.

A Stanford engineering team has built a radio, equipped with sensors, computational units and antennas one-tenth the size of Wi-Fi antennas, that is able to gain all the power it needs from the same electromagnetic waves that carry signals to its receiving antenna. No batteries are required.

These radios, which are designed to compute, execute and relay commands, could be the key to linking gadgets together in the increasingly popular idea of the Internet of Things.

Today’s radios generally are the size of a quarter, according to Amin Arbabian, assistant professor of electrical engineering at Stanford and a researcher on the radio project. These new radios are much smaller. They’re 3.7 x 1.2 millimeters.

Radios that small could be added to everything from $100 bills to medical gauze, Band-Aids and home appliances. At just pennies per radio, that means a myriad of products could easily and cheaply become part of a linked network.

“This could be very important,” Arbabian told Computerworld. “When you think about the Internet of Things, you’re talking about needing a thousand radios per person. That counts all the radios and devices you’d need around you in your home and office environments. With 300 million people in the U.S., we’d have 300 billion radios.”

A Bluetooth-type radio works fine for smartphones but is too big and expensive to connect most of the objects in users’ lives.

“We needed the cost and size to go down, and you need scale,” said Arbabian, who began working on the project in 2011. “Do you want to put something the size of a Bluetooth radio on a Band-Aid? It’s too big. It costs a lot. The technology we have today for radios doesn’t meet any of these requirements.”

He explained that a tiny radio with a temperature sensor could be put on a bandage or piece of adhesive that’s applied to every patient that enters a hospital. The radio and its sensor would enable the medical staff to continuously track every patient’s temperature, a key health indicator, effortlessly and cheaply.

Sensors also could be used to measure air quality, to track medications from the manufacturer to the end user and to even keep track of tools and supplies in an operating room. For instance, Arbabian noted that a radio, encased in bio-safe material, could be attached to gauze or medical tools. With them, everything in an operating room could be tracked to ensure that nothing is left inside the patient at the end of surgery.

The radios also could be attached to every day products inside the home, including appliances, doors and windows.

 

 

 

Will EA Make Games For Wearables?

September 16, 2014 by Michael  
Filed under Computing

EA is considering developing games for wearables. The company already has two teams on the job, looking for ways to make wearable games. Their efforts are focused on the Apple Watch for now.

EA told CNET that the company has quite a relationship with Apple and Frank Gibeau, head of EA’s mobile gaming arm, said he is impressed with the new Apple A8 SoC. Gibeau added that Apple’s decision to include 128GB storage in flagship models is more good news for gamers, as it raises the bar for developers and gives them more room to play around with.

Gibeau said EA’s mobile division is “intrigued” by the prospect of gaming on wearables. He said wearables are eventually going to offer more performance and capability, thus enabling new gaming experiences. However, he cautioned that “it’s very early days” for wearable gaming.

“In fact, we have two teams prototyping wearable experiences that are not only standalone, but also some ideas where you can actually use the fitness component in the watch that can unlock capabilities in the game that might be on your iPhone. Or you could do crafting or some other auction trading on your watch that goes back into your tablet game that you might check out later when you get home,” he told CNET.

Courtesy-Fud

Ericsson Acquires Fabrix Systems For $95M

September 15, 2014 by mphillips  
Filed under Mobile

The distinctions between TV and mobile services continues to merge and in many cases that occurs in the cloud.

That’s the logic behind Ericsson’s planned $95 million acquisition of Fabrix Systems, which sells a cloud-based platform for delivering DVR (digital video recorder), video on demand and other services.

The acquisition is intended to help service providers deliver what Ericsson calls TV Anywhere, for viewing on multiple devices with high-quality and relevant content for each user. Cable operators, telecommunications carriers and other service providers are seeing rapid growth in video streaming and want to reach consumers on multiple screens. That content increasingly is hosted in cloud data centers and delivered via Internet Protocol networks.

Fabrix, which has 103 employees in the U.S. and Israel, sells an integrated platform for media storage, processing and delivery. Ericsson said the acquisition will make new services possible on Ericsson MediaFirst and Mediaroom as well as other TV platforms.

Stockholm-based Ericsson expects the deal to close in the fourth quarter. Fabrix Systems will become part of Ericsson’s Business Unit Support Solutions.

Other players usually associated with data networks are also moving into the once-specialized realm of TV. At last year’s CES, Cisco Systems introduced Videoscape Unity, a system for providing unified video services across multiple screens, and at this year’s show it unveiled Videoscape Cloud, an OpenStack-based video delivery platform that can be run on service providers’ cloud infrastructure instead of on specialized hardware.

 

 

HP Scoops Up Eucalyptus Software

September 15, 2014 by Michael  
Filed under Computing

HP is about to make one of its few purchases since it cocked up buying Autonomy in 2011.

HP wants to buy cloud software startup Eucalyptus Software which provides open-source software for building private and hybrid clouds, or Internet-based computing services.

It is not a big sale — HP is tipped to pay less than $100 million. The acquisition is expected to close in the fiscal fourth quarter, after which Eucalyptus Chief Executive Officer Marten Mickos will join HP as senior vice president and head of its cloud business. He will report to CEO Meg Whitman and be tasked to buy HP’s “Helion” cloud computing services. Martin Fink, who now leads the cloud business, will remain chief technology officer as well as director of HP Labs, which focuses on researching next-generation products.

Although it has been making losses, HP has a lot of cash in the bank. It ended the July 2014 quarter with $4.9 billion in operating company net cash. In August, Whitman told analysts HP was in a position to make acquisitions if needed, though it remains committed to returning half its cash to shareholders.

Courtesy-TheInq

HP May Be Looking To Unload Snapfish

September 15, 2014 by mphillips  
Filed under Around The Net

Hewlett-Packard Co is taking a look at putting its web-based photo sharing service Snapfish on the block, and has held discussions with multiple private equity and industry buyers, a person with knowledge of the situation said.

Snapfish, which HP bought for more than $300 million in 2005 and currently sits within its printing and personal systems group, is considered non-core for the company, the person said, asking not to be named because the matter is not public.

A spokesman for HP declined to comment.

Last year, HP replaced the printing and personal business’ long-time head Todd Bradley with former Lenovo executive Dion Weisler. Bradley has since left the technology company, to join Tibco Software Inc as its president.

Some of the parties that have been eyeing Snapfish have also expressed interest in buying another online photo-sharing services provider, Shutterfly Inc, the person said.

Shutterfly hired Frank Quattrone’s Qatalyst Partners over the summer to find a buyer, and is expected wrap up its process in the next several weeks, people familiar with the matter have said previously.

 

 

Watch Dogs For Wii U Coming In November

September 15, 2014 by Michael  
Filed under Gaming

Finally, Ubisoft has a release date for the Wii U version of Watch Dogs. While we don’t know if that many people are waiting for the Wii U version, when it does release it could very well end up being one of the last M rated titles for the Wii U console.

The release date for the Wii U version of Watch Dogs appears to be November 18th in North America and November 21st in Europe. This ends the original release delay that Ubisoft announced for the Wii U version as resources were moved to prepare the other versions of the game for release.

Ubisoft has been one of the strongest supports of software for the Wii U, but recently it announced that it was done producing titles like Assassins Creed and Watch Dogs for the Wii U because the sales of these M rated titles are just not there on the Wii U platform. It did indicate that it would focus on some of its other Wii U titles that continue to be popular on the console.

The news is good that they are getting Watch Dogs, but it looks like we will not see many more games like this on the Wii U.

Courtesy-Fud

Intel Shows It’s Wireless Technology

September 12, 2014 by Michael  
Filed under Technology

Intel demoed its “no wires future” of wireless gigabit docking at its Intel Developer Forum (IDF) in California.

Intel wireless gigabit docking is a fully cable-free experience that includes wireless docking, wireless display and wireless charging. Intel demonstrated a reference design based on a next generation 14nm Intel processor on stage during its opening keynote on Tuesday.

Intel hopes to implement this technology by the end of 2015.

“Not only your wireless display, but storage, keyboard and mouse – all the other peripherals you have that have been weighing down our backpacks or strewn across our desk, we’re eliminating with one technology, and that’s wireless gigabit,” said an Intel expert on stage.

“It’s not only a secure and also localised connection – so you can use it in high dense areas such as in an office – but also extremely fast performing at over three times the performance of today’s WiFi.

“But while that’s cool we still have one more cord in our bag and let’s get rid of it: ditch that brick. That last thing that’s weighing us down is [resolved by] wireless power; the ease of use and installation it has is really going to be an advantage using the wireless resonance technology.”

The technology works over a simple receiver that goes into client devices, along with a resonance board that acts as a dock, which creates its own wireless hotspot.

Intel demonstrated how the standard will work using a laptop that automatically powered up and charged as soon as it reached the surface of the table due to the magnetic charging field built into the desk surface.

Intel said that this technology could also charge wireless Bluetooth earpieces, wearable devices, tablets and notebooks. However, it doesn’t have to be built into devices to work, as Intel said it can also be retrofitted into the cases of the devices we are carrying around.

Intel’s wireless gigabit technology is another push towards the firm’s vision of a cable-free future, meaning there’ll be no annoying wires or leads connecting computers to monitors, laptops to plug sockets or tablets to projectors.

The semiconductor giant first announced this view in August, saying that it’s looking to change the enterprise IT market with a strategy that will offer “three major experiences” in the office, that is, wireless display connectivity, wireless docking and wireless charging.

Courtesy-TheInq

SanDisk DAS Cache Going Dell PowerEdge

September 12, 2014 by Michael  
Filed under Computing

SanDisk has released more details about its joint venture with Dell, which will see DAS Cache SSD caching software from Sandisk added to Dell’s Poweredge servicers.

Sandisk’s director of Software Marketing Rich Petersen told The INQUIRER, “We’re excited to be announcing a co-venture with a brand with Dell’s credentials that offers platform independent, brand independent caching.”

Sandisk DAS Cache is a pure software caching system that uses flash memory to improve latency at the server level by up to 37 times.

Network managers can choose between apportioning part of, or a full dedicated SSD, for caching, with the software algorithms controlling the flow of data without the need for any additional hardware.

“An all software solution allows anyone to take advantage of caching technology without the need for engineering knowledge or previous experience of configuration” continued Petersen.

Users can create up to four different cache pools with different prioritisations to create quality of service (QoS) infrastructure. Cache persistence ensures that even if the server is rebooted the speed boost is maintained.

Sandisk DAS Cache will be available in the newly announced range of Dell servers that were announced at IDF in San Francisco. However, users are not required to use Sandisk SSDs in the system, as the software works with all disk manfacturers’ products.

At launch the system is in place for Windows and Linux based systems, with VMware set to follow in 2015. The system also supports hypervisors including Microsoft Hyper-V.

Sandisk has made a number of advances in the enterprise market this year, including the first 4TB capacity solid-state disk (SSD) drive, and a dedicated SSDs for business laptops.

Courtesy-Fud

Will eSports Overtake the NHL?

September 11, 2014 by Michael  
Filed under Gaming

You can’t accuse eSports League CEO Ralf Reichert of always telling people what they want to hear. At last month’s FanExpo Canada in Toronto, Ontario, just a few blocks away from the Hockey Hall of Fame, Reichert told GamesIndustry.biz that he saw competitive gaming overtaking the local pastime.

“Our honest belief is it’s going to be a top 5 sport in the world,” Reichert said. “If you compare it to the NHL, to ice hockey, that’s not a first row sport, but a very good second-row sport. [eSports] should be ahead of that… It’s already huge, it’s already comparable to these traditional sports. Not the Super Bowl, but the NHL [Stanley Cup Finals].”

Each game of this year’s Stanley Cup Finals averaged 5 million viewers on NBC and the NBC Sports Network. The finals of the ESL Intel Extreme Masters’ eighth season, held in March in Katowice, Poland, drew 1 million peak concurrent viewers, and 10 million unique viewers over the course of the weekend. That’s comparing the US audience for hockey to a global audience for the IEM series, but Reichert said the events are getting larger all the time.

As for how eSports have grown in recent years, the executive characterized it as a mostly organic process, and one that sometimes happens in spite of the major players. One mistake he’s seen eSports promoters make time and again is trying to be too far ahead of the curve.

“There have been numerous attempts to do celebrity leagues as a way to grow eSports, to make it more accessible,” Reichert said. “And rather than focusing on the core of eSports, the Starcrafts and League of Legends of the world, people tried to use easy games, put celebrities on it, and make a classic TV format out of it.”

One such effort, DirecTV’s Championship Gaming Series, held an “inaugural draft” at the Playboy Mansion in Beverly Hills and featured traditional eSports staples like Counter-Strike: Source alongside arguably more accessible fare like Dead or Alive 4, FIFA 07, and Project Gotham Racing 3.

“They put in tens of millions of dollars in trying to build up a simplified eSports league, and it was just doomed because they tried to simplify it rather than embrace the beauty of the apparent complexity.”

Complexity is what gives established sports their longevity, Reichert said. And while he dismisses the idea that eSports are any more complex than American football or baseball, he also acknowledged there is a learning curve involved, and it’s steep enough that ESL isn’t worrying about bringing new people on board.

“It’s tough for generations who didn’t grow up with gaming to get what Starcraft is,” Reichert said. “They need to spend 2-10 hours with it, in terms of watching it, getting it explained, and getting educated around it, or else they still might have that opinion. Our focus is more to have the generations who grew up with it as true fans, rather than trying to educate people who are outside of this conglomerate… There have been numerous attempts to make European soccer easier to approach, or American football, or baseball, but they all kill the soul of the actual sport. Every attempt to do that is just doomed.”

Authenticity is what keeps the core of the audience engaged, Reichert said. And even though there will always be purists who fuss over every change–Reichert said changing competitive maps in Starcraft could spark a debate like instant replay in baseball–being true to the core of the original sport has been key for snowboarding, mixed martial arts, and every other successful upstart sport of the last 15 years.

“Like with every new sport, the biggest obstacle has been people not believing in it,” Reichert said. “And it goes across media, sponsorships, game developers, press, everyone. The acceptance of eSports was a hard fought battle over a long, long time, and there’s a tipping point where it goes beyond people looking at it like ‘what the hell is this?’ And to reach that point was the big battle for eSports… The thing is, once we started to fill these stadiums, everyone looking at the space instantly gets it. Games, stadiums, this is a sport. It’s such a simple messaging that no one denies it anymore who knows about the facts.”

That’s not to say everybody is convinced. ESPN president John Skipper recently dismissed eSports as “not a sport,” even though his network streamed coverage of Valve’s signature Dota 2 tournament earlier this year. Reichert admitted that mainstream institutions seem to be lagging behind when it comes to acceptance, particularly with sponsors. While companies within the game industry are sold on eSports, non-endemic advertisers are only beginning to get it.

“The very, let’s say progressive ones, like Red Bull, are already involved,” Reichert said. “But to get it into the T-Mobiles and other companies as a strategy piece, that will still take some time. The market in terms of the size and quality of events is still ahead of the sponsorship, but that’s very typical.”

Toronto was the second stop for ESL’s IEM Season 9 after launching in Shenzhen July 16. The league is placing an international emphasis on this year’s competition, with additional stops planned in the US, Europe, and Southeast Asia.

Courtesy-GI.biz

Microsoft Unveils A Re-designed MSN

September 10, 2014 by mphillips  
Filed under Around The Net

Microsoft has unveiled its complete makeover of the MSN portal that now combines easy access to personal productivity tools and content from a large number of providers.

As the company tries to revive MSN, the focus this time is also on top content from the Web instead of offering original content. For the relaunch, the company has signed up with over 1,300 publishers worldwide including The New York TimesThe Wall Street JournalYomiuri, CNN and The Guardian.

A “Services Stripe” at the top of the MSN homepage gives users easy access to personal services including Outlook.com email, OneDrive, Office 365 and Skype, as well as popular third-party sites like Twitter and Facebook, according to an online preview launched by Microsoft on Sunday.

The new MSN also provides “actionable information” and content and personal productivity tools such as shopping lists, a savings calculator, a symptom checker, and a 3D body explorer. Readers will have access to content from 11 sections including sports, news, health and fitness, money, travel and video, wrote Frank Holland, corporate vice president of Microsoft Advertising, in a blog post.

The company said it has rebuilt MSN from the ground up for a mobile-first, cloud-first world. The new MSN helps people complete their key digital tasks across all of their devices, wrote Brian MacDonald, Microsoft’s corporate vice president for information and content experiences, in a blog post.

“Information and personalized settings are roamed through the cloud to keep users in the know wherever they are,” MacDonald added. Users worldwide can try out the new MSN preview.

In the coming months, Microsoft plans to release MSN apps across iOS and Android to complement its corresponding Windows and Windows Phone apps. “You only need to set your favorites once, and your preferences will be connected across MSN, Cortana, Bing and other Microsoft experiences,” MacDonald wrote.

Microsoft claims an audience of more than 437 million people across 50 countries for MSN.

MSN.com ranks number 26 among the top sites in the U.S., behind Microsoft’s own Bing site, Google’s search site, YouTube, Facebook and Yahoo’s portal, according to traffic estimates by Alexa.

 

 

IBM And Intel Move To Improve Cloud Security

September 10, 2014 by Michael  
Filed under Computing

IBM and Intel have announced that SoftLayer will be the first cloud platform to offer customers bare metal service that provides monitoring and security down to the microchip level. The move will tighten up security on cloud based systems just as Apple’s iCloud appeared to be hacked.

The IBM system works with Intel’s Trusted Execution Technology (TXT) which identifies if traffic is coming from a known location using trusted hardware. Intel TXT verifies components of a computing system from its operating system to its boot firmware and hardware and can then permit or deny a workload from running on that select server system. The increased security is also activated during boot up, meaning that it doesn’t add any performance overhead to applications.

It will also will help organizations improve governance, compliance, audit, application security, privacy, identity and access management and incident response.

Mark Jones, chief technology officer for SoftLayer, said that perceived security flaws were the biggest barrier to cloud adoption.

SoftLayer is the only bare-metal cloud platform offering Intel TXT, leading the industry in enabling customers to build hybrid and cloud environments that can be trusted from end-to-end,” he added.

Courtesy-Fud

 

NTT Experimenting With 400Gbps Optical Technology For Internet Backbone

September 9, 2014 by mphillips  
Filed under Around The Net

NTT has successfully tested technology for optical Internet backbone connections capable of transmitting 400Gbps on a single wavelength.

Working with Fujitsu and NEC, the Japanese telecommunications giant verified the digital coherent optical transmission technology for distances of several thousand kilometers to 10,000 km. With it, a single wavelength of light can carry 400 Gbps, four times the capacity of previous systems. Each fiber can carry multiple wavelengths, and many fibers can be bundled into one cable.

The approach could more than double existing capacity to meet ever-increasing bandwidth demand, especially by heavy data users.

The technology could be used in the next generation of backbone links, which aggregate calls and data streams and send them over the high-capacity links that go across oceans and continents. The fiber in the network would stay the same, and only the equipment at either end would need to change.

While the current capacity on such links is up to 8Tbps (terabits per second) per fiber, the new technology would make a capacity of 24Tbps per fiber possible, according to NTT.

“As an example of the data size, 24 Tbps corresponds to sending information contained in 600 DVDs (4.7 GB per DVD) within a second,” an NTT spokesman wrote in an email. “The verification was done using algorithms which are ready to be implemented in CMOS circuits to show that these technologies are practically feasible.”

To compensate for distortions along the optical fiber, researchers from the consortium developed digital backward propagation signal processing with an optimized algorithm. The result of this and other research is that the amount of equipment required for transmissions over long distances can be reduced, meaning the network could consume less electricity.

“We are extremely excited to show this groundbreaking performance surpassing 100 Gbps coherent optical transmission systems,” Masahito Tomizawa, executive manager of consortium leader NTT Network Innovation Labs, wrote in an email. “This new technology maintains the stability and reliability of our current 100 Gbps solutions while at the same time dramatically improving performance.”

The consortium said it is taking steps toward commercialization of the technology on a global scale but would not say when that might happen.