Subscribe to:

Subscribe to :: TheGuruReview.net ::

Telsa Electric Trucks Gets Vote Of Confidence From PepsiCo

December 13, 2017 by  
Filed under Around The Net

PepsiCo Inc has reserved 100 of Tesla Inc’s new electric Semi trucks, the biggest known order of the big rig, as the maker of Mountain Dew soda and Doritos chips seeks to reduce fuel costs and fleet emissions, a company executive said on Tuesday.

Tesla has been trying to convince the trucking community that it can build an affordable electric big rig with the range and cargo capacity to compete with relatively low-cost, time-tested diesel trucks.

 Early orders reflect uncertainty over how the market for electric commercial vehicles will develop. About 260,000 heavy-duty Class-8 trucks are produced in North America annually, according to FTR, an industry economics research firm.

PepsiCo intends to deploy Tesla Semis for shipments of snack foods and beverages between manufacturing and distribution facilities and direct to retailers within the 500-mile (800-km) range promised by Tesla Chief Executive Elon Musk.

The semi-trucks will complement PepsiCo’s U.S. fleet of nearly 10,000 big rigs and are a key part of its plan to reduce greenhouse gas emissions across its supply chain by a total of at least 20 percent by 2030, said Mike O‘Connell, the senior director of North American supply chain for PepsiCo subsidiary Frito-Lay.

PepsiCo is analyzing what routes are best for its Tesla trucks in North America but sees a wide range of uses for lighter loads like snacks or shorter shipments of heavier beverages, O‘Connell said.

Tesla did not immediately reply to a request for comment.

 Tesla unveiled the Semi last month and expects the truck to be in production by 2019.

Broadcomm, Qualcomm Merger May Face Resistance In China

November 8, 2017 by  
Filed under Consumer Electronics

The proposed mega-merger between chipmaker Broadcom Ltd and U.S. rival Qualcomm Inc is likely to face stern scrutiny from China, antitrust lawyers say, amid a strategic push by Beijing into semiconductors.

Broadcom made an unsolicited $103 billion bid for Qualcomm on Monday, aimed at creating a $200-billion-plus behemoth that could reshape the industry at the heart of mobile phone hardware.

But Chinese regulatory approval could be a hold-up. Beijing and Washington have sparred over technology deals, including in chips, with the Committee on Foreign Investment in the United States (CFIUS) knocking back a number of takeovers involving Chinese firms this year.

The thorny topic is likely to come up when U.S. President Donald Trump visits China this week – with Qualcomm executives in tow.

The merger would face a lengthy review from the anti-monopoly unit of China’s commerce ministry, due to strategic concerns, the huge size of the deal and because Qualcomm has come under fire before in the country over competition concerns.

“This is a critical industry for China and Qualcomm has been fined by the Ministry of Commerce (Mofcom) before so it’s on its radar,” said Wendy Yan, Shanghai-based partner at law firm Faegre Baker Daniels.

Qualcomm agreed to pay a record fine of $975 million in China in 2015 to end a probe into anti-competitive practices related to so-called “double dipping” by billing Chinese customers patent royalty fees in addition to charging for the chips.

China is making a major push to develop its own semiconductor industry under local champions such as Tsinghua Unigroup and Fujian Grand Chip Investment to help cut reliance on global operators including Qualcomm, Samsung Electronics Co Ltd and Intel Corp.

Can A Robot Manage People

October 24, 2017 by  
Filed under Around The Net

Up to a third of British Workers would be happy to report to a robot boss given the option but most thought that if it was the boss, it should pay tax.

The survey of 1,000 workers for accounting package Freeagent found that 31 per cent of those surveyed said they would be happy to work for a robot, with 10 per cent believing it would be “just the same as answering to a human boss”.

42 per cent said they would be “comfortable” taking orders from a robot. Men are more receptive than women, with 48 per cent of men saying yes, but just 36 per cent of women.

The most enthusiastic were the Welsh, where 38 per cent said they were down with a metallic master, whilst Northern Ireland was just behind with 37 per cent.

Bill Gates has said he believes that robot workers should pay tax like the rest of us (presumably subbed by pocket money from their masters – it’s a posh way of saying that there should be a robot levy to protect human workers). 57 per cent agree that “if they’re replacing the role of a person, the company owning the robot should be taxed the same”.

However, 43 per cent say that it would set a precedent for taxing technology, a view echoed by the EU in recent findings.

Ed Molyneux, CEO and co-founder of FreeAgent said: “Although it might be many years before we see physical robots taking over the workforce, many workers are already anticipating the changes that automation will bring in the years ahead.”

“The shifting landscape of AI and new technology will have a major impact on people in employment, but I don’t think that this is a gloomy outlook for the workforce. Previous research we’ve carried out has suggested that many employed people are keen to quit their jobs and start their own businesses. So as automation takes a more prominent role in the workforce, it’s likely we could see a self-employment boom in the future.”

“Previous research we’ve carried out has suggested that many employed people are keen to quit their jobs and start their own businesses. So as automation takes a more prominent role in the workforce, it’s likely we could see a self-employment boom in the future.”

“In this scenario, automation will actually be a major benefit for these new businesses, as technological advances will make business admin and data management much easier to manage than ever before.”

Alternatively, it could just be bloody creepy and lead to mass unemployment. And then for others, they’ll probably never notice the ruddy difference. We’ll leave that for you to decide.

Courtesy-TheInq

Will RISC-V Finally Hit Linux Next Year

October 16, 2017 by  
Filed under Computing

Linux fanboys tend to announce a lot of “year of” events. There is the year of the desktop which appears to be every year and still never happens and now there is the year of RISC V Linux processor.

SiFive has declared that 2018 will be the year of RISC V Linux processor, so mark your penguin diaries accordingly.  In the UK there will be all sorts of events planned, including guess the weight of Linus Torvalds competitions, there will be penguin tossing at Slough, The over 80s Linux nudist club will be holding a bring and buy sale and there will be the open sauce bob sleigh event down the escalators of Covent Garden tube station.

SiFive released its first open-source system on a chip, the Freeform Everywhere 310, last year. At the time it said it was aiming to push the RISC-V architecture to transform the hardware industry in the way that Linux transformed the software industry.

This year it released its U54-MC Coreplex, the first RISC-V-based chip that supports Linux, Unix, and FreeBSD. This latest opens up a whole new world of use cases for the architecture and paves the way for RISC-V processors to compete with ARM cores and similar offerings in the enterprise and consumer space.

The outfit claims that next year companies looking to build SoC’s around RISC-V will throng to the new developments.

Andrew Waterman co-founder and chief engineer at SiFive said the forthcoming silicon is going to enable much better software development for RISC-V.

Waterman said that, while SiFive had developed low-level software such as compilers for RISC-V the company really hopes that the open-source community will be taking a much broader role going forward and really pushing the technology forward.

“No matter how big of a role we would want to have we can’t make a dent. But what we can do is make sure the army of engineers out there are empowered.”

Courtesy-Fud

Samsung Goes 28nm With MRAM Chips

October 5, 2017 by  
Filed under Computing

Samsung Foundry will soon mass produce magnetoresistive random-access memory (MRAM) chips using 28nm fully depleted silicon-on-insulator (FD-SOI) process technology.

Samsung is reportedly teaming up with NXP and has completed the tape-out of its 28nm FD-SOI embedded MRAM, which will be first applied to NXP’s new low-power i.  The memory will be aimed at the automotive, multimedia and display panel applications.

Synopsys announced its Design Platform has been fully certified for use on Samsung Foundry’s 28nm FD-SOI process technology. It said that a PDK and a comprehensive reference flow, compatible with Synopsys’ Lynx Design System, containing scripts, design methodologies and best practices is now available.

Samsung’s foundry solutions team senior VP Jaehong Park said Samsung Foundry’s 28FD-SOI technology allows designs to operate both at high and low voltage making it ideal for IoT and mobile applications.

“The FD-SOI technology exhibits the best soft error immunity, and, therefore, is well suited for applications that require high reliability such as automotive,” Park said.

Courtesy-Fud

Is Open Source Winning

July 17, 2017 by  
Filed under Around The Net

Going way back, pretty much all software was effectively open source. That’s because it was the preserve of a small number of scientists and engineers who shared and adapted each other’s code (or punch cards) to suit their particular area of research. Later, when computing left the lab for the business, commercial powerhouses such as IBM, DEC and Hewlett-Packard sought to lock in their IP by making software proprietary and charging a hefty license fee for its use.

The precedent was set and up until five years ago, generally speaking, that was the way things went. Proprietary software ruled the roost and even in the enlightened environs of the INQUIRER office mention of open source was invariably accompanied by jibes about sandals and stripy tanktops, basement-dwelling geeks and hairy hippies. But now the hippies are wearing suits, open source is the default choice of business and even the arch nemesis Microsoft has declared its undying love for collaborative coding.

But how did we get to here from there? Join INQ as we take a trip along the open source timeline, stopping off at points of interest on the way, and consulting a few folks whose lives or careers were changed by open source software.

The GNU project
The GNU Project (for GNU’s not Unix – a typically in-jokey open source monicker, it’s recursive don’t you know?)  was created by archetypal hairy coder and the man widely regarded as the father of open source Richard Stallman in 1983. GNU aimed to replace the proprietary UNIX operating system with one composed entirely of free software – meaning code that could be used or adapted without having to seek permission.

Stallman also started the Free Software Foundation to support coders, litigate against those such as Cisco who broke the license terms and defend open-source projects against attack from commercial vendors. And in his spare time, Stallman also wrote the GNU General Public License (GNU GPL), a “copyleft” license, which means that derivative work can only be distributed under the same license terms –  in 1989. Now on its third iteration GPLv3, it remains the most popular way of licensing open source software. Under the terms of the GPL, code may be used for any purpose, including commercial uses, and even as a tool for creating proprietary software.

PGP
Pretty Good Privacy (PGP) encryption was created in 1991 by anti-nuclear activist Phil Zimmerman, who was rightly concerned about the security of online bulletin boards where he conversed with fellow protesters. Zimmerman decided to give his invention out for free. Unfortunately for him, it was deployed outside of his native USA, a fact that nearly landed him with a prison sentence, digital encryption being classed as a munition and therefore subject to export regulations. However, the ever-resourceful Mr Zimmerman challenged the case against him by reproducing his source code in the form of a decidedly-undigital hardback book which users could scan using OCR. Common sense eventually won the day and PGP now underpins much modern communications technology including chat, email and VPNs.

“PGP represents the democratisation of privacy,” commented Anzen Data CIO and developer of security software, Gary Mawdsley.

Linux
In 1991 Finnish student and misanthrope Linus Torvalds created a Unix-like kernel based on some educational operating system software called MINIX as a hobby project. He opened up his project so that others could comment. And from that tiny egg, a mighty penguin grew.

Certainly, he could never have never anticipated being elevated to the position of open-source Messiah. Unlike Stallman, Torvalds, who has said many times that he’s not a “people person” or a natural collaborator (indeed recent comments have made him seem more like a dictator – albeit a benevolent one), was not driven by a vision or an ideology. Making Linux open source was almost an accident.

“I did not start Linux as a collaborative project, I started it for myself,” Torvalds said in a TED talk. “I needed the end result but I also enjoyed programming. I made it publicly available but I had no intention to use the open-source methodology, I just wanted to have comments on the work.”

Nevertheless, like Stallman, the Torvalds name is pretty much synonymous with open source and Linux quickly became the server operating system of choice, also providing the basis of Google’s Android and Chrome OS.

“Linux was and is an absolute game-changer,” says Chris Cooper of compliance software firm KnowNow. “It was the first real evidence that open could be as good as paid for software and it was the death knell of the OS having a value that IT teams would fight over. It also meant that the OS was no longer a key driver of architectural decisions: the application layer is where the computing investment is now made.”

Red Hat
Red Hat, established in 1995, was among the first proper enterprise open source companies. Red Hat went public in 1999 with a highly successful IPO. Because it was willing to bet big on the success of open source at a time when others were not, Red Hat is the most financially buoyant open source vendor, achieving a turnover of $1bn 13 years later. Red Hat’s business model revolves around offering services and certification around its own Linux distribution plus middleware and other open source enterprise software.

“Red Hat became successful by making open source stable, reliable and secure for the enterprise,” said Jan Wildeboer, open source affairs evangelist at the firm.

Courtesy-TheInq

 

Is Google Poaching From Apple To Help The Pixel Phone

June 23, 2017 by  
Filed under Around The Net

Google has reportedly scooped up veteran chip architect Manu Gulati from Apple, fuelling speculation that the firm is designing custom silicon for its Pixel smartphones. 

Gulati has confirmed his move to Google on his recently updated LinkedIn profile, where he’s now listed as ‘Lead SoC Architect’ at the Mountain View firm.

His profile doesn’t give much else away, but Variety reports that Google has roped in Gulati to help it build custom chips for its future Pixel smartphones, as it looks to ditch Qualcomm in a bid to better take on the iPhone. 

Gulati certainly has the experience, having been instrumental in Apple’s efforts in building custom chips for the iPad, iPhone, and Apple TV, from the single-core A4 chip found inside the original iPad to the six-core A10X Fusion processor powering the new iPad Pro. 

What’s more, prior to joining Apple, Gulati worked for almost 15 years at chip makers AMD and Broadcom, giving him a total of 27 years of experience in the industry.

Coinciding with Gulati’s hire, Google has posted a number of job advertisements for chip design-related positions, including one for a ‘Mobile SoC CPU Architect’ and a ‘Mobile SoC Architect,’ who will “help define the architecture of future generations of phone and tablet chips.”

As well as shifting to custom silicon for its homegrown smartphones, Google is shifting OEM partners, according to 9to5Google. It reports that HTC has been binned in favour of LG, which has been roped in to build the next-generation Pixel XL, codenamed ‘Taimen’. 

It’s unclear why Google has ditched HTC in favour of LG, but the report notes that firm was perhaps dissatisfied with HTC’s manufacturing scale, given that both the Pixel and Pixel XL experienced severe shipping delays.

Courtesy-Fud

Is Samsung Readying MRAM?

May 2, 2017 by  
Filed under Computing

For those who came in late, MRAM is a nonvolatile memory that uses electron spin from applying current to magnetic material and writes and reads data depending on changes in resistance value. It is as fast as DRAM but has other advantages.

Samsung’s cunning plan is to use MRAM as an internal memory for system semiconductors. This means flogging the tech as Intellectual Property for its process technologies rather than selling individual product.

Samsung Electronics’ Device Solution Sector’s System LSI Business Department finished producing a prototype of SoC with MRAM built inside and is carrying out business activities on its customers.

At Samsung Foundry Forum Event on the 24 May, it is going to detail its process technology for MRAM embedded memory.  Meanwhile NXP and Samsung Electronics have agreed on a foundry contract regarding mass-production of 28-nano FD-SOI .

Starting from this year, SoC i.MX series will be mass-produced through FD-SOI process for the Internet of Things. The flash memories will be built inside of new products this year MRAM embedded memory technology will be used for next year’s next-generation SoC and MCU.

The FD-SOI process uses MRAM is with Customers can choose between flash memory and MRAM through embedded memory technology.

Samsung says the production cost of embedded DRAM is cheaper than production cost of flash memory. Ten sheets of masks are needed to build 45-nano flash memory inside of SoC and 20 sheets of masks are needed for 28-nano flash memory.

However only three or four sheets of masks are needed for MRAM and this lowers number of processes.

MRAM is also smaller than flash memory and its speed is faster than normal flash. Compared to SRAM, MRAM only takes up a third of an area.

As Samsung Electronics applies MRAM to its foundry business, it is likely that this will open new memory markets. Intel and Micron already commercialized PRAM called ‘3D XPoint’.

Courtesy-Fud

G.Skill Goes Superfast DDR4

April 26, 2017 by  
Filed under Computing

RAM maker G.SKILL has released a new DDR4-4333MHz 16GB (2x 8GB).

The outfit said that it has managed to overclock it to 4500MHz using an Intel Core i5-7600K processor paired with an ASUS ROG Maximus IX Apex motherboard.

“The latest addition to the Trident Z series of extreme performance memory kit is the DDR4-4333MHz CL19-19-19-39 timing in 16GB (8GBx2) at 1.40V. This is the first DDR4-4333MHz memory kit on the market in the 8GBx2 configuration for a total of 16GB,” said G.SKILL.

The company said that continuing with the pursuit of extreme memory speeds on the latest hardware, G.SKILL has reached an extreme DDR4-4500MHz speed on the Intel Z270 platform, “achieving a stunning bandwidth write speed of 65GB per second in dual channel mode”.

No word on price or release date yet.

Courtesy-Fud

Is Java Script The Most Popular Language?

March 29, 2017 by  
Filed under Computing

Beancounters at RedMonk have taken time out from their busy prayer wheels to create a list of the world’s most popular programming languages.

The list is based on data from both GitHub and Stack Overflow and the Red Monks have chanted a top 10 list for 2017.

1: JavaScript
2: Java
3: Python
4: PHP
5: (tie) C# and C++
6: (tie) Ruby and CSS
7: C
8: Objective-C

While there was little change in the top ten, there were a few stat changes in the also rans. This was mostly because GitHub data now counts the number of pull requests rather than the number of repositories.

As a result, Swift was a major beneficiary of the new GitHub process, jumping eight spots from 24 to 16.

For those who came in late, Swift was supposed to be the Great White Hope and which gave way to scepticism. The language appears to be entering something of a trough of disillusionment, but the Red Monks seem to think that Swift has reached a Top 15 ranking faster than any other language it has tracked since it has been doing the rankings.

TypeScript also did well, moving up 17 points and PowerShell moved from 36 to 19.

One of the biggest overall gainers of any of the measured languages, Rust leaped from 47 on the board to 26 one spot behind Visual Basic.

Courtesy-Fud

Linux To Support Virtual GPUs

February 24, 2017 by  
Filed under Computing

pen source’s Mr Sweary Linus Torvalds announced the general availability of the Linux 4.10 kernel series, which includes virtual GPU (Graphics Processing Unit) support.

Linus wrote in the announcement, adding “On the whole, 4.10 didn’t end up as small as it initially looked”.

The kernel has a lot of improvements, security features, and support for the newest hardware components which makes it more than just a normal update.

Most importantly there is support for virtual GPU (Graphics Processing Unit) support, new “perf c2c” tool that can be used for analysis of cacheline contention on NUMA systems, support for the L2/L3 caches of Intel processors (Intel Cache Allocation Technology), eBPF hooks for cgroups, hybrid block polling, and better writeback management.

A new “perf sched timehist” feature has been added in Linux kernel 4.10 to provide detailed history of task scheduling, and there’s experimental writeback cache and FAILFAST support for MD RAID5.

It looks like Ubuntu 17.04 will be the first stable OS to ship with Linux 4.10.

Courtesy-Fud

G.Skill Goes Trident Z with DDR4

November 10, 2016 by  
Filed under Computing

0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-0-gskill-tridentz6G.Skill has announced its latest and fastest Trident Z DDR4 64GB (4x16GB) memory kit working at 3600MHz with CL17 latency.

About two months after it announced its 32GB (8GBx4) DDR4-3866 Trident Z memory kit, G.Skill has now pushed the frequency limit on its 64GB memory kit by announcing the Trident Z 64GB kit working at 3600MHz and CL17-19-19-39 latency and at 1.35V.

The  64GB Trident Z DDR4-3600 memory kit features four 16GB memory modules based on high-performance Samsung 8Gb ICs and comes with a well known Trident Z design heatspreader. G.Skill was quite keen to note that it has already tested the kit on Asus Z170-Deluxe motherboard and Intel Core i5-6600K CPU, which you can check out in the screenshot below.

According to G.Skill, the Trident Z DDR4-3600 64GB memory kit should be available in retail in December but we weren’t given a precise price.

Courtesy-Fud

Singapore To Put Self-driving Buses On The Road

October 20, 2016 by  
Filed under Around The Net

self-driving-buses-150x150Singapore has signed an agreement to begin testing self-driving buses, as the city-state pushes ahead with its vision of using autonomous technology to help deal with the challenges posed by its limited land and labor.

Countries around the world are encouraging the development of such technologies, and high-density Singapore is hoping driverless vehicles will prompt its residents to use more shared vehicles and public transport.

“They say big dreams start small, so we are collaborating with NTU (Nanyang Technological University) on an autonomous bus trial, starting with two electric hybrid buses,” Singapore’s transport regulator said in a Facebook post.

The Land Transport Authority hopes eventually to outfit existing buses with sensors and develop a self-driving system that can effectively navigate Singapore’s traffic and climate conditions.

It did not specify when the trial would start.

Earlier this week, Singapore said it would seek information from the industry and research institutes on the potential use of self-driving vehicles for street cleaning and refuse collection.

Self-driving vehicles are also being tested in another western Singapore district, where a driverless car collided with a truck on Tuesday when changing lanes. Developer nuTonomy, which started trials of the world’s first robo-taxis in August, said it was investigating the accident.

Is Oracle Hitting A Road Block With Java?

October 3, 2016 by  
Filed under Computing

Oracle’s attempts to get a new trial on Java API’s have been rejected by a Judge who happens to know about programming.

Oracle lost a case when a Jury decided that the use of 37 Java APIs in Android was fair use.

Oracle thought that those 37 APIs were worth $9 billion and it asked for a new trail claiming that Google concealed information during discovery on its plans to integrate Android apps with the Chrome OS running on desktops and laptops, thus extending the scope of the infringement beyond smartphones and tablets.

District Judge William Alsup, who is a hobby programmer when he is not running the U.S. District Court for the Northern District of California denied Oracle’s request for a new trial, which would have been the third time the case has been tried.

In the decision he said that in 2015, Google began a new project, which it called internally as ARC++. Its aim was to provide Chrome OS users with Play Android apps on Chrome OS without developer action.

“ARC++ would run an isolated instance of Android (with all of Android’s public APIs, including those reimplemented from Java) in order to allow users to run all Android apps on Chrome OS devices. Google planned to include its ‘Play Store’ — Google’s app wherein users could purchase and download other Android apps — as part of ARC++ to facilitate access to those apps,” wrote Judge Alsup in his order.

However, the Judge did agree that Google had “stonewalled and had completely concealed the ARC++ project.” He said that Google had produced at least nine documents discussing the goals and technical details of ARC++ in 2015, at least five months before trial.

He said that Oracle’s failure to review the ARC++ documents was its own fault.

Judge Alsup pointed out that the evidence on ARC++ would not have impacted the trial in May because any evidence relating to implementations of Android on devices other than smartphones and tablets fell outside its scope.

“It may well be true that the use of the copyrighted APIs in ARC++ (or any other later use) will not qualify as a fair use, but that will not and does not mean that Google’s argument on transformative use as to the original uses on trial (smartphones and tablets) was improper. That Oracle failed to detect the ARC++ documents in its possession had no consequence within the defined scope of our trial,” Judge Alsup wrote.

Google’s launch of the full Android system on the Chrome OS is even now in its preliminary stages and available only to developers and on a limited set of devices, the Judge said.

The Judge’s order also dismissed Oracle’s demand for a new trial because of the exclusion of what the judge described as “minor evidence and testimony” from Stefano Mazzocchi, a member of the board of directors of the Apache Software Foundation in 2008.

It also rejected Oracle’s contention that the court had improperly excluded its document containing replies to the European Commission, which had asked for an explanation of the dispute between Google and Sun Microsystems, the company that developed Java and was later acquired by Oracle.

This one will go on forever. We are expecting Oracle to take it to the Supreme Court.

Courtesy-Fud

Is The IOT Really Taking Off?

September 26, 2016 by  
Filed under Around The Net

While every IT company in the entire universe appears to be hyping the internet of things (IoT) as the next big thing, there’s very little substance to the claims.

That’s according to Malcolm Penn, chairman and CEO of semiconductor analyst firm Future Horizons.

Speaking at a conference in London, Penn said: “The IoT is overhyped but some common sense is entering. The IoT is not one space. Everybody has their own spin on it.”

He said that the next phase of the IoT is “mercifully not wearables” and watches are not the right form factor. “Even Apple failed to break the [condundrum of] the 40 year old digital watch.”

Penn thinks, however, that there is some merit to the IoT concept.

“Connectivity is the key and data analysis is getting some traction,” he said. He added that technology has to be straightforward and simple. Where the IoT may come into his own, he suggested were electronic cars.

“There are more components in a smartphone than an electronic car,” he said.

Courtesy-Fud

Next Page »