Subscribe to:

Subscribe to :: TheGuruReview.net ::

A Majority Of Apple Pay Users Complain Of Issues

April 2, 2015 by mphillips  
Filed under Mobile

Apple Pay is not winning the hearts for many people who attempt to use the mobile payment service at the register.

A survey from research firm Phoenix Marketing International found that 68 percent of respondents who have used Apple Pay had encountered an issue when making an in-store purchase.

The leading compliant made by nearly half of respondents was that retailers’ sales terminals took too long to record a transaction. Other problems: employees who didn’t know how to process sales with the mobile wallet (42 percent); errors in how the sale posted (36 percent), like a transaction appearing twice; and out of service Apple Pay terminals (27 percent). Almost half of the Apple Pay users surveyed (47 percent) found that the particular store they visited didn’t accept Apple Pay although the retailer was supposed to support the service.

Apple Pay launched in October and is accepted at 700,000 locations and supported by 2,500 banks in the U.S., CEO Tim Cook said at an event earlier this month. Retailers that accept Apple Pay include Macy’s, Subway, Nike, Whole Foods and McDonald’s. Apple hasn’t shared details on when the service will be expanded internationally.

People appear eager to use Apple Pay, with 59 percent answering that they have asked store employees if the merchant accepts payments with the service. Using Apple Pay requires linking a credit or debit card to the service.

A majority of respondents used the mobile payment system in Apple stores (46 percent), followed by McDonald’s (36 percent) and Macy’s (30 percent). Apple Pay was also popular at Nike stores, Whole Foods and Walgreens.

 

 

 

AMD Working On Asynchronous Shaders

April 2, 2015 by Michael  
Filed under Computing

AMD has been working closely with Microsoft on the upcoming DirectX 12 to create something which it calls “Asynchronous Shaders,” which are more efficient way of handling task queues.

In DirectX 11 synchronous task scheduling is handled with multi-threaded graphics with pre-emption and prioritization.

GPU’s shaders do the drawing of the image, computing of the game physics, post-processing and more, and they do this by being assigned various tasks. These are handled through the command stream which is generated through merging individual command queues.

However the queue has empty bits because they are not generated in order in multi-threaded graphics. This means the shaders don’t reach their full potential and get stuck in dead end jobs, with ugly demanding partners who demand huge sacrifices for their screaming kids..

DirectX 12 enters

In DirectX 12, Asynchronous Shaders provide asynchronous multi-threaded graphics with pre-emption and prioritization. The Asynchronous Compute Engines (ACE) on AMD’s GCN-based GPUs will interleave the tasks, filling the gaps in one queue with tasks from another.

Despite that, it can still move the main command queue to the side to let priority tasks pass by when necessary. It probably goes without saying that this leads to a performance gain.

AMD’s GCN GPUs, each ACE can handle up to eight queues, with each one looking after its shaders. Basic GPUs will have just two ACEs, while more elaborate GPUs carry eight.

AMD demonstrations show that it does not cost much in frame rate to run Asynchronous Shaders and post-processing but it will improve performance.

All the increased parallelism and will ensure that more frames make their way to the screen even faster, which can be especially interesting for purposes such as VR.

Courtesy-Fud

 

Smartwatch Shipments Forecasted To Surge In 2015

April 1, 2015 by mphillips  
Filed under Consumer Electronics

Smartwatch shipments will swell by an impressive 500% this year, fueled by the interest in the coming Apple Watch and its impact on other smartwatches already in the market, according to market research firm IDC.

IDC also lowered its Apple Watch forecast to 15.9 million shipments in 2015, down from 22 million, a 28% reduction. That’s partly because more details have surfaced and the sales date of April 24 was later than IDC had previously expected, according to IDC analyst Ryan Reith in an email.

Even with that reduction, the Apple Watch will account for 62% of the smartwatch market in 2015, Reith said. IDC had expected the 22 million in Apple Watch shipments for this year in a forecast conducted last August, well before the shipment date and other details were announced at Apple’s Spring Forward event on March 9, Reith said.

A February forecast by CCS Insight put potential Apple Watch sales at 20 million in 2015. Apple Watch goes on sale April 24, starting at $349, but some gold-encased models will start at $10,000. Apple will begin taking pre-orders April 10.

IDC grouped the Apple Watch with Motorola’s Moto 360, several Samsung Gear watches and others under a category it calls smart wearables, or devices capable of running third-party apps. For all of 2015, IDC said 25.7 million smart wearables will ship, up 511% from the 4.2 million shipped in 2014.

The IDC smartwatch forecast is much lower than a recent prediction by Gartner, which says 40 million smartwatches will ship in 2015, up from about 5 million in 2014.

 

 

IBM To Invest $3B In ‘Internet of Things’ Services

April 1, 2015 by mphillips  
Filed under Computing

International Business Machines Corp announced that it will invest $3 billion over the next four years in a new ‘Internet of Things’ unit, hoping to offer its expertise in gathering and making sense of the surge in real-time data.

The Armonk, New York-based technology company said its services will be based remotely in the cloud, and offer companies ways to make use of the new and multiplying sources of data such as building sensors, smartphones and home appliances to enhance their own products.

For its first major partnership, IBM said a unit of the Weather Co will move its weather data services onto IBM’s cloud, so that customers can use the data in tandem with IBM’s analytics tools.

As a result, IBM is hoping that companies will be able to combine live weather forecasting with a range of business data, so companies can quickly adapt to customer buying patterns or supply chain issues connected to the weather.

For example, insurance companies could send messages to policyholders in certain areas when hailstorms are approaching and tell them safe places to park, saving money all round.

Or retail stores could compare weather forecasts with past data to predict surges or drop-offs in customer buying due to extreme weather, and to adjust staffing and supply chain logistics accordingly.

IBM said it was already working with some large companies, such as German tire maker Continental AG and jet engine maker Pratt & Whitney to help them use data in their processes.

 

Will Intel To Buy Altera?

April 1, 2015 by Michael  
Filed under Computing

Shares in Intel have surged after the news leaked that it is trying to buy fellow chipmaker Altera for a cool $10 billion.

If it goes ahead it will be Intel’s biggest purchase ever and the latest merger in the quickly consolidating semiconductor sector.

For those who came in late. Altera, was once part of AMD, and makes programmable chips widely used in mobile phone towers, the military and other industrial applications. Altera’s value to Intel is its programmable chips, which are increasingly being used in data centres, where they are customized for specialized functions such as providing web-search results or updating social networks.

It is seen as part of Intel Chief Executive Officer Brian Krzanich’s glorious plans to seek out new markets, and new technologies and to boldly go where noIntel has gone before.

Earlier this month, Intel slashed nearly $1 billion from its first-quarter revenue forecast to $12.8 billion, plus or minus $300 million, as small businesses put off upgrading their personal computers.

Altera is one of the only semiconductor companies with better gross margins than Intel, and with about two-thirds of its revenue from telecom, wireless, military/aerospace.

The story has yet to be confirmed by anyone other than the business press which probably means it is true.

Intel’s previous biggest deal, is the $7.7 billion purchase of security software maker McAfee in 2011.

Courtesy-Fud

Sprint Picks Chicago To Roll Out Advanced LTE Network

April 1, 2015 by mphillips  
Filed under Mobile

Sprint plans to add more than 540 jobs and 115 retail locations in the Chicago area along with its first LTE-Advanced upgrade in the nation. LTE Advanced has the potential of delivering 100 Mbps wireless download speeds.

The impact on the city of Chicago proper will be 300 new jobs by the end of 2015. Sprint will also install hundreds of new cellular sites in the city at an expected cost of $45 million by the end of 2016, Sprint said in a statement.

LTE Advanced has become more common around the globe in the past year and is being used in more than 30 countries, including the U.S., according to the Global Mobile Suppliers Association. In some countries, LTE Advanced offers wireless download data speeds of up to 300 Mbps — potentially 30 times faster than basic LTE, which has download speeds of 10Mbps to 20Mbps.

LTE Advanced uses different technologies to enhance speeds, but the earliest approach uses carrier aggregation. That lets operators treat two or three radio channels in different frequency bands as if they were one to send data at higher speeds.

AT&T has upgraded its network in several cities to LTE Advanced, including Chicago.

Sprint has the potential to reach 100Mbps in the Chicago upgrade by using channel aggregation, as well as MIMO (Multiple Input, Multiple Output) antennas and 8T8R (eight transmitters, eight receivers on a single radio), according to a spokeswoman. That 100Mbps would be about double the current Sprint LTE peak speeds of 50Mbps to 60Mbps.

New technology from companies such as Alcatel-Lucent, Qualcomm and Intel has made the expansion of LTE Advanced possible.

Sprint said faster speeds will support video and other bandwidth-rich apps such as online games, virtual reality and cloud services.

In Chicago, new cell towers and cellular equipment placed on buildings and other structures will include areas around the Rush University Medical Center and along Chicago Transit Authority subway routes as well as the area surrounding Garfield Park on Chicago’s West Side.

 

 

 

Chat Tool Slack Admits To Being Hacked

March 31, 2015 by mphillips  
Filed under Around The Net

The popular group chat tool Slack had its central database hacked in February, according to the company, potentially compromising users’ profile information like log-on data, email addresses and phone numbers.

The database also holds any additional information users may have added to their profiles like their Skype IDs.

The passwords were encrypted using a hashing technique. There was no indication the hackers were able to decrypt the passwords, Slack Technologies said in a blog post. No financial or payment information was accessed or compromised, it said.

The unauthorized access took place over about four days in February. The company said it has made changes to its infrastructure to prevent future incidents.

Slack was contacting a “very small number” of individual users who had suspicious activity tied to their accounts, or whose messages may have been accessed. Slack did not say how many users it thinks may have been affected in this way. A company spokeswoman declined to comment further.

There’s been strong interest in Slack’s business chat app since it launched last year, and its user base now tops 500,000.

To beef up security, Slack added a two-factor authentication feature on Friday. If it’s enabled, users must enter a verification code in addition to their normal password whenever they sign in to Slack. The company recommends that all users turn it on.

Slack has also released a password kill-switch feature, to let team owners and administrators reset passwords for an entire team at once. Barring that, users can reset their passwords in their profile settings.

 

 

Toshiba And SanDisk To Launch 48-Layer 3D Flash Chip

March 31, 2015 by Michael  
Filed under Computing

Toshiba has announced the world’s first 48-layer Bit Cost Scalable (BiCS) flash memory chip.

The BiCS is a two-bit-per-cell, 128Gb (16GB) device with a 3D-stacked cell structure flash that improves density and significantly reduces the overall size of the chip.

Toshiba is already using 15nm dies so, despite the layering, the finished product will be competitively thin.

24 hours after the first announcement, SanDisk made one of its own regarding the announcement. The two companies share a fabrication plant and usually make such announcements in close succession.

“We are very pleased to announce our second-generation 3D NAND, which is a 48-layer architecture developed with our partner Toshiba,” said Dr Siva Sivaram, executive vice president of memory technology at SanDisk.

“We used our first generation 3D NAND technology as a learning vehicle, enabling us to develop our commercial second-generation 3D NAND, which we believe will deliver compelling storage solutions for our customers.”

Samsung has been working on its own 3D stacked memory for some time and has released a number of iterations. Production began last May, following a 10-year research cycle.

Moving away from the more traditional design process, the BiCS uses a ‘charge trap’ which stops electrons leaking between layers, improving the reliability of the product.

The chips are aimed primarily at the solid state drive market, as the 48-layer stacking process is said to enhance reliability, write speed and read/write endurance. However, the BiCS is said to be adaptable to a number of other uses.

All storage manufacturers are facing a move to 3D because, unless you want your flash drives very long and flat, real estate on chips is getting more expensive per square inch than a bedsit in Soho.

Micron has been talking in terms of 3D NAND since an interview with The INQUIRER in 2013 and, after signing a deal with Intel, has predicted 10TB in a 2mm chip by the end of this year.

Production of the chips will roll out initially from Fab 5 before moving in early 2016 to Fab 2 at the firm’s Yokkaichi Operations plant.

This is in stark contrast to Intel, which mothballed its Fab 42 chip fabrication plant in Chandler, Arizona before it even opened, as the semiconductors for computers it was due to produce have fallen in demand by such a degree.

The Toshiba and Sandisk BiCS chips are available for sampling from today.

Courtesy-TheInq

 

Canonical Teams Up With Ericsson For Network Function Virtualization

March 31, 2015 by Michael  
Filed under Computing

Canonical and Ericsson have announced their arrival into the cloud telecoms market after signing a three-year collaboration to develop network Function Virtualization (NFV) products for software-defined communications networks.

The deal will see Ericsson deploying the Ubuntu Server operating system as the host for all its cloud offerings.

John Zannos, VP of cloud alliances and channels at Canonical, told The INQUIRER: “It’s actually a very exciting time to be alive, with the pace of change in the marketplace. As we move toward software-defined solutions more and more, we’re going to see the accelerating pace of change more than ever.”

By working together, the companies hope to drive adoption of NVF products and accelerate research.

The news comes just a day after Oracle and Intel announced a similar deal based on an Oracle hypervisor to control expansion and contraction of communication network nodes at an intelligent level.

As with that announcement, the Canonical-Ericsson arrangement is based on the interoperability provided by OpenStack, meaning that the alignment between the two projects is set to be much closer than one might expect.

“What is most exciting for us is not just the chance to work with Ericsson, which already carries nearly 40 percent of the world’s mobile traffic, but the opportunities that working together brings for us to take these concepts to the next level,” said Zannos.

Ubuntu is used in 80 percent of OpenStack cloud deployments worldwide. Using Ubuntu Server means that the partnership should be able to bring the newest ideas in open platform NVF.

“Our ability to offer scale-out solutions means that for the first time we can help meet the massive demand on telecoms in the future,” said Zannos.

“I don’t want to speculate on ‘infinite scalability’ because infinite is a pretty big number, but we’re certainly able to create solutions without the restraints of traditional hardware.”

The rollout of open platform NFV acts as a natural next step after the arrival of cloud communication. Virtualizing the workload of global communications, and reducing the natural lag of hardware controllers, allows providers to offer cheaper running costs, lower energy use and greater flexibility to grow and contract the network according to customer need.

Zannos added: “Organizations are struggling to keep pace with data, complexity, cost and compliance demands, so this partnership will help customers overcome many of these challenges.”

The Ericsson name disappeared from the consumer market after Sony acquired the joint Sony-Ericsson venture in 2012, but the Swedish company’s reach remains vast. A venture into virtual telecoms, alongside the biggest single Linux distribution, is bound to disrupt the market.

Ericsson recently became the latest company to join the alliance of Canonical’s Snappy Ubuntu Core for the Internet of Things.

Zannos also confirmed that there will be room for cross-fertilization between the two alliances in the coming months and years, particularly with the opportunities for the silent, seamless firmware upgrades that underpin the technology.

Courtesy-TheInq

Alibaba, BMG Sign Music Distribution Deal

March 31, 2015 by mphillips  
Filed under Around The Net

Germany’s BMG music rights company announced that it has signed a music digital distribution deal with China’s Alibaba Group Holding Ltd, as the world’s largest e-commerce giant firms up its bid to become a digital media empire.

The deal, one of the first in China made by a major music publisher rather than a label, will bring more than 2.5 million copyrights to Alibaba, whose music platforms already had many of the songs from artists including Kylie Minogue, the Rolling Stones and Jean-Michael Jarre, an Alibaba spokeswoman said.

Alibaba has set its eyes on becoming an online-media powerhouse, with music, film and television. The $210-billion firm has touted the potential for selling digital products as well as physical products in China, despite the country’s track record of users not paying for media content.

In the process, it is vying with Tencent Holdings Ltd, China’s biggest social networking and online entertainment firm, and search leader Baidu Inc and its online video unit, iQiyi.

For BMG, the tie-up is both a chance to boost earnings by its artists in China and part of its attempt to “grow the legitimate music market in China”, the company said.

BMG last November linked up with Chinese independent company Giant Jump to manage publishing and recording rights both at home and overseas.

Alibaba’s Digital Entertainment arm will “promote BMG writers and artists through channels such as its streaming apps Xiami and TTPod” and “monitor and take action against digital and mobile services who may infringe the rights of BMG clients,” the subsidiary of Bertelsmann AG, Europe’s largest media company, said in a statement.

“Internet and particular mobile media are quickly providing an answer to the music industry’s long-time challenge of how to monetize the vast untapped potential of the Chinese market,” BMG Chief Executive Hartwig Masuch said in Monday’s statement.

 

GPS To Become Obsolete?

March 30, 2015 by mphillips  
Filed under Uncategorized

Finding GPS unreliable in certain situations, the U.S. government is focusing on developing a more reliable real-time position tracking technology whose signals won’t disappear in blind spots and can’t be jammed.

The Defense Advanced Research Projects Agency is developing “radically” new technologies to deliver a more advanced position- and navigation-tracking system that is more reliable and accurate than GPS, according to a document on DARPA research projects recently released.

DARPA — which is a part of the U.S. Department of Defense — thinks that new real-time positioning technology would give the U.S. military an advantage over rivals. GPS technology has provided a strategic advantage, but it isn’t foolproof, as it can be jammed by opponents or also be inaccessible in some parts of the world.

“The need to be able to operate effectively in areas where GPS is inaccessible, unreliable or potentially denied by adversaries has created a demand for alternative precision timing and navigation capabilities,” DARPA said in the document.

Beyond the military, GPS has had a significant impact on individuals, business and economies. GPS has helped deliver customized content and services to mobile users, and also helped in the timely delivery of goods. But GPS isn’t flexible, and DARPA wants to make its alternative more flexible and customizable with the help of algorithms.

New types of self-contained instruments are under development that could better track position, time and direction of motion, which are critical aspects of GPS. DARPA is developing high-precision clocks, self-calibrating gyroscopes and accelerometers, and high-precision navigation instruments that can track position for long periods without relying on external sources.

DARPA is also researching new technologies that could make real-time tracking possible through a number of sources. DARPA is developing sensors that “use signals of opportunity” such as television, radio, cell towers, satellites, and even lightning, for real-time tracking. The effort, called ASPN (All Source Positioning and Navigation) alleviates issues related to fixing locations in buildings, deep foliage, underwater or underground, where GPS access can be limited.

The ultimate goal is to develop a compact navigation system that could be given to soldiers, put on tanks or implemented in guidance systems.

 

 

 

Amazon Unleashes Unlimited Storage For $5 A Month

March 30, 2015 by mphillips  
Filed under Around The Net

Amazon upped the ante: Unlimited cloud storage for individuals for $5 a month ($59.99 per year).

Amazon’s Unlimited Everything Plan allows users to store an infinite number of  photos, videos, files, documents, movies and music in its Cloud Drive.

The site also announced a separate $12 per year plan for unlimited photos. People who subscribe to Amazon Prime already get unlimited capacity for photos. Both the Unlimited Everything Plan and the Photos Plan have three-month free trial periods.

Online storage and file sharing service providers, such as Google Drive, Dropbox, and iCloud, have been engaged in a pricing war over the past year. Last fall, Dropbox dropped its Pro plan pricing for individuals to $9.99 per month for 1TB of capacity. Dropbox offers 2GB of capacity for free.

Dropbox also offers members 500MB of storage each time they get a friend to sign up; there’s a 16GB max on referrals, though. With Dropbox Pro, members can get 1GB instead of 500MB each time they refer someone.

Google Drive offers 15GB of capacity for free and charges $1.99 per month for 100GB and $9.99 per month for 1TB.

Apple’s iCloud offers 5GB of capacity for free, and charges 99 cents per month for 20GB, $3.99 per month for 200GB and $9.99 per month for 1TB.

Microsoft’s OneDrive offers 15GB of capacity for free, and charges $1.99 per month for 100GB, $3.99 per month for 200GB and $6.99 per month for 1TB.

While Amazon offers unlimited file size uploads for desktop users, it limits file sizes to 2GB for mobile devices.

 

 

Will Intel Challenge nVidia In The GPU Space?

March 30, 2015 by Michael  
Filed under Computing

Intel has released details of its next -generation Xeon Phi processor and it is starting to look like Intel is gunning for a chunk of Nvidia’s GPU market.

According to a briefing from Avinash Sodani, Knights Landing Chief Architect at Intel, a product update by Hugo Saleh, Marketing Director of Intel’s Technical Computing Group, an interactive technical Q&A and a lab demo of a Knights Landing system running on an Intel reference-design system, Nvidia could be Intel’s target.

Knights Landing and prior Phi products are leagues apart and more flexible for a wider range of uses. Unlike more specialized processors, Intel describes Knights Landing as taking a “holistic approach” to new breakthrough applications.

The current generation Phi design, which operates as a coprocessor, Knights Landing incorporates x86 cores and can directly boot and run standard operating systems and application code without recompilation.

The test system had socketed CPU and memory modules was running a stock Linux distribution. A modified version of the Atom Silvermont x86 cores formed a Knights Landing ’tile’ which was the chip’s basic design unit consisting of dual x86 and vector execution units alongside cache memory and intra-tile mesh communication circuitry.

Each multi-chip package includes a processor with 30 or more tiles and eight high-speed memory chips.

Intel said the on-package memory, totaling 16GB, is made by Micron with custom I/O circuitry and might be a variant of Micron’s announced, but not yet shipping Hybrid Memory Cube.

The high-speed memory is similar to the DDR5 devices used on GPUs like Nvidia’s Tesla.

It looks like Intel saw that Nvidia was making great leaps into the high performance arena with its GPU and thought “I’ll be having some of that.”

The internals of a GPU and Xeon Phi are different, but share common ideas.

Nvidia has a big head start. It has already announced the price and availability of a Titan X development box designed for researchers exploring GPU applications to deep learning. Intel has not done that yet for Knights Landing systems.

But Phi is also a hybrid that includes dozens of full-fledged 64-bit x86 cores. This could make it better at some parallelizable application categories that use vector calculations.

Courtesy-Fud

Are Free-To-Play Games Still In Its Infancy

March 30, 2015 by Michael  
Filed under Gaming

During a presentation at the Game Developers Conference earlier this month, Boss Fight Entertainment’s Damion Schubert suggested the industry to drop the term “whales,” calling it disrespectful to the heavy spenders that make the free-to-play business model possible. As an alternative, he proposed calling them “patrons,” as their largesse allows the masses to enjoy these works that otherwise could not be made and maintained.

After his talk, Schubert spoke with GamesIndustry.biz about his own experiences with heavy spending customers. During his stint at BioWare Austin, Schubert was a lead designer on Star Wars: The Old Republic as it transitioned from its original subscription-based business model to a free-to-play format.

“I think the issue with whales is that most developers don’t actually psychologically get into the head of whales,” Schubert said. “And as a result, they don’t actually empathize with those players, because most developers aren’t the kind of person that would shell out $30,000 to get a cool speeder bike or whatnot… I think your average developer feels way more empathy for the free players and the light spenders than the whales because the whales are kind of exotic creatures if you think about them. They’re really unusual.”

Schubert said whales, at least those he saw on The Old Republic, don’t have uniform behavior patterns. They weren’t necessarily heavy raiders, or big into player-vs-player competition. They were just a different class of customer, with the only common attribute being that they apparently liked to spend money. Some free-to-play games have producers whose entire job is to try to understand those customers, Schubert said, setting up special message boards for that sub-community of player, or letting them vote on what content should be added to a game next.

“When you start working with these [customers], there’s a lot of concern that they are people who have gambling problems, or kids who have no idea of the concept of money,” Schubert said.

But from his experience on The Old Republic, Schubert came to understand that most of that heavy spending population is simply people who are legitimately rich and don’t have a problem with devoting money to something they see as a hobby. Schubert said The Old Republic team was particular mindful of free-to-play abuse, and had spending limits placed to protect people from credit card fraud or kids racking up unauthorized charges. If someone wanted to be a heavy spender on the game, they had to call up customer service and specifically ask for those limits to be removed.

“If you think about it, they wanted to spend money so much that they were willing to endure what was probably a really annoying customer service call so they could spend money,” Schubert said.

The Old Republic’s transition from a subscription-based model to free-to-play followed a wider shift in the massively multiplayer online genre. Schubert expects many of the traditional PC and console gaming genres like fighting games and first-person shooters to follow suit, one at a time. That said, free-to-play is not the business model of the future. Not the only one, at least.

“I think the only constant in the industry is change,” Schubert said when asked if the current free-to-play model will eventually fall out of favor. “So yeah, it will shift. And it will always shift because people find a more effective billing model. And the thing to keep in mind is that a more effective billing model will come from customers finding something they like better… I think there is always someone waiting in the wings with a new way of how you monetize it. But I do think that anything we’re going to see in the short term, at least, is probably going to start with a great free experience. It’s just so hard to catch fire; there are too many competitive options that are free right now.”

Two upstart business models Schubert is not yet sold on are crowdfunding and alpha-funding. As a consumer, he has reservations about both.

“The Wild West right now is the Kickstarter stuff, which is a whole bunch of companies that are making their best guess about what they can do,” Schubert said. “Many of them are doing it very, very poorly, because it turns out project management in games is something the big boys don’t do very well, much less these guys making their first game and trying to do it on a shoestring budget. I think that’s a place where there’s a lot more caveat emptor going on.”

Schubert’s golden rule for anyone thinking of supporting a Kickstarter is to only pledge an amount of money you would be OK losing forever with nothing to show for it.

“At the end of the day, you’re investing on a hope and a dream, and by definition, a lot of those are just going to fail or stall,” Schubert said. “Game development is by definition R&D. Every single game that gets developed is trying to find a core game loop, trying to find the magic, trying to find the thing that will make it stand out from the 100 other games that are in that same genre. And a lot of them fail. You’ve played 1,000 crappy games. Teams didn’t get out to make crappy games; they just got there and they couldn’t find the ‘there’ there.”

He wasn’t much kinder to the idea of charging people for games still in an early stage of development.

“I’m not a huge fan of Early Access, although ironically, I think the MMO genre invented it,” Schubert said. “But on the MMOs, we needed it because there are things on an MMO that you cannot test without a population. You cannot test a 40-man raid internally. You cannot test large-scale political systems. You cannot test login servers with real problems from different countries, server load and things like that. Early Access actually started in my opinion, with MMOs, with the brightest of hopes and completely and totally clean ideals.”

Schubert has funded a few projects in Early Access, but said he wound up getting unfinished games in return. Considering he works on unfinished games for a living, he doesn’t have much patience for them in his spare time, and has since refrained from supporting games in Early Access.

“I genuinely think there are very few people in either Kickstarter or Early Access that are trying to screw customers,” Schubert said. “I think people in both those spaces are doing it because they love games and want to be part of it, and it’s hard for me to find fault in that at the end of the day.”

Courtesy-GI.biz

Amazon Rumored To Be Pursuing Net-a-Porter

March 30, 2015 by mphillips  
Filed under Around The Net

Amazon.com is holding discussions to acquire online luxury retailer Net-a-porter in what could be the biggest acquisition yet for the e-commerce giant, but the negotiations are in early stages and could fall apart, Forbes reported, citing a person familiar with the matter.

The potential deal, first reported by Women’s Wear Daily, could value Net-a-Porter lower than the valuation of 2 billion euros ($2.16 billion) reported by the fashion industry trade journal, Forbes reported last Thursday, citing the person.

Seattle-based Amazon has long eyed the high-end fashion retail sector and any deal for Net-a-Porter would mean a new commitment in an area where the company lacks a strong presence, Forbes said.

“It’s Day 1 in the category,” Amazon Chief Executive Jeff Bezos told the New York Times in an interview in 2012, saying the company was making a “significant” investment in fashion to convince top brands that it wanted to work with them, not against them.

Media reports in 2014 said Amazon was in talks to buy Indian fashion retailer Jabong.com for $1.2 billion.

Net-a-Porter is owned by luxury goods group Richemont, which bought the London-based company for 392 million euros in 2010.

A spokeswoman for Net-a-Porter said the company does not comment on industry speculation.

Amazon.com and Richemont could not be immediately reached for comment outside regular business hours.