It looks like that Qualcomm wants to make drones smarter and the company plans to use the Snapdragon developer board to do so. We had a chance to see the proof of concept drones that are capable of knowing and mapping environment.
Hugo Swart, Sr Director, Head IoE-consumer electronics at Qualcomm, has explained that the general direction in smart drone market at this time is the consumer electronic. Swart confirmed that the first drones powered by Qualcomm Snapdragon Flight drone platform technology should be commercially available very soon.
The company see drones as flying cameras, as most of sold drones have being used for video or aerial photography purpose. The drone we saw demonstrated at Qualcomm San Diego campus were powered by Snapdragon 410c developer board and this is one light device. The drone weights just bel 250 grams and it is made from composite materials. It packs a few cameras, four rotors and a Snapdragon 410 based developer board that makes the drone smart.
The actual weight is an important detail, as drones that are less than 250 grams do not have to be registered by the aviation authorities in the US. The demo showed a drone that used multiple camera to map the world around it, and it is aware of its surroundings.
The operator would use the tablet to fly the drone and the software had some nice features, like the use of the GPS to mark the position, and when necessary, the operator would just press the button and drone would find its way back to the marked position.
Since the drone would be using multiple cameras to map the world around it, it would be able to find a new path and avoid possible obstacles on its fly path. The demonstration we saw was done in a controlled environment with a huge rock in the middle of the environment, and the drone was avoiding the rock just as you would expect it.
The drone was able to detect a wall, and it would not let you fly in it and damage it. Drone would simply stop and would not crash and break no matter how hard you would try. The other nice feature was that the drone would be able to find its own way to the position market by GPS. It would not have to fly the path that you already flown, it would be able to find a shorter part to the mark position too.
Adding Snapdragon SoC on the drone would definitely make the flights safer and help you avoid damaging the drones or stuff around you. If you fly big drones for example with big cameras, you do not actually want to crash it and potentially destroy hundreds of dollars worth equipment.
Swart does believe that drones using Snapdragon Fly technology will first find its way in “flying camera drones” while later there might be a commercial applications with the Snapdragon Fly drones. Yes, at some point in the future, drones powered with this technology should be able to deliver packages. That is one of potential areas.
The only downside of this super lightweight drone was the fact that it had a small battery that would let it fly for six to eight minutes. Of course, if you make a larger drone with a larger battery, you would be able to fly it longer, but as we said this is a proof of concept designed to show the capabilities of this flying cameras. Qualcomm will have customers who will make the actual devices, the drone we saw in the demo room, was just to show the capabilities of the platform.
Partners will design its own drones and use the developer board (or integrated Snapdragon platform in an actual drone). The important part is the software who makes the synergy of the flying hardware and the visual compute in one Smart flying drone. If you are into drones, that this will definitely improve the overall experience.
Nvidia has been talking about its Tesla M10 GPU designed to run on the latest version of the company’s GRID technology.
For those who came in late, GRID technology is supposed to give servers a kick in the graphics back-end. It powers virtual desktops and support cloud-powered gaming.
Nvidia says the Tesla M10 GPU can support up to 64 desktops per board and 128 per server with two boards. This means shedloads of virtual machines which are potentially dead and alive.
The new graphics card ccan support Citrix’s XenApp and virtual PCs running Windows, or power virtual workstations that need the performance for professional graphics work.
The M10 is a bit like the M6 and M60 as a GPU accelerator – unlike the M10 motorway which is a disappointingly short road connected the M1 to the A414 just south of St Albans.
Companies making use of virtual machines or looking to substitute hardware for more efficient virtual systems can access the GRID and Tesla tech for less than $2 per month per user for use with virtual apps and remote desktop sessions, and the firm will provide virtual PCs for less than $6 per month per user.
The company was rumored to have been designing its own chip, based partly on job ads it posted in recent years. But until today it had kept the effort largely under wraps.
It calls the chip a Tensor Processing Unit, or TPU, named after the TensorFlow software it uses for its machine learning programs. In a blog post, Google engineer Norm Jouppi refers to it as an accelerator chip, which means it speeds up a specific task.
At its I/O conference Wednesday, CEO Sundar Pichai said the TPU provides an order of magnitude better performance per watt than existing chips for machine learning tasks. It’s not going to replace CPUs and GPUs but it can speed up machine learning processes without consuming a lot more more energy.
As machine learning becomes more widely used in all types of applications, from voice recognition to language translation and and data analytics, having a chip that speeds those workloads is essential to maintaining the pace of advancements.
The TPU is in production use across Google’s cloud, including powering the RankBrain search result sorting system and Google’s voice recognition services. When developers pay to use the Google Voice Recognition Service, they’re using its TPUs.
Urs Hölzle, Google’s senior vice president for technical infrastructure, said during a press conference at I/O that the TPU can augment machine learning processes but that there are still functions that require CPUs and GPUs.
Google started developing the TPU about two years ago, he said.
Right now, Google has thousands of the chips in use. They’re able to fit in the same slots used for hard drives in Google’s data center racks, which means the company can easily deploy more of them if it needs to.
Orcs Must Die! Studio Robot Entertainment is a rare breed nowadays – in an age where you’re either indie or AAA, the Plano, Texas-based company (one of several Texas developers that rose from the ashes of Age of Empires studio Ensemble) has managed to succeed as a mid-sized outfit. When Robot was formed in 2009, the company operated on a small scale, but things really changed when it landed a major investment from Chinese media giant Tencent in 2014. That enabled Robot to scale up and to benefit from Tencent’s knowledge at the same time.
“We made the first Orcs Must Die! as a semi-indie studio. We were about 40-45 people. We’re about twice that size now. And we were able to do Orcs Must Die! and Orcs Must Die! 2 with that. We kind of kept following the franchise and following what the fans were asking for in that game and we knew the next version was going to be bigger. We had to make a strategic decision – were we going to stay small and try to do another small version of that game or did we want to be ambitious and try to do something a little bit bigger? And that was going to necessitate a different type of arrangement for us to find financing. Because, you know, just selling a $15 or $20 game on Steam over and over is tough to support a studio to make a bigger game,” Robot CEO Patrick Hudson told GamesIndustry.biz.
“We also did some licensing deals for this game. As an online game, we didn’t necessarily have an ambition of setting up a European publishing office or an Asian publishing office. So we went to Europe and we partnered up with GameForge and licensed the rights for them to publish the game for us. And that comes with some advances and license fees, which help us make the game. We did the same thing with Tencent in China and that led to an investment. So we are in that mid-space. I think you’re right that there are fewer people in that space right now. It would probably be harder for us to stay in that space if we didn’t have really strong partnerships with folks like GameForge and Tencent.”
Investments and partnerships can clearly make a difference to any game company, but it’s also easy to mismanage a studio’s growth. Before you know it, one department doesn’t know what the other is doing, and things spiral out of control.
“It’s all in how you manage it. You’re either afraid of that growth or you embrace it, put a process and structure in place to allow for that. There’s no question we have to run our studio differently at 90 people than we did at 45. There’s more structure in place, there are more layers of leadership to help the project along. We’ve done a decent job of managing the growth… We went through the same kind of growth curve at Ensemble and we actually spent a lot of time talking about what went well, what didn’t go well, ‘What did we learn from that experience that we could have managed the growth better, how do we apply that to Robot?’ So we try to be a little bit smarter about that. Talking to other friendly studios [helps also] – ‘Hey, what did you guys do through this kind of growth? What pains did you experience? What did you learn?’ So we’ll grow as much as it takes to support Orcs Must Die! or as little to support it,” Hudson continued.
While everyone was devastated when Microsoft seemingly shut down a successful Ensemble Studios for no good reason, Hudson takes it as a learning experience.
In Ensemble’s case, Hudson discovered that scale ultimately held back some of its better talent. “Age of Empires attracted a lot of really good game talent to the studio, either people who were starting fresh in the games industry and learned how to make great games inside of Ensemble or we recruited really talented people to Dallas to work on the Empires franchise and, ultimately, Halo Wars. So we had just a tremendous amount of pent up talent in what was not a huge studio. At its peak it was 120 people. So it was very densely populated with talent. When you’re a studio that size, you have a lead structure within each department, but not everybody gets a chance to take those leadership positions and do their own games. Once Ensemble went away, you saw all these talented people go off in different places and show what they were capable of,” he remarked.
Working at Ensemble instilled a certain level of dedication to quality in all the developers who worked there too. “We held ourselves to a really high standard of making games that everyone took with them to their next places. I would say, in addition to that… all of us worked for another six years for Microsoft post-acquisition, so we got to learn the industry as both indie developers and inside a publisher. We got to learn the entire space, how the whole ecosystem is close to the publishing side. So that was a very valuable experience that maybe a lot of other devs don’t get,” Hudson said.
There’s no animosity or regret about Ensemble either, as far as Hudson is concerned: “Six years is a long time to be with a company post-acquisition. It was actually, for the most part, six good years. Microsoft treated us well. I think we worked well with the people we worked with at Microsoft. You do see some [studios] that get acquired and they’re gone within a year or two. We didn’t have that experience. I kind of view six years as a nice success.”
Perhaps the greatest lesson that Hudson and Robot have learned, even before the rise of Kickstarter and Steam Early Access, is that listening and responding to a vibrant community is critical. Discoverability has become a nuisance to deal with, and you need the fans behind you in order to succeed. If you have expectations that a platform holder will feature you, your marketing strategy needs an overhaul.
“As some of those previous PC developers that came into mobile are now migrating back to PC, discoverability on PC has become not quite as bad as mobile, but it’s not easy. There’s a lot of content on Steam now. There’s no easy space. Games is more competitive and a harder business than it’s probably ever been. There’s just a lot of great developers out there making a lot of great content and there’s just no barriers to putting your content out there to players, and players move quickly from game to game. They’re going to seek the best content,” Hudson noted.
He continued, “When I talk to the Valve or Apple or Google folks, they know the problem. They see it. But it’s an almost impossible problem to solve… Everyone wants to be featured, right? It’s funny, when you talk to a new mobile developer and be like, ‘Hey, we’re gonna make this great game. We’re gonna be featured.’ Probably not. You’re probably not going to be featured. Unless you’re doing something really cool and innovative and very different that really shows off the platform.
“They all have different programs to try and help you get noticed but you can’t make that the core of your strategy. It’s really up to you to make a great game. If you don’t have a marketing budget to cultivate a community, start with a small community, really cultivate it and listen to them and speak to them and let them organically grow. It’s not the platform holder’s job to make it successful.”
Beyond building a robust community, selecting the right business model for your game is crucial. While free-to-play is almost the default option in today’s market, Hudson said that premium games are coming back too.
“We really do think of it as a case-by-case. There are interesting trends in the market where you’re seeing paid games come back in certain areas – even in China where we’re seeing an uptick in paid games, customers in China buying paid games. [That's] never happened before. So it’s really going to depend on the game, the needs of the game,” he commented.
For Orcs Must Die! Unchained, which just entered an open beta about a month ago, free-to-play just made sense for Robot, as it’s a big multiplayer MOBA-style tower defense game; Robot wants as many people online for matchmaking as possible. Hudson and Robot have tried free-to-play before with Hero Academy in 2012, but he fully admitted, “We made a ton of mistakes, we didn’t really know what we were doing. It was a very successful game critically. It probably should’ve been a little more successful for us commercially, but we learned those lessons and hopefully we’re applying some of those.
“[Unchained] will be our first big free-to-play PC title. And we get a lot out of our partners too. GameForge has been operating free-to-play titles forever. Tencent has been operating free-to-play titles forever and we really lean on their expertise and we ask them to be involved with us as we design the game. The nice thing about both of those partners is… monetization follows. They start with making a great game, get the players around, keep the players around, [and then] hopefully they’ll pay you down the road. But don’t solve for money up front. So we’ll see. This will be our first foray into it. We’ll make a few more mistakes I’m sure but hopefully we learn quickly.”
Right now Robot remains 100 percent committed to Orcs Must Die! and the studio is bringing the game to PS4 later this year, but that doesn’t mean it expects to be pigeonholed with that one franchise. Hudson said that Robot continues to brainstorm new IP ideas, but nothing has made it too far along in development to warrant a release. “We’ll definitely do a new IP again. We started a couple of prototypes in the past few years that haven’t panned out. It happens all the time, right?” he said, adding that the company also remains interested in mobile but is “very cautious.”
“I think what’s interesting about mobile over the last couple of years is how non-dynamic the market is as far as the top games. The games that have lived in the top charts have been there now for 2 or 3 years. They get there and they stay there and they’re really good at staying there and it’s hard to break in and become the new thing. There are some good case studies for that. Certainly not nearly as many as there are on PC,” he said.
Hudson on VR
Likewise, virtual reality, although enticing, is just too risky for a studio like Robot, Hudson noted.
“It comes back to a company our size and where we sit. For us to overinvest in a market where it’s hard to know what the growth curve is going to be would be pretty risky at our size. We can’t afford to be wrong on something this new and this different… We love the options it provides for new and compelling experiences in games. We’ve brainstormed plenty of ideas for Orcs Must Die! in VR and we’ve got some pretty good ones, but it’ll be a while before we seriously invest in it,” he said.
Hudson joked that Robot is “living vicariously” though a couple of ex-Ensemble studios in Dallas that are working on VR now.
A conservative and cautious approach is probably one of the reasons Robot has managed to survive in an increasingly challenging environment. Even for eSports – an area of the industry that Orcs Must Die! clearly could excel in – Hudson isn’t jumping in headfirst.
That being said, Hudson is definitely optimistic about eSports as a sector. “I think it’s going to become an increasingly large aspect of the industry. And there will be the games that work and the games that don’t work for it. There will be a lot of companies chasing it and probably crash on the rocks trying to get there, but it’s going to continue to grow. I think you’ll see it across platforms too. I think you’ll continue to see eSports be popular in mobile. It’ll continue to grow there. You think of it as a PC thing now but it’s not. I think it’s going to encompass all aspects of games,” he said.
The announcement was posted on a dark market website called TheRealDeal by a user who wants 5 bitcoins, or around $2,200, for the data set that supposedly contains user IDs, email addresses and SHA1 password hashes for 167,370,940 users.
According to the sale ad, the dump does not cover LinkedIn’s complete database. Indeed, LinkedIn claims on its website to have more than 433 million registered members.
Troy Hunt, the creator of Have I been pwned?, a website that lets users check if they were affected by known data breaches, said it’s highly likely for the leak to be legitimate. He had access to around 1 million records from the data set.
“I’ve seen a subset of the data and verified that it’s legit,” Hunt said.
LinkedIn suffered a data breach back in 2012, which resulted in 6.5 million user records and password hashes being posted online. It’s highly possible that the 2012 breach was actually larger than previously thought and that the rest of the stolen data is surfacing now.
LinkedIn did not immediately respond to a request for comment.
Attempts to contact the seller failed, but the administrators of LeakedSource, a data leak indexing website, claim to also have a copy of the data set and they believe that the records originate from the 2012 LinkedIn breach.
When the 6.5 million LinkedIn password hashes were leaked in 2012, hackers managed to crack over 60 percent of them. The same thing is likely true for the new 117 million hashes, so they cannot be considered safe.
Worse still, it’s very likely that many LinkedIn users that were affected by this leak haven’t changed their passwords since 2012. Hunt was able to verify that for at least one HIBP subscriber whose email address and password hash was in the new data set that is now up for sale.
Many people affected by this breach are also likely to have reused their passwords in multiple places on the Web, Hunt said via email.
Moving forward with his attempt to attract Indian customers and developers, Apple’s CEO Tim Cook announced that the company was setting up a new development center for its Maps product in Hyderabad in south India.
Apple earlier on Wednesday announced it would set up by early next year a facility in Bangalore to focus on helping developers on best practices and to improve the design, quality and performance of their apps on the iOS platform.
Cook is on his first visit to India, where the company saw a 56 percent year-on-year growth in iPhone sales in the first quarter even as its global iPhone sales and overall revenue dropped.
Apple’s new center will focus on the development of Maps for Apple products such as the iPhone, iPad, Mac and Apple Watch. The investment will accelerate Maps development and create up to 4,000 jobs, the company said.
The Cupertino, California, company did not disclose the size of its investment in the center though some reports have placed the figure at $25 million.
A large number of U.S. companies, including Texas Instruments, Oracle, Microsoft and IBM, have set up software, chip design and product development centers in India, to tap the country’s large pool of engineers.
“The talent here in the local area is incredible and we are looking forward to expanding our relationships and introducing more universities and partners to our platforms as we scale our operations,” Cook said in a statement.
India is the third-largest smartphone market in the world, after China and the U.S., according to Gartner research director Anshul Gupta.
Alphabet’s Google Inc introduced us to its answer to Amazon’s Alexa virtual assistant along with new messaging and virtual reality products at its annual I/O developer conference on Wednesday, doubling down on artificial intelligence and machine learning as the keys to its future.
Google Chief Executive Sundar Pichai introduced Google Assistant, a virtual personal assistant, along with the tabletop speaker appliance Google Home.
He also unveiled Allo, a new messaging service that will compete with Facebook’s WhatsApp and Messenger products and feature a chatbot powered by the Google Assistant. Allo, like WhatsApp, will also have end-to-end encryption when it is rolled out this summer.
Amazon’s Echo, a surprise hit that has other tech giants racing to match it, uses a virtual assistant called Alexa, a cloud-based system that controls the Echo speaker and responds to voice-controlled commands by users.
Like Alexa, Google Assistant can search the internet and adjust your schedule. However, Pichai said Google Assistant can use images and other information to provide more intuitive results.
“You can be in front of this structure in Chicago and ask Google who designed this and it will understand in this context that the name of that designer is Anish Kapoor,” said Pichai, pointing toward a photo of Chicago’s Cloud Gate sculpture.
For Google Home, the Google Assistant merges with Chromecast and smart home devices to control televisions, thermostats and other products. Google did not offer a specific release date or pricing for Google Home, saying only that it will be available later this year.
Intel has scored a more significant chunk of the upcoming iPhone 7 which is due to be released this year.
Digitimes deep throats claim that Intel will supply half the modem chips for use in the new iPhones slated for launch in September 2016.
Intel will itself package the modem chips for the upcoming new iPhones, but have contracted Taiwan Semiconductor Manufacturing Company (TSMC) and tester King Yuan Electronics (KYEC) to manufacture the chips, the sources said.
Qualcomm is currently the supplier of LTE modem chips for the iPhone, but Apple has been keep to avoid focusing on one supplier. Still, the figure of half the iPhone 7′s is much more than many expected. It is a pity for Intel that the iPhone 7 is not expected to be a big seller – mostly because there is little new under the bonnet and it looks the same as the iPhone 6S.
The ever shrinking Biggish Blue is working on a cheaper alternative to DRAM by making it denser.
Dubbed phase-change memory (PCM) the technology could give enterprises and consumers faster access to data at lower cost. IBM says it’s achieved a density rating of three bits on each cell, which is 50 percent more than the company showed off in 2011 with a two-bit form of PCM. The denser the RAM is the more capacity can be squeezed out of the pricey tech.
PCM works by changing a glass-like substance from an amorphous to a crystalline form using an electrical charge. Like NAND flash, it keeps storing data when a device is turned off. PCM responds to data requests faster than flash: In less than one microsecond, compared with 70 microseconds.
It also lasts longer than flash, to at least 10 million write cycles versus about 3,000 cycles for an average flash USB stick.
Three-bit PCM could find its niche as a faster tier of storage within arrays, including all-flash arrays, so the most-used data gets to applications faster. It could also take the place of a lot of the DRAM in systems, cutting the cost of technologies like in-memory databases.
IBM said that a customer who stores their OS on three-bit PCM would have their phone up and running a few seconds.
Three-bit PCM needs the backing of a chip maker. IBM wants it for its Power architecture, but that will make it less popular.
Biggish Blue isn’t predicting when three-bit PCM will be in mass-market systems, partly because the company doesn’t make memory and will have to find a partner. It might take two to three years for large-scale availability, the company said.
The move will open up new opportunities for designers of autonomous vehicles and security systems, among other connected things, according to ARM CEO Simon Segars. Computer vision is in its early stages, and Apical is at the forefront of embedding such technology, he said.
Apical’s technologies is already used in 1.5 billion smartphones, according to ARM, although many of those phones may be using nothing more sophisticated than a display brightness control Apical calls Assertive Display. That technology also turned up in Samsung Electronics’ new laptop, the ATIV Book 9.
Assertive Camera is another of Apical’s developments: It’s a range of software packages and silicon-based image signal processors for reducing image noise, managing color and shooting high dynamic range images.
ARM makes its money by designing chips that others manufacture, or licensing its chip modules for others to incorporate in their own designs.
In that context, Apical’s Spirit silicon building blocks are perhaps where ARM sees the most opportunity for growth. The Spirit silicon blocks process raw sensor data or video into a machine-readable representation of an image in an energy-efficient way, so ARM and its partners can use them to add computer vision capabilities to future low-power devices.
Putting image analysis and interpretation capabilities in hardware could accelerate and simplify the design of a whole host of products, including self-driving cars and security systems.
ARM paid US$350 million for Apical, closing the deal Tuesday, it said.
AMD’s Polaris strategy is becoming a bit clearer and even if we thought that the fabless chipmaker might have dropped the ball a bit, it’s cunning plan is starting to make sense.
Last week we saw Nvidia showing off its next-generation flagship GPU the GTX 1080 and the GTX 1070. The Green Goblin told us shedloads things which if true would clean AMD’s clock in terms performance.
It threw AMD’s decision to focus on the mainstream desktop and notebook markets with upcoming GCN (Graphics CoreNext) 4.0 GPUs, codenamed Polaris 10 and 11 into question.
Normally GPU manufacturers release the flagship or ‘high-end’ products first to get all the attention and then release the mid-range chips for the great unwashed a lot later once they have sorted out yields.
But AMD’s cunning plan suggests that it is going to do the opposite. It is risky, but it could mean that the outfit could make more money quickly. This is because mainstream GPUs account for the majority of GPU sales.
Sure the high-end, flagship level graphics cards carry the largest profit margins, mainstream and performance segment GPUs account for the vast majority of total graphics card sales. But it is not going to sort out AMD’s market share and profit woes.
AMD’s discrete GPU sales increased by 6.69 per cent in Q4 of 2015, which coincides with its release of the performance-segment R9 380X graphics card. Meanwhile Nvidia’s desktop discrete GPU shipments were down by 7.56 per cent from when it released its mainstream GTX 950.
Sure this is small potatoes, but it means that AMD could take roughly 7 per cent of Nvidia’s sales in a single quarter, by releasing a graphics card in a price segment that Nvidia had nothing.
Now Nvidia is going to be focusing on the high-end first and will not release anything for the performance for the mid-range for ages. But AMD will have its Polaris there and ready. In fact it will be about six months ahead of Nvidia which is more than enough time to drain a bit of the Green Goblin’s market share.
Then when AMD releases its flagship graphics card based on the HBM2 powered Vega 10 GPU, possibly as early as October 2016, it will arrive with a spec which is better than the GTX 1080 and is meant to go toe-to-toe with a possible GTX 1080 Ti or Titan X successor.
The plan requires nerves of steel, particularly as AMD’s bottom line is absolute pants at the moment, but it does make sense. However it is not good news for consumers. AMD is deliberately avoiding competition with its plan and this means that it can afford to charge a bit more until Nvidia pulls finger. Good for AMD but means that prices will be higher because AMD does not have to undercut Nvidia.
Nokia has demonstrated the feasibility of 10Gbps symmetrical data speeds over traditional hybrid fibre-coaxial (HFC) cable networks, such as those operated by Virgin Media in the UK.
Trumping BT’s 5Gbps XG.fast trials, Nokia’s prototype technology, called XG-Cable, is still at the proof-of-concept stage, but should easily integrate into the DOCSIS 3.1 suite of specifications focused on providing cable operators with technology innovations to transform the industry.
DOCSIS is the set of standards governing data access over cable TV networks, and DOCSIS 3.1 was designed to enable capacities of 10Gbps downstream, but only 1Gbps upstream. Nokia has taken this a step further by demonstrating that symmetrical speeds of 10Gbps are possible.
The technology is still at an early stage of development and no in-service date has been even floated by Nokia, but the test by Nokia Bell Labs has apparently demonstrated that the technology is viable using existing HFC cable networks, where fibre-optic cable is used to connect to cabinets on the street and coaxial copper cable lines are used for last-leg distribution to the customer premises.
XG-Cable means that cable operators will at some point in the future be able to use existing HFC cables in the last 200 meters to provide upstream speeds never before achievable owing to the limited spectrum available, according to Nokia.
This will enable the provision of ultra-fast broadband services to consumer locations that were not physically or economically viable unless fiber was brought all the way to the premises.
“The XG-Cable proof-of-concept is a great example of our ongoing effort and commitment to provide the cable industry with the latest innovations and technology needed to effectively address the growing demand for gigabit services,” said Federico Guillén, president of fixed networks at Nokia.
“The proof-of-concept demonstrates that providing 10Gbps symmetrical services over HFC networks is a real possibility for operators. It is an important achievement that will define the future capabilities and ultra-broadband services cable providers are able to deliver.”
Corvex Management LP disclosed that it owns 9.9 percent of Pandora Media Inc and urged the internet music streaming company to consider being sold instead of pursuing a “costly and uncertain business plan.”
Corvex, a hedge fund run by Keith Meister, a protégé of billionaire activist investor Carl Icahn, said it had met with the company’s management and had withdrawn a plan to replace some of its board members. However, it now believes Pandora should hire an investment bank to help the company explore its strategic options including a sale.
“We believe there is likely to be significant strategic interest in the company at a substantial premium to the company’s recent stock price,” Corvex said, adding that large internet companies, handset makers and media companies could be potential buyers.
Pandora’s shares are down more than 25 percent in 2016 and more than 45 percent year-over-year. Corvex owns about 22.7 million shares in the company, making the hedge fund Pandora’s largest shareholder.
Pandora said in response that it is in constant dialogue with shareholders and committed to achieving long-term value for them.
“Pandora has a profitable core business, combined with a strong balance sheet. We are confidently investing to fully capture the massive opportunity ahead of us,” the company said in a statement.
Oakland, California-based Pandora has faced tough competition from music-streaming rivals such as Spotify, Apple Inc , Alphabet Inc’s Google and Amazon.com and has failed to turn an annual profit as a public company.
Analysts have said Pandora, which had a market capitalization of $2.29 billion on Monday, could be an acquisition target for larger media or internet companies looking to beef up their online music offerings.
Pandora co-founder Tim Westergren, a former musician who spearheaded Pandora’s music algorithm technology, returned to the company March 28 to become CEO, squashing some investors’ hopes the company could be sold.
Westergren told Reuters on April 15, “If you want to sell a company, you don’t do that by spending half a billion on acquisitions and hiring a new CEO.”
Twitter Inc users will soon have more flexibility in posting tweets because the company plans to discontinue including photos and links as part of its 140-character limit, according to a Bloomberg report.
The social media platform has faced stagnant user growth. Months earlier, Twitter Chief Executive Jack Dorsey said the company would simplify its product in an effort to attract new users.
“We think there’s a lot of opportunity in our product to fix some broken windows that we know are inhibiting growth,” Dorsey said during a February earnings call.
Links currently take up to 23 characters of a tweet, limiting the amount of commentary that users can offer when sharing articles or other content.
Twitter has faced stagnant user growth, and shares have fallen more than 70 percent over the past year.
Twitter declined to comment on the report.
On Thursday, sources within Microsoft’s upstream supply chain have reported that the second-generation refresh to the company’s Surface Book is expected to be delayed. The sources cited “design issues” for the launch setback, indicating that the company could be preparing to redesign some critical areas to the final consumer product before launch.
The sources report the device will launch sometime after 2016, but do not specify whether design-related issues are hardware or software related. They they also confirm that the second-generation Surface Book will be upgraded from a 3000x2000p display to a 4K UltraHD (3840x2160p) display, perhaps in an effort to adopt a more industry-standard resolution that scales well across connected displays.
The second-gen Surface Book, or “Surface Book 2,” will also feature at least one Thunderbolt 3 port based on Intel’s Alpine Ridge controller. This will provide up to 40Gbps of bi-directional bandwidth and the ability to daisy-chain up to six devices simultaneously – including up to dual 4K displays at 60Hz or a single 5K display (5120x2880p) at 60Hz.
Microsoft’s original Surface Book design
The current Surface Book’s design was influenced by the variety of 2-in-1 convertible tablets that have hit mainstream retail shops since they emerged as an industry trend at the 2012 Consumer Electronics Show in Las Vegas. Microsoft developed a special hinge on the keyboard that would maintain the device’s weight-to-balance ratio, a move that allows the device to be used similarly to a clipboard and as a traditional notebook.
The Surface Book and Surface Pro series are both constructed using a magnesium metal “glass” that is melted in an oxygen-free environment and rapidly cooled to prevent crystallization. Of course, general chemistry tells us that magnesium catches fire when exposed to air. With this design, however, some claim the devices would need to be heated to between 500 and 600C to see any real effects, and these temperatures are far outside the rated device operating specs.
Perhaps Microsoft’s reported sign issues with the second-generation Surface Book have more to do with cosmetics, hinges and weight ratios than the construction material, but this is only an educated guess.
Current Surface Book Specifications
The current Surface Book, released in October 2015, measures 12.3 x 9.14 x 0.9 inches (312.4 x 232.2 x 22.9mm) and weighs 3.34 pounds (1.51kg) as a laptop, or just 0.3 inches thick (7.62mm) and 0.76kg (1.6 pounds) as a detachable tablet.
The device features a 13.5-inch 3000x2000p display (267ppi) and includes either a 2.4GHz Core i5 6300U (Skylake) or 2.6GHz Core i7 6600U (Skylake) CPU, 8 or 16GB of LPDDR3 RAM, an optional Geforce 940M 1GB GPU, 128GB to 1TB of SSD storage, dual-band 802.11n/ac Wi-Fi, Bluetooth 4.0, two USB 3.0 ports, a Mini DisplayPort, an SDXC card reader, an 8-megapixel rear 1080p camera, a 5-megapixel front camera, dual microphones, a 3.5mm headphone jack, and is compatible with a variety of stylus pens.
Surface Book vs. Surface Pro sales still unknown
In January, Microsoft reported that it sold 2.5 million Surface-series devices in Q4 2015 (October through December), or $888 million dollars’ worth. However, we are unsure how many of these sales are specifically Surface Book units versus Surface Pro 3 and 4 units. In total, the company sold 6 million Surface series devices in 2015. This is compared with a previous 4 million sale estimate for the year, according to sources in the upstream supply chain.