Analysis: Why Google killed Glass
Mainstream media are surprised, but it was only a matter of time until Google took its Glass project out to the paddock and put a bullet in its head. That happened today.
Google Glass, RIP.
The concept will live on but Glass, as we know it, is dead.
We’re not surprised.
That said, we’d be very surprised if Google did not continue to develop video camera-based technology that is able to capture and make sense of its surroundings – but one thing is certain today: Whatever that looks like, it won’t be a camera protruding from a pair of glasses.
Of the three mainstream categories of wearable technology, smart glasses were always the ugly sister to smart wristbands and smartwatches.
The jury is still out on the whole topic of face-worn cameras – with many people feeling uncomfortable about the fact that a complete stranger may be not only recording 30fps video of their surroundings, but also using internet-enabling technologies for things like facial recognition, all in real time.
2014 was a bad year for Glass, with users being banned from cafes in hip areas of Seattle, attacked on the street and even dubbed as “glassholes” for sporting the eyewear.
It’s not only Google that has been struggling against headwinds in this space – in 2013, much was made of the companies offering lapel-worn cameras that would take a continuous photo-stream of your day – for consumers. Today, versions that capture streaming video are increasingly being adopted by police forces, but consumers are not fighting in the aisles for products from vendors like Autographer and Narrative.
So why did Google kill Glass, as we know it?
Simple. The company is disciplined in “cutting bait” when it becomes clear that projects are not going to attain the hoped-for levels of success, and will instead look for something ‘more Googley’.
Glass was a Google Moon shot
Glass was one of Google’s much-vaunted “Moon shot” projects, a concept driven by CEO Larry Page. There’s a great interview in Wired where Page talks about the thinking behind Moon shots – notably that the run-of-the-mill corporate goal of achieving a 10% incremental improvement in a product isn’t nearly good enough. Instead, Google Moon shot projects aim for an improvement of 1000 per cent.
As Wired put it: “Thousand-percent improvement requires rethinking problems entirely, exploring the edges of what’s technically possible, and having a lot more fun in the process.”
Does that sound like Google Glass?
Compare it with other Google Moon shots, like the space exploration project and the self-driving car, and it’s clear why Google killed it: Glass didn’t have what it takes, at least in its current form.
What we expect to happen next – probably in 6-9 months – is for parts of Glass to start emerging in a different form – for example, in cool additional features for Android smartphones, such as video camera software that can detect not only where you are and who you’re with, but … using data from a rapidly-increasing number of sensors connected to the internet (via the Internet of Things) … work out what you’re likely to do next.
Just like Amazon is experimenting with predictive analytics for its frequent shoppers to start shipping products even before the customer has confirmed the purchase, we think Google is trying to predict the future, based on both the past and the present.