What It Means To “Google” Something Just Changed Forever

0

Google doesn’t want to just answer questions anymore. It wants to guess the question before you ask it.

When were we this excited about any Apple event? It’s honestly hard to remember. Because on stage at Google’s I/O conference this week, the company perfectly distilled how technology has changed in the 20 or so years since the introduction of Google’s ubiquitous search box. It’s a world where you can point your phone at anything, tap a button, and learn more about it.

It’s Word Lens for everything–a search box for real life.

[Image: Google]

Technically, Google announced this breakthrough as a feature called Google Lens. It will launch inside the Google Assistant on Android and iOS phones, as well as your Google Photo library. Using Google’s image recognition–which currently bests that of humans’ own facilities–the system cross-references what it sees with Google Search and the company’s other services.

[Image: Google]

That sounds painfully boring, yes. It’s the amalgamation of machine learning and artificial intelligence, two fields of research that are as difficult to fathom as the stars in the universe. But the effect of Lens is akin to installing Google Search inside your own retinas. Hold your camera up to that building over there, tap Lens, and you’ll learn it’s the Willis Tower. Point it at a flower or a storefront, and it returns the name of the bloom, or restaurant reviews. In one onstage demo, Google even demonstrated how Lens could look at a Wi-Fi router’s SKU and password–and then automatically connect an Android phone to that network.That last magic trick is almost staggering to consider, when you try to see its strings. All you have to do is look at something through your phone and tap on it. It’s the simplest possible interaction; one can imagine a Luddite succumbing to it out of pure desperation: “Maybe if I just aim my phone and tap, Google will figure it out!” And yet, that’s exactly what happens.

On the back end, Google is juggling all sorts of probable computations–scanning the SKU to search for the router type, using image recognition to detect a login and password, and then, crucially, hopping into your Android phone’s settings to take that last step and just get you connected. None of these individual tasks is anything but routine for Google. But string them together in just the right context for a person, and launch them with a tap? The whole world suddenly becomes Googleable, while the definition of what it means to “Google” something expands dramatically. Google isn’t just answering questions anymore. It’s inferring the question before we even ask it.

A crucial aspect of Lens is that it’s not an anonymized tool, but rather is connected to individual users across their devices. That means it can learn who you are over time, customizing its easy button just for you. That evolving, customized profile suggests many possibilities for the service that Google hasn’t talked about yet–but it’s easy to envision plenty.Imagine an architecture enthusiast, a bargain shopper, and a real estate agent taking a photo of the same building. Lens might tell one the building’s history, and the next, sales happening in the storefront shop, while drawing up city records for the property for the latter user. Any interface that is this minimal, that relies so heavily on machine learning, by necessity must make liberal assumptions about the user’s intent every time they tap that Lens button–so knowing that user very, very well is vital to Lens’s future success.

[Image: Google]

But just as importantly, Lens will create a consistent bridge defining who you are across your devices. Consider how iOS has no clue about what apps you’ve been using on Mac OS–but any iPhone using Chrome or the Google app has a complete history of everything you’ve ever looked at online. Such is Google’s advantage as the world’s greatest infrastructure company, rather than the world’s greatest industrial design studio. It can build a service that sucks up every last drop of data about our behavior across all of our devices–and use that granular, expansive profile to build a tool that is perfectly tailored to us.Of course Apple, Amazon, and Microsoft are all trying to be this cross-platform interface of the future with their own personal assistants, Siri, Alexa, and Cortana. But speech recognition is only a sliver of the way we communicate something we don’t know. How do we learn when we’re toddlers? We point, and we ask, “what’s that?” That’s why augmented reality is such an important piece of real estate to conquer first, even if it takes the form of a handy search tool on a smartphone instead of a headset or contact lens.

In case you’re still skeptical of the importance of Lens, keep your eyes peeled; over the next year or two, other companies are going to do (or claim to do) exactly what Google is doing with Google Lens.

In fact, Samsung–to some extent–beat Google to the punch with its recently announced augmented reality features for its latest Galaxy phones, including an assistant called Bixby. A few months ago, Bixby promised a world where you’d see a shoe you liked on the street, snap a photo, and be pointed to a store where you could buy it. But when Samsung’s new phones actually shipped to reviewers, the world learned that Bixby would be delayed. Of course it was. Samsung doesn’t learn from 3.5 billion searches every day. It doesn’t house countless photos on its servers. Samsung hasn’t mapped the world’s streets, nor has it plugged itself into satellite imagery of the world. And you could say many of the same things about most of Google’s other competitors. Consider why Amazon released a camera for your home called the Echo Look. What can it do today? It can tell you if you’re hot or not and sell you clothes. It’s an invasive novelty. Amazon simply lacks the data to do more, and, as a result, it looks more like a surveillance device than a must-have solution for your life.

No, Google’s isn’t just showing us a concept, or a promise, with Lens. It’s showing us a believable portrait of our present and near future. Google can ship this non-product. It will only be woven into more and more of our lives. And so don’t be surprised when, in 5 or 10 years, somebody advises that you “Google it” and that old search box never even crosses your mind. In fact, Google has ceased to be the question at all. It’s just the answer.*

* Along with all of the ads that come with it.

This article first appeared in www.fastcodesign.com

Seeking to build and grow your brand using the force of consumer insight, strategic foresight, creative disruption and technology prowess? Talk to us at +9714 3867728 or mail: info@groupisd.com or visit www.groupisd.com

About Author

Mark Wilson

Mark Wilson is a writer who started Philanthroper.com, a simple way to give back every day. His work has also appeared at Gizmodo, Kotaku, PopMech, PopSci, Esquire, American Photo and Lucky Peach.

Comments are closed.