The next generation of search will change, well, everything.

Last week, while the media world wove through canapé servers at the upfronts, and everyone had an opinion about the return of Will and Grace, the Google I/O 2017 event unveiled a completely new way of search. No, really, don’t yawn, because this innovation could (and should) change how the media world thinks and plans. 

The I/O event is a must watch for the media world. It’s an event that is less about devices and more about utility – and this year the launch of Google Lens was, in my mind, the biggest news. 

Imagine a world in which your phone’s camera replaces the text box on the Google search engine. Well, we’re nearly there. Here’s how Google Lens works: take a picture of a beautiful flower. Launched inside Google Assistant or the Google Photo library, Google’s image recognition cross-references what it sees with Google Search and the company’s other services, and in a text box on the photo, will tell you the species of the flower. Take a picture of a restaurant storefront – Google Lens will give you the most recent reviews. 

0.jpeg

Here’s where Google Lens gets especially tricky – it applies machine learning to your habits over time, and the services that you use, and customizes your search results accordingly. It will anticipate your question before you’ve even thought of it. 

GULP. 

It’s clear to me that if this scales, and chances are it will, media planning will have to cope with some fundamental changes. Here are three ways off the top: 

1.   The physical world (and mobility) rise again in importance: 

The implication of Google Lens is that typed search - particularly mobile search. will fade away, probably rapidly. Marketers (and media planners) must reconsider how their product/stores cope with pictorial search. 

·     Physical product and the retail space will become much more important, upping the value of packaging, experiential and POS in the mix; 

·     Out of home in all its forms, once connected, rises in value; 

·     The link between taking a picture and purchase is going to be direct and trackable. 

2.   Seamless and easy is the name of the game: 

Google Lens will make search the ultimate in easy and intuitive. The bar has been set high for the rest of the brand experience. The entire media plan needs to feel as connected and effortless as the initial search effort, otherwise conversion is at risk. My guess is that media plans will need to say goodbye to the “phased over time” approach to an “in-the-moment” fully integrated ecosystem approach (truly it should be there anyway). The degree of difficulty behind Google Lens – machine learning talking to AI seamlessly over multiple applications – is immense. Our job as media planners is the same – consumers should never see the wizard behind the curtain. 

3.   Machine learning and behavioural targeting will get married, and that will change probably everything in digital media. 

The ultimate implication of Google Lens is that different people can take the same image, and they will be served different results based on machine learning. 

For example, three people take a picture of the same car. One is an auto enthusiast – Google Lens will serve them links to expert reviews. One hasn’t taken a picture of a car before – they’ll get served links to local dealers because they’re probably auto intenders. One will get served local restaurant recommendations because the photo’s location is in another city – they’re probably on a trip.  

One clear implication for planners is that they need to be explicit on the needs state of the consumer they’re targeting. Based on that needs state – sharing, saving a memory, searching to buy – the media plan should pave that behavioural path. 

One other clear implication is that behavioural targeting will get much, much better – but the systemic impacts on the digital media complex are significant.  

We’ve heard this story before: search will never be the same. 

But is it hyperbole?  Why would the launch of Google Lens be any different than the prognostications around Siri, Alexa or Cortana? Well, for starters, it’s non-verbal, making it instantly and intuitively universal. As always, there’s a healthy dose of skepticism when it comes to delivering on image recognition and machine learning, and admittedly, it needs some work. But secondly, and more powerfully, Google Lens is a natural extension to an already-scaled search universe. And, if Google does anything superbly, it is infrastructure at scale. 

Is Google Lens ready to roll out right now? Not yet – and that’s probably a good thing because the media industry will need to adjust to some new realities. Whether it’s Google Lens or the next new take on search, the text box’s days are numbered as we move to a new, more intuitive, media universe. 

Sarah Ivey