The True Future of Search
When envisioning the future of search, a lot of buzz-worthy topics come to mind. Topics like visual, voice, wearables, augmented reality, and virtual reality, to name a few. When thinking of how the future of search is connected to retail, these same topics tend to bubble up.
What to Wear
The future of search will be a constantly shifting nebulous of data, imagery, stimuli, and environmental cues tied closely to each individual consumer and the world around them. The key phrase there being the world around them.
To achieve its future state, Google will have to buck the proverbial shackles that have limited its potential. It’s a prisoner of its own devices (no pun intended), currently existing solely within a screen-driven environment. It will need to move beyond screen-only interactions. Break the fourth wall. Escape from its device-based chains to become a passive observer instead of just something accessed via a portal.
Once this happens, Google can move beyond its other limiting factor – the need for an input. As of now, Google only acts upon something when it’s engaged. Be it a text-based query, a voice-based question, or some other form of input. It needs to be interacted with before it can, in turn, interact.
The shift from domain to a physical (either wearable or implantable) device is what will allow Google to extend beyond these limitations. This will also require a fundamental shift in how people perceive Google (and search in general) as it evolves from a digital entity to a more physically manifested device – one that becomes part of the users’ daily endeavors.
Learning from Past Mistakes
It’s not as if it’s something they haven’t already attempted. Everyone knows Google Glass was an epic failure due to multiple reasons that ranged from fashion to adoption to privacy. However, this type of technology has to be viewed as more of a stepping-stone than a hard stop. Other pioneers across social, research, etc. are looking into wearables and their viability because they understand the value.
Specific to search, instead of waiting for someone to enter a query, provide input, give feedback, etc., a wearable device can constantly absorb environmental cues from its immediate (and not-so-immediate) surroundings. Going the wearable route means Google is no longer relegated to a phone in your pocket or a laptop in your office.
This ties together neatly with retail because both Google and retail’s futures involve an evolution beyond their current physical limitations.
Retail and Search’s Futures Intertwined
Retail specifically has seen a few fundamental shifts specific to online. Retail browsing was historically something conducted in-store or in-print. The advent of the internet changed this, bringing first the browsing capability to an online format and then the transactional capability. Once consumers became accustomed to shopping online, mobile format took hold. The idea of shopping via a device became mainstream.
But like search, retail – until a few years ago – was trapped, waiting for an input to provide recommendations (outside of using invasive cookie-based ad tech). Because of this, one could say Google and retail are inextricably connected in their need to gain access to the consumers’ environment.
Instead of waiting for a consumer to take out their phone to text or image search an item, the device could anticipate such needs in-store, at home, or the world at large. And it’s Google’s ability to capture product information and house it in one consolidated location that provides the foundation that this next leap for retail can be built upon.
A quick and easy example of the puzzle pieces assembled would be a reaction to a simple visual cue – like someone waiting in line and taking notice of a jacket the person in front of them is wearing. Without a prompt, the device would be able to, based on environmental signals, do the following:
- Recognize what the product is (who makes it, color options, sizing comparisons)
- Gauge the viewer’s level interest (based on length / intensity of glance, heart rate)
- Access their browsing / purchase history (understand stage in funnel, conversion propensity)
- Present visual, audio and text-based representations of the product (enhanced PDP)
- Assist their path to purchase (find retailers, compare prices, shipping options)
- Encourage basket-building (recommendations, offers, perks)
As noted, this torrent of outputs would be based on a single, unsolicited input from the consumer’s field of vision. It would obviously need to be AI-driven and user-specific to ensure it engages at the right moments and mitigates inundation. When executed effectively, this type of interaction embodies anticipatory search. This is the idea of a unique, intelligent device using every cue and stimuli at its disposal to anticipate queries instead of waiting for inputs.
All previously detailed Google-specific retail advancements would aid this process, including Shopping Graph, Image Search, Recommendations, Buy on Google, etc. They’d all play a role in this new environment that requires massive amounts of data tied to AI-led recognition and predictive decision-making.
Again, this is future-focused stuff. It doesn’t exist today in a consumer format, and one can only imagine the privacy concerns associated with such an approach. However, all signs point to this – or some variation – being the next iteration.
Where to Next?
Retailers have been experimenting with similar AR-driven efforts to bring their static, online product catalogs to life in the real world. The applications are endless – from digitally inserting a coffee table into a living room to trying on shades of eye shadow via an app enhanced with facial recognition.
The metaverse could play a key role in all of this as well, potentially helping vault over a few of the technological and adoption-specific gaps. Everything in the metaverse can be tagged and documented. Everyone in the metaverse is fair game. There likely wouldn’t be the same privacy concerns that would exist in the real world with someone wearing an AI-fortified camera capturing their field of vision.
Therefore, the Metaverse can serve as a proving ground or sandbox in many ways for this next step in search. It can be tested in a less invasive format before bringing the more nuanced, next generation version to life in the real world.
The options are truly endless. However, making the most of this future iteration requires a lot of things to happen – and a lot of behavior to adjust. This isn’t a scenario where a great product is introduced and immediately changes the world. Consumers must gain something from these advancements in order to fold them into their everyday lives.
The concept of search engines changed the way we look at the world. It changed how we access information, consume media, interact, and beyond. For this next evolution of the search engine to thrive, it will need to change the way we look at the world once more – and it wouldn’t hurt to bring retail along for the ride as well.