Key Points from Interview with Stefan Weitz
In this interview Stefan outlines the key areas where Bing sees itself diverging from Google. The discussion provides a clear and direct look at the way Bing plans to build its market share over time. There are three major components to Bing's strategy:
- Become a personal assistant for the searcher, one that knows enough about you to highly customize the search experience (see the discussion about Mission Impossible below)
- Move away from a search box and make it totally transparent (for more on this read the discussion of the Xbox Kinect below)
- Focus on partnerships instead of acquisitions, allowing Bing to leverage the creativity and accumulated data of others
Stefan also sees great algorithmic search as “table stakes”, but that the real value add in the future is come through additional layers built on top of the raw algorithmic piece. These layers will handle the personalization, the embedding in different platforms, managing partnerships, etc.
This summary covers the basics. Please read the full interview transcript to get the full impact of the discussion.
Full Interview Transcript
Eric Enge: During our last interview we talked Google being very algorithm-focused, and how Bing was going to take a different path. I suspect that this divergence is beginning to grow. Is that right?
There is this shift in how we view the web and what it can do and the way Google does …
Stefan Weitz: Yes. Let me take you a step back, and cover some new stuff we haven't talked about before. This bifurcation you referred to is happening. There is this shift in how we view the web and what it can do and the way Google does, and neither of them by the way are bad and they both are necessary, but there is a difference. They have done great work focusing on index size, index freshness, speed optimizations, and user experience models. They offer a great keyword search experience, and that's good. On our side we obviously have to do an amazing job with that core index, with retrieval of URLs based on two and a half keywords per query.
… in many cases we actually outperform Google for algorithmic search and in almost all cases we are at least as good or better.
All that stuff is table stakes, and we've always known that. With a couple of our more recent algorithmic updates, in many cases we actually outperform Google for algorithmic search and in almost all cases we are at least as good or better. But, there is this new thing, this notion that the web itself has changed and continues to change at an accelerating pace.
Search really is predicated on the structure of the web, and as it changes, search needs to change with it. Over time search looks less and less like a search box on a web site. It looks much more like a service, almost like a platform that spans across any input modality from a gesture on your Xbox, to voice on your phone, to touch on a tablet device, to even implicit queries that you don't even know necessarily are happening on your behalf.
Of course, the service has to be personally useful. We want to create something that takes into account who you are, who you know, what you do, all those types of things, and then focuses on delivering experiences that help turn those ideas of things you want to actions. To do this well, we have to do an amazing job of brokering resources from all across the web to enable new levels of functionality.