Bing, the Microsoft-owned search engine, has unveiled new improvements to the search crawler technology that underpins Bing. The development of the “Bingbot” web crawler will change the way Bing searches for information online. This improvement, and new technology, the company argues, will help make Bing the most searchable search engine on the planet – even outcrawling Google.
The new developments mean Bing’s new “BingBot” will use a new algorithm that will:
“… determine which sites to crawl, how often, and how many pages to fetch from each site. The goal is to minimize bingbot crawl footprint on your websites while ensuring that the freshest content is available. How do we do that? The algorithmic process selects URLs to be crawled by prioritizing relevant known URLs that may not be indexed yet, and URLs that have already been indexed that we are checking for updates to ensure that the content is still valid (example not a dead link) and that it has not changed. We also crawl content specifically to discovery links to new URLs that have yet to be discovered. Sitemaps and RSS/Atom feeds are examples of URLs fetched primarily to discover new links.”
The company understands that Webmasters have long grumbled about Bing’s crawlers being slow and that this new technological shift will be part of the journey required to help overcome this problem. But Crawler Efficiency, as it is being called, will help drive Bing’s wider search crawler efficiency thus helping Webmasters and search engine users alike.