While most global users ignore Microsoft’s Bing Search, the service has a small but present marketshare in the United State. Sure, Google Search remains the dominant engine in all markets, but Microsoft is all that stand between a complete Google search monopoly. That’s part of the reason why Bing still receives new features.
In fact, Microsoft has made Bing Search a potent alternative to Google. In its latest push, the company is beefing up the accuracy of the web crawler.
Microsoft says the BingBot has been improved with a new algorithm that has been in development for 18 months.
Fabrice Canel, principal program manager for Bing Webmaster Tools said the result is a bot with improved understanding of “which sites to crawl, how often and how many pages to fetch from each site.”
BingBot is also receiving other updates, such as making its general web crawling more functional. Microsoft says it wanted to achieve “crawl efficiency” by balancing maintaining a light load for site servers and managing new content.
“Our crawl efficiency north star is to crawl an URL only when the content has been added (URL not crawled before), updated (fresh on-page context or useful outbound links). The more we crawl duplicated, unchanged content, the lower our Crawl Efficiency metric is.”
Microsoft is presenting this update as an example of how it works with user feedback. However, the truth is a little different has webmasters and SEO experts have complained. They say Bing Search web crawl is not effective.
Yes, Microsoft has listened to customers, but was largely forced to make changes to improve the BingBot:
“We’ve heard concerns that bingbot doesn’t crawl frequently enough and their content isn’t fresh within the index; while at the same time we’ve heard that bingbot crawls too often causing constraints on the websites resources.”