Briljant

Crawl-delay on msnbot can override bingbot rules in robots.txt

What this page documents

This page documents an observed robots.txt behaviour in Bing crawling based on server log analysis.

Key finding

Setting a Crawl-delay for the user-agent msnbot can override rules specified for bingbot, even when bingbot only falls under User-agent: * in robots.txt.

Why this matters

This means Bing crawl behaviour may be influenced by a legacy user-agent group even when the expected effective rules appear to be defined elsewhere in robots.txt.

How it was observed

This behaviour was identified through server log analysis of Bing crawlers. When a Crawl-delay directive was applied to msnbot, the effective Bing crawl frequency decreased despite the absence of a matching crawl delay rule for bingbot.

Source

Observation documented by Gerk Mulder, Briljante Geesten.

Example robots.txt configuration

User-agent: *
Allow: /

User-agent: bingbot
Allow: /

User-agent: msnbot
Crawl-delay: 10

In observed cases, Bing respected the crawl delay defined for msnbot, effectively slowing the crawl rate even though bingbot had its own rule set.

Practical implications

For technical SEO and crawl management, robots.txt behaviour should be validated through server log analysis and not only interpreted theoretically from the file structure.

FAQ

Does Bing still use msnbot?
Bing’s primary crawler today is bingbot, but observed crawl behaviour suggests legacy user-agent handling may still affect robots.txt interpretation.

Is this officially documented by Microsoft?
This page describes an observed behaviour based on server log analysis. It should be treated as a technical observation, not as a formally published Microsoft specification.

Lees eens meer blogs