What does crawl delay mean in robots txt?
I tested my client’s robots.txt and found out that ‘crawl delay’ is present on the website. I checked google to find out the answers. It says Google ignores the crawl delay; other search engines don’t. My question is, if Google ignores it, then why is GSC giving me a warning? Is it something to worry about?