The robots.txt
file tells search engines which of your site's pages they can crawl. An invalid robots.txt
configuration can cause two types of problems:
- It can keep search engines from crawling public pages, causing your content to show up less often in search results.
- It can cause search engines to crawl pages you may not want shown in search results.
How the Lighthouse robots.txt
audit fails
So here how to fix it , simply you have to ceate rebots.txt on public_html then use this code and change your website url
User-agent: Googlebot
Disallow:
User-agent: googlebot-image
Disallow:
User-agent: googlebot-mobile
Disallow:
User-agent: MSNBot
Disallow:
User-agent: Slurp
Disallow:
User-agent: Teoma
Disallow: /
User-agent: Gigabot
Disallow: /
User-agent: Nutch
Disallow: /
User-agent: ia_archiver
Disallow:
User-agent: baiduspider
Disallow: /
User-agent: naverbot
Disallow: /
User-agent: yeti
Disallow: /
User-agent: yahoo-mmcrawler
Disallow:
User-agent: psbot
Disallow: /
User-agent: yahoo-blogs/v3.9
Disallow:
User-agent: *
Disallow:
Crawl-delay: 10
Disallow: /cgi-bin/
Sitemap: https://wwww.inkhost.net here replace our website to your