09 Jul 06
26 Sep 14 1:49 am
A robots.txt file isn't essential to your website. You'd only really need one if you wanted to block robots from certain parts of your site. Otherwise the robots just assume that you're happy for them to wander through as they please.
So you can just ignore that warning if you like.
But one thing you might want to check is that you've got any duplicate pages on your site blocked from robots. If you're using WordPress, it'll usually generate a bunch of archive pages containing your article content. This can count as "duplicate content" to the search engines and can hurt your SEO.
You can block these with a robots.txt, but an easier way to do it if you're using the Yoast SEO plugin is to disable indexing of these pages on a category level. You can set the rules for this inside the Yoast SEO plugin (menu down the left of the page). You can set it to "noindex, follow" -- so the search engines will still go through the pages, they just won't index them and it won't count as duplicate content.