- A blocking robot.txt file in the top folder of a web server stops a robot like Google, which analyzes pages, from continuing. As a result, the search results are not refreshed;
- To authorize for Google Search, you may need to re-register the web domain; just to clean up search results;
- One page clean up:
– Have an index.html file containing in the header: meta name=”robots” content=”noindex”;
– Unblock in robots.txt: ‘User-agent: *’ and then ‘Disallow:’ with no details fits;
– Log in to Google Search to have the “noindex” html-page indexed.
- Several pages clean up:
– Use a plugin for a sitemap; update Google Search with your sitemap path;
– Unblock in robots.txt not to block analysis : ‘User-agent: *’ and then ‘Disallow:’ with no details fits;
– Counteract search results in .htaccess with ‘Options -Indexes’;
– Log in to Google Search to have only pages in the sitemap indexed and other search results cleaned up.
- A rewrite in the .htaccess file, or a 301/302 redirect, to another web domain, may affect search results.