Solve robots issues

  • A blocking robot.txt file in the top folder of a web server stops a robot like Google, which analyzes pages, from continuing;
  • For normal public websites a not-blocking robots.txt file stabilizes search results.