Create a Robots.txt File
This is a very simple addition of the robots.txt file which disallows indexing of the entire website. #1262
This commit is contained in:
committed by
GitHub
parent
5358e28a68
commit
db67127df3
2
public/robots.txt
Normal file
2
public/robots.txt
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
User-agent: *
|
||||||
|
Disallow: /
|
||||||
Reference in New Issue
Block a user