forked from GitHub/gf-core
src/www/robots.txt: robot exclusion file for the GF cloud service.
To reduce potential server load caused by search bots.
This commit is contained in:
4
src/www/robots.txt
Normal file
4
src/www/robots.txt
Normal file
@@ -0,0 +1,4 @@
|
||||
User-agent: *
|
||||
Disallow: /grammars
|
||||
Disallow: /robust
|
||||
Disallow: /*.pgf
|
||||
Reference in New Issue
Block a user