I am configuring apache2 on debian and would like to allow only robots.txt to be accessed for searching engines, while other .txt files are restricted, I tried to add the followings to .htaccess but no luck:
<Files robots.txt>
Order Allow,Deny
Allow from All
</Files>
<Files *.txt>
Order Deny,Allow
Deny from All
</Files>
Can anyone help or give me some hints? I am new comer to apache, thanks a lot.
Best Answer
Use mod_rewrite
First, make sure the rewrite engine in enabled.
Next, use a negated-match (
!
) to apply a conditional to theRewriteRule
that excludes any URI's ending in "/robots.txt"Lastly, if the URI ends in ".txt" then issue a 403 Forbidden.
EDIT: Don't forget the comparison engine is using regex, so you need to escape special characters (ie,
.
)