2

I am configuring apache2 on debian and would like to allow only robots.txt to be accessed for searching engines, while other .txt files are restricted, I tried to add the followings to .htaccess but no luck:

<Files robots.txt>
Order Allow,Deny
Allow from All
</Files>

<Files *.txt>
Order Deny,Allow
Deny from All
</Files>

Can anyone help or give me some hints? I am new comer to apache, thanks a lot.

1 Answer 1

2

Use mod_rewrite

RewriteEngine On RewriteCond %{REQUEST_URI} !/robots\.txt$ [nocase] RewriteRule \.txt$ - [forbidden,last] 

First, make sure the rewrite engine in enabled.

Next, use a negated-match (!) to apply a conditional to the RewriteRule that excludes any URI's ending in "/robots.txt"

Lastly, if the URI ends in ".txt" then issue a 403 Forbidden.

EDIT: Don't forget the comparison engine is using regex, so you need to escape special characters (ie, .)

1
  • Thanks! It's worked!! Fully appreciated for your help!!! Commented Aug 19, 2014 at 4:46

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.