1

I have created a 'global' robots.txt file on my Ubuntu Server and located it in

/home/robots.txt

And then in /etc/apache2/apache.conf have added this at the bottom:

Alias /robots.txt /home/robots.txt

and restarted apache.

However when trying to access myvirtualhost.com/robots.txt I get 403 Forbidden.

/home/robots.txt is owned by 'root' and chmod 755 (testing as 777 also errors).

If I change Alias /robots.txt /home/robots.txt to Alias /robots.txt /home/myvirualhost/public_html/robots.txt it works - i.e. confirming this Alias rule is running.

I have another (older) server which has this exact same set up (/home/robots.txt owned by root) and it works for all virtualhosts and their users.

I wonder if myvirtualhost.com/robots.txt is running as user myvirtualhost and therefore not got access to /home/robots.txt but not sure as I don't think my old server had any special permissions set up for this... (can't remember).

If I remove /home/robots.txt I still get 403 Permission error, rather than not found.

I have multiple virtualhosts so can't just change it to /home/myonevirualhost/public_html/robots.txt

Any help would be most appreciated.

Thanks

SOLVED:

Added this to the apache.conf file (and needs Required all granted instead of allow form all as running latest version of apache):

**<Location "/robots.txt">
    SetHandler None
    Require all granted
</Location>**
Alias /robots.txt /home/robots.txt

But I can't mark it as solved yet as I don't have enough points to solve within 8 hours :-)

1 Answers1

2

Added this to the apache.conf file (and needs Required all granted instead of allow form all as running latest version of apache):

<Location "/robots.txt">
 SetHandler None
 Require all granted
</Location>
Alias /robots.txt /home/robots.txt