I recently moved my CentOS 7 machine from an Apache web server to an nginx web server, and I'm still working out various issues as I learn the in's and out's of nginx. One issue I have stumbled upon is that text files, such as robots.txt, appear to be unaccessible in my public web root directory.
My server block configuration is as follows:
server { listen 80; server_name example.com; root /var/www/example.com/public_html; index index.php; access_log /var/www/example.com/logs/example.com_access.log; error_log /var/www/example.com/logs/example.com_error.log error; location / { index index.php index.html; } location /reports { autoindex on; } location ~ \.php$ { try_files $uri =404; fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } error_page 404 /var/www/html/404.html; error_page 500 502 503 504 /var/www/html/50x.html; location ~* \.(js|css|png|jpg|jpeg|gif|ico)$ { expires max; log_not_found off; } location /robots.txt { #access_log off; #log_not_found off; } location ~ /\.ht { deny all; } } I am confused about why text files would be inaccessible. I figured this was a problem with me using PHP, but I couldn't determine what might be blocking access to the file.
I also can't check my log files (yet) because this server block doesn't seem to be writing to my logs, but that's in another question.
What could be going on here?
/var/www/in order to get SELinux to (more easily) recognize them as web server directories. This was an issue in the past, but as far as I know, SELinux should be labeling the directory correctly.