7

os: CentOS 7
nginx: 1.6.2
httpd: apache 2.4.6
cms: Drupal 7

After my server was compromised I removed all from server, reinstalled OS and soft, and restored data from backup. Now I configure all services in maximum security style.

After detail researching access logs - I decided to deny any requests for php files except index.php which is in the site documents root for improving security.

Nginx access log contents a lot of records like:

azenv2.php az.php 

and

/*/wp-login.php /administrator/index.php /MyAdmin/index.php 

First category - backdoors (and one of them hacked my sites, somebody send huge portion of spam from my server).

Second - somebody want to find popular cms and utilities and try some login@password, like admin@123456

My reasons to block both categories by nginx through deny requests to php files are:

  1. Even if somebody will upload php-shell - it will be impossible to use it.

  2. All these requests are 'not good' a priory - and to refuse them by nginx will protect drupal(httpd+php+mysql) to work and spent power.

My current config for one virtual host:

server { listen <server-ip>; server_name <site-name>; location ~* /sites/default/files/styles/ { try_files $uri @imagestyles; } location @imagestyles { proxy_pass http://127.0.0.1:<port>; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; access_log off; } location ~* \.(jpg|jpeg|gif|png|ico|css|bmp|swf|js|pdf|zip|rar|mp3|flv|doc|xls)$ { root <site-documents-root>; access_log off; } location ~ (^|/)\. { deny all; } location / { proxy_pass http://127.0.0.1:<port>; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; access_log <path-to-log-folder>/nginx_access.log main; } } 

nginx.conf - was not changed after installation.


UPDATE
Finally I create this config for deny:

location ~ \.php$ { access_log /path/to/log/nginx_deny.log name_log; deny all; } 

and this config for proxy:

location =/index.php { proxy_pass http://127.0.0.1:<port>; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; } location =/cron.php { proxy_pass http://127.0.0.1:<port>; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; } location / { proxy_pass http://127.0.0.1:<port>; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; } 

1). So full information about attacks attempts is collects in log.
2). Server not make additional work for bad requests.
3). Drupal cron may work.

6
  • A probably superfluous question: have you read the canonical serverfault.com/questions/218005/… Commented Dec 8, 2014 at 14:22
  • I had read a lot of about security during last two weeks :) I have no problems with fact of compromised - all actions already done - new os, soft, complex clean sites files. Now - I need to protect by maximum new server. ssh, selinux, httpd, php seems secure. Strongly nginx configurations can improve security. Commented Dec 8, 2014 at 14:46
  • Anyone able to give me feedback on the following? While what OP is doing is not too logical, I'd like to know if this would do what he wants: location ~ \.php$ { deny all } location \index.php { proxy_pass whatever } Commented Dec 8, 2014 at 14:46
  • @Peter unfortunately that would result in the denial of the request to index.php also, as the location ~\.php$ { deny all } directive would also match index.php Commented Dec 8, 2014 at 15:27
  • What exactly does which is in the site documents root for improving security mean? Where was it before? There's also no php handling at all in the above config - so not very clear. Commented Dec 8, 2014 at 15:30

1 Answer 1

4

You can achieve this in a number of ways.

Integrating quite directly with what you have in your config file, you may wish to simply include a section such as the following;

location ~ \.php$ { try_files index.php @error; fastcgi_pass ...; fastcgi_param SCRIPT_FILENAME /path/to$fastcgi_script_name; ... } location @error { [config of however you want to handle errors] } 

Which will check for the existence of the requested file before allowing its access/execution.

Further to the above however, I would actually personally recommend using fail2ban which will provide you more comprehensive security if configured correctly; you can configure it to monitor your access logs in real-time and ban IPs from accessing your server(s) by automatically creating new iptables rules on-the-fly, with ban times which you specify.

Personally I have my servers configured to use fail2ban with nginx as per this article (or at least based upon that - you may alter it as you wish).

4
  • fail2ban won't stop someone from exploiting a vulnerability in some web portal (like Drupal, for instance). Commented Dec 8, 2014 at 19:23
  • Of course not, but that is neither what was requested by OP nor what I detailed in my response. It is however a point worth noting - known vulnerabilities in packages should be tracked and software updated accordingly. Commented Dec 8, 2014 at 22:10
  • I try your advice and get <invalid number of arguments in "try_files"> after reload. Commented Dec 21, 2014 at 20:54
  • Alright, it would seem that try_files requires at least two arguments then - I would suggest that you set up a second argument to point to an error location if you wish (answer updated to reflect this change above) Commented Dec 22, 2014 at 9:23

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.